It's sad to say this. Having been born and raised in the United States, I vaguely remember the phrase "diversity in the workplace" becoming important after people won lawsuits that proved "discrimination in the workplace". Isn't that sad? (My opinion.)
♦
"Workplace diversity training first emerged in the mid-1960s following the introduction of equal employment laws and affirmative action. Prior to this, many companies had known histories of racial discrimination."
REF: The History and Growth of the Diversity, Equity, and Inclusion Profession (https://insights.grcglobalgroup.com/the-history-and-growth-of-the-diversity-equity-and-inclusion-profession/ )
♦
"Diversity is not just a moral issue, but also a business issue. Although a spotlight has been shone on diversity issues in the workplace in the past decade, workplace diversity calls-to-action had not started to emerge until the mid-1960s."
REF: Evolution of Workplace Diversity — Brief History | AllVoices (https://www.allvoices.co/blog/evolution-of-workplace-diversity )