Metrics, benchmarking, and indexing, oh my! What they reveal - and what they don’t - about diversity, equity, and inclusion
Dr. Kristen Liesch
?? Helping Leaders in Higher Ed and Beyond Drive Real Change | Forbes D&I Trailblazer | Co-Creator of Equity Sequence? | Strategic Advisor to Progressive Deans | Expert in Equitable Innovation & Inclusive Strategy
In 1967, Christine Mann Darden - a Black woman - joined NASA as a data analyst.
It didn’t take long before she experienced and observed inequity in her workplace. What she saw was that “men with math credentials were placed in engineering positions, where they could be promoted through the ranks of the civil service, while women with the same degrees were sent to the computing pools, where they languished until they retired or quit.”
In Data Feminism, Catherine D’Ignazio and Lauren F. Klein share Darden’s story and explain how, when Darden brought this observation to her boss, she was told, “Well, nobody’s ever complained,” and her own complaint led to nothing. Later on in her career, Darden continued to experience inequity, watching as her male counterparts received promotions far more quickly. Finally, together with a White woman named Gloria Champine, Darden brought a bar chart to the director of her division who was “shocked at the disparity” and gave Darden a promotion.
Darden had experienced bias, discrimination, and inequity in her workplace. Racism. Sexism. She knew how it felt, she knew how it showed up. And yet her lived experience didn’t count until it was “proven” with data.
Today, organizations are turning to sophisticated people analytics to produce graphs, metrics, dashboards...visual representations of human experience. Or at least that’s what it seems. Right there in the D&I toolkit, alongside unconscious bias training, diversity training, sponsorship programs and ERGs is the “diversity and inclusion dashboard”, with its HRIM-integrated AI-backed natural language processing (NLP) that promises to take a wealth of quantitative and qualitative data and transform them into actionable insights.
But what story does the D&I dashboard really tell?
A better question might be, what story are your D&I metrics hiding?
??PotentialEvaluation??LeanHiring??TalentDevelopment Reliable Swiss software & Management Expert System
4 年Great article, but it is not impossible to have hiring algorithms thought to evaluate human potential by success factors, that are also ethical and foreseeing who is the best candidate for the future. We shouldn't be biased that we couldn't escape being biased. Keep in touch for more information please, this is our business www.leantalentsystems.com
Executive Leadership Accelerator and Advisory for Digital Transformation and Technology Leaders. | Author, Keynote Speaker, Consultant | Founder and CEO
4 年There’s data and then there’s data. Do we clearly understand the implications behind the AI, what the algorithms are looking for and doing? Are there ethical checks and balances all through development? Just as a hiring algorithm may learn to look for only white Stanford grads because that’s what the the data says they’ve hired in the past, makes it efficient but totally biased as it learns from biased data. Good article, thank you.
Strategic Advisor | Consultant focusing on equity and inclusion and projects with positive social impact | Board Member | Host of the If I Had Been Born A Girl podcast | Embracing my inner millennial
4 年Another great article, Kristen. And thank you for slipping in the link to your article about Ros Atkins. I already knew of and admired the 50:50 project, but I didn't know details of the origin story. Fascinating, and as you say, a perfect case study for exploring strategies for systemic, equitable change.