Building Diversity into Artificial Intelligence Software Development

Building Diversity into Artificial Intelligence Software Development

The Problem

Many people are now discussing diversity, unconscious bias and artificial intelligence in the same breath. Some of the most prominent examples of where, if ignored, AI can have a counterproductive impact on minority communities include its use in the criminal justice system, facial recognition, and voice recognition. And the list is growing as the applications for machine learning and artificial intelligence continue to grow. The question many organizations are beginning to ask is how to address diversity as they develop their AI.

Table Stakes--The Team and The Data Sets

The most fundamental place to begin is with the team. It is impossible for someone to truly represent a point of view by proxy. As a result, having a software development team that is diverse across multiple dimensions (ethnicity, sex, ability, age etc.) is essential to ensure that many points of view are truly integrated into the product. And to create a diverse team, an organization has to be deliberate--reconsidering how you fill your candidate pool at the top of the funnel, ensuring that the interview processes that are in place are fair and that no group is dropping out of the process at a different rate than other groups, and creating an environment where a diverse workforce will thrive.

Data sets are fundamentally important to diversity--the considerations of different groups cannot be taken into account if they are not part of the training data for applications. Including a wide range of cases can serve as a source of innovation---stumbling into unexpected discoveries or applications that we would not have otherwise considered. We spend most time emphasizing the remedial aspect of diversity analyses (guarding against bias), but there is opportunity that we may miss out on.

Embedding diversity into the product development cycle

The more subtle, but significant challenge is how to embed diversity into the software development cycle. We experimented at with it at Dropbox this past year, and learned a great deal about the process.

Identifying the right projects, at the right time

For Dropbox, many of the areas where we use machine learning and AI do not have bias or diversity implications. So taking the time think through and find the right products or features for analysis is the initial step. We also needed to determine the correct stage in the cycle to conduct a bias or diversity analysis. Too late, and there is little time to adjust based on any feedback, and too early, it is difficult to even know the right diversity or bias questions to ask.

Preparing people to analyze for bias or diversity

For our pilot, we recruited volunteers from our seven different employee resource groups (ERGs) to ensure a wide range of mindsets. We took them through an interactive training where we explored the dimensions of diversity, and discussed specific methodologies for divergent thinking so they would have frameworks and tools to at least begin asking questions. Most important, the product manager prepared and presented a brief for the volunteers on the product, its stage in development, and an initial sense of the diversity questions to be considered. The volunteers were then given 1-2 weeks to reflect and develop feedback, with the instruction that they shouldn't limit their thinking , but to be certain to reflect on the specific implications for their identity group.

The Payoff--the Feedback session

The product manager then hosted a debrief session where she heard from the volunteers on their questions and diversity comments related to product. With our first pilot , the volunteers struggled , but in our second pilot (with some additional preparation) the feedback was very valuable. We also included another key member of the product team in the second debrief session, which made a difference in its quality. We are excited about what we learned and embedding it into our cycle for the long term.

The Hard Part--Making it sustainable

We learned so much last year, but there are three elements that we know will be crucial to making this a permanent part of our cycles.

Product Manager buy-in

We were fortunate for our pilots to have a very forward looking product manager who had been a vital partner from the beginning in developing the process. However, we know that that we will need all product managers to approach their work with her same mindset and enthusiasm to make this sustainable. Based on our early discussions, they are supportive of the concept, but we know it will take some time to get them to the same place as our founding product manager.

Triggering Mechanism for Diversity Review

We have partnered with our legal team to identify the right time in the cycle when a product manager would make a determination as to whether something warrants a diversity review. We'd like teams to eventually treat it similar to their privacy and other reviews, yet early enough so that any feedback can be incorporated.

Review Group

Because this is a pilot, maintaining a review group who has the interest, training and time will be crucial. Eventually it should be integrated into the product teams for long term sustainability, but keeping a group on call and prepared when a project has been identified in the early stages is another foundational step.

Final Thoughts and an invitation

I am a believer in what we've done but know that we can improve on it. We conducted a great deal of research to find models without any luck, so this was our best (and first) attempt. If you who have tried something similar or have insights, we'd welcome them as we know this work is important enough to open source so we all can get better together. Integrating diversity into the product development process is a responsibility that as technology companies we should all take as seriously as privacy--and if we embed this thinking early, we will save ourselves from course correcting later on.








?erife (Sherry) Wong

Artist Working on AI Governance and Algorithmic Auditing w ORCAA

4 年

Wonderful to see applied focus groups in responsible AI. The entire product line needs thought and diaogue from development to deployment.

Juli Kirkpatrick

ALI Testing Co-Coordinator at University of Southern California

4 年

I learned a lot from your article. I had never thought about the importance of accounting for diversity in AI software.

要查看或添加评论,请登录

Danny Guillory的更多文章

社区洞察

其他会员也浏览了