Breaking algorithm-driven behaviors - Part Two.
Meenakshi (Meena) Das
CEO at NamasteData.org | Advancing Human-Centric Data & AI Equity
Welcome to Data Uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data we missed and are yet to collect. In this newsletter, we will talk about everything the raw data is capable of – from simple strategies of building equity into research+analytics processes to how we can make a better community through purpose-driven analysis.
Today we are continuing what we started last week – you and I probing our algorithm-driven behaviors.?
I like to define?Algorithm-driven or algorithmic behaviors?as – us (humans) adopting intended/unintended behaviors that gravitate towards actions expected from the algorithms. Now, are all such behaviors need to be kept in check? Not necessarily, no. To be fair, I am specifically talking about those that lead to unintentionally perpetuating the gaps and biases in data.
Last week, we started this discussion with examples of algorithm-driven behaviors – both within the nonprofit industry and outside. Within this nonprofit industry, some examples you and I listed were:
To be clear – these unintentionally picked behaviors do not come from a place of malicious intent. I believe what we lack here is purpose, clarity, and at times, the intentionality behind the ways we co-exist with the algorithms.?
Then, the question is, what can we do? Or is there anything we can do? This week, let us focus on answering this question for those behaviors specific to our (nonprofit) industry.
Here goes the first draft of the list for breaking away from algorithmic behaviors such as listed in the examples above:
#1. Create in-house modeling awareness:?Regardless of whether you want to buy a donor segmentation product from the market or leverage in-house modeling skills, invest time for your team to at least build a proof of concept for the same task. Of course, this translates differently to your specific modeling objectives. Proof of concept refers to the exercise of designing the bare minimum solution for an objective to confirm or reject if the objective is feasible or not. Building a proof of concept allows you to understand the entire lifecycle from problem definition to algorithm output.
For example, if you intend to use predictive analytics for your mid-level or major gift donors, invest some time to perform this exercise in-house. It will give you an in-depth idea of how to approach modeling - from defining an objective function to the data you must feed into such models. This step will also lead to collective awareness about what data points from the database can be collected better.
#2. Design consciously adopted mechanism for community involvement:?Communities most affected by an outcome must be placed at the core of the solution design. That means involving your community at every step of the solution. Create space for the community voice to guide your process and not just be the audience of the result.
For example, building user personas of supporters from diverse communities has to include people directly from those communities. No approximation or guesstimation of their motivations/interests should be used as a proxy.
Another example would be - including in your funding/grant applications the inputs of your community so that their voices and perspectives are guiding the strategy and not just the evaluation metrics at the end.
#3. Promote codesign-led data culture:?That means promote a culture in favor of authentic and transparent joint conversation between three parties:
Allow conversations in this space to be led humanly and vulnerably so everyone can bring their authentic feedback, experience, and knowledge of working with data and algorithms you all use every day. This can help you understand the impact and implications of the algorithms you use and give a chance to the technologists in the room as to how the algorithmic solutions can improve.
For example, hold a 60-90-min conversation every quarter, depending on your organization's size, need, and timing to initiate such a codesign-led data talk.
领英推荐
And perhaps through these conversations, you can realize that it is not just bringing a diversified audience that will lead to diverse perspectives in this conversation of data. These conversations will allow you to learn and explicitly specify (as needed and relevant) gender identity, sexual orientation, race/ethnicity, age, nationality, language, immigration status, and other identity-related components. Our algorithm-driven behavior often tends to push us towards a dominant/homogenous social group without such explicit specifications.
#4. Invest in continuous educational opportunities (internal and external):?Though this is somewhat related to the above point, it is a bit more than that. To understand the indicators of those algorithm-driven behaviors that can lead to biases in data – you and I need continuous education – both from sources within and outside of our organizations.
For example, when you plan to budget for conferences and learning sessions over a year, plan to include learning opportunities related to data fundamentals, cultural competencies, community needs, etc. That is, areas that may not be directly related to your job description but tangentially gives you the knowledge to think about data and algorithms in your work.
#5. Pause before pushing new (data-driven) "innovation":?If you feel an urge/pressure to push for new (data-driven) "innovation", pause. You don't necessarily need one more dashboard, one more "all-in-one" report, or that one magic metric. I know it feels like you need to do that something more, something extra, to become further innovative, but you don't.
Your innovation also comes from cutting out dashboards that don't work, re-evaluating formulas behind metrics, or what tech you have vs. use vs. need.?Evaluate if you genuinely need something new or need to remove/adjust something you already have.
For example, our behavior can be attributed to what products we have been using in the organization. Before pushing a new product, if you collectively conclude what/how your existing products impact your internal and external team – you will have a much better chance of checking any unhealthy, unintended behaviors picked along the way.
*********************************
To remind you – from all this two-edition, 2500-word discussion on algorithmic behaviors, I am not asking you to push back on them (algorithms) simply. That's, after all, only a set of rules behind our technologies. I am, however, saying that what gives power to those algorithms is our individual and collective behaviors.
Our behaviors lead to (sometimes) predictable and repeatable decisions, strategies, and actions from those algorithms. Some of those predictable and repeatable actions perpetuate biases. So, let's take conscious actions that lead to better data and better algorithms.
Break those algorithm-driven behaviors you find necessary and needed.
***?So, what do I want from you today (my readers)?
Today, I want you to
*** Here is the continuous prompt for us to keep alive the list of community-centric data principles.
***For those reading this newsletter for the first time,?here is some intro of this newsletter?for you. :)
Imagine. Innovate. Build. I solve complex problems and unlock #disruptive #innovation through compassion. Academic, Industry, and Government experience in #northamerica #uae #europe #latinamerica #africa #asia
2 年This is awesome ! Here at SWRIL we are big on #4 (with the person receiving training then teaching and presenting back - with the added challenge of making it accessible to Children & Youth). #2 and #3 are what we are working on now to create a framework/system to do better. #1 and #5 go together in my mind as we should be aware and also not rush to "innovate" something new...but also not to publish data that is misleading. Awesome read this week!