Fairness reviews: identifying essential attributes
[Originally published in Oct 2024 here, where an audio version is also available]
In a previous article, we discussed fairness in algorithmic systems, equity and equality.
When we're checking for fairness in our algorithmic systems (incl. processes, models, rules), we often ask:
What are the personal characteristics or attributes that, if used, could lead to discrimination?
This article provides a basic framework for identifying and categorising these attributes.
Anti-discrimination laws exist in most jurisdictions, so that's a good place to start.
If none apply to your country (e.g., South Korea, Japan), you could use existing human rights laws, or perhaps one of the international covenants or conventions.
The legal landscape
There's no shortage of definitions when it comes to discrimination.
For example, in Australia, there are at least 5 relevant federal laws.
Each state and territory has its own set of rules.
The definitions vary, but there's some overlap.
One example of a definition is detailed in the 2014 guide produced by the Australian Human Rights Commission, "A quick guide to Australian discrimination laws":
The Australian Human Rights Commission Act 1986 specifies "Discrimination on the basis of race, colour, sex, religion, political opinion, national extraction, social origin, age, medical record, criminal record, marital or relationship status, impairment, mental, intellectual or psychiatric disability, physical disability, nationality, sexual orientation, and trade union activity."
That's a lot to take in, and it's just one definition.
Simplifying the approach
To make this easier to work with, we can group these attributes into five main categories:
Each of these contain several attributes.?
Detailing the attributes can help provide context, and support our efforts to reduce bias:
1. Age
2. Race
领英推荐
3. Sex / gender
4. Disability
5. Activity / beliefs
Additional Considerations
There are a few more to think about - they are less frequently observed, but need to be considered:
Putting It Into Practice
Consider whether, and how, each of the attributes might be influencing decisions.?
Some key questions to ask:
Going a bit deeper, we may ask:
Regularly revisiting these questions helps ensure our systems remain fair and equitable as they evolve.
If you found this article helpful, feel free to share it with a colleague or friend.
Disclaimer: The information in this article does not constitute?legal advice.?It may not be relevant to your circumstances.?It may not be appropriate for high-risk use cases (e.g., as outlined in?The Artificial Intelligence Act - Regulation (EU) 2024/1689, a.k.a. the EU AI Act). It was written for consideration in certain algorithmic contexts within banks and insurance companies, may not apply to other contexts, and may not be relevant to other types of organisations.