Making Healthcare Technologies Accessible To All- Part 2: Considering a wider design scope with diversity and inclusion (a PharMedTech curtain-raiser)
Images taken from sources presented as references in this article

Making Healthcare Technologies Accessible To All- Part 2: Considering a wider design scope with diversity and inclusion (a PharMedTech curtain-raiser)

In the comings weeks, PharMedTech Magazine and Newsletter will be publishing a series of articles to serve as curtain-raisers to the formal launch of both publications. The key points and highlights of this article are:

  • For the medtech, the spotlight on Biased Algorithms and Limited Smartwatch Functionalities on Certain Skin Tones
  • Lack of diversity: Addressing the gender, racial and ethnic issues in clinical trials
  • Regulatory approach: using regulatory requirements in addressing issues in clinical trial recruitment of participants
  • Utilisation of strategies and sources of valuable demographic information in support of decision-making from clinical trial data  

Making healthcare technologies accessible to all is one of the universal goals of the World Health Organisation (WHO) through its Global Strategy on Digital Health and Public Digital Health Technology (PDH) initiative. The purpose for a Global Strategy on Digital Health is to promote healthy lives and wellbeing for everyone, everywhere, at all ages. To deliver its potential, national or regional Digital Health initiatives must be guided by a robust Strategy that integrates financial, organizational, human and technological resources.

In the advent of technological advancement in Digital Health with connected "smart" or "intelligent" medical devices, another spotlight is on biased algorithms of AI-enabled systems and devices. When it comes to healthcare, there is very little or no room for flawed systems or defective medical devices, however, there have been many reported cases of non-functioning devices, not because they are defective but they are not designed and tested on certain skin tones. There are documented instances of limited functionalities of wearables such as smartwatches and other devices that do not work properly or do not work at all on black or dark toned and tattooed skins.

For the past years, efforts have been dedicated to studying the developments and trends within the medical technologies in the delivery of healthcare services. The global wearable technology market size was valued at USD 32.63 billion in 2019 and is projected to expand at a compound annual growth rate (CAGR) of 15.9% from 2020 to 2027. The growing popularity of the Internet of Things (IoT) and connected devices and rising technically sound population is expected to drive the demand.

With such a scope for growth and demand coupled with their applications in the healthcare, wearable and connected intelligent medical devices are seeing increasing research interest from "concept to commercial". However, the appetite for growth in the wearable technology is marred by issues of their metrological accuracy, security and quality of data as well as functionality.

From trivial design flaws to more serious ones that could be lead to a fatal medical decision being made as a result of the wrong device output. Examples of trivial design flaws are Zoom’s virtual background and the soap dispenser technology not recognising different skin tones.

There have been reports that Zoom’s virtual background feature isn’t built for black or dark skin tone faces in that black people are fading into their Zoom backgrounds because supposedly the algorithms are not able to detect faces of dark complexions well or use auto-adjustments. One may say that it is fair that if black background is not working for you, pick another suitable background. 

The UK health secretary, Sajid Javid, has announced a review into systemic racism and gender bias in medical devices in response to concerns it could contribute to poorer outcomes for women and people of colour.

Writing in the Sunday Times, Javid said: “It is easy to look at a machine and assume that everyone’s getting the same experience. But technologies are created and developed by people, and so bias, however inadvertent, can be an issue here too.”

"A picture is worth a thousand words", an English language adage that refers to the notion that complex and sometimes multiple ideas can be conveyed by a single still image, which conveys its meaning or essence more effectively than a mere verbal description. The following pictures speak volumes of where technologies have been limited in their intended use on certain skin tones from their fundamental designs to the rendition of their expected output:

No alt text provided for this image

A classic illustration of design limitation, unless of course the device is designed for white/lighter skin tones but not black/darker ones. Articles from the Economist highlight the issues and further to the above outlines the widespread nature and how serious lack of diversity can be …..

A pulse oximeter measures blood oxygen saturation by passing a beam of light through the finger on which the device is worn. But in people with dark skin, the light through the finger may be worse as much of it is absorb and not reflected to the sensor for measurement – as a result, the pulse oximeter may give a higher saturation indicator than it actually is.

Even if the pulse oximeter is not accurate enough, the dynamics of saturation can indicate potential health problems in a person. The level of oxygen in the blood of a healthy person should be 95-99%, but in the case of chronic diseases of the respiratory or cardiovascular system, the figure can drop to 92-94%.

A drop in saturation is considered one of the manifestations of coronavirus disease – but it can also indicate respiratory failure of a different kind. In some cases, saturation can imperceptibly reach a critically low 50-70% – this condition is called “silent hypoxia”.

A study using a multivariate analysis indicated that Oxygen saturation (SaO2) level, sensor type, skin colour, and gender were predictive of errors in Peripheral oxygen saturation (SpO 2) estimates at low SaO2 levels. The data suggest that clinically important bias should be considered when monitoring patients with saturations below 80%, especially those with darkly pigmented skin.

The following startling image illustrates the deep-rooted biases of AI research. Input a low-resolution picture of Barack Obama, into an algorithm designed to generate depixelated faces, and the output is a white man.

No alt text provided for this image

But exactly what the Obama example reveals about bias and how the problems it represents might be fixed are complicated questions. Indeed, they’re so complicated that this single image has sparked heated disagreement among AI academics, engineers, and researchers. The fact that some researchers seem keen to only address the data side of the bias problem is what sparked larger arguments about the Obama image. This is not an isolated or single case, but trying to get the same algorithm to generate high-resolution images of actress Lucy Liu or congresswoman Alexandria Ocasio-Cortez from low-resolution inputs, and the resulting faces look distinctly white.  

On a technical level, some experts aren’t sure of when a dataset is bias. Some attribute the issues to algorithm rather than the data whilst others blame the combination of both. But regardless of the cause, the outputs of the algorithm seem biased — something that the researchers didn’t notice before the tool became widely accessible. This speaks to a different and more pervasive sort of bias: one that operates on a social level.

No alt text provided for this image

In the case of the soap dispenser technology, one may not have the option to another if that is the only one available. There have been reports published on biased technology and ethics in design referencing a video at a hotel that went viral on the social media, an automatic soap dispenser is shown unable to detect a black customer’s hand. 

The dispenser used near-infrared technology to detect hand motions. The invisible light is reflected back from the skin which triggers the sensor. Darker skin tones absorb more light, thus enough light isn't reflected back to the sensor to activate the soap dispenser. This means that dark-skinned lavatory or restroom users will have to skip washing their hands with this not-so-sensitive soap dispenser.

This epic design flaw may seem hilarious on the internet, but demonstrates a major issue with many technology-based companies: diversity. The soap dispenser was created by a company which unintentionally made a discriminatory soap dispenser because no one at the company thought to test their product on dark skin.

Risk Mitigation and Solutions

This problem is common in machine learning, and it’s one of the reasons facial recognition algorithms perform worse on non-white and female faces. Data used to train AI is often skewed toward a single demographic, white men, and when a program sees data not in that demographic it performs poorly. Not coincidentally, it’s white men who dominate AI research.

  • a better understanding of the issues from multifaceted perspectives and considerations including scientific, technical, social, demographic, lack of expert knowledge, etc... to help address them from product design and development processes.
  • Using a wider dataset with diverse sampled population or demographic variations, the systems should work better than examples seen. Efforts should be taken to recruit participants of wider population with diverse backgrounds on whom these devices are tested so help define the scope of intended users.
  • Residual risks should outline what precautionary measures to be taken by users. Although it may sound extreme but perhaps, the scope of the Risk Management should cover the limited or lack of diversity during the validation and verification process to specifically present a cautionary note to users, e.g. "Not intended to be used by certain types of people or for/on skins of certain tones." This may not be the desired approach but it may offer opportunities to look into the issues more deeper to come out with solutions with technologies that work all.

Many researchers acknowledge that AI is affected by wider social issues and that simply using “correct” data does not deal with the fundamental problems. Others note that even from the point of view of a purely technical fix, “fair” datasets can often be anything but. For example, a dataset that accurately reflected the demographics of certain area or population can contribute to such biases and that an algorithm trained on data acquired from predominantly certain a certain or specified skin tone would perform better on skin tones on which the AI has not been tested with. 

Scientist, engineers, designers, clinicians and intended end-users of wearables have to be part of the product design and development processes should take into consideration the diversity and inclusion aspects of the intended users as an article puts it:

  • "We're all different, and wearable sensors need to account for that"
No alt text provided for this image


PharMedTech is seeking for experts in areas of different applications of various technologies in healthcare and in the pharmaceutical, medical device and technology sectors to write and submit their articles for review and publication.

Please share/re-share with our network, recommend to all to subscribe, comment, "Like it", in other words, please do something positive to help reach out to all. For the icing on the cake, just connect with me - send an invitation and I will gladly accept.

To subscribe to PharMedTech:

PharMedTech: Introducing the new platform for the convergence of professionals in Pharma, MedTech and Tech Sectors | LinkedIn

https://www.dhirubhai.net/newsletters/pharmedtech-6902122349190361089/

References:

John R. Feiner, MD; John W. Severinghaus, MD and Philip E. Bickler, MD, PhD: Effect of Gender and Skin Pigment on Pulse Oximeter Accuracy International Anesthesia Research Society. Vol. 105, No. 6, December 2007. DOI: 10.1213/01.ane.0000285988.35174.d9

James Zou, Londa Schiebinge: Ensuring that biomedical AI benefits diverse populations. EBioMedicine Volume 67, 2021. https://doi.org/10.1016/j.ebiom.2021.103358. (https://www.sciencedirect.com/science/article/pii/S2352396421001511

5 Ways Technology Is Making Healthcare More Accessible (bustle.com)

Five Ways Technology is Making Healthcare More Accessible | by Craig Richardville | Becoming Human: Artificial Intelligence Magazine

Here’s how to improve access to healthcare around the world | World Economic Forum (weforum.org)

Can technology make healthcare more accessible? | WarwickshireWorld

Wearables still haven't solved the problems of skin science, but new ideas are coming (wareable.com)

How medicine discriminates against non-white people and women | The Economist

What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias - Business (ipsnews.net)

Covid: Pulse oxygen monitors work less well on darker skin, experts say - BBC News

Covid: Sajid Javid orders review of medical device racial bias - BBC News

Bias in medical devices may have led to avoidable UK Covid deaths, says Javid | NHS | The Guardian

UK-Race Bias in Covid Treatment, Summary, World Stats - The St Kitts Nevis Observer

From oximeters to AI, where bias in medical devices may lurk | NHS | The Guardian

Design bias is harmful, and in some cases may be lethal | The Economist

How to make sure technology doesn't leave people behind | World Economic Forum (weforum.org)

How biased algorithms perpetuate inequality (newstatesman.com)

AI-powered tool transforming pixelated images has 'racial bias' | Daily Mail Online

How accurate is smartwatch heart data? It depends on your skin tone (medicalxpress.com)

Smart wearable designs for today’s active lifestyle - Infineon Technologies

Bigotry Encoded: Racial Bias in Technology (rit.edu)

Smartwatch Heart Data May Be Less Accurate for Black Users (blackdoctor.org)

Smartwatch fitness trackers don’t work as well for people with darker skin, study says – Find Your New Smartwatch at the Best Price (saysmarter.com)

Racial Discrimination in Face Recognition Technology - Science in the News (harvard.edu)

Artificial Intelligence Has a Racial and Gender Bias Problem | Time

AI: How can we tackle racism in artificial intelligence? | World Economic Forum (weforum.org)

Hazel Simila

A teacher, researcher and dentist interested in the Science of Dental Biomaterials and enthusiastic about the future of smart biomaterials.

2 年

? ?? ?

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了