April 24, 2024

x08x

Body and Interior

Design bias is harmful, and in some cases may be lethal

SOME THINGS, you might think, are obvious. For example, if you design a device which shines light through someone’s fingertip to measure the oxygen level of their blood, then the colour of the skin through which that light is shining should be a factor when the device is calibrated.

But no. Research suggests that, with honourable exceptions, pulse oximeters, the machines which do this, overestimate oxygen levels three times more frequently (12% of the time) in people with black skin rather than white. When this informs decisions on whom to admit to hospital during a pandemic, more black than white patients are sent home on the mistaken conclusion that their blood-oxygen levels are within a safe range. This could have fatal consequences.

The pulse oximeter is only the latest example of an approach to design which fails to recognise that human beings are different from one another. Other recent medical cases include an algorithm that gave white patients in America priority over those from racial minorities, and the discovery that implants such as prosthetic hips and cardiac pacemakers cause problems more often in women than in men.

Beyond medicine, there are many examples of this phenomenon in information technology: systems that recognise white faces but not black ones; legal software which recommends harsher sentences for black criminals than white; voice-activated programs that work better for men than women. Even mundane things like car seat-belts have often been designed with men in mind rather than women.

The origin of such design bias is understandable, if not forgivable. In the West, which is still the source of most innovation, engineers have tended to be white and male. So have medical researchers. That leads to groupthink, quite possibly unconscious, in both inputs and in outputs.

Input bias is particularly responsible for the IT cock-ups. Much of what is commonly called artificial intelligence is actually machine learning. As with any learning, the syllabus determines the outcome. Train software on white faces or men’s voices, and you will create a system that is focused on handling them well. More subtle biases are also in play, though. The faulty medical algorithm used prior medical spending as a proxy for current need. But black Americans spend less on health care than whites, so it discriminated against them. Sentencing software may similarly conflate poor social circumstances with the propensity to reoffend.

Input bias is also a problem in medicine. Despite decades of rules on the matter, clinical trials are still overloaded with white men. As far as sex bias is concerned, trial designers do have half a point. If a female participant became pregnant and the treatment under test harmed her baby, that would be tragic. But there is no excuse for failing to make trials big enough to detect statistical differences between relevant groups.

Output bias is more intriguing. In a well-ordered market, competition should introduce diversity quite fast. In the past, women and non-white people may have lacked purchasing power, but that is surely no longer so. This, however, assumes that they are the customer when frequently they are not. Look to those who buy medical equipment, and you may see a mix that is more white and male than the population in hospital wards and doctors’ waiting rooms. Neither are face-recognition systems or sentencing software bought by those who suffer because of their failures.

Most consumer-led industries excel at generating choice by segmenting markets, so competition will probably sort things out. In other areas, though, boots may need to be applied to backsides. Regulators should, for example, factor in diversity when assessing clinical trials.

In both cases, however, it would behove firms to build diversity into their designs from the very outset. This means including women and non-white individuals in design teams. Eliminating design bias is not just about equality or doing the right thing, although all of these are important. It is also about creating products that meet the needs of women and the vast, non-white majority of the world’s population. It is one of those welcome areas where the best path is not just the right one, but often the profitable one, too.

This article appeared in the Leaders section of the print edition under the headline “Working in the dark”