Headlines

Engineers are working to build better pulse oximeters

Whether any of these strategies will fix the bias in pulse oximeters remains to be seen. But it’s likely that by the time improved devices are up for regulatory approval, the bar for performance will be higher. At the meeting last week, committee members reviewed a proposal that would require companies to test the device in at least 24 people whose skin tones span the entirety of a 10-shade scale. The current requirement is that the trial must include 10 people, two of whom have “darkly pigmented” skin.

In the meantime, health-care workers are grappling with how to use the existing tools and whether to trust them. In the advisory committee meeting on Friday, one committee member asked a representative from Medtronic, one of the largest providers of pulse oximeters, if the company had considered a voluntary recall of its devices. “We believe with 100% certainty that our devices conform to current FDA standards,” said Sam Ajizian, Medtronic’s chief medical officer of patient monitoring. A recall “would undermine public safety because this is a foundational device in operating rooms and ICUs, ERs, and ambulances and everywhere.”

But not everyone agrees that the benefits outweigh the harms. Last fall, a community health center in Oakland California, filed a lawsuit against some of the largest manufacturers and sellers of pulse oximeters, asking the court to prohibit sale of the devices in California until the readings are proved accurate for people with dark skin, or until the devices carry a warning label.

“The pulse oximeter is an example of the tragic harm that occurs when the nation’s health-care industry and the regulatory agencies that oversee it prioritize white health over the realities of non-white patients,” said Noha Aboelata, CEO of Roots Community Health Center, in a statement. “The story of the making, marketing and use of racially biased pulse oximeters is an indictment of our health-care system.”

Read more from MIT Technology Review’s archive

Melissa Heikkilä’s reporting showed her just how “pale, male, and stale” the humans of AI are. Could we just ask it to do better

No surprise that technology perpetuates racism, wrote Charlton McIlwain in 2020. That’s the way it was designed. “The question we have to confront is whether we will continue to design and deploy tools that serve the interests of racism and white supremacy.”

We’ve seen that deep-learning models can perform as well as medical professionals when it comes to imaging tasks, but they can also perpetuate biases. Some researchers say the way to fix the problem is to stop training algorithms to match the experts, reported Karen Hao in 2021

From around the web

The high lead levels found in applesauce pouches came from a single cinnamon processing plant in Ecuador. (NBC)

#Engineers #working #build #pulse #oximeters

Leave a Reply

Your email address will not be published. Required fields are marked *