The growing use of artificial intelligence has been a hot topic in recent years, with a growing number of industries beginning to use the technology that is predicted to revolutionize some sectors.
The technology drives excitement, but there are also concerns about potential negative effects on people and society.
In her new book, “More Than a Glitch: Confronting Race, Gender and Ability Bias in Tech,” AI researcher and data journalism professor Meredith Broussard highlights the implications of some of AI’s deeper problems.
Broussard spoke to ABC News about finding out how AI was used during her own clinical trial for breast cancer and some of the potential pitfalls of the technology being used in this capacity .
Lynsey Davies: So in the book you talk about being diagnosed with breast cancer and at some point you find out that the AI actually read your test. What was your reaction to that?
Meredith Broussard: I thought it was very strange, because I was depending on my doctors for care at this crisis point in my life. And I thought, what did this AI find? How is it being used? And because I’m an AI researcher, I thought, who made this particular AI, because I know there’s a lot of bias in artificial intelligence systems.
So I didn’t do anything with this knowledge right away, but then when I recovered, I went back into it, and I actually took an open source AI and through this open source AI to write about the state Played your movies. The art of AI-based cancer detection.
Davis: Do you think this is something we’re going to see increasingly in the medical world?
Broussard: I think one of the things that researchers really want to do, that medical researchers really want to do is they want to cure more people and they want to diagnose more people earlier and more accurately. want to do, and the real hope is that A.I. I will help in that. Is it going to happen anytime soon? No, will it finally happen? Perhaps.
Davis: What are the potential risks of using AI in this capacity?
Broussard: One thing that people often don’t understand is that when you build an AI system, a machine learning system, for something like diagnosing breast cancer, you really need to make it have a high rate of false positives or false positives. Must tune in for. negative.
So a false positive would mean that it says, “Oh, you might have cancer,” whereas you really don’t have cancer, and it sends you for more testing. And a false negative would mean that it says, “No, there’s no cancer here,” but you actually have cancer. So the cost of false negatives in medicine is very high. And so these systems actually have more false positives than false negatives, which means it’s often going to say, “Oh yeah, I think there might be a problem here,” and then people Will be referred for more testing, as many people know, this can mean waiting weeks or even months for additional tests and just being on the edge of your seat while worrying about .
DAVIES: Okay folks at home, gonna wonder, do you have any false positives?
Broussard: So, I got really great medical care during my cancer experience. And I just, I’m so grateful to all the people who took care of me. One interesting thing about the AI that diagnosed me is that it’s not actually used for diagnosis. It is used after the doctors in this particular hospital have already given their diagnosis, so it is more like a backup tool. So the doctor, the radiologist, will enter a diagnosis in what they think is a diagnosis, and then they’ll have access to the results of the AI. And they can either ignore the AI results or they can use it to say, “Okay, well, maybe I’ll go back and revisit that area of concern.”
So it is still in the hands of the doctors. Nobody has to worry that the AI is out there, you know, diagnosing people with cancer. And we’re certainly not in the situation that a lot of people imagine, where it’s going to be like a box that you go to and it scans you and tells you, “Yeah, you have cancer. No, you don’t have cancer.” As such, this is not an ideal scenario.
Davis: Not today.
Broussard: Not today. I mean, I hope it never happens, because I don’t want bad news medically. I want that from a doctor. I don’t want it from any machine.
DAVIES: Meredith, we thank you very much. Of course, this is a conversation that will continue in the months and years to come. We want to let our viewers know that her book, “More Than a Glitch: Confronting Race, Gender and Ability Bias in Tech” is now available wherever books are sold.