AI, Substance Misuse, Addiction, and the Tragedy of the Commons

May 30, 2023
Posted in:
image of a chainlink fence with signs saying don't give up

Substance misuse and addiction is perhaps our most significant domestic challenge. In 2022, the U.S. Congress Joint Economic Committee (JEC) found that the opioid epidemic alone cost the United States nearly $1.5 trillion in 2020, or 7 percent of gross domestic product (GDP), an increase of about one-third since the cost was last measured in 2017. If you start calculating in alcohol and other drugs and their impact on lost productivity, healthcare and beyond, it moves well into double digit losses to our nation’s economic health. Beyond the economic impact, its impact has reduced the length and quality of our collective lives. If this was any other issue, we would do everything in our power to normalize seeking help and reducing barriers to care. Yet, it is not any other issue, it is addiction. As a nation we have deep seated negative attitudes about addiction and people like me who experience it. As a result, we end up doing things that isolate and stigmatize anyone suspected of possibly having a problem. New technology will likely make these dynamics much worse.

Last April, I wrote about the Algorithm of Medical Care Discrimination. Software company Bamboo Health uses patient data to determine an overdose risk score. The software, NarxCare deploys an algorithm that far too often label patients as potential drug addicts. As this article notes, NarxCare gathers information like criminal records, sexual abuse history, distance traveled to fill a prescriptions and even pet prescriptions to assign risk scores. BIPOC community members score higher as our criminal justice system has historically targeted them for sanctions at higher rates than whites. Women who have more documented sexual trauma than men also get scored higher.

As noted at Wired.com, a woman was kicked out of receiving services from her primary care provider. Her dogs were prescribed opioids and benzodiazepines. That gave her a high score for potential addiction. She became a person to be gotten rid of, not helped. She got the drug addict treatment, she was shown the door and terminated from care.

People in the crosshairs of this software far too often get treated like pariahs by medical professionals, not offered help but kicked to the curb and treated like societal outcasts. This is the norm for people with SUDs. Two weeks ago a citizens petition was filed with the FDA to take NarxCare off the market. The CUSP FDA Citizen Petition to Protect Patients was filed by the Center for US Policy. The petition asks the FDA to issue a warning letter to Bamboo Health, initiate a mandatory recall, and inform healthcare providers not to use the NarxCare risk scores. It should come off the market.  

The problem is much bigger than NarxCare. The data gathering and AI tools that are increasingly used to sift through information and assist clinical care are ripe with flaws. As noted recently by the Washington Post, denials of health-insurance claims are rising — and getting weirder and include things like an automated system, called PXDX, which “reviews” medical records at a rate of 50 charts in 10 seconds and lead to increased denials for things like substance misuse and addiction.

The book, Weapons of Math Destruction by Cathy O’Neil on the societal impact of algorithms explores how big data is increasingly used in ways that reinforce preexisting inequality. One of the themes of the book is machine learning (ML) bias. Essentially, all the hidden and not so hidden biases ever held in human history is baked into all the data. AI uses the internet and all the information on it like a huge library. It all happens in ways we cannot see, and we tend to falsely believe that these are objective tools. They are not.

Some ML models, like supervised learning, learn by examining previous cases and understanding how data is labelled. AI bias: exploring discriminatory algorithmic decision-making models and the application of possible machine-centric solutions adapted from the pharmaceutical industry noted that: 

“The process of data mining, one of the ways algorithms collect and analyze data, can already be discriminatory as a start, because it decides which data are of value, which is of no value and its weight—how valuable it is. The decision process tends to rely on previous data, its outcomes and the initial weight given by the programmer. One example can be when the word woman was penalized, by being given a negative or a lower weight, on a CV selection process based on the data of an industry traditionally dominated by men like the tech industry. The outcome ended discriminating women in the selection process.”

Training data that are biased can lead to discriminatory ML models. It can happen in at least two ways:

  1. A set of prejudicial examples from which the model learns or in the case of under-represented groups which receives an incorrect or unfair valuation.
  2. The training data are non-existent or incomplete.

Both of these dynamics are in play in respect to SUD. There is significant bias against people with substance use disorders in our healthcare system. Something we explored in, OPPORTUNITIES FOR CHANGE – An analysis of drug use and recovery stigma in the U.S. healthcare system. Stigma is prevalent, but the majority of people with lived experience with SUDs did not need our paper to tell them that, it is in our common experience. The attitudes are abysmal. We are seen as subhuman. We get “treated and streeted.” Labels are openly attached to us like GOMERS (Get Out of My ER) and “frequent flyers.” AI weaponizes these biases, and there are no business or influential stakeholder groups to shape guardrails to protect us from these harms, even as the vitality of our nation continues to erode, a true tragedy of the commons.

AI, addiction, and invasive commercial surveillance

The selling of addictive drugs is highly lucrative. What we have historically done in America is allow these industries (both illicit and legitimate) to flourish and only address the fallout when the scope of the problem becomes too large to ignore. The “opioid crisis” is actually decades in the making and more than just opioids. It only got attention when it reached the magnitude in which it started reducing our overall life expectancy in the US. Remedial efforts have been anemic and not equitably available, leading to more devastation in marginalized communities. These dynamics have been shortsighted, but entirely consistent historically with how we address substance misuse and addiction half-heartedly, even as the devastation is sapping the very vitality and welfare of our nation.

While such technology is always touted as improving care, because of the deep stigma, lack of stakeholder inclusion and money to be made selling addictive drugs, it is much more likely these tools will be used to identify “abusers” and sanctioning them, to deny employment, housing, or educational opportunities. Profit off of selling drugs, then blame the user when they develop a problem and sideline then. Most often in ways that render the impacted person powerless.

Consider the woman I referenced above denied pain killers because her dog was on benzos. If similar software was in use when she applies for a job, or apply for entrance into a university, or rent an apartment, she would never know that she was being redlined because her dog has anxiety. The odds are no one would ever even know or do anything about it. For those of a certain age, it is reminiscent a dark dystopian mid-80s comedy Brazil in which the main character has his life ruined because of over-reliance on poorly maintained machines, but this is no longer science fiction.

When being treated like a drug addict means that people are treated just like everyone else, we will be where we need to be. We have a long way to go to reach that destination. AI is not going to be good for persons with addictions or those of us in recovery in particular. We have not even fully understood the ways people with substance use conditions face discriminatory barriers in the dawn of the age of AI. This is one plane we should not be flying as we are building it, but this is exactly what is happening.

Senior Microsoft executives suggest we should wait for “Meaningful harm” from AI necessary before regulation. What would the threshold of harm be in respect to ferreting our persons with addictions and those of us in recovery and who would determine it? Who is even looking? Those harmed rarely come forward as it sets them up for further sanctions, economically benefits those who discriminate against us. The longer we wait, the higher the hurdles set up by those with vested interests in ferreting out people who experience a substance use disorder. The greater the cost to our entire society. It is time to set up some guardrails before we proceed any farther. NarxCare is a perfect example of software intended to help in respect to prescription drug management that in the quest for beneficence led us down the road of debasement of persons suspected of substance misuse.

Setting up structures to reduce harm of AI for marginalized communities.

These emergent technologies are operating in a wild west environment and may not be containable in ways that have worked historically. Recently, a computer hired a human to defeat the technology we put in place to ensure that the system was interacting with a human rather than a computer. The computer faked a human voice and lied to the human when asked if it was a computer to get access to the data it wanted. This is a world no human has experienced before, and the marginalized are the ones at greatest risk for maleficence.   We should be pumping the brakes on moving forward with AI and related technologies until we initiate ethical limits in its use. Given the scope of our addiction problem and the market forces in play, AI will undermine healing and recovery efforts in a myriad of ways that will exacerbate what is already one of our most profound challenges. It will continue to erode our GDP while reducing US life expectancy as we sit by the sidelines. The only way we will get ahead of this curve is with intention and focus.

There is little evidence we have either.

Copyright © 2024 Recovery Alliance Initiative
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram