The Organization That’s Tracking People With Mental Illnesses

An experimental Florida program that aims to use big data to treat the mentally ill raises privacy questions

Illustration: Tara Jacoby
May 24, 2017 at 2:14 PM ET

What if a judge could order you to wear a tracker that logged not just your location but your mood? What if an algorithm could look at your medical records and predict how likely you were to commit a crime?

In Florida, a progressive database project — a collaboration between the county, the district court, medical clinicians, and a private company — is experimenting with both predictive algorithms and cell-phone apps as tools to track and treat people with mental illness. It’s part of an ambitious plan to use big data to get mentally ill individuals into treatment and keep them out of jail — but critics say the price could be their privacy or even their autonomy.

“An app that reminds me to take my medication is incredibly helpful,” said lawyer and disability rights activist Matthew Cortland. “An app that reports to the government that I did not take my medication is verging on tyranny.”

The initiative has roots that go back nearly 20 years, when Steven Leifman, a judge in Miami Dade County, became troubled at seeing the same people pass through his courtroom again and again — people who were sick and needed help that prison could not give them. “This population was really hitting the system hard, without any good outcomes for them, society, or anybody,” Leifman told Wired last fall.

This is a national problem. Across the country, people with mental health diagnoses are incarcerated at an alarming rate. “We’ve, frankly, criminalized the mentally ill, and used local jails as de facto mental health institutions,” one county health director told PBS in 2014. That same year a report from the Treatment Advocacy Center found that more than 10 times as many people with mental illness were held in prisons than were in treatment in hospitals.

Leifman’s solution, beginning in 2000, was the Criminal Mental Health Project, which provides mental health crisis response training for police officers, as well as diversion programs for people with psychiatric diagnoses who face misdemeanor and lower-level felony charges. Just as some courts give drug offenders the option to enter detox programs rather than face charges, Leifman’s mental health court offers defendants with a history of mental illness the option to enter treatment.

The program has been hugely successful. According to the New England Journal of Medicine, “The average daily census in the county jail system has dropped from 7,200 to 4,000, one jail facility has been closed, and fatal shootings and injuries of mentally ill people by police officers have been dramatically reduced.” Leifman’s project became a national model.

Then, around 2012, Leifman had a chance meeting at a conference with a representative from the Japanese pharma company Otsuka. Otsuka had entered into a joint venture with IBM’s Watson supercomputer division, and was looking to diversify into big data approaches to managing behavioral health. Leifman saw an opportunity. So did Otsuka.

But first, they needed to get access to people using mental health services who weren’t in the court system. So Leifman turned to a local mental health service organization, the South Florida Behavioral Health Network, for help. “Judge Leifman volunteered us,” said John Dow, the network’s president and CEO. Leifman now chairs the organization’s board, and he arranged for the network to partner with Otsuka to create a database of health service consumers.

Today, anyone who walks into one of the network’s 36 clinics is asked to fill out a detailed questionnaire on their family background, medical and criminal history, and health habits. Unless they opt out, that questionnaire forms the basis of their file. As the person continues to get treatment, more data is recorded about them: what services they needed, how they’ve responded. That information is shared between clinics — Otsuka is proud of “breaking down data silos” — through a dashboard that lets them “see the entire spectrum of their clients’ integrated behavioral health information,” according to a report by the unrelated Community Oriented Correctional Health Services (COCHS). The system raises a flag when an individual visits an emergency room, or misses an appointment. Eventually, it will also flag encounters with police.

Otsuka is working to analyze all that data and create treatment protocols, Dow said, allowing caregivers to better tailor treatment to individuals based on the outcomes of other patients with similar profiles. He compares Otsuka’s database to cancer registries. “One of the biggest issues with mental health is there are no standards of care,” he said. “Unless you have a standard to apply to care, you have no basis of measuring success or lack thereof.”

Right now, Dow said, Otsuka’s data — and the conclusions its software draws about that data — are “on a need-to-know basis,” available mostly to therapists. “If you’re treating a person,  you have access to it,” he said.

But eventually, Otsuka, the county and the treatment network plan to work together to integrate this database with the county’s computer system. That means clinicians will have access to booking histories, court dockets, arrest records, and other data. And on the court’s side, an individual’s medical data may be used track and predict the behavior of any individual coming into contact with the criminal justice system.

It’s still unclear how much access judges will have and at what point — whether medical information will come into play during bail hearings or sentencing — or what part predictive algorithms will play in legal or medical decisions, Dow said. Hypothetically, the system could predict the most effective treatment for an individual, or it could identify someone who was a risk to the community, he said.

Tracking options are also experimental: Right now, network providers are tinkering with a handful of apps that let people log their feelings, and alerts a counselor or a family member if their mood dips. The app can also send an alert if someone goes into a particular geographical area. For example, said Johnny Guimaraes, the network’s VP of information technology and compliance monitoring, “If somebody is struggling with alcohol and they are pacing back and forth in front of a bar or nightclub.”

Currently, use of this app is voluntary — and popular: Guimaraes says the retention rate of people who try it is 70 percent. But there’s also the possibility that a judge could eventually order someone to use similar apps as a condition of parole or probation. “Judges need to be informed,” Dow said. “Most judges want to do the right thing.”

But for activists like Cortland, this is where the program potentially crosses a line.

Cortland, a lawyer in Massachusetts who specializes in disability, has been following Judge Leifman’s progress for years. The judge is sincere in his intent, he said. “I respect his work a great deal … He seems to be a good actor within the constraint imposed by the criminal justice system,” Cortland said. “But that’s the wrong system to be addressing these problems.”

Meshing mental health care and the criminal justice system has the potential to compromise medical consent, he said. “We have a well-established tradition of the right to refuse treatment,” he said. “Being disabled does not take away that right. People with mental illnesses don’t lose the right to refuse treatment just because they have a mental illness.”

Dow stresses that individuals can currently choose to opt out of the Otsuka system at any time. When the courts become part of the system, he added, it will always be possible for a defendant to choose jail over treatment. “It is the consumer’s choice,” Dow said. “You can’t force them into something like that without their concurrence.”

Guimaraes concurred: “We want to ensure that this remains consumer-focused and consumer-centric,” he said. “It’s simply a tool to help the consumer get better. It would be detrimental and taking so many steps backward to use these types of tools for enforcement type of actions.”

Cortland, however, contends that choice is an illusion when the alternative is imprisonment. “The criminal justice system is an inherently coercive system,” he said. Nor does the addition of a private company to the mix inspire trust, he added. “One of the problems with outsourcing government to private corporations is that a private corporation isn’t accountable in the same way government is.”

Cortland is also concerned about the premise of Otsuka creating predictive algorithms from the data it gathers. “It seems like this company has had the bright idea of taking predictive algorithms used in sentencing and adapting them to mental health, to the intersection of mental health and the criminal justice system,” Cortland said. “And that is deeply concerning.”

Predictive sentencing algorithms have already come under fire for reinforcing racist and classist biases, he pointed out — and people with mental health diagnoses already face enormous stigma and misunderstanding, often stereotyped as criminal or violent even when they aren’t.

As for mood-tracking apps, “Monitoring bracelets are less invasive than an app that collects data on my most intimate thoughts and feelings,” Cortland said.

Kenneth R. Weingardt, Scientific Director of Northwestern University’s Center for Behavioral Intervention Technologies, has been studying similar apps — including ones that simply use cellphone data to extrapolate whether the user is having a manic or depressive episode. He agrees with some of Cortland’s criticisms. “All of us researchers are so all deeply invested in informed consent, but we all see what’s happening in commercial spaces, and it’s of concern,” Weingardt said.

Tracking apps could be deeply appealing to law enforcement and parole officers, he said. “From the system’s side, it makes thing more efficient,” he said. “If your job is to manage someone in the community, it would be helpful to know where they go, when they go to sleep.”

He added, “Let’s not forget that consumers with mental health issues are Americans also, and have free choice about whether they’re going to participate in these programs.”

All this could eventually have the effect of making people wary of seeking mental health care, worried that simply setting foot in a mental health center could create a file that might eventually be used to predict their behavior, track their movements, force them into medical treatment, or sentence them. And Cortland worries that the stigma of having a mental health diagnosis would increase if it was automatically linked to criminality or to the justice system. “It would be tragic if … people were worried that a mental illness diagnosis is putting them in some sort of legal jeopardy,” he said.

Of course, this is the opposite of what Leifman and Dow intend. “There are checks and balances in place,” Dow said. “Balancing patient choice with needed care is a very tricky thing, and we spend a lot of time on it … We spend incredible amounts of time discussing what’s the best way to do this.”

Other jurisdictions have been looking to Miami-Dade as a model, and Otsuka hopes to expand models piloted there in other courts and states. Dow sees their current experimentation as a way of testing best methods. And, he said, it’s better than doing nothing and letting the mentally ill fill prisons across the nation.

“I am incredibly lucky at this point in my career to have staff who will follow some crazy ideas and make a difference,” he said. “If you’re sitting here saying, ‘I don’t have the answers,’ nothing is going to happen. You’re going to be doing the same thing 40 years from now. And that, to me, is unacceptable.”

The new season of DARK NET — an eight-part docuseries developed and produced by Vocativ — airs Thursdays at 10 p.m. ET/PT on SHOWTIME.