The Real Harm of Crisis Text Line’s Data Sharing
Another week, another privacy horror show: Crisis Text Line, a non-profit text message service for people experiencing serious mental health crises, has been using “anonymized” conversation data to power a for-profit machine learning tool for customer support teams. (After backlash, CTL announced that they would stop.) Crisis Text Line’s response to the backlash focused on the data itself and whether it included personally identifiable information. But that response uses data as a distraction. Imagine this: say you texted Crisis Text Line and got back a message that said “Hey, just so you know, we’ll use this conversation to help our for-profit subsidiary build a machine learning tool for companies who do customer support.” Would you keep texting?
That’s the real travesty: when the price of obtaining mental health help in a crisis is to be grist in a machine learning mill. And it’s not just users of CTL who pay: it’s everyone who goes looking for help when they need it most.
Americans need help and can’t get it. The huge unmet demand for critical advice and help has given rise to a new class of organizations and software tools that exist in sort of a regulatory gray area: they help people with bankruptcy or evictions, but aren’t lawyers; they help people with mental health crises, but aren’t care providers. They invite ordinary people to rely on them, and often provide real help. But these services can also avoid taking responsibility for their advice, or abuse the trust people put in them. They can make mistakes, push predatory advertising and disinformation, or just outright sell data. And the consumer protections that would normally protect people against malfeasance or mistakes by lawyers or doctors haven’t caught up.
This regulatory gray area can constrain organizations with novel solutions to offer. Take the example of Upsolve, a non-profit that develops software for guiding people through bankruptcy. (And which they take pains to claim is not legal advice.) Upsolve wants to train New York community leaders to help others navigate the city’s notorious debt courts. One problem: these would-be trainees aren’t lawyers, so under New York (and nearly every other state) law, Upsolve’s initiative would be illegal. So Upsolve is suing to carve out an exception for itself. Upsolve claims, quite rightly, that a lack of legal help means that people effectively lack rights under the law.
The legal profession’s failure to meet Americans’ access to justice needs is well-documented. But Upsolve’s lawsuit also raises new, important questions. Who is ultimately responsible for the
advice given under a program like this, and who is responsible for a mistake—a trainee, a trainer, both? How do we teach people about their rights as a client of this service, and how to seek recourse? These are eminently answerable questions. There are lots of policy tools for creating relationships with elevated responsibilities: we could assign advice-givers a special legal status, establish a duty of loyalty for organizations that handle sensitive data, or create policy sandboxes to test and learn from new models for delivering advice.
But instead of using these tools, most regulators seem content to bury their heads in the sand. Officially, you can’t give legal advice or health advice without a professional credential. Unofficially, you can get advice in all but name from gray area tools and organizations. And while credentials can be important, regulators are failing to engage with how software has fundamentally changed how we give advice and care to one another, and what that means for the responsibilities of advice-givers.
And we need that engagement more than ever. People who seek help from experts or caregivers are vulnerable. They may not be able to distinguish a good service from a bad one. They don’t have time to parse out terms of service dense with jargon, caveats, and disclaimers. And they have little to no negotiating power to set better terms, especially mid-crisis. That’s why the fiduciary duties that lawyers and doctors have are so necessary in the first place: not just to protect a person seeking help once, but to give people confidence that they can seek help from experts for the most critical, sensitive issues they face. In other words, a lawyer’s duty to their client isn’t there just to protect the client from the lawyer; it’s to protect society’s trust in lawyers.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.