The Facebook Whistleblower, Frances Haugen, Says She Wants to Fix the Company, Not Harm It

Frances Haugen, a former product manager hired to help protect against election interference on Facebook, said she had grown frustrated by what she saw as the company’s lack of openness about its platforms’ potential for harm and unwillingness to address its flaws. She is scheduled to testify before Congress on Tuesday. She has also sought federal whistleblower protection with the Securities and Exchange Commission.

In a series of interviews, Ms. Haugen, who left the company in May after nearly two years, said that she had come into the job with high hopes of helping Facebook fix its weaknesses. She soon grew skeptical that her team could make an impact, she said. Her team had few resources, she said, and she felt the company put growth and user engagement ahead of what it knew through its own research about its platforms’ ill effects.

Toward the end of her time at Facebook, Ms. Haugen said, she came to believe that people outside the company—including lawmakers and regulators—should know what she had discovered.

“If people just hate Facebook more because of what I’ve done, then I’ve failed,” she said. “I believe in truth and reconciliation—we need to admit reality. The first step of that is documentation.”

In a written statement, Facebook spokesman

Andy Stone

said, “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

Ms. Haugen, 37 years old, resigned from Facebook in April. She stayed on another month to hand off some projects. She also sifted through the company’s internal social network, called Facebook Workplace, for instances where she believed the company had failed to be responsible about users’ welfare.

She said she was surprised by what she found. The Journal’s series, based in part on the documents she gathered as well as interviews with current and former employees, describes how the company’s rules favor elites; how its algorithms foster discord; and how drug cartels and human traffickers use its services openly. An article about Instagram’s effects on teenage girls’ mental health was the impetus for a Senate subcommittee hearing last week in which lawmakers described the disclosures as a “bombshell.”

Ms. Haugen kept expecting to be caught, she said, as she reviewed thousands of documents over several weeks. Facebook logs employees’ activities on Workplace, and she was exploring parts of its network that, while open, weren’t related to her job.

She said that she began thinking about leaving messages for Facebook’s internal security team for when they inevitably reviewed her search activity. She liked most of her colleagues, she said, and knew some would feel betrayed. She knew the company would as well, but she thought the stakes were high enough that she needed to speak out, she said.

On May 17, shortly before 7 p.m., she logged on for the last time and typed her final message into Workplace’s search bar to try to explain her motives.

“I don’t hate Facebook,” she wrote. “I love Facebook. I want to save it.”

Facebook’s headquarters in Mountain View, Calif.



Photo:

Ian Bates for The Wall Street Journal

Rule-follower

Ms. Haugen was born and raised in Iowa, the daughter of a doctor father and a mother who left behind an academic career to become an Episcopal priest. She said that she prides herself on being a rule-follower. For the last four Burning Man celebrations, the annual desert festival popular with the Bay Area tech and art scene, she served as a ranger, mediating disputes and enforcing the community’s safety-focused code.

Ms. Haugen previously worked at

Alphabet Inc.’s

Google,

Pinterest Inc.

and other social networks, specializing in designing algorithms and other tools that determine what content gets served to users. Google paid for her to attend Harvard and get her master’s in business administration. She returned to the company in 2011 only to be confronted with an autoimmune disorder.

“I came back from business school, and I immediately started decaying,” she said. Doctors were initially baffled. By the time she was diagnosed with celiac disease, she had sustained lasting damage to nerves in her hands and feet, leaving her in pain. She went from riding a bicycle as much as 100 miles a day to struggling to move around.

Ms. Haugen resigned from Google at the beginning of 2014. Two months later, a blood clot in her thigh landed her in the intensive care unit.

A family acquaintance hired to assist her with errands became her main companion during a year she spent largely homebound. The young man bought groceries, took her to doctors’ appointments, and helped her regain the capacity to walk.

“It was a really important friendship, and then I lost him,” she said.

The friend, who had once held liberal political views, was spending increasing amounts of time reading online forums about how dark forces were manipulating politics. In an interview, the man recalled Ms. Haugen as having unsuccessfully tried to intervene as he gravitated toward a mix of the occult and white nationalism. He severed their friendship and left San Francisco before later abandoning such beliefs, he said.

Ms. Haugen’s health improved, and she went back to work. But the loss of her friendship changed the way she thought about social media, she said.

“It’s one thing to study misinformation, it’s another to lose someone to it,” she said. “A lot of people who work on these products only see the positive side of things.”

Recruited

When a Facebook recruiter got in touch at the end of 2018, Ms. Haugen said, she replied that she might be interested if the job touched on democracy and the spread of false information. During interviews, she said, she told managers about her friend and how she wanted to help Facebook prevent its own users from going down similar paths.

She started in June 2019, part of the roughly 200-person Civic Integrity team, which focused on issues around elections world-wide. While it was a small piece of Facebook’s overall policing efforts, the team became a central player in investigating how the platform could spread political falsehoods, stoke violence and be abused by malicious governments.

‘I have a lot of compassion for people spending their lives working on these things.’


— Frances Haugen

Ms. Haugen was initially asked to build tools to study the potentially malicious targeting of information at specific communities. Her team, comprising her and four other new hires, was given three months to build a system to detect the practice, a schedule she considered implausible. She didn’t succeed, and received a poor initial review, she said. She recalled a senior manager telling her that people at Facebook accomplish what needs to be done with far less resources than anyone would think possible.

Around her, she saw small bands of employees confronting large problems. The core team responsible for detecting and combating human exploitation—which included slavery, forced prostitution and organ selling—included just a few investigators, she said.

“I would ask why more people weren’t being hired,” she said. “Facebook acted like it was powerless to staff these teams.”

Mr. Stone of Facebook said, “We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority.”

Ms. Haugen said the company seemed unwilling to accept initiatives to improve safety if that would make it harder to attract and engage users, discouraging her and other employees.

“What did we do? We built a giant machine that optimizes for engagement, whether or not it is real,” read a presentation from the Connections Integrity team, an umbrella group tasked with “shaping a healthy public content ecosystem,” in the fall of 2019. The presentation described viral misinformation and societal violence as among the results.

Samidh Chakrabarti, left, and other Facebook employees at work on Oct. 17, 2018, ahead of a runoff election in Brazil.



Photo:

David Paul Morris/Bloomberg News

Ms. Haugen came to see herself and the Civic Integrity team as an understaffed cleanup crew.

She worried about the dangers that Facebook might pose in societies gaining access to the internet for the first time, she said, and saw Myanmar’s social media-fueled genocide as a template, not a fluke.

She talked about her concerns with her mother, the priest, who advised her that if she thought lives were on the line, she should do what she could to save those lives.

Facebook’s Mr. Stone said that the company’s goal was to provide a safe, positive experience for its billions of users. “Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business,” he said.

On Dec. 2, 2020, the founder and chief of the team, Samidh Chakrabarti, called an all-hands teleconference meeting. From her San Francisco apartment, Ms. Haugen listened to him announce that Facebook was dissolving the team and shuffling its members into other parts of the company’s integrity division, the broader group tasked with improving the quality and trustworthiness of the platform’s content.

Mr. Chakrabarti praised what the team had accomplished “at the expense of our family, our friends and our health,” according to Ms. Haugen and another person at the talk. He announced he was taking a leave of absence to recharge, but urged his staff to fight on and to express themselves “constructively and respectfully” when they see Facebook at risk of putting short-term interests above the long-term needs to the community. Mr. Chakrabarti resigned in August. He didn’t respond to requests for comment.

That evening after the meeting, Ms. Haugen sent an encrypted text to a Journal reporter who had contacted her weeks earlier. Given her work on a team that focused in part on counterespionage, she was especially cautious and asked him to prove who he was.

The U.S. Capitol riot came weeks later, and she said she was dismayed when Facebook publicly played down its connection to the violence despite widespread internal concern that its platforms were enabling dangerous social movements.

Mr. Stone of Facebook called any implication that the company caused the riot absurd, noting the role of public figures in encouraging it. “We have a long track record of effective cooperation with law enforcement, including the agencies responsible for addressing threats of domestic terrorism,” he said.

In March, Ms. Haugen left the Bay Area to take up residence in Puerto Rico, expecting to continue working for Facebook remotely.

Open forums

Ms. Haugen had expected there wouldn’t be much left on Facebook Workplace that wasn’t already either written about or hidden away. Workplace is a regular source of leaks, and for years the company has been tightening access to sensitive material.

To her surprise, she found that attorney-client-privileged documents were posted in open forums. So were presentations to Chief Executive

Mark Zuckerberg

—sometimes in draft form, with notes from top company executives included.

In Ms. Haugen’s view, allowing outsiders to see the company’s research and operations is essential.



Photo:

Stephen Voss for The Wall Street Journal

Virtually any of Facebook’s more than 60,000 employees could have accessed the same documents, she said.

To guide her review, Ms. Haugen said she traced the careers of colleagues she admired, tracking their experiments, research notes and proposed interventions. Often the work ended in frustrated “badge posts,” goodbye notes that included denunciations of Facebook’s failure to take responsibility for harms it caused, she said. The researchers’ career arcs became a framework for the material that would ultimately be provided to the SEC, members of Congress and the Journal.

The more she read, she said, the more she wondered if it was even possible to build automated recommendation systems safely, an unpleasant thought for someone whose career focused on designing them. “I have a lot of compassion for people spending their lives working on these things,” she said. “Imagine finding out your product is harming people—it’d make you unable to see and correct those errors.”

The move to Puerto Rico brought her stint at Facebook to a close sooner than she had planned. Ms. Haugen said Facebook’s human resources department told her it couldn’t accommodate anyone relocating to a U.S. territory. In mid-April, she agreed to resign the following month.

Ms. Haugen continued gathering material from inside Facebook through her last hour with access to the system. She reached out to lawyers at Whistleblower Aid, a Washington, D.C., nonprofit that represents people reporting corporate and government misbehavior.

In addition to her coming Senate testimony and her SEC whistleblower claim, she said she’s interested in cooperating with state attorneys general and European regulators. While some have called for Facebook to be broken up or stripped of content liability protections, she disagrees. Neither approach would address the problems uncovered in the documents, she said—that despite numerous initiatives, Facebook didn’t address or make public what it knew about its platforms’ ill effects.

Mr. Stone of Facebook said, “We have a strong track record of using our research—as well as external research and close collaboration with experts and organizations—to inform changes to our apps.”

In Ms. Haugen’s view, allowing outsiders to see the company’s research and operations is essential. She also argues for a radical simplification of Facebook’s systems and for limits on promoting content based on levels of engagement, a core feature of Facebook’s recommendation systems. The company’s own research has found that “misinformation, toxicity, and violent content are inordinately prevalent” in material reshared by users and promoted by the company’s own mechanics.

“As long as your goal is creating more engagement, optimizing for likes, reshares and comments, you’re going to continue prioritizing polarizing, hateful content,” she said.

Beyond that, she has some business ideas she’d like to pursue—and she would like to think about something other than Facebook.

“I’ve done a really good job figuring out how to be happy,” she said. “Talking about things that make you sad all the time is not the way to make yourself happy.”

Write to Jeff Horwitz at [email protected]

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.