Former Facebook employee says she aims to help speedy turnaround at the social media giant
In a series of interviews, Ms Haugen, who left the company in May after nearly two years, said she had come to the job with high hopes of helping Facebook fix vulnerabilities. She soon became suspicious that her team could make an impact, she said. That said, her team had few resources, and felt that the company put growth and user engagement ahead of what it had known through its own research about the side effects of its platform.
At the end of her time at Facebook, Ms. Haugen said, she was convinced that people outside the company – including lawmakers and regulators – should know what she had discovered.
“If people hate Facebook more than I do because of what I’ve done, I’ve failed,” she said. “I believe in truth and reconciliation – we need to accept reality. The first step is documentation.”
In a written statement, Facebook spokesman Andy Stone said, “Every day our teams balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place.” We continue to make significant improvements to combat the spread of misinformation and harmful content. To suggest that we encourage bad content and do nothing is simply not true.”
Ms Haugen, 37, resigned from Facebook in April. She stayed for another month to hand over some projects. She also scoured the company’s internal social network, called Facebook Workplace, where she felt the company had failed to be responsible for the welfare of users.
She said she was surprised by what she found. The journal’s series, based on documents he collected as well as interviews with current and former employees, describes how the company’s rules favor the elite; how its algorithms fuel discord; and how drug cartels and human traffickers openly use its services. An article about the effects of Instagram on the mental health of teenage girls was the impetus for a Senate subcommittee hearing last week in which lawmakers called the revelations a “bombing”.
Ms Haugen continued to hope to be caught, she said, as she reviewed thousands of documents over several weeks. Facebook logs employee activities on Workplace, and she was searching for parts of her network that, when open, were not related to her job.
She said she began thinking about leaving a message for Facebook’s internal security team when they essentially reviewed her search activity. That said, she liked most of her partners, and knew that some would feel betrayed. She knew the company would too, but thought the stakes were high enough that she needed to speak up, she said.
On May 17, shortly before 7 p.m., he logged in for the last time and typed his last message into Workplace’s search bar to try to explain his objectives.
“I don’t hate Facebook,” she wrote. “I like Facebook. I want to save it.”
Ms. Haugen was born and raised in Iowa, the daughter of a doctor father and a mother who left an academic career behind to become an Episcopal priest. She said that she prides herself on being a rule-follower. For the last four Burning Man celebrations, the annual desert festival popular with the Bay Area tech and arts scene, she served as a ranger, arbitrating disputes and enforcing the community’s safety-focused code.
Ms. Haugen previously launched Alphabet. had worked in Inc. NS
google, pintrest Inc.
and other social networks, specializing in designing algorithms and other tools that determine what content is served to users. Google paid her to attend Harvard and get a master’s degree in business administration. She returned to the company in 2011 to battle an autoimmune disorder.
“I came back from business school, and I immediately started to rot,” she said. The doctors were initially surprised. By the time she was diagnosed with celiac disease, she had permanent damage to the nerves in her arms and legs, leaving her in pain. She struggled to move from cycling for 100 miles a day.
Ms Haugen resigned from Google in early 2014. Two months later, a blood clot formed in his thigh, forcing him to be taken to the intensive care unit.
A family acquaintance hired her to assist with chores, becoming her main companion during a year she spent largely at home. The young man bought groceries, took him to doctor’s appointments, and helped him regain his ability to walk.
“It was a really important friendship, and then I lost it,” she said.
Friends, who once held liberal political views, were spending more time reading online forums about how black forces were manipulating politics. In an interview, the man recalled to Ms. Haugen that she had unsuccessfully attempted to intervene, moving toward a mix of covert and white nationalism. They broke off their friendship and later left San Francisco before giving up on such beliefs, he said.
Ms. Haugen’s health improved, and she went back to work. But the loss of her friendship changed the way she thought about social media, she said.
“It’s one thing to study misinformation, it’s another to lose someone,” she said. “A lot of the people who work on these products only see the positive side of things.”
When a Facebook recruiter got in touch in late 2018, Ms Haugen said, she replied that if the job touched on democracy and the spread of false information, she might be interested. During the interview, she said, she told managers about her friend and how she wanted to help Facebook stop its users from going down a similar path.
She began in June 2019, part of a nearly 200-person Civic Integrity team that focused on election issues around the world. While it was only a small part of Facebook’s overall policing efforts, the team became a central player in the investigation of how the platform could spread political lies, incite violence and be abused by malicious governments.
“‘I feel so sorry for the people who spend their lives working on these things.’“
Ms Haugen was initially asked to create tools to study potentially malicious targeting of information in specific communities. His team, which included him and four other new employees, was given three months to create a system to trace the practice, a program he deemed impossible. It did not succeed, and received a poor initial review, she said. She recalled a senior manager saying that people at Facebook do what needs to be done with far fewer resources than anyone would think possible.
Around him, he saw small groups of employees facing big problems. He said that the core team responsible for detecting and combating human exploitation – which included slavery, forced prostitution and organ selling – consisted of only a few investigators.
“I would ask why more people weren’t being hired,” she said. “Facebook acted like it was powerless to have employees on these teams.”
“We have invested heavily in people and technology to keep our platform secure, and have made fighting misinformation and providing authoritative information a priority,” said Facebook’s Mr. Stone.
Ms Haugen said the company is unwilling to accept initiatives to improve security if it makes it harder to attract and engage users, thereby discouraging her and other employees.
“What did we do? We’ve built a massive machine that optimizes for engagement, whether it’s real or not,” read a presentation from the Connections Integrity Team, an umbrella group known as “Shaping a Healthy Public Content Ecosystem.” Worked with”. In the fall of 2019. The presentation described the consequences of viral misinformation and social violence.
Ms. Haugen had come to see herself and the Civic Integrity team as a goofy cleanup crew.
He said he was concerned about the dangers Facebook could pose to societies with access to the Internet in the first place, he said, and saw Myanmar’s social media-fueled genocide as a template, not a a temp.
She talked about her concerns with her mother, the priest, who advised her that if she felt life was on the line, she should do what she could to save the lives of those around her.
Facebook’s Mr Stone said the company’s goal was to provide a safe, positive experience for its billions of users. “Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business,” he said.
On December 2, 2020, Samidh Chakraborty, the founder and head of the team, called an all-hands teleconference meeting. From her San Francisco apartment, Ms. Hogen heard her announce that Facebook was disbanding the team and reshuffling its members in other parts of the company’s integrity division, with the broader group appreciating the quality and credibility of the platform’s content. worked to improve.
According to Ms Hogen and another person in the talks, Mr Chakraborty praised what the team has achieved “at the cost of our family, our friends and our health”. He announced he was taking a leave of absence to recharge, but urged his employees to fight and express themselves “constructively and respectfully” when they see Facebook putting short-term interests above the community’s long-term needs. Let’s look at the risk. Wealth….