According to the Associated Press, Facebook’s user base is changing as younger people sign in less often and see it as an “outdated network” with “irrelevant content.”

- Advertisement -

As the company continues to make news for its scandals and privacy issues, CEO Mark Zuckerberg and his executives are trying to address the declining engagement and interest among teens and young adults.

- Advertisement -

According to a November 2020 internal document, young adults engage with Facebook far less frequently than older users and think the social network is “boring, misleading and negative”.

Facebook is aware of the risks of losing its youthful demographic and says its products are still widely used by teens, but according to the AP, its researchers say it’s seeking attention from rivals like TikTok and Snapchat. Getting “Tough Competition”.

- Advertisement -

Congress is now closely examining the social network, with many waiting to see what Zuckerberg’s next move will be to make Facebook relevant to young people again.

For more reporting from the Associated Press, see below.

Thousands of pages of internal documents provided to Congress by a former Facebook employee show an internally conflicting company, where data about the damage it has caused is abundant, but the solution, the will to act on them, is enormous. Less is more, stopping at best.

The crisis highlighted by the documents shows how Facebook, despite its regularly declared good intentions, has been slow-moving or sidelined efforts to grow the social network and undo the real damage it has created. shows. They reveal many instances where researchers and rank-and-file workers uncovered deep-seated problems that the company overlooked or ignored.

The ultimate responsibility for this situation rests with Zuckerberg, who describes a former employee as having dictatorial power over a corporation that collects data and provides free services to nearly 3 billion people worldwide.

“Ultimately, it rests with Mark and whatever his prerogative is – and it has always evolved to increase his power and his reach,” said Jennifer Grigiel, a Syracuse University communications professor who over the years Followed Facebook closely.

Zuckerberg has a firm hold on Facebook Inc. He owns the majority of the company’s voting shares, controls its board of directors and surrounds himself with executives who do not question his vision.

So it can continue to expand its reach and power, with Facebook emphasizing high user growth outside of the US and Western Europe. But as it expanded into less familiar parts of the world, the company was able to address the unintended consequences of signing up millions of new users without providing staff and systems to detect and limit the spread of hate speech, misinformation, and calls. Or even failed to guess. to violence.

For example, in Afghanistan and Myanmar, a systemic lack of language support for content moderation has led to the flourishing of extremist language, whether human or artificial intelligence-driven. In Myanmar, it has been linked to atrocities committed against the country’s minority Rohingya Muslim population.

But Facebook appears to be unable to acknowledge this, doing little to prevent the collateral damage of the real world, with its unchecked growth. Those pitfalls include shady algorithms that radicalize users, widespread misinformation and extremism, facilitating human trafficking, teen suicide, and more.

Internal efforts to mitigate such problems are often pushed aside or abandoned when solutions conflict with development – ​​and, by extension, profit.

Backed in a corner with hard evidence of leaked documents, the company has doubled down on defending its choice to instead try to fix its problems.

“We don’t and we haven’t prioritized engagement over security,” Monica Bickert, Facebook’s head of global policy management, told the AP this month. Haugen’s testimony and appearance in the days after 60 minutes– During which Zuckerberg posted a video of himself sailing with his wife Priscilla Chan – Facebook has repeatedly tried to defame Haughan by pointing out that he has not acted directly on the many problems he faced. He had disclosed.

“A curated selection from the millions of documents on Facebook cannot be used in any way to draw reasonable conclusions about us,” Facebook tweeted earlier this month from its public relations “Newsroom” account, the company’s After the discovery that a group of news organizations were working on stories about internal documents.

“At the heart of these stories is a premise that is false. Yes, we are a business and we make a profit, but the idea that we do so at the cost of people’s safety or well-being is misunderstood as our Where are your business interests,” Facebook said in a prepared statement last Friday. “The truth is we’ve invested $13 billion and have over 40,000 people to do one thing: keep people safe on Facebook.”

Such statements are the latest sign that Facebook has taken what Sophie Zhang, a former data scientist at Facebook, describes as a “siege mentality” at the company. Zhang last year accused the social network of ignoring fake accounts used to undermine foreign elections. With more whistleblowers—especially Haugen—come forward, it has only gotten worse.

“Facebook is going through an authoritarian narrative spiral where it becomes less responsive to employee criticism, internal dissent, and in some cases cracks down on it,” said Zhang, who was fired from Facebook in the fall of 2020. “And that leads to more internal dissatisfaction.”

“I have seen many colleagues who are extremely frustrated and angry, while at the same time, feeling powerless and [disheartened] Regarding the current situation,” one employee, whose name was modified, wrote on an internal message board after Facebook last year decided to drop incendiary posts by former President Donald Trump that suggested Minneapolis protesters could be shot. “My view is, if you want to fix Facebook, do it in.”

This story is based in part on disclosures made to the Securities and Exchange Commission and provided to Congress in revised form by Hogen’s legal advisor. Revised versions obtained by Congress were obtained by a consortium of news organizations including the AP.

They expand on data collected on problems as widespread as the smuggling of domestic workers in the Middle East, a hyper-reform in the crackdown on Arabic content, which critics say muzzle free speech while hate speech and abuse flourished. , and rampant anti-vaccine misinformation researchers found could easily have been reduced with subtle changes to the way users viewed posts on their feeds.

The company insists it “does not conduct research and then systematically and deliberately ignore it if the findings are inconvenient to the company.” This claim, Facebook said in a statement, “can only be made by cherry-picking citations from individual pieces of leaked content that presents complex and nuanced issues such that there is only one correct answer.”

Haughan, who testified before the Senate this month that Facebook’s products “harm children, promote division and undermine our democracy,” said the company should declare “moral bankruptcy” if it We have to move on from all this.

At this stage, it seems impossible. There’s a deep conflict between profit and people within Facebook—and the company doesn’t seem ready to give up on its assertion that it’s good for the world, even though it regularly makes decisions aimed at maximizing growth. .

“Facebook conducts regular surveys of its employees – what percentage of employees believe Facebook is making the world a better place,” Zhang recalled.

“When I joined it was about 70 percent. When I left it was about 50 percent,” said Zhang, who had been at the company for more than two years in the fall of 2020 after she was fired.

Facebook did not say where the numbers are today.