Facebook, following this gruesome testimony that its platform is harming children, will introduce a number of features, including prompting teens to take breaks using its photo-sharing app Instagram and asking them if they share the same content over and over again. Seeing them involves “nudges”. not conducive to their well being.
Menlo Park, Calif.-based Facebook also plans to introduce new controls for adults on an optional basis so that a parent or guardian can monitor what their teen is doing online. The move comes after Facebook announced late last month that it was halting work on its Instagram for Kids project. But critics say the plan lacks details, and they doubt the new features will be effective.
The new controls were outlined Sunday by Facebook’s vice president of global affairs, Nick Clegg, who made the rounds on various Sunday news shows, including CNN. state of the Union and abc This Week with George Stephanopoulos, where he was questioned about Facebook’s use of algorithms prior to the January 6 Capitol riots, as well as its role in spreading harmful misinformation.
“We’re constantly iterating to improve our products,” Clegg told Dana Bash union state sunday. “We, with the wand of the wand, cannot make everyone’s life perfect. What we can do is improve our products, so that our products are safe and enjoyable to use.”
Clegg said Facebook has invested US$13 billion over the years to keep the platform safe and that the company has 40,000 people working on these issues.
- Facebook whistleblower Frances Haugen could face legal retaliation for disclosure
The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, went before Congress last week accusing the social media platform of failing to make changes to Instagram, as internal research apparently found some teens harmed and dishonest in it. People’s fight against hate and misinformation.
Haugen’s allegations were supported by thousands of pages of internal research documents secretly copied before leaving the job at the company’s civil integrity unit.
Josh Golin, executive director of Fairplay, a watchdog for the children and media marketing industry, said he doesn’t think it would be effective to introduce parental controls to help monitor teens, as many teens set up secret accounts anyway.
He was also skeptical about how effective it would be to motivate teens to take breaks or move away from harmful content. He said Facebook needs to show how it will implement it and offer research that shows these tools are effective.
“There is tremendous reason to be skeptical,” he said. He added that regulators need to restrict what Facebook does with its algorithms.
Golin said that he also believes that Facebook should cancel its Instagram project for children.
- Ottawa urged to crack down on Facebook after blatant whistleblower testimony before US Senate
Whistleblower allegations against Facebook could be watershed moment, says former employee
When Clegg was grilled by both Bash and Stephanopoulos in separate interviews about the use of algorithms in amplifying misinformation prior to the January 6 riots, he responded that if Facebook removed the algorithm, people would be more There will be, no less, hate speech – and much, no less, misinformation.
Clegg told both hosts that the algorithm worked as a “giant spam filter.”
Democratic Sen. Amy Klobuchar of Minnesota, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust and Consumer Rights, told Bash in a separate interview Sunday that it is looking to update children’s privacy laws and bring more transparency into the use of algorithms. It’s time to provide.
“I appreciate that he’s ready to talk things out, but I believe it’s time to talk,” Klobuchar said, referring to Clegg’s plan. “Now is the time to take action.”