“Individual humans” are to blame for spreading misinformation on Facebook, a top executive at the company said in a new interview Sunday.
Andrew Bosworth, who’s developed a reputation in Silicon Valley as a trusted deputy of Facebook CEO Mark Zuckerberg, swatted down critics who say the company has amplified misinformation about COVID-19 and other topics.
“Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing,” Bosworth said in an interview published Sunday with “Axios on HBO.”
Bosworth appeared in the interview to equate regulating misinformation on the platform with removing content that’s simply unfavorable in the eyes of some people.
“I don’t feel comfortable at all saying they don’t have a voice because I don’t like what they said,” he told Axios.
Bosworth joined Facebook in 2006 and has worked on a litany of key initiatives, including boosting advertising revenue. Next year, he’ll be elevated to chief technology officer of Meta, Facebook’s parent company.
When pressed by Axios in the interview on the company’s role in spreading at least some misinformation that may have contributed to vaccine hesitancy or otherwise crippled the world’s pandemic response, Bosworth again shifted blame on to individuals who he said wanted to see information like that.
“That’s their choice. They are allowed to do that. You have an issue with those people. You don’t have an issue with Facebook. You can’t put that on me,” he said.
“People want that information,” he added.
“I don’t believe that the answer is ‘I will deny these people the information they seek and I will enforce my will upon them.’”
“At some point the onus is, and should be in any meaningful democracy, on the individual.”
Bosworth went on to question whether Facebook can itself define what misinformation is.
“Our ability to know what is misinformation is itself in question and I think reasonably so, so I’m very uncomfortable with the idea that we possess enough fundamental rightness even in our most scientific centers of study to exercise that kind of power on a citizen, another human, and on what they want to say and who they want to listen to,” he said.
Notably, Facebook has weighed in on what it felt was misinformation earlier in the pandemic. Facebook in February had banned posts claiming that COVID-19 is man-made or manufactured.
The site’s fact-checkers slapped a “False Information” notification over shares of a Feb. 23 Post op-ed by Steven Mosher, which said that the US couldn’t trust China’s story about the origins of COVID-19 and argued the virus might have escaped from a lab in Wuhan.
But in May, the company reversed that decision and said it would no longer remove posts on its platform asserting COVID-19 was man-made, after President Joe Biden ordered US intelligence agencies to investigate if the virus came from a Chinese lab.
Bosworth, who often shares his views on social media and tech more generally on his blog, has previously suggested that Facebook may cause some harm, but that the good outweighs the bad.
In a leaked 2016 memo, he wrote about a belief among some Facebook employees that the platform is “de facto good,” though it may lead to some bad outcomes, like a “terrorist attack coordinated on our tools.”
Bosworth and Zuckerberg later backtracked, saying that the memo was meant to criticize Facebook employees who believed that.