Facebook’s former director of monetization has said he worked to make the company “addictive” by taking “a page from Big Tobacco’s playbook”.
“I fear we are pushing ourselves to the brink of a civil war”, Tim Kendall wrote.
Mr Kendall’s comments come as written testimony before the House Consumer Protection and Commerce Subcommittee.
He said that while he believed his job was to figure out the business model for Facebook, it was instead to “mine as much attention as humanly possible and turn into historically unprecedented profits”.
Mr Kendall compares Facebook to tobacco companies, which added sugar and menthol to cigarettes so that users could continue to smoke for longer periods.
“At Facebook, we added status updates, photo tagging, and likes”, Kendall says, which covered the spread of “misinformation, conspiracy theories, and fake news”.
Mr Kendall also said that Facebook’s algorithms “rewired” users brains in order to maximise attention and, by extension, profits.
“Extreme, incendiary content—think shocking images, graphic videos, and headlines that incite outrage—sowed tribalism and division”, he wrote.
“When it comes to spreading extremist views and propaganda, these services have content moderation tools—but the tools can never seem to keep up, despite these companies having billions of dollars at their disposal”, Kendall said.
Facebook, its subsidiary Instagram, and many other social networking sites have been criticised for not moderating content effectuively.
The law shields any website or service that hosts content – like news outlets’ comment sections, video services like YouTube and social media services like Facebook and Twitter – from lawsuits over content posted by users.
“I can think of few industries that enjoy such broad immunity and none that have profited so greatly from this lack of basic regulation”, Mr Kendall wrote.
The legislation has come under attack from president Trump, due to Twitter’s fact-checking of his tweets.
Kendall is not the only ex-Facebook employee to criticise the company.
A former data scientist who was fired by the company claimed Facebook has been ignoring evidence that fake accounts on its platform have been disrupting political events across the world.
Mid-level employee Sophie Zhang said she had “blood on her hands” and claimed to have power over national events that affected multiple elections.
The Independent has reached out to Facebook for comment.