Francis Hagen said he was hired to work. Facebook In 2019, before the highly controversial presidential election and before the unforgettable year 2020. The social media giant that hired him over the next several months became the top platform for distribution, hatred and misinformation.
In his time, not only presidential elections were held. There was a global epidemic, a death knell for social justice. George Floyd, Sports school, business and life suspension, and the Capitol Hill uprising to try to overturn the 2020 election.
Hogan, who had worked before. Google And Pentecost said it was set up to deal with misinformation circulating on Facebook. She appeared. 60 minutes To talk about some issues
“What I saw on Facebook over and over again was what was good for the public and what was good for Facebook, and Facebook repeatedly chose to improve its interests, such as making more money, Hagen said on the show.
Hagen called on the Securities and Exchange Commission (SEC) and The Wall Street Journal.Not just an email here or there, nor a lip slip on the water cooler, according to the report.
“I’ve seen a bunch of social networks, and it’s worse on Facebook than anything I’ve seen before,” Hogan said Sunday night. “At some point in 2021, I realized that I had to do it in a systematic way, that I had to get out. [documents] That no one can question the fact. ”
Hagen said that after the 2020 presidential election, Facebook disbanded its civic integrity team, which then twisted its gut at the company’s direction. He thinks the demise of a company “integrity team” has opened the door for Facebook to be used as a staging point for the January 6 riots in the capital.
“They basically said, ‘Oh well, we made it through elections, there were no riots, now we can get rid of civic integrity.’ ۔ When they got rid of civic integrity, that was the moment I was there, ‘I’m not sure they’re really willing to invest what they need to invest. So that Facebook is not made dangerous.
Facebook spokeswoman Lena Petish said. CNN Do business on Sundays that its apps are better and less harmful than people think. “Eliminating misinformation is a top priority,” he said.
“Every day, our teams need to strike a balance between protecting the ability of billions of people to express themselves, with the need to keep their platform safe and positive,” said Peitsch. “We continue to make significant improvements in dealing with the spread of misinformation and harmful content. To suggest that we encourage bad content and do nothing is simply not true.”
A Facebook spokeswoman said Sunday that not only is the research being shared internally, but it is being viewed by a third party for further discussion.
“We do a lot of research, we share it with external researchers as much as we can, but remember that there is a difference between conducting peer review exercises in collaboration with other academics and preparing papers internally. Stimulate and inform the debate, “said Nick Clegg, Facebook’s vice president of global affairs.
Hagen finally said in a whistle-blowing interview on Sunday night that Facebook is more about dollars, stirring emotions.
“One of the consequences of how Facebook is picking up content today is that it is optimizing for content that has engagement, a reaction, but its own research shows that it is hateful. Content, which is distribution, which is polarizing, makes it easier to get people angry than other emotions. “Hagen also said,” If they secure the algorithm, people will spend less time on the site. “The less ads they click on, the less money they make.”