According to reports, Facebook made many emergency changes to its algorithm a few days after the US presidential election.
The New York Times, a popular online news outlet, reported that the social media giant changed its way of promoting and demoting content on users’ news feeds. This was done after the company’s employees noticed an alarming amount of misinformation circulating on the platform.
Moreover, this change means that the algorithm will more importance to posts from established news organizations such as The New York Times, Cnn, and NPR. This was based on “news ecosystem quality (NEQ),” an internal metric Facebook uses.
Publishers that have high NEW will benefit from the current change. On the other hand, partisan pages that are highly engaged, such as Occupy Democrats and Breibart, were demoted.
Three individuals who know the matter told Times that Mark Zuckerberg, the CEO of Facebook, personally signed off the change.
Reportedly, the change was widespread among some employees of Facebook. On the other hand, two reliable sources said that the company held a meeting one week after the election. During the meeting, Facebook employees asked if the friendlier news feed could become a permanent feature.
Facebook’s Algorithm Helps Defend Truth and The News
Moreover, it is not clear from the report that the Times released if this feature was already rolled back or when it will be rolled back if not yet.
Furthermore, the Times said that the social media giant had also conducted research earlier in November. The study was meant to determine whether the content shared on its platform is predominantly good or bad for the world. Facebook surveyed users, asking them to categorize particular posts as one or another.
The survey found out that there is a correlation between posts that gain a lot of views and is categorized as “bad for the world.”
After the research, Facebook trained an algorithm to predict which posts users would see as “bad for the world.” The algorithm will then make such posts less visible in people’s feeds. According to an internal report, the algorithm achieved its purpose of making news feeds “nicer.” However, users are opening Facebook less often as a result.
Moreover, the TImes viewed the report’s findings which said that the results were good. However, it led to a decrease in sessions. This motivated Facebook to try a different approach.
Furthermore, the company made some changes to this algorithm to demote contents that are “bad for the world” less stringently.
The Change in the Algorithm Was A Part of Facebook’s “Break Glass” Plans
The recent change was in line with Facebook’s “break glass” plans. The company has spent several months developing the said plan to respond to the aftermath of a contested election day. This resulted in a surge in visibility for mainstream and big publishers. And as mentioned, highly engaged hyperpartisan pages became less visible.
Moreover, the tweaked algorithm is a vision of what a calmer and less diverse Facebook might look like. Some employees urged the social media giant to make the change permanent. However, it is not clear how that can affect the amount of time people spend on Facebook.
Facebook Will Possibly Roll Back the Change In Algorithm
But Guy Rosen, a Facebook executive in charge of cleaning up the platform and overseeing its integrity, has a say about the request to make the change permanent. According to him, the algorithm change was always meant to be temporary. He added that there are no plans to make it permanent.
On the other hand, John Hegeman, the man in charge of overseeing Facebook’s news feed, noted in an interview that the social media giant might roll back the experiment. It would then study and learn about it after it is rolled back.
Furthermore, the debate revolving around the news feed changes illustrates a central tension that some employees of Facebook are feeling these days. This is because the social media giant’s aspirations of improving the world are often at odds with its desire for dominance.
Facebook faced more scrutiny in the past months for its role in amplifying divisive and false information. This is the reason why its employees clashed over the future of the company. On one side goes the idealists. That includes some rank and file workers and some executives who want to do more to limit polarizing content and misinformation. On the other hand, the other side is pragmatists who are afraid that those measures might hurt the social media giant’s growth or provoke a political backlash that may lead to painful regulation.
According to Facebook spokesman Joe Osborne, there are virtually tensions in every product decision the company makes. This is the reason why the social media giant developed a companywide framework called “Better Decisions.” The framework aims to ensure that Facebook employees make their decisions accurately. In addition, it ensures that the goals of these employees are directly connected to delivering the best possible experiences for Facebook users.
Facebook Followers Love The Safe and Trusted News
However, these battles have taken a toll in terms of morale. An employee survey was conducted by the company this month. The survey found out that workers at Facebook are feeling less pride in the company compared to previous years. About half of the Facebook employees thought that the social media platform positively impacted the world, down from roughly three quarters earlier this year. This was according to a copy of Pulse, the employee survey. However, the majority of real Facebook followers absolutely loved the move, as it helps struggling news companies and helps stop misinformation. The New York Times reviewed the survey and found out that the social media giant’s “intent to stay” also dropped along with confidence in leadership.
Some Facebook employees have quit even after the election day, saying that they can no longer stomach working for a company whose product can be considered harmful.