March 28, 2024

Facebook and Journalism: Part Two


March 2018, Hamburg: Mathias Döpfner, CEO of Axel Springer SE and President of the Federal Association of German Newspaper Publishers, at ‘Online Marketing Rockstars’ fair. Axel Heimken/Press Association


Facebook has fundamentally changed the news ecosystem.


By Mohamed Amdi / 08.10.2018

When asked in December 2016 what kind of a company Facebook was exactly, Mark Zuckerberg replied that “Facebook is a new kind of platform. It’s not a traditional technology company. It’s not a traditional media company.” Zuckerberg has always been reluctant to describe his company as a publishing house, but the control he has acquired over who gets to see what and when makes him the most powerful news editor the world has ever seen and although Facebook does not produce any content of its own, the news publishers’ dependence on the social medium for the distribution of their content has turned Facebook into a de facto news company.

One of the more serious consequences of this development is that neither the news publishers nor the news consumers have much control over the flow of news content. Instead, Facebook can decide which user sees what type of information, based on detailed psychographic profiles. The company therefore has a considerable impact on how its more than two billion active users perceive the world. How easy it is for Facebook to affect the sensitivities of its users has been proven by a number of controversial experiments conducted by Facebook without its users’ knowledge. In 2013 for example, a group of psychologists tried to see whether or not a change in the news feed can alter the mental states of Facebook users. By tweaking the algorithm, the scientists tried to find out whether exposure to positive or negative posts would lead to emotional contagion. They found strong evidence for mood alterations on Facebook. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred”, they concluded.

There have been fears that Facebook could use the sway it holds over its users’ emotions to influence election results. Unknowingly to its users, Facebook conducted several social experiments during the 2012 US presidential election campaign to see whether a higher amount of political news in one’s news feed would increase interest in politics and, consequently, voter turnout. The data analysts concluded that Facebook did have an impact on voting behavior, as the number of voters did indeed go up as a consequence of the experiment. Likewise, in 2016 Facebook launched its “get out the vote” campaign, which, according to a post by Zuckerberg, has “helped as many as 2 million people register to vote”. What Zuckerberg presents as civic responsibility of his company was seen by many as a meddling in the election campaigns. Facebook has reacted by declaring it will never use the platform to influence the electoral results: “Voting is a core value of democracy and we believe that supporting civic participation is an important contribution we can make to the community. We encourage any and all candidates, groups, and voters to use our platform to share their views on the election and debate the issues. We as a company are neutral – we have not and will not use our products in a way that attempts to influence how people vote.”

Conservative groups in particular were not persuaded by these promises of neutrality – not only because Zuckerberg has never made any bones about his support for the Democratic party, but more crucially, because Facebook has been accused in the past of wilfully suppressing conservative news publishers. In January 2014, Facebook launched “Trending Topics”. Thanks to this news feature, which showed up on the right side of the News Feed, Facebook users could know which stories were most popular on the platform at a given moment. Publishers for their part were able to extend their reach and engagement. Furthermore, advertisers could adapt their ads campaign to the prevailing trends. In May 2016 the tech website Gizmodo issued an investigative piece based on interviews with Facebook “news curators”, who claimed that conservative news were deliberately prevented from showing up in the Trending Topics list. At the same time, news about liberal causes such as posts relating to the Black Lives Matter movement were artificially injected into the list to push their reach, according to the same Facebook employees. Furthermore, Facebook also indicated that news about Facebook should not be circulated via the Trending Topics feature. Not only did Gizmodo’s article accuse Facebook of political bias but it also showed that the distribution of news was not merely based on algorithms but that human editors were also at work.

The report seemed to corroborate the fears of Trump supporters who worried about Facebook’s opposition to their candidate following a leak that documented how Facebook employees were toying with the idea of stopping Donald Trump from becoming the next US president. Tom Stocky, who is responsible for Facebook’s Trending Topic feature, and Mark Zuckerberg responded to the allegations by denying any wrongdoing. Internal investigation did not find any evidence to support the claims iterated by Gizmodo. “ We have rigorous guidelines that do not permit the prioritization of one viewpoint over another or the suppression of political perspectives ”, Zuckerberg wrote in a May 2016 post. Nonetheless, Zuckerberg has met with leaders of conservative publishers, such as Glenn Beck, following Gizmodo’s scathing article. A few months after the meeting, Facebook decided to get rid of its news editors in favour of distributions based completely on algorithms.

While conservative groups have voiced concern over Zuckerberg’s support for the Democratic party, several newspapers have documented how the Republican party itself had instrumentalized the platform during the 2016 elections by flooding the News Feed with fake news in order to promote Trump and denigrate his adversaries. Fake news are believed to have strongly influenced the outcome of the US presidential election – though it is difficult to measure their political impact. However, according to a 2016 BuzzFeed study, the most successful election-related fake news stories generated more engagement on Facebook than the most successful posts from reference newspapers. Facebook algorithms are believed to have prioritized fake news over accurate reporting – mostly to the advantage of the GOP. The Cambridge Analytica (CA) and Aggregate IQ scandals further corroborated the accusations against the Republican party. (Interestingly enough, CA’s parent firm Strategic Communications Laboratories Group has been hired by several national defense departments to disseminate military propaganda.) Fake news also influenced political decisions in Europe. It is believed that disinformation has strongly affected the outcome of the Brexit referendum. In April 2018, Facebook representative Mike Schroepfer had to give evidence to the UK’s Digital, Culture, Media and Sport Select Committee on the issue of fake news and its impact in Britain.

Political leaders worldwide have reacted to the dangers of fake news in upcoming elections and have urged Zuckerberg to take measures against the scourge. What Gizmodo’s revelations as well as the scandals around the elections and referenda show, is that the distribution of news on Facebook is anything but impartial. The platform can be – and has been – misused for political manipulation and propaganda.

Monetizing

Financial incentives are an important factor in the propagation of fake news as well as in the decline of accuracy and quality of news. In the current system publishers hope to attract as many clicks as possible in order to monetize on the ads placed on their website. As a result of this ads-based monetization system quality and truth becomes subordinate to reach and engagement. As a consequence, online journalism has adopted the style and methods of tabloids and muckraking newspapers in order to attract the attention of Facebook users. Articles are often emotionally charged, strongly biased, hyperbolic or intentionally provoking. Phenomena such as clickbait or engagement bait have invaded the social network. Viral propagation, shitstorms and hashtag campaigns have become the boon of online journalism. Even highly respected news publishers have preferred to produce trashy pieces and infotainment that may be low-quality but that nonetheless results in a great deal of engagement.

The Facebook page of leading international newspapers are filled with so-called junk food news. There is in fact an enormous quality gap between print and online versions of the same news publishers as the quality of online newspapers is considerably lower to the print versions due to the “eye-catching” effects needed to attract the attention and the clicks of Internet users.

News consumers have now become very sceptical of the press in general and are questioning the objectivity and accuracy of news reporting. A 2018 global report by the Edelman Trust Barometer, which surveyed people from 28 countries, showed that media in general are facing a crisis of trust. 66% of the respondents agreed with the statement that news organisations are overly focused on attracting large audiences rather than reporting, 65% found that news organisations sacrifice accuracy in order to be the first to break a story and 59% thought that news media supported an ideology instead of informing the public. The study also shows that 7 in 10 people worry about fake news being used as a weapon. 63% of respondents have admitted that they were unable to tell good journalism from rumour or falsehood.

Polarisation and echo

Both fake news and partisan low-quality articles have led to two other dangerous and related developments: the polarization of society and online echo chambers. Content that can potentially go viral often deals with divisive and highly emotional issues such as immigration as well as identity-related topics such as race or gender. By flooding the social platform with partisan articles on these topics in order to generate more engagement, newsrooms have contributed to further fragment society along ideological and identity-based lines. Furthermore, the result of this distrust is not that people read less news or that they restrict their news consumption to renowned news publishers but rather that they only read news that confirms pre-existing beliefs. Ironically, this only worsens the problem as confirmation is replacing accuracy as a yardstick for good journalism. News consumers are less likely to read and engage with content that does not conform to their pre-existing beliefs, which in turn leads to more polarization.

Facebook plays a big part in this development. Algorithm-generated distribution may reduce the risk of biased human intervention, but it opens the door to other dangers, such as for instance the risk of creating echo chambers. This concept denotes the selective exposure to ideas that a given individual agrees with, thereby reinforcing her belief system. As stated above, most news that appear on our wall tally with a user’s perceptions thereby reinforcing the impression that her own beliefs are the predominant ones. The user is only confronted with topics she is interested in and with opinions she agrees with. The digital world makes it easier for individuals who share certain outlooks and beliefs, to gather in virtual discussion spaces and mutually confirm their common bias. Several studies have indeed found evidence that echo chambers exist on Facebook and that Facebook users are highly polarised. This polarisation obviously has consequences in the offline world, too.

After initially dismissing the scope and effects of fake and low-quality news on its platform, Facebook has promised to introduce new measures in order to reduce these phenomena. Acknowledging that financial gain is one of the main incentives for spreading fake news as well as clickbaiting headlines and sensationalist articles, Facebook made several updates to make it more difficult to reap profits from such misleading posts. On August 2017 for instance, Facebook made an update to block ads from users who have repeatedly shared false or misleading content on Facebook. Critics were quick to warn about possible abuses of power. Facebook should not take the spread of misinformation on its platform as an excuse to suppress content. In an interview with the Swiss publication “Blick”, Mathias Döpfner, CEO of the German publishing house Axel Springer, explained the dangers of Facebook turning into a news editor: “Those who are now calling upon Facebook to employ an editor-in-chief and to verify the accuracy of independent editorial texts, deleting them where appropriate, they are creating a kind of global super censor and are destroying precisely the diversity that makes up our democracy.” Dunja Mijatović , the Representative on Freedom of the Media for the Organization for Security and Co-operation in Europe, seconded Döpfner and argued that monitoring content “may just cause greater harm to free expression than any lie, no matter how damaging”. A number of NGOs advocating free speech, such as IFEX or Article19, have documented cases of government censorship that have been justified on the grounds of protection against fake news, thereby underlining the dangers of instrumentalising the debate for political goals.

Global super censor

Facebook has repeatedly pledged that, as a tech company, it was not going to make such editorial decisions. “We cannot become arbiters of truth ourselves, it’s not feasible given our scale, and it’s not our role. Instead, we’re working on better ways to hear from our community and work with third parties to identify false news and prevent it from spreading on our platform.” One possibility is to work with third parties who check the veracity of news posted online. In December 2016 Facebook announced its collaboration with the International Fact-Checking Network (IFCN) to fight fake news. Once a news item has been identified as fake, the fact-checkers can issue a rebuttal that will then be attached to the original link. Döpfner has criticized the initiative arguing that journalists are doing even more free work for Facebook in order to solve its fake news problem instead of producing quality articles. Some fact-checkers themselves have also questioned the efficiency of the initiative. Besides, it has been argued that the project puts newspapers in a conflict of interest given that some selected media outlets were granted the privilege to assess the articles of other newspapers.

Additionally, Facebook has introduced measures to expose users to a more diverse range of sources. In April and October 2017, Facebook has tested a function that would show related articles in order to offer different perspectives, as well as a function that provides context for articles that are shared on the news feed, such as information on the publisher or related articles, so that readers can make “an informed decision about which stories to read, share, and trust”. Another early attempt was to introduce a function that let users flags news as “disputed”. According to Facebook, this would make it easier to tell fake articles from authentic ones and thus prevent the spread of hoaxes.

A year after introducing the function Facebook withdrew it, admitting that it generated the opposite effect of what was hoped. In 2018, Zuckerberg introduced a new measure to rank publishers according to their trustworthiness: Facebook users could now decide for themselves which news publishers they deem authentic. News publishers who are “broadly trusted across society, even by those who don’t follow them directly” would as a consequence be prioritised. This again has drawn criticism by those who accuse Facebook of “taking the path with least responsibility”, implying that Facebook should censor instead of continuing to make money with hoaxes.

“Meaningful interaction”

It is clear that whatever the solutions offered, there will always be downsides. The press has become so dependent on Facebook that each adjustment may lead to undesired consequences. This has been exemplified by Facebook’s introduction of a separate news feed which would only display non-promoted press content. The original news feed would instead focus on posts made by friends and family in order to foster what Zuckerberg described as “meaningful interaction”.

Ads would still show up in the original news feed, as well as news – if publishers paid for it to appear there. The feature was launched in 6 countries on 19 October 2017 and has resulted in a massive loss of referral traffic for the concerned news outlets – especially smaller and local publishers relying on Facebook for reaching their audience and making money. The reach of the Guatemalan publisher Soy502 has dropped by 66%, according to one of its editorial board members. In Slovakia, publishers are reported to have lost two thirds to three quarters of their Facebook reach. The failed experiment shows just how much the press is hooked on Facebook and what effects a change in Facebook’s policies can have on media pluralism and especially local and small-sized media outlets in countries as distant as Slovakia and Guatemala.

One of Facebook’s more ambitious and arguably less controversial initiatives has been the launch of its “Facebook Journalism Project” (FJP) under the aegis of Campbell Brown. The declared goal of this news initiative is to raise the quality of and the trust in journalism and is based on three pillars according to Fidji Simo, the director of the project: collaborative projects, training for journalists and training for everyone. One of the focal points of FJP is therefore the educational axis. FJP offers several courses directed at journalists who want to enhance their digital skills. In January 2018, Brown has announced that FJP would offer scholarships to 100 ongoing journalists. Furthermore, FJP tries to advance media literacy through the “News Integrity Initiative” (NII) which it founded in collaboration with a series of other institutions in order to help people “make informed judgments about the news they read and share online”. NII has also pledged to support quality journalism with a $2.5 million grant to partners worldwide.

Facebook FJP has also started to reach out to local media with projects such as the Local News Subscriptions Accelerator aimed to help local media raise their subscription revenue or apps such as Today In which only displays local news and events. The results of these projects remain to be seen but, here again, publishers have questioned the efficiency of the initiative, wondering whether FJP was not a mere PR stunt after all.

Reluctant to quit

While discontent over Facebook is constantly growing, media outlets have nonetheless been reluctant to quit the platform. In fact, they have become too dependent on the social network. News publishers see themselves confronted with a difficult dilemma: should they continue to use Facebook as a distributive platform in order to reach larger audiences at the expense of quality and control?

Or should they stop using Facebook to retain control over production and readership and produce high-quality journalism but suffer a loss of audience as well as revenue? The dilemma involves Facebook users and news readers, too: should they consume free, low-quality news that is conveniently selected by a Facebook algorithm according to one’s interests and preferences or should they find more cumbersome alternatives that involve fees but also higher quality standards? It seems that, without the willingness of readers to pay for journalism, news publishers will hardly have an incentive to quit Facebook.

Facebook too is facing an important dilemma. Should it stress its quality as a tech company whose ultimate goal it is to maximise its gain ­– in which case, it can continue to monetize on viral posts that are usually low quality, sometimes false and often extremely divisive. Or should it identify more strongly as a news distributor, thereby assuming greater responsibilities in its function as a primary news source for a growing amount of its users worldwide? In this case, Facebook ought to promote better journalism, which entails sharing a greater part of its revenue with publishers. The company may as a consequence also experience a reduction of reach and engagement for many posts.

But even if Facebook opts for the latter, as seemingly it has during the last two years, it is still difficult to find a solution that satisfies all parties: the advertisers, major publishers, local news outlets, governments, NGOs and, obviously, Facebook itself as well as its users. Changes in Facebook’s News Feed or algorithms may for instance have serious consequences on some of the publishers, as the fates of Slovakian and Guatemalan news outlets have shown. Furthermore, Facebook is caught between demands for more regulation of news content on the one hand, being accused of a laissez-faire attitude vis-à-vis fake news and low-quality garbage, while on the other hand it is confronted with accusations of using fake news as an excuse to monitor and censor content on its platform along the lines of ideological preferences.

Whatever the decision taken by the different parties involved, the repercussions on the press may be considerable. Facebook has fundamentally changed the news ecosystem and has, in fact, jeopardised press freedom and plurality – whether willingly or not.


Originally published by openDemocracy under a Creative Commons Attribution-NonCommercial 4.0 International licence.