Facebook not changed; it is still about reporting

Facebook has swallowed journalism’.

Critically evaluate the argument that newsorganizations have lost control of their distribution.Beforethe Internet, the distribution of news was much more expensive than it is nowand it was much clearer what constituted news andmedia, making regulation or self-regulation easier. With the rise of social mediaand due to the commercialisation and digitation of the media industry,journalism has been reshaped in tangible ways. Onlineplatforms such as Facebook, Twitter and Snapchat have brought about new meansthrough which citizens can produce, share and gain access to information. (Watsonand Wadhwa, 2014). The growth of these platforms meant that individuals are nolonger merely consumers of information but they actively participate in theproduction, distribution, curation and verification of information. (Harlow andHarp, 2013). Even though many studies stress the democratising effects of thesupposedly increased user control over the news process, it can be argued thatthis view underestimate the rising power of the news distributor in the medialandscape.

The locus of power in delivery and distribution of news, which oncewas controlled by media outlets, has now shifted not towards the user buttowards platforms who have priorities that often compete with those ofjournalism. The fundamental essence of journalism has not changed;it is still about reporting on world’s events and adding context to explainthem, but it is now mixed with a system built for scale, speed, and revenuedictated by social media platforms (Bell and Owen, 2017). In this essay, I amgoing to argue that the loss of control of distribution by media outlets has ‘swallowed’journalism and that the news distributors of today – social media platforms –are the new gatekeepers in the sphere of media.  I will look at the proliferation of fake news on Facebook during the 2016U.S. presidential election and evaluate how the platform helped in exacerbatingproblems related to the distribution of misinformation. My stance will be thatin Facebook algorithm allinformation—including journalism—is reduced anddistributed based on a set of unregulated, hidden, rapidly iterating and individualized rules (Bell and Owen, 2017).Hoaxes and misleading newshave been associated with the Internet since its origin, but it is only recentlythat these systematised misinformation campaigns have become apparent, andtheir effect on democracy and society carefully examined.

The 2016 USpresidential election seemed to have provided a rich breeding place for fakenews. Headlines such as “Pope backs Trump”,”Hillary sold weapons to ISIS”, “FBI Agent Suspected in HillaryEmail Leaks Found Dead” went viral on Facebook during the electioncampaign, reaching millions of people and obtaining thousands of shares (Ticomband Carson, 2017).  A Buzzfeed News investigationfound that in the three months before the end of the US presidential campaign,the top-performing fake election news stories on Facebook generated moreengagement than the top stories from major news organisations such as the NewYork Times, Washington Post, Huffington Post, NBC News and others (Silverman,2016).  Mainstream news organisationsdo not control society’smedia and information gates anymore and therefore they are largely powerless tostop the virality of fake news and the fracturing of news audiences on the Internet(Madison and Dejarnette, 2017).

Best services for writing your paper according to Trustpilot

Premium Partner
From $18.00 per page
4,8 / 5
4,80
Writers Experience
4,80
Delivery
4,90
Support
4,70
Price
Recommended Service
From $13.90 per page
4,6 / 5
4,70
Writers Experience
4,70
Delivery
4,60
Support
4,60
Price
From $20.00 per page
4,5 / 5
4,80
Writers Experience
4,50
Delivery
4,40
Support
4,10
Price
* All Partners were chosen among 50+ writing services by our Customer Satisfaction Team

This shows that the structure and the economicsof social platforms favours the distribution of low-quality content overhigh-quality material (Bell and Owen, 2017). “Journalism with high civic value– journalism that investigates power or reaches underserved and localcommunities – is discriminated against a system that favours scale andshareability” (Bell and Owen, 2017). Different forms of propaganda and misinformation have beencirculating through media platforms for centuries, but the profound media powerresiding in one monopolistic platform arguably reveals a unique threat.

Eventhough, Facebook has recently been under increased public pressure to be held accountablefor its algorithm and the spreading of misinformation, the main problem isoften overlooked: the proliferation of fake news is caused by an unregulatednews monopoly, determined solely by profit imperatives (Pickard, 2016).   Theearliest intellectual definition of gatekeeping is traced to Kurt Lewin, who definedthe concept as a study of people in “key positions” along “social channels” where”desire and the resistance to change are expressed” (Silviu, 2015). To implementgatekeeping theory, Lewin determined that “one must identify those in controlof the entrance and exit of the channel and must study the psychology of thegatekeepers, their system of values and beliefs that causes the decisionsrelated to the traffic through each channel” (Mehrotra, 2016).

The basicassumptions, based on this theory and applied to journalism, are that there arecountless events in the world and that the media is not able to report on allthese events. Therefore, it can be argued that the true source of power in themedia industry is not news production but control over news selection anddistribution (Silviu, 2015). Throughout history, the gatekeepers of news selectionand distribution have changed: from the wire editor to the press associationsto the mainstream media outlets.

 Withthe rise of the internet and social media platforms, however, the entirenotion of a gate and the idea of news organisations limiting what is passingthrough is arguably no longer relevant in today’s society and many theorists mistakenly championed the idea that itis now located with the end user (Bruns, 2003). For example, “on Facebook peopledecide about what information to add, withhold and disregard, and how to shape,localise and manipulate the information they channel through their profile” (Delulis,2015). It is therefore assumed that they are the new gatekeepers, who innavigating the web, take on some of the role of thetraditional gatekeeper-journalist, they pass through the gates in the searchfor information and the evaluation of what is found.

This means that thespread of production capabilities renders uncontrolled and anti-hierarcal communication(Mehrotra, 2016). However, theassumption that the Internet is a freedom tool and that social media platformsare networks where the audience holds the power, isarguably just a mere illusion. The new realm of communication is actually beingdefined and confined by market logic. (Poell and Van Dijck, 2013). In his paper, Bruns ignores what is controlling Facebook’sdistribution: algorithms that disseminates the information that users produce.

News consumption throughsocial media actually limits the users’ ability to consume accurate informationthrough all the noise. Users are driven bythe architecture to use humour, post unique or socially relevant content and becomemore revelatory to gain the attention of their network; and as a site forgeneral expression of identity to also share a version of themselves (Marichal,2016). This means that the information is not accurate and that the consumersare not active like Bruns has stated in his paper; instead they passivelyconsume news and allow the platform to control how reality is presented to them (Mehrotra, 2016). Technology companies likeFacebook as well as taking on the powerful role of media distribution, they also create a world wherethe hyperreal dominates the real and where a facsimile of reality is acceptedas real.

 Hyperreality world is a condition in which we are not able to distinguish realityfrom fantasy and it has been defined by the Italian essayist Umberto Eco as “a place oftotal passivity’ whose users ‘must agree to behave like its robots’ with no’individual initiative” (Miranti, 2017). Instead, users accept the “reconstructedtruth’ so they no longer ‘feel any need for the original” (Mehrotra, 2016). If wedecide to hold technology companies accountable for their software code andtheir decisions, we must take Lewin’s direction and analyse the newgatekeepers’ ‘psychology’. Today, one of the most significant pieces ofsoftware code controlling news selection and distribution is the algorithm thatcurates every individual’s Facebook newsfeed. The new gatekeepers are theemployees who build this code and make the editorial decisions at this hybridmedia-technology company.  The reason whyI have decided to use the 2016 US presidential election as my case study isbecause it is a clear example of how online platforms, rather than acting as source of accurateinformation, frequently distribute misinformation in an attempt to drivetraffic and social engagement. Moreover, it shows that social network companieslike Facebook that distribute media content hold the power over discourse andthe capability to shape our realities – a phenomena we have primarilyattributed to news producers in the newsroom.

The proliferation of fake newsduring the election perfectly highlighted this shift in the control of news distributionand the role of Facebook as a commercial service rather than a “neutraltechnology that merely enables user activity”; a role which is difficult to reconcile withthe one of journalism as a social good. (Poell and Van Dijck, 2014).  Asmy primary sources, I have analysed a Buzzfeed News Analysis which found thattop “fake election news stories” generated more total engagement on Facebookthan top election stories from 19 major news outlets combined (Silverman, 2016).Moreover, I have also looked at various academic papers; the most relevant Ifound were “Selective Exposure toMisinformation: Evidence from the consumption of fake news during the 2016 U.S.presidential campaign”,”Media Failures in the Age of Trump” and “Don’tblame the election on fake news. Blame it on the media.

” The reason why I chosethese sources is because they all presented diverging views on my case studyand this allowed me to construct a more rounded argument. From these sources, Ihave extrapolated three main key ideas: the news media’s extreme commercialism;Facebook’senvironment is ripe for the spread of disinformation and networked propaganda, and the crisis of journalism related to the loss of controlof distribution by mainstream media outlets. The 2016 elections in the UnitedStated and abroad proved the importance of Facebook in distributing andselecting news and arguably in influencing voting behavior and political views. Whilst we may never fully know if thespread of fake news and misinformation across the platform actually affectedpolitical views and outcomes, it without a doubt raised concerns over newsdistribution becoming subject to the ‘algorithm logic’ of social mediatechnologies rather than to the ‘editorial logic’ of professional news media(Gillespie, 2012).

  A week after the widely unexpected election of Donald Trumpin the United States, Buzzfeed Media editor Craig Silverman published ananalysis exposing how misleading news spread across social media during theelection campaign, primarily on Facebook.  During the 10 months leading upto the election, “the top 20 fake news articles being shared on Facebookskyrocketed from 3 million shares, reactions, and comments to nearly 9 million,while mainstream media articles declined from 12 million shares, reactions, andcomments in February to just 7.3 million by Election Day” (Romano, 2016). Itwas found that Facebook’s newsfeed algorithm not only promoted a lot of falsestories, but that nearly 40% of the content published by far-right Facebookpages and 19% of the content published by extreme left-leaning Facebook pageswas false or misleading (Silverman, 2016). Moreover, Buzzfeed revealed that oneof the biggest sources of fake Facebook stories, was a small town in Veles,Macedonia where a group of young men profited from fabricating stories aboutthe American election. The reason behind it was not promoting one of thecandidates but financial gains; as the best way to generate shares and trafficon Facebook is “to publish sensationalist and often false content that catersto Trump supporters” and every time a story went viral, they would make moneyon the advertisements that accompanied it (Silverman, 2016).For example, the false story fromthe “fantasy news website” that Pope Francis endorsed Donald Trump ended up onFacebook newsfeed, without any context and presented in the same format and inthe same digital real estate as an article from The New York Times or CNN(Cheshire, 2017).

The number of likes and shares from this post, was on average19 times higher than for posts from a mainstream news outlet(Silverman, 2016). This perfectly shows how goodreporting is not currently privileged on many social media platforms whichemphasise spectacle over substantive policy issues (Patterson, 2016). Rather,they actively intervene in the distribution of news “through opaque algorithmsthat structure users’ personal news feed” (Franklin and Eldridge, 2017). Inorder to optimise the experience and maximise the commercial value of theNewsfeed, readers are shown material that will generate more likes and shares;which are the currency of the new advertising market.  In this algorithm logic, instead ofbeing empowered, social media users are much rather “the products ofalgorithmic steering, pushed towards particular content and sold toadvertisers” (Poell and Van Dijck, 2014).

Moreover, thedirector of advertising for the Republican National Committee Gary Coby said inan interview with Wired that “on any given day, the Trump campaign was testing40.000 to 50.000 versions of its ads on Facebook”, calling it a “A/B testing onsteroids” (Lapowski, 2016) Furthermore, Facebook admitted tocongressional investigators that it sold $100,000worth of ads to an ambiguous Russian company with a history of pro-Kremlinpropaganda.  The ads, which startedrunning in the summer of 2015 and continued throughout the election, promotedissues like gun rights, immigration fears, racial strife and also spread fakenews (Kulp, 2017). These advertisements were not regulatedby Facebook which allowed the Trump campaign to address an infinite number ofmarket subsets with custom unregulated messages. This was made possible by massdata collection and the decline of user privacy – which is a result of theInternet business logic (Bell and Owen, 2017). Users now increasingly gravitate to thespecific sources and stories that appeal to their opinion, interests andworldview.

This has given the opportunity for media creators to thrive, bothpolitically and/or financially. by distributing and filtering people thespecific news, commentary and advertisements they wanted to see (Samuel, 2017).There is certainly reasonable doubt thateven millions ofdollars of Facebook spending could change the outcome of even a state in theU.S.

presidential election, but this shows how unlike previous methods ofdisseminating propaganda – on newspapers, TV and radio – Facebook’sdistribution is difficult to regulate or officiate. Facebook has therefore allowed president Trump to directlycombat the hugely negative media coverage directed at him, simply by giving hiscampaign and supporters another platform to distribute news, propaganda andadvertisements. Facebook however, instead of taking the role of an open and democratisingenvironment that liberates as it informs, proved to control the distribution ofinformation on this network.

Facebook, its algorithm, editorial practices andbusiness model are ultimately ‘swallowing’ the editorial values journalism isbuilt on by taking the role of gatekeeper in this new sphere media.   Even though,since the US presidential election Facebook has begun new efforts to tackle thedistribution of fake news and polarised adverts from its algorithm news feeds(Constine, 2017), it still calls itself a technology company and not a media company(Bell, 2017). There are many reasons behind this resistance in changedefinition. The most relevant is that by doing this Facebook avoids legalresponsibilities for the content it distributes. According to the company, itsalgorithms are the proof that it is not a publisher, but a distributor of otherpeople’s information. However, behind the algorithms there are creators andthey are therefore responsible for the information they spread. In order tohide their subjectivity in a story, reporters use quotation marks as a ‘signallingpractice’. Moreover, journalists falsely distance themselves from their sourcesto protect themselves from the risks of their trade (Mehrotra, 2016).

In thesame way, the press has a “limited repertoire with which to define and defendtheir objectivity, computersoftware engineers cannot hide under the veil of algorithmic objectivity todeflect subjectivities in their products”(Mehrotra, 2016).  More than 40% of American adultsget news on Facebook (Pew Research Centre, 2016) and this channel of connectionis clearly just the product of powerful actors. The Internet has potential toadvance society but if the public cares about quality journalism and stampingout fake news, regardless of political affiliation, it must demand these techcompanies to take responsibility for their distribution of content.