Facebook not changed; it is still about reporting

Facebook has swallowed journalism’. Critically evaluate the argument that news
organizations have lost control of their distribution.

Before
the Internet, the distribution of news was much more expensive than it is now
and it was much clearer what constituted news and
media, making regulation or self-regulation easier. With the rise of social media
and due to the commercialisation and digitation of the media industry,
journalism has been reshaped in tangible ways. Online
platforms such as Facebook, Twitter and Snapchat have brought about new means
through which citizens can produce, share and gain access to information. (Watson
and Wadhwa, 2014). The growth of these platforms meant that individuals are no
longer merely consumers of information but they actively participate in the
production, distribution, curation and verification of information. (Harlow and
Harp, 2013). Even though many studies stress the democratising effects of the
supposedly increased user control over the news process, it can be argued that
this view underestimate the rising power of the news distributor in the media
landscape. The locus of power in delivery and distribution of news, which once
was controlled by media outlets, has now shifted not towards the user but
towards platforms who have priorities that often compete with those of
journalism. The fundamental essence of journalism has not changed;
it is still about reporting on world’s events and adding context to explain
them, but it is now mixed with a system built for scale, speed, and revenue
dictated by social media platforms (Bell and Owen, 2017). In this essay, I am
going to argue that the loss of control of distribution by media outlets has ‘swallowed’
journalism and that the news distributors of today – social media platforms –
are the new gatekeepers in the sphere of media.  I will look at the proliferation of fake news on Facebook during the 2016
U.S. presidential election and evaluate how the platform helped in exacerbating
problems related to the distribution of misinformation. My stance will be that
in Facebook algorithm all
information—including journalism—is reduced and
distributed based on a set of unregulated, hidden, rapidly iterating and individualized rules (Bell and Owen, 2017).

We Will Write a Custom Essay Specifically
For You Only $13.90/page!


order now

Hoaxes and misleading news
have been associated with the Internet since its origin, but it is only recently
that these systematised misinformation campaigns have become apparent, and
their effect on democracy and society carefully examined. The 2016 US
presidential election seemed to have provided a rich breeding place for fake
news. Headlines such as “Pope backs Trump”,
“Hillary sold weapons to ISIS”, “FBI Agent Suspected in Hillary
Email Leaks Found Dead” went viral on Facebook during the election
campaign, reaching millions of people and obtaining thousands of shares (Ticomb
and Carson, 2017).  A Buzzfeed News investigation
found that in the three months before the end of the US presidential campaign,
the top-performing fake election news stories on Facebook generated more
engagement than the top stories from major news organisations such as the New
York Times, Washington Post, Huffington Post, NBC News and others (Silverman,
2016).  Mainstream news organisations
do not control society’s
media and information gates anymore and therefore they are largely powerless to
stop the virality of fake news and the fracturing of news audiences on the Internet
(Madison and Dejarnette, 2017). This shows that the structure and the economics
of social platforms favours the distribution of low-quality content over
high-quality material (Bell and Owen, 2017). “Journalism with high civic value
– journalism that investigates power or reaches underserved and local
communities – is discriminated against a system that favours scale and
shareability” (Bell and Owen, 2017). Different forms of propaganda and misinformation have been
circulating through media platforms for centuries, but the profound media power
residing in one monopolistic platform arguably reveals a unique threat. Even
though, Facebook has recently been under increased public pressure to be held accountable
for its algorithm and the spreading of misinformation, the main problem is
often overlooked: the proliferation of fake news is caused by an unregulated
news monopoly, determined solely by profit imperatives (Pickard, 2016).

 

 

The
earliest intellectual definition of gatekeeping is traced to Kurt Lewin, who defined
the concept as a study of people in “key positions” along “social channels” where
“desire and the resistance to change are expressed” (Silviu, 2015). To implement
gatekeeping theory, Lewin determined that “one must identify those in control
of the entrance and exit of the channel and must study the psychology of the
gatekeepers, their system of values and beliefs that causes the decisions
related to the traffic through each channel” (Mehrotra, 2016). The basic
assumptions, based on this theory and applied to journalism, are that there are
countless events in the world and that the media is not able to report on all
these events. Therefore, it can be argued that the true source of power in the
media industry is not news production but control over news selection and
distribution (Silviu, 2015). Throughout history, the gatekeepers of news selection
and distribution have changed: from the wire editor to the press associations
to the mainstream media outlets.  With
the rise of the internet and social media platforms, however, the entire
notion of a gate and the idea of news organisations limiting what is passing
through is arguably no longer relevant in today’s society and many theorists mistakenly championed the idea that it
is now located with the end user (Bruns, 2003). 
For example, “on Facebook people
decide about what information to add, withhold and disregard, and how to shape,
localise and manipulate the information they channel through their profile” (Delulis,
2015). It is therefore assumed that they are the new gatekeepers, who in
navigating the web, take on some of the role of the
traditional gatekeeper-journalist, they pass through the gates in the search
for information and the evaluation of what is found. This means that the
spread of production capabilities renders uncontrolled and anti-hierarcal communication
(Mehrotra, 2016). However, the
assumption that the Internet is a freedom tool and that social media platforms
are networks where the audience holds the power, is
arguably just a mere illusion. The new realm of communication is actually being
defined and confined by market logic. (Poell and Van Dijck, 2013). In his paper, Bruns ignores what is controlling Facebook’s
distribution: algorithms that disseminates the information that users produce. News consumption through
social media actually limits the users’ ability to consume accurate information
through all the noise. Users are driven by
the architecture to use humour, post unique or socially relevant content and become
more revelatory to gain the attention of their network; and as a site for
general expression of identity to also share a version of themselves (Marichal,
2016). This means that the information is not accurate and that the consumers
are not active like Bruns has stated in his paper; instead they passively
consume news and allow the platform to control how reality is presented to them (Mehrotra, 2016). Technology companies like
Facebook as well as taking on the powerful role of media distribution, they also create a world where
the hyperreal dominates the real and where a facsimile of reality is accepted
as real.  Hyper
reality world is a condition in which we are not able to distinguish reality
from fantasy and it has been defined by the Italian essayist Umberto Eco as “a place of
total passivity’ whose users ‘must agree to behave like its robots’ with no
‘individual initiative” (Miranti, 2017). Instead, users accept the “reconstructed
truth’ so they no longer ‘feel any need for the original” (Mehrotra, 2016). If we
decide to hold technology companies accountable for their software code and
their decisions, we must take Lewin’s direction and analyse the new
gatekeepers’ ‘psychology’. Today, one of the most significant pieces of
software code controlling news selection and distribution is the algorithm that
curates every individual’s Facebook newsfeed. The new gatekeepers are the
employees who build this code and make the editorial decisions at this hybrid
media-technology company.

 

 

The reason why
I have decided to use the 2016 US presidential election as my case study is
because it is a clear example of how online platforms, rather than acting as source of accurate
information, frequently distribute misinformation in an attempt to drive
traffic and social engagement. Moreover, it shows that social network companies
like Facebook that distribute media content hold the power over discourse and
the capability to shape our realities – a phenomena we have primarily
attributed to news producers in the newsroom. The proliferation of fake news
during the election perfectly highlighted this shift in the control of news distribution
and the role of Facebook as a commercial service rather than a “neutral
technology that merely enables user activity”; a role which is difficult to reconcile with
the one of journalism as a social good. (Poell and Van Dijck, 2014).  As
my primary sources, I have analysed a Buzzfeed News Analysis which found that
top “fake election news stories” generated more total engagement on Facebook
than top election stories from 19 major news outlets combined (Silverman, 2016).
Moreover, I have also looked at various academic papers; the most relevant I
found were “Selective Exposure to
Misinformation: Evidence from the consumption of fake news during the 2016 U.S.
presidential campaign”,
“Media Failures in the Age of Trump” and “Don’t
blame the election on fake news. Blame it on the media.” The reason why I chose
these sources is because they all presented diverging views on my case study
and this allowed me to construct a more rounded argument. From these sources, I
have extrapolated three main key ideas: the news media’s extreme commercialism;
Facebook’s
environment is ripe for the spread of disinformation and networked propaganda, and the crisis of journalism related to the loss of control
of distribution by mainstream media outlets.

The 2016 elections in the United
Stated and abroad proved the importance of Facebook in distributing and
selecting news and arguably in influencing voting behavior and political views.
 Whilst we may never fully know if the
spread of fake news and misinformation across the platform actually affected
political views and outcomes, it without a doubt raised concerns over news
distribution becoming subject to the ‘algorithm logic’ of social media
technologies rather than to the ‘editorial logic’ of professional news media
(Gillespie, 2012).  

 A week after the widely unexpected election of Donald Trump
in the United States, Buzzfeed Media editor Craig Silverman published an
analysis exposing how misleading news spread across social media during the
election campaign, primarily on Facebook.  During the 10 months leading up
to the election, “the top 20 fake news articles being shared on Facebook
skyrocketed from 3 million shares, reactions, and comments to nearly 9 million,
while mainstream media articles declined from 12 million shares, reactions, and
comments in February to just 7.3 million by Election Day” (Romano, 2016). It
was found that Facebook’s newsfeed algorithm not only promoted a lot of false
stories, but that nearly 40% of the content published by far-right Facebook
pages and 19% of the content published by extreme left-leaning Facebook pages
was false or misleading (Silverman, 2016).

 

Moreover, Buzzfeed revealed that one
of the biggest sources of fake Facebook stories, was a small town in Veles,
Macedonia where a group of young men profited from fabricating stories about
the American election. The reason behind it was not promoting one of the
candidates but financial gains; as the best way to generate shares and traffic
on Facebook is “to publish sensationalist and often false content that caters
to Trump supporters” and every time a story went viral, they would make money
on the advertisements that accompanied it (Silverman, 2016).

For example, the false story from
the “fantasy news website” that Pope Francis endorsed Donald Trump ended up on
Facebook newsfeed, without any context and presented in the same format and in
the same digital real estate as an article from The New York Times or CNN
(Cheshire, 2017). The number of likes and shares from this post, was on average
19 times higher than for posts from a mainstream news outlet
(Silverman, 2016).

This perfectly shows how good
reporting is not currently privileged on many social media platforms which
emphasise spectacle over substantive policy issues (Patterson, 2016). Rather,
they actively intervene in the distribution of news “through opaque algorithms
that structure users’ personal news feed” (Franklin and Eldridge, 2017). In
order to optimise the experience and maximise the commercial value of the
Newsfeed, readers are shown material that will generate more likes and shares;
which are the currency of the new advertising market.  In this algorithm logic, instead of
being empowered, social media users are much rather “the products of
algorithmic steering, pushed towards particular content and sold to
advertisers” (Poell and Van Dijck, 2014).

Moreover, the
director of advertising for the Republican National Committee Gary Coby said in
an interview with Wired that “on any given day, the Trump campaign was testing
40.000 to 50.000 versions of its ads on Facebook”, calling it a “A/B testing on
steroids” (Lapowski, 2016) Furthermore, Facebook admitted to
congressional investigators that it sold $100,000
worth of ads to an ambiguous Russian company with a history of pro-Kremlin
propaganda.  The ads, which started
running in the summer of 2015 and continued throughout the election, promoted
issues like gun rights, immigration fears, racial strife and also spread fake
news (Kulp, 2017). These advertisements were not regulated
by Facebook which allowed the Trump campaign to address an infinite number of
market subsets with custom unregulated messages. This was made possible by mass
data collection and the decline of user privacy – which is a result of the
Internet business logic (Bell and Owen, 2017). Users now increasingly gravitate to the
specific sources and stories that appeal to their opinion, interests and
worldview. This has given the opportunity for media creators to thrive, both
politically and/or financially. by distributing and filtering people the
specific news, commentary and advertisements they wanted to see (Samuel, 2017).
There is certainly reasonable doubt that
even millions of
dollars of Facebook spending could change the outcome of even a state in the
U.S. presidential election, but this shows how unlike previous methods of
disseminating propaganda – on newspapers, TV and radio – Facebook’s
distribution is difficult to regulate or officiate. Facebook has therefore allowed president Trump to directly
combat the hugely negative media coverage directed at him, simply by giving his
campaign and supporters another platform to distribute news, propaganda and
advertisements. Facebook however, instead of taking the role of an open and democratising
environment that liberates as it informs, proved to control the distribution of
information on this network. Facebook, its algorithm, editorial practices and
business model are ultimately ‘swallowing’ the editorial values journalism is
built on by taking the role of gatekeeper in this new sphere media.

 

 

Even though,
since the US presidential election Facebook has begun new efforts to tackle the
distribution of fake news and polarised adverts from its algorithm news feeds
(Constine, 2017), it still calls itself a technology company and not a media company
(Bell, 2017). There are many reasons behind this resistance in change
definition. The most relevant is that by doing this Facebook avoids legal
responsibilities for the content it distributes. According to the company, its
algorithms are the proof that it is not a publisher, but a distributor of other
people’s information. However, behind the algorithms there are creators and
they are therefore responsible for the information they spread. In order to
hide their subjectivity in a story, reporters use quotation marks as a ‘signalling
practice’. Moreover, journalists falsely distance themselves from their sources
to protect themselves from the risks of their trade (Mehrotra, 2016). In the
same way, the press has a “limited repertoire with which to define and defend
their objectivity, computer
software engineers cannot hide under the veil of algorithmic objectivity to
deflect subjectivities in their products”(Mehrotra, 2016).  More than 40% of American adults
get news on Facebook (Pew Research Centre, 2016) and this channel of connection
is clearly just the product of powerful actors. The Internet has potential to
advance society but if the public cares about quality journalism and stamping
out fake news, regardless of political affiliation, it must demand these tech
companies to take responsibility for their distribution of content.