Tuesday, 6 February 2018

The Conversation/Megan Knight: Explainer: how Facebook has become the world’s largest echo chamber

The Conversation

    Edition:

Available editions
Africa

    Job Board

    Become an author
    Sign up as a reader
    Sign in

The Conversation
Academic rigour, journalistic flair

    Arts + Culture
    Business + Economy
    Education
    Environment + Energy
    Health + Medicine
    Politics + Society
    Science + Technology
    In French

Explainer: how Facebook has become the world’s largest echo chamber
February 5, 2018 4.21pm SAST
Author

    Megan Knight

    Associate Dean, University of Hertfordshire

Disclosure statement

Megan Knight does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Partners

University of Hertfordshire

University of Hertfordshire provides funding as a member of The Conversation UK.

The Conversation is funded by Barclays Africa and seven universities, including the Cape Peninsula University of Technology, Rhodes University and the Universities of Cape Town, Johannesburg, Kwa-Zulu Natal, Pretoria, and South Africa. It is hosted by the Universities of the Witwatersrand and Western Cape, the African Population and Health Research Centre and the Nigerian Academy of Science. The Bill & Melinda Gates Foundation is a Strategic Partner. more
Republish this article

Republish
Republish our articles for free, online or in print, under Creative Commons licence.
Is there an echo here? Reuters/Benoit Tessier

    Email
    Twitter4
    Facebook47
    LinkedIn1
    Print

I began my research career in the last century with an analysis of how news organisations were adapting to this strange new thing called “the Internet”. Five years later I signed up for Twitter and, a year after that, for Facebook.

Now, as it celebrates its 14th birthday, Facebook is becoming ubiquitous, and its usage and impact is central to my (and many others’) research.

In 2017 the social network had 2 billion members, by its own count. Facebook’s relationship with news content is an important part of this ubiquity. Since 2008 the company has courted news organisations with features like “Connect”, “Share” and “Instant Articles”. As of 2017, 48% of Americans rely primarily on Facebook for news and current affairs information.

Social networks present news content in a way that’s integrated into the flow of personal and other communication. Media scholar Alfred Hermida calls this “ambient news”. It’s a trend that has been considered promising for the development of civil society. Social media – like the Internet before it – has being hailed as the new “public sphere”: a place for civic discourse and political engagement among the citizenry.

But, unlike the Internet, Facebook is not a public space in which all content is equal. It is a private company. It controls what content you see, according to algorithms and commercial interests. The new public sphere is, in fact, privately owned, and this has far-reaching implications for civic society worldwide.

When a single company is acting as the broker for news and current affairs content for a majority of the population, the possibility for abuse is rife. Facebook is not seen as a “news organisation”, so it falls outside of whatever regulations countries apply to “the news”. And its content is provided by myriad third parties, often with little oversight and tracking by countries’ authorities. So civic society’s ability to address concerns about Facebook’s content becomes even more constrained.
Getting to know all about you

Facebook’s primary goal is to sell advertising. It does so by knowing as much as possible about its users, then selling that information to advertisers. The provision of content to entice consumers to look at advertising is not new: it’s the entire basis of the commercial media.

But where newspapers can only target broad demographic groups based on language, location and, to an extent, education level and income, Facebook can narrow its target market down to individual level. How? Based on demographics – and everything your “likes”, posts and comments have told it.

This ability to fine tune content to subsets of the audience is not limited to advertising. Everything on your Facebook feed is curated and presented to you by an algorithm seeking to maximise your engagement by only showing you things that it thinks you will like and respond to. The more you engage and respond, the better the algorithm gets at predicting what you will like.

When it comes to news content and discussion of the news, this means you will increasingly only see material that’s in line with your stated interests. More and more, too, news items, advertisements and posts by friends are blurred in the interface. This all merges into a single stream of information.

And because of the way your network is structured, the nature of that information becomes ever more narrow. It is inherent in the ideals of democracy that people be exposed to a plurality of ideas; that the public sphere should be open to all. The loss of this plurality creates a society made up of extremes, with little hope for consensus or bridging of ideas.
An echo chamber

Most people’s “friends” on Facebook tend to be people with whom they have some real-life connection – actual friends, classmates, neighbours and family members. Functionally, this means that most of your network will consist largely of people who share your broad demographic profile: education level, income, location, ethnic and cultural background and age.

The algorithm knows who in this network you are most likely to engage with, which further narrows the field to people whose worldview aligns with your own. You may be Facebook friends with your Uncle Fred, whose political outbursts threaten the tranquillity of every family get-together. But if you ignore his conspiracy-themed posts and don’t engage, they will start to disappear from your feed.

Over time this means that your feed gets narrower and narrower. It shows less and less content that you might disagree with or find distasteful.

These two responses, engaging and ignoring are both driven by the invisible hand of the algorithm. And they have created an echo chamber. This isn’t dissimilar to what news organisations have been trying to do for some time: gatekeeping is the expression of the journalists’ idea of what the audience wants to read.

Traditional journalists had to rely on their instinct for what people would be interested in. Technology now makes it possible to know exactly what people read, responded to, or shared.

For Facebook, this process is now run by a computer; an algorithm which reacts instantly to provide the content it thinks you want. But this fine tuned and carefully managed algorithm is open to manipulation, especially by political and social interests.
Extreme views confirmed

In the last few years Facebook users have unwittingly become part of a massive social experiment – one which may have contributed to the equally surprising election of Donald Trump as president of the US and the UK electing to leave the European Union. We can’t be sure of this, since Facebook’s content algorithm is secret and most of the content is shown only to specific users.

It’s physically impossible for a researcher to see all of the content distributed on Facebook; the company explicitly prevents that kind of access. Researchers and journalists need to construct model accounts (fake ones, violating Facebook’s terms of use) and attempt to trick the algorithm into showing what the social network’s most extreme political users see.

What they’ve found is that the more extreme the views the user has already agreed with, the more extreme the content they saw was. People who liked or expressed support for leaving the EU were shown content that reflected this desire, but in a more extreme way.

If they liked that they’d be shown even more content, and so on, the group getting smaller and smaller and more and more insular. This is similar to how extremist groups would identify and court potential members, enticing them with more and more radical ideas and watching their reaction. That sort of personal interaction was a slow process. Facebook’s algorithm now works at lightning speed and the pace of radicalisation is exponentially increased.

    Facebook
    Social media
    Algorithm
    Extremism
    echo chamber
    Online extremism
    social media politics
    Facebook ethics

    Tweet
    Share
    Get newsletter

You might also like
From #MeToo to #RiceBunny: how social media users are campaigning in China
Super Bowl: how bots, brands and the alt-right highjacked the event on social media
How cyberbullies overtly and covertly target their victims
What should I do if my child is a cyberbully?
Sign in to comment
0 Comments

    There are no comments on this article yet.
    Have your say, post a comment on this article.

Most popular on The Conversation

Expert Database

    Find experts with knowledge in:*

Want to write?

Write an article and join a growing community of more than 62,200 academics and researchers from 2,274 institutions.

Register now
The Conversation
Community

    Community standards
    Republishing guidelines
    Research and Expert Database
    Analytics
    Job Board
    Our feeds

Company

    Who we are
    Our charter
    Our team
    Partners and funders
    Contributing institutions
    Resource for media
    Contact us

Stay informed and subscribe to our free daily newsletter and get the latest analysis and commentary directly in your inbox.
Email address
Follow us on social media

Privacy policy Terms and conditions Corrections

Copyright © 2010–2018, The Conversation Africa, Inc.
Post a Comment