The Conversation
Academic rigour, journalistic flair
Arts + Culture
Business + Economy
Education
Environment + Energy
Health + Medicine
Politics + Society
Science + Technology
In French
How Facebook could really fix itself
February 1, 2018 1.38pm SAST
Under fire: Facebook founder and CEO Mark Zuckerberg. AP Photo/Jeff Roberson
Author
Bhaskar Chakravorti
Senior Associate Dean, International Business & Finance, Tufts University
Disclosure statement
Bhaskar Chakravorti directs the Institute for Business in the Global Context at Tufts Fletcher School. The Institute has received funding from Mastercard, Microsoft and the Gates Foundation.
Partners
Tufts University
Tufts University provides funding as a founding partner of The Conversation US.
The Conversation is funded by Barclays Africa and seven universities, including the Cape Peninsula University of Technology, Rhodes University and the Universities of Cape Town, Johannesburg, Kwa-Zulu Natal, Pretoria, and South Africa. It is hosted by the Universities of the Witwatersrand and Western Cape, the African Population and Health Research Centre and the Nigerian Academy of Science. The Bill & Melinda Gates Foundation is a Strategic Partner. more
Republish this article
Republish
Republish our articles for free, online or in print, under Creative Commons licence.
Email
Twitter18
Facebook53
LinkedIn7
Print
Facebook has a world of problems. Beyond charges of Russian manipulation and promoting fake news, the company’s signature social media platform is under fire for being addictive, causing anxiety and depression, and even instigating human rights abuses.
Company founder and CEO Mark Zuckerberg says he wants to win back users’ trust. But his company’s efforts so far have ignored the root causes of the problems they intend to fix, and even risk making matters worse. Specifically, they ignore the fact that personal interaction isn’t always meaningful or benign, leave out the needs of users in the developing world, and seem to compete with the company’s own business model.
Based on The Digital Planet, a multi-year global study of how digital technologies spread and how much people trust them, which I lead at Tufts University’s Fletcher School, I have some ideas about how to fix Facebook’s efforts to fix itself.
Face-saving changes?
Like many technology companies, Facebook must balance the convergence of digital dependence, digital dominance and digital distrust. Over 2 billion people worldwide check Facebook each month; 45 percent of American adults get their news from Facebook. Together with Google, it captures half of all digital advertising revenues worldwide. Yet more people say they greatly distrust Facebook than any other member of the big five – Amazon, Apple, Google or Microsoft.
In March 2017 Facebook started taking responsibility for quality control as a way to restore users’ trust. The company hired fact-checkers to verify information in posts. Two months later the company changed its algorithms to help users find diverse viewpoints on current issues and events. And in October 2017, it imposed new transparency requirements to force advertisers to identify themselves clearly.
But Zuckerberg led off 2018 in a different direction, committing to “working to fix our issues together.” That last word, “together,” suggests an inclusive approach, but in my view, it really says the company is shifting the burden back onto its users.
The company began by overhauling its crucial News Feed feature, giving less priority to third-party publishers, whether more traditional media outlets like The New York Times, The Washington Post or newer online publications such as Buzzfeed or Vox. That will leave more room for posts from family and friends, which Zuckerberg has called “meaningful social interactions.”
However, Facebook will rely on users to rate how trustworthy groups, organizations and media outlets are. Those ratings will determine which third-party publishers do make it to users’ screens, if at all. Leaving trustworthiness ratings to users without addressing online political polarization risks making civic discourse even more divided and extreme.
Personal isn’t always ‘meaningful’
Unlike real-life interactions, online exchanges can exacerbate both passive and narcissistic tendencies. It’s easier to be invisible online, so people who want to avoid attention can do so without facing peer pressure to participate. By contrast, though, people who are active online can see their friends like, share and comment on their posts, motivating them to seek even more attention.
This creates two groups of online users, broadly speaking: disengaged observers and those who are competing for attention with ever more extreme efforts to catch users’ eyes. This environment has helped outrageous, untrue claims with clickbait headlines attract enormous amounts of attention.
This phenomenon is further complicated by two other elements of social interaction online. First, news of any kind – including fake news – gains credibility when it is forwarded by a personal connection.
And social media tends to group like-minded people together, creating an echo chamber effect that reinforces messages the group agrees with and resists outside views – including more accurate information and independent perspectives. It’s no coincidence that conservatives and liberals trust very different news sources.
Users of Facebook’s instant-messaging subsidiary WhatsApp have shown that even a technology focusing on individual connection isn’t always healthy or productive. WhatsApp has been identified as a primary carrier of fake news and divisive rumors in India, where its users’ messages have been described as a “mix of off-color jokes, doctored TV [clips], wild rumors and other people’s opinions, mostly vile.” Kenya has identified 21 hate-mongering WhatsApp groups. WhatsApp users in the U.K. have had to stay alert for scams in their personal messages.
Addressing the developing world
Facebook’s actions appear to be responding to public pressure from the U.S. and Europe. But Facebook is experiencing its fastest growth in Asia and Africa.
Research I have conducted with colleagues has found that users in the developing world are more trusting of online material, and therefore more vulnerable to manipulation by false information. In Myanmar, for instance, Facebook is the dominant internet site because of its Free Basics program, which lets mobile-phone users connect to a few selected internet sites, including Facebook, without paying extra or using up allotted data in their mobile plans. In 2014, Facebook had 2 million users in Myanmar; after Free Basics arrived in 2016, that number climbed to 30 million.
One of the effects has been devastating. Rumor campaigns against the Rohingya ethnic group in Myanmar were, in part, spread on Facebook, sparking violence. At least 6,700 Rohingya Muslims were killed by Myanmar’s security forces between August and September 2017; 630,000 more have fled the country. Facebook did not stop the rumors, and at one point actually shut down responding posts from a Rohingya activist group.
Facebook’s Free Basics program is in 63 developing countries and municipalities, each filled with people new to the digital economy and potentially vulnerable to manipulation.
Fighting against the business model
Facebook’s efforts to promote what might be called “corporate digital responsibility” runs counter to the company’s business model. Zuckerberg himself declared that the upcoming changes would cause people to spend less time on Facebook.
But the company makes 98 percent of its revenues from advertising. That is only possible if users keep their attention focused on the platform, so the company can analyze their usage data to generate more targeted advertising.
Our research finds that companies working toward corporate social responsibility will only succeed if their efforts align with their core business models. Otherwise, the responsibility project will become unsustainable in the face of pressure from the stock market, competitors or government regulators, as happened to Facebook with European privacy rules.
Real solutions
What can Facebook do instead? I recommend the following to fix Facebook’s fix:
Own the reality of Facebook’s enormous role in society. It’s a primary source of news and communication that influences the beliefs and assumptions driving citizen behavior around the world. The company cannot rely on users to police the system. As a media company, Facebook needs to take responsibility for the content it publishes and republishes. It can combine both human and artificial intelligence to sort through the content, labeling news, opinions, hearsay, research and other types of information in ways ordinary users can understand.
Establish on-the-ground operations in every location where it has large numbers of users, to ensure the company understands local contexts. Rather than a virtual global entity operating from Silicon Valley, Facebook should engage with the nuances and complexities of cities, regions and countries, using local languages to customize content for users. Right now, Facebook passively publishes educational materials on digital safety and community standards, which are easily ignored. As Facebook adds users in developing nations, the company must pay close attention to the unintended consequences of explosive growth in connectivity.
Reduce the company’s dependence on advertising revenue. As long as Facebook is almost entirely dependent on ad sales, it will be forced to hold users’ attention as long as possible and gather their data to analyze for future ad opportunities. Its strategy for expansion should go beyond building and buying other apps, like WhatsApp, Instagram and Messenger, all of which still feed the core business model of monopolizing and data-mining users’ attention. Taking inspiration from Amazon and Netflix – and even Google parent company Alphabet – Facebook could use its huge trove of user data responsibly to identify, design and deliver new services that people would pay for.
Ultimately, Zuckerberg and Facebook’s leaders have created an enormously powerful, compelling and potentially addictive service. This unprecedented opportunity has developed at an unprecedented pace. Growth may be the easy part; being the responsible grown-up is much harder.
Facebook
Social media
Online news
Trust
Mark Zuckerberg
Solutions Journalism
Social media and democracy
Tweet
Share
Get newsletter
You might also like
Instagram is changing the way we experience art, and that’s a good thing
Three ways Facebook could reduce fake news without resorting to censorship
Social media companies should ditch clickbait, and compete over trustworthiness
Trust in digital technology will be the internet’s next frontier, for 2018 and beyond
Sign in to comment
1 Comment
Oldest Newest
Andrew Taylor
This article calls for FB to be normalized into society as the central feed of truth, and just get FB to “do it morally”
I couldn’t disagree more.
Instead of normalizing FB, we need to normalize competition and diversity and alternatives to FB. Even if we just let FB continue as is, and allow other media companies to do as they already are and analyzing FB’s problems relentlessly, soon FB will join Elvis, Nixon, and MySpace in ancient history books. FB’s crash is imminent, please don’t get in the way of its collapse, PLEASE.
They are a fashion, soon to be forgotten. Except, if this article’s advice is taken, and FB gets taken from fashion of the moment into integrated government-supported BigBother
13 hours ago
Report
Most popular on The Conversation
Remembering Hugh Masekela: the horn player with a shrewd ear for music of the day
West Africa: empirehood and colonialism offer lessons in integration
Cape Town water crisis: crossing state and party lines isn’t the answer
Free higher education in South Africa: cutting through the lies and statistics
South African news station ANN7 is on the skids: why it won’t be missed
Did Steinhoff’s board structure contribute to the scandal?
Zimbabwe’s LGBT community: why civil rights and health issues go hand in hand
Is the net about to close on Zuma and his Gupta patronage network?
Is South Africa seeing a return to the rule of law? More evidence is needed
Cape Town’s water crisis: driven by politics more than drought
Expert Database
Find experts with knowledge in:*
Want to write?
Write an article and join a growing community of more than 62,000 academics and researchers from 2,272 institutions.
Register now
The Conversation
Community
Community standards
Republishing guidelines
Research and Expert Database
Analytics
Job Board
Our feeds
Company
Who we are
Our charter
Our team
Partners and funders
Contributing institutions
Resource for media
Contact us
Stay informed
Subscribe to our Newsletters
Follow us on social media
Privacy policy Terms and conditions Corrections
Copyright © 2010–2018, The Conversation Africa, Inc.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment