• Advertise
  • About us
  • Terms and Conditions
  • Contact us
Friday, May 20, 2022
Australian Times News
  • News
    • Weather
    • Sport
    • Technology
    • Business & Finance
      • Currency Zone
    • Lotto Results
      • The Lott
      • UK Lottery
      • UK Lotto
      • EuroMillions
  • Lifestyle
    • Horoscopes
    • Health & Wellness
    • Recipes
    • Video
  • Entertainment
  • Travel
  • Expat Life
  • Move to Australia
No Result
View All Result
  • News
    • Weather
    • Sport
    • Technology
    • Business & Finance
      • Currency Zone
    • Lotto Results
      • The Lott
      • UK Lottery
      • UK Lotto
      • EuroMillions
  • Lifestyle
    • Horoscopes
    • Health & Wellness
    • Recipes
    • Video
  • Entertainment
  • Travel
  • Expat Life
  • Move to Australia
No Result
View All Result
Australian Times News
No Result
View All Result
Home News

Facebook’s problem is more complicated than fake news

OPINION: Do Facebook’s filtering algorithms explain why so many liberals had misplaced confidence in a Clinton victory? And is the fake news being circulated on Facebook the reason that so many Trump supporters have endorsed demonstrably false statements made by their candidate?

R. Kelly Garrett by R. Kelly Garrett
21-11-2016 03:05
in News

In the wake of Donald Trump’s unexpected victory, many questions have been raised about Facebook’s role in the promotion of inaccurate and highly partisan information during the presidential race and whether this fake news influenced the election’s outcome.

A few have downplayed Facebook’s impact, including CEO Mark Zuckerberg, who said that it is “extremely unlikely” that fake news could have swayed the election. But questions about the social network’s political significance merit more than passing attention.

Do Facebook’s filtering algorithms explain why so many liberals had misplaced confidence in a Clinton victory (echoing the error made by Romney supporters in 2012)? And is the fake news being circulated on Facebook the reason that so many Trump supporters have endorsed demonstrably false statements made by their candidate?

The popular claim that “filter bubbles” are why fake news thrives on Facebook is almost certainly wrong. If the network is encouraging people to believe untruths – and that’s a big if – the problem more likely lies in how the platform interacts with basic human social tendencies. That’s far more difficult to change.

A misinformed public

Facebook’s role in the dissemination of political news is undeniable. In May 2016, 44 percent of Americans said they got news from the social media site. And the prevalence of misinformation disseminated through Facebook is undeniable.

It’s plausible, then, that the amount of fake news on a platform where so many people get their news can help explain why so many Americans are misinformed about politics.

But it’s hard to say how likely this is. I began studying the internet’s role in promoting false beliefs during the 2008 election, turning my attention to social media in 2012. In ongoing research, I’ve found little consistent evidence that social media use promoted acceptance of false claims about the candidates, despite the prevalence of many untruths. Instead, it appears that in 2012, as in 2008, email continued to be a uniquely powerful conduit for lies and conspiracy theories. Social media had no reliably detectable effect on people’s beliefs.

AlsoRead...

May's Weather Forecast

Weather Forecast 20 May 2022

20 May 2022
Monkeypox detected in the UK and US this week but no cause for alarm

Monkeypox detected in the UK and US this week but no cause for alarm

19 May 2022

For a moment, however, let’s suppose that 2016 was different from 2012 and 2008. (The election was certainly unique in many other regards.)

If Facebook is promoting a platform in which citizens are less able to discern truth from fiction, it would constitute a serious threat to American democracy. But naming the problem isn’t enough. To fight the flow of misinformation through social media, it’s important to understand why it happens.

Don’t blame filter bubbles

Facebook wants its users to be engaged, not overwhelmed, so it employs proprietary software that filters users’ news feeds and chooses the content that will appear. The risk lies in how this tailoring is done.

There’s ample evidence that people are drawn to news that affirms their political viewpoint. Facebook’s software learns from users’ past actions; it tries to guess which stories they are likely to click or share in the future. Taken to its extreme, this produces a filter bubble, in which users are exposed only to content that reaffirms their biases. The risk, then, is that filter bubbles promote misperceptions by hiding the truth.

The appeal of this explanation is obvious. It’s easy to understand, so maybe it’ll be easy to fix. Get rid of personalized news feeds, and filter bubbles are no more.

The problem with the filter bubble metaphor is that it assumes people are perfectly insulated from other perspectives. In fact, numerous studies have shown that individuals’ media diets almost always include information and sources that challenge their political attitudes. And a study of Facebook user data found that encounters with cross-cutting information is widespread. In other words, holding false beliefs is unlikely to be explained by people’s lack of contact with more accurate news.

Instead, people’s preexisting political identities profoundly shape their beliefs. So even when faced with the same information, whether it’s a news article or a fact check, people with different political orientations often extract dramatically different meaning.

A thought experiment may help: If you were a Clinton supporter, were you aware that the highly respected prediction site FiveThirtyEight gave Clinton only a 71 percent chance of winning? Those odds are better than a coin flip, but far from a sure thing. I suspect that many Democrats were shocked despite seeing this uncomfortable evidence. Indeed, many had been critical of this projection in the days before the election.

If you voted for Trump, have you ever encountered evidence disputing Trump’s assertion that voter fraud is commonplace in the U.S.? Fact checkers and news organizations have covered this issue extensively, offering robust evidence that the claim is untrue. However a Trump supporter might be unmoved: In a September 2016 poll, 90 percent of Trump supporters said they didn’t trust fact checkers.

Facebook = angry partisans?

If isolation from the truth really is the main source of inaccurate information, the solution would be obvious: Make the truth more visible.

Unfortunately, the answer isn’t that simple. Which brings us back to the question of Facebook: Are there other aspects of the service that might distort users’ beliefs?

It will be some time before researchers can answer this question confidently, but as someone who has studied how the various ways that other internet technologies can lead people to believe false information, I’m prepared to offer a few educated guesses.

There are two things that we already know about Facebook that could encourage the spread of false information.

First, emotions are contagious, and they can spread on Facebook. One large-scale study has shown that small changes in Facebook users’ news feeds can shape the emotions they express in later posts. In that study, the emotional changes were small, but so were the changes in the news feed that caused them. Just imagine how Facebook users respond to widespread accusations of candidates’ corruption, criminal activity and lies. It isn’t surprising that nearly half (49 percent) of all users described political discussion on social media as “angry.”

When it comes to politics, anger is a powerful emotion. It’s been shown to make people more willing to accept partisan falsehoods and more likely to post and share political information, presumably including fake news articles that reinforce their beliefs. If Facebook use makes partisans angry while also exposing them to partisan falsehoods, ensuring the presence of accurate information may not matter much. Republican or Democrat, angry people put their trust in information that makes their side look good.

Second, Facebook seems to reinforce people’s political identity – furthering an already large partisan divide. While Facebook doesn’t shield people from information they disagree with, it certainly makes it easier to find like-minded others. Our social networks tend to include many people who share our values and beliefs. And this may be another way that Facebook is reinforcing politically motivated falsehoods. Beliefs often serve a social function, helping people to define who they are and how they fit in the world. The easier it is for people to see themselves in political terms, the more attached they are to the beliefs that affirm that identity.

These two factors – the way that anger can spread over Facebook’s social networks, and how those networks can make individuals’ political identity more central to who they are – likely explain Facebook users’ inaccurate beliefs more effectively than the so-called filter bubble.

If this is true, then we have a serious challenge ahead of us. Facebook will likely be convinced to change its filtering algorithm to prioritize more accurate information. Google has already undertaken a similar endeavor. And recent reports suggest that Facebook may be taking the problem more seriously than Zuckerberg’s comments suggest.

But this does nothing to address the underlying forces that propagate and reinforce false information: emotions and the people in your social networks. Nor is it obvious that these characteristics of Facebook can or should be “corrected.” A social network devoid of emotion seems like a contradiction, and policing who individuals interact with is not something that our society should embrace.

It may be that Facebook shares some of the blame for some of the lies that circulated this election year – and that they altered the course of the election.

If true, the challenge will be to figure out what we can do about it.

_____________________

By R. Kelly Garrett, Associate Professor of Communication, The Ohio State University

This article was originally published on The Conversation. Read the original article.

Tags: Facebookmediaopinionpoliticssocial mediatechnology
DMCA.com Protection Status

SUBSCRIBE to our NEWSLETTER

[mc4wp_form id=”2384248″]

Don't Miss

Horoscopes: 20 May 2022 – Friday

by Adamu
20 May 2022
Free Daily Horoscope - Astrology
Horoscopes

Keep your karma positive with these daily free horoscopes!

Read more

Weather Forecast 20 May 2022

by Adamu
20 May 2022
May's Weather Forecast
Australia Weather

Be prepared for any weather with our daily weather forecast for Australia.

Read more

How And Where To Get The Best Hair Transplant In Turkey

by Alan Aldridge
20 May 2022
Image credit: https://cosmedica.com/
Lifestyle

These patients don't just go for the premium treatment but also to enjoy it at a very reduced cost. This...

Read more

Kendrick Lamar announces new tour dates for Melbourne and Sydney

by Shannon Alexander
20 May 2022
Kendrick Lamar announces new tour dates for Melbourne and Sydney
Lifestyle

Kendrick Lamar released his new album Mr. Morale and the Big Steppers on May 13th and now he is set...

Read more

Johnny Depp: A documentary titled Johnny vs Amber to be released

by Shannon Alexander
19 May 2022
Johnny Depp: A documentary titled Johnny vs Amber to be released
Lifestyle

While the media and the public has their eyes set on Johnny Depp and Amber Heard, Discovery channel will be...

Read more

Monkeypox detected in the UK and US this week but no cause for alarm

by Shannon Alexander
19 May 2022
Monkeypox detected in the UK and US this week but no cause for alarm
News

The US and the UK has reported a few cases of Mokeypox this week. But health authorities in Australia says...

Read more

Taylor Swift graduates from NYU with an Honorary doctorate

by Shannon Alexander
19 May 2022
Taylor Swift graduates from NYU with an Honorary doctorate
Lifestyle

The swifties are in such a frenzy as their favorite artist Taylor Swift has graduated from the New York University...

Read more
Load More

Copyright © Blue Sky Publications Ltd. All Rights Reserved.
australiantimes.co.uk is a division of Blue Sky Publications Ltd. Reproduction without permission prohibited. DMCA.com Protection Status

  • About us
  • Write for Us
  • Advertise
  • Contact us
  • T&Cs, Privacy and GDPR
No Result
View All Result
  • News
    • Weather
    • Sport
    • Technology
    • Business & Finance
      • Currency Zone
    • Lotto Results
      • The Lott
      • UK Lottery
      • UK Lotto
      • EuroMillions
  • Lifestyle
    • Horoscopes
    • Health & Wellness
    • Recipes
    • Video
  • Entertainment
  • Travel
  • Expat Life
  • Move to Australia

Copyright © Blue Sky Publications Ltd. All Rights Reserved.
australiantimes.co.uk is a division of Blue Sky Publications Ltd. Reproduction without permission prohibited. DMCA.com Protection Status