4 Ways To Protect Yourself From Disinformation

4 Ways To Protect Yourself From Disinformation

You might have fallen for someone’s attempt to disinform you about current events. But it’s not your fault.

Even the most well-intentioned news consumers can find today’s avalanche of political information difficult to navigate. With so much news available, many people consume media in an automatic, unconscious state – similar to knowing you drove home but not being able to recall the trip.

And that makes you more susceptible to accepting false claims.

But, as the 2020 elections near, you can develop habits to exert more conscious control over your news intake. I teach these strategies to students in a course on media literacy, helping people become more savvy news consumers in four simple steps.

1. Seek out your own political news

Like most people, you probably get a fair amount of your news from apps, sites and social media such as Twitter, Facebook, Reddit, Apple News and Google. You should change that.

These are technology companies – not news outlets. Their goal is to maximize the time you spend on their sites and apps, generating advertising revenue. To that end, their algorithms use your browsing history to show you news you’ll agree with and like, keeping you engaged for as long as possible.

That means instead of presenting you with the most important news of the day, social media feed you what they think will hold your attention. Most often, that is algorithmically filtered and may deliver politically biased information, outright falsehoods or material that you have seen before.

Instead, regularly visit trusted news apps and news websites directly. These organizations actually produce news, usually in the spirit of serving the public interest. There, you’ll see a more complete range of political information, not just content that’s been curated for you.

If there are numbers, check the math yourself. Picsfive/Shutterstock.com

2. Use basic math

Untrustworthy news and political campaigns often use statistics to make bogus claims – rightfully assuming most readers won’t take the time to fact-check them.

Simple mathematical calculations, which scholars call Fermi estimates or rough guesstimates, can help you better spot falsified data.

For instance, a widely circulated meme falsely claimed 10,150 Americans were “killed by illegal immigrants” in 2018. On the surface, it’s hard to know how to verify or debunk that, but one way to start is to think about finding out how many total murders there were in the U.S. in 2018.

Murder statistics can be found in, among other places, the FBI’s statistics on violent crime. They estimate that in 2018 there were 16,214 murders in the U.S. If the meme’s figure were accurate, it would mean that nearly two-thirds of U.S. murders were committed by the “illegal immigrants” the meme alleged.

Next, find out how many people were living in the U.S. illegally. That group, most news reports and estimates suggest, numbers about 11 million men, women and children – which is only 3% of the country’s 330 million people.

Just 3% of people committed 60% of U.S. murders? With a tiny bit of research and quick math, you can see these numbers just don’t add up.

3. Beware of nonpolitical biases

News media are often accused of catering to people’s political biases, favoring either liberal or conservative points of view. But disinformation campaigns exploit less obvious cognitive biases as well. For example, humans are biased to underestimate costs or look for information that confirms what they already believe. One important bias of news audiences is a preference for simple soundbites, which often fail to capture the complexity of important problems. Research has found that intentionally fake news stories are more likely to use short, nontechnical and redundant language than accurate journalistic stories.

Also beware of the human tendency to believe what’s in front of your eyes. Video content is perceived as more trustworthy – even though deepfake videos can be very deceiving. Think critically about how you determine something is accurate. Seeing – and hearing – should not necessarily be believing. Treat video content with just as much skepticism as news text and memes, verifying any facts with news from a trusted source.

You won’t – and shouldn’t – believe what Barack Obama says in this video.

4. Think beyond the presidency

A final bias of news consumers and, as a result, news organizations has been a shift toward prioritizing national news at the expense of local and international issues. Leadership in the White House is certainly important, but national news is only one of four categories of information you need this election season.

Informed voters understand and connect issues across four levels: personal interests – like a local sports team or health care costs, news in their local communities, national politics and international affairs. Knowing a little in each of these areas better equips you to evaluate claims about all the others.

For example, better understanding trade negotiations with China could provide insight into why workers at a nearby manufacturing plant are picketing, which could subsequently affect the prices you pay for local goods and services.

Big businesses and powerful disinformation campaigns heavily influence the information you see, creating personal and convincing false narratives. It’s not your fault for getting duped, but being conscious of these processes can put you back in control.

Elizabeth Stoycheff, Associate Professor of Communication, Wayne State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Can Hiding Likes Make Facebook Fairer And Rein In Fake News? The Science Says Maybe

Can Hiding Likes Make Facebook Fairer And Rein In Fake News? The Science Says Maybe


On Facebook, we like what other people have already liked before us.
Shutterstock

Marian-Andrei Rizoiu, University of Technology Sydney

This is the first article in a series looking at the attention economy and how online content gets in front of your eyeballs.


You may have read about – or already seen, depending on where you are – the latest tweak to Facebook’s interface: the disappearance of the likes counter.

Like Instagram (which it owns), Facebook is experimenting with hiding the number of likes that posts receive for users in some areas (Australia for Facebook, and Canada for Instagram).

In the new design, the number of likes is no longer shown. But with a simple click you can see who liked the post and even count them.

It seems like Facebook is going to a lot of trouble to hide a seemingly innocuous signal, especially when it is relatively easy to retrieve.

Facebook prototypes hiding like counts.

Facebook’s goal is reportedly to make people comfortable expressing themselves and to increase the quality of the content they share.

There are also claims about ameliorating user insecurity when posting, perceived liberty of expression, and circumventing the herd mentality.

But are there any scientific grounds for this change?

The MusicLab model

In 2006, US researchers Matthew Salganik, Peter Dodds and Duncan Watts set out to investigate the intriguing disconnect between quality and popularity observed in cultural markets.

They created the MusicLab experiments, in which users were presented with a choice of songs from unknown bands. Users would listen online and could choose to download songs they liked.

The users were divided into two groups: for one group, the songs were shown at random with no other information; for the other group, songs were ordered according to a social signal – the number of times each had already been downloaded – and this number was shown next to them.




Read more:
Users (and their bias) are key to fighting fake news on Facebook – AI isn’t smart enough yet


A song’s number of downloads is a measure of its popularity, akin to the number of likes for Facebook posts.

The results were fascinating: when the number of downloads was shown, the song market would evolve to be highly unequal (with one song becoming vastly more popular than all the others) and unpredictable (the winning song would not be the same if the experiment were repeated).

Based on these results, Australian researchers proposed the first model (dubbed the MusicLab model) to explain how content becomes popular in cultural markets, why a few things get all the popularity and most get nothing, and (most important for us) why showing the number of downloads is so detrimental.

They theorised that the consumption of an online product (such as a song) is a two-step process: first the user clicks on it based on its appeal, then they download it based on its quality.

As it turns out, a song’s appeal is largely determined by its current popularity.
If other people like something, we tend to think it’s worth taking a look at.

So how often a song will be downloaded in future depends on its current appeal, which in turn depends on its current number of downloads.

This leads to the well-known result that future popularity of a product or idea is highly dependent on its past popularity. This is also known as the “rich get richer” effect.

What does this have to do with Facebook likes?

The parallel between Facebook and the MusicLab experiment is straightforward: the songs correspond to posts, whereas downloads correspond to likes.

For a market of products such as songs, the MusicLab model implies that showing popularity means fewer cultural products of varying quality are consumed overall, and some high-quality products may go unnoticed.

But the effects are even more severe for a market of ideas, such as Facebook.
The “rich get richer” effect compounds over time like interest on a mortgage.
The total popularity of one idea can increase exponentially and quickly dominate the entire market.

As a result, the first idea on the market has more time to grow and has increased chances of dominating regardless of its quality (a strong first-mover advantage).




Read more:
We made deceptive robots to see why fake news spreads, and found a weakness


This first-mover advantage partially explains why fake news items so often dominate their debunking, and why it is so hard to replace wrong and detrimental beliefs with correct or healthier alternatives that arrive later in the game.

Despite what is sometimes claimed, the “marketplace of ideas” is no guarantee that high-quality content will become popular.

Other lines of research suggest that while quality ideas do make it to the top, it is next to impossible to predict early which ones. In other words, quality appears disconnected from popularity.

Is there any way the game can be fixed?

This seems to paint a bleak picture of online society, in which misinformation, populist ideas, and unhealthy teen challenges can freely flow through online media and capture the public’s attention.

However, the other group in the MusicLab experiment – the group who were not shown a popularity indicator – can give us hope for a solution, or at least some improvement.

The researchers reported that hiding the number of downloads led to a much fairer and more predictable market, in which popularity is more evenly distributed among a greater number of competitors and more closely correlated to quality.

So it appears that Facebook’s decision to hide the number of likes on posts could be better for everyone.

In addition to limiting pressure on post creators and reducing their levels of anxiety and envy, it might also help to create a fairer information exchange environment.

And if posters spend less time on optimising post timing and other tricks for gaming the system, we might even notice an increase in content quality.The Conversation

Marian-Andrei Rizoiu, Lecturer in Computer Science, University of Technology Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.