Exploring the impact of misinformation on the general public


It is everywhere, but is it working?

Professors at Duke University gathered for a panel on digital disinformation and so called "fake news," addressing the various challenges it poses to society and how it might be addressed. Bill Adair, a professor of journalism at Duke’s Sanford School of Public Policy, said that digital misinformation has begun to spread throughout every facet of the world.

"We just see in every corner of the world, in every corner of our lives ... there is just so much misinformation," he said. "It pops up in such insidious ways. It’s really scary.”

It is scary. But possibly the scariest aspect of this issue is the inevitable trend for people to (eventually) disbelieve everything they read, regardless of the bonafides of the source. Sowing distrust is a major goal of many of the players (Russia in particular), and it will be hard as hell to track the responsibility for that back to the original sources of misinformation. But at least one Duke researcher doesn't believe it's having much impact on opinions:

But how detrimental disinformation can be is not perfectly clear, at least according to Sunshine Hillygus, a professor of political science who studies election behavior. Hillygus cautions that she is "often times more concerned about media coverage of misinformation than about the misinformation itself.”

“The thing I would emphasize is that widespread existence is not the same as widespread impact on the public, she said. “While it’s very popular to see a headline in the media about how dumb the American public is because they believe in something ... a lot of times the evidence backing up these beliefs is as much the fault of the pollster as it is the public.”

She said that politically, disinformation can be a symptom of polarization, not necessarily the sole cause of it.

“There’s no doubt that misinformation is becoming part of the warfare of political polarization," Hillygus said. "Often times though it is about the one side talking about how dumb the other side is because they believe in some type of misinformation.”

I won't dispute that much of the chatter on social media is an effort to point a finger at the other side for believing x or y. But while that may be classified as negative reinforcement, it also works pretty well, depending on how you do it. You can't just say, "You're stupid." You also need to demonstrate why; show proof why that misinformed belief is wrong. It doesn't always work, but it's the only way it might work.

Professor Hillygus isn't just shooting from the hip, she and her colleagues dug into the Russian disinformation issue last year, to see if it actually changed opinions:

Each person in the study was measured on six political attitudes and behaviours, including where they placed themselves on the political spectrum, and how they would feel if a family member married someone politically opposed to them. Despite media fear, people remained steadfast in their opinions, even after being targeted by IRA accounts.

“While there are all kinds of reasons it is troubling and worthy of concern by the US government and the American public, interaction with IRA accounts didn’t change people’s attitudes or behaviours,” says Hillygus.

Saif Shahin at American University in Washington DC isn’t surprised by the results. Tweets posted by IRA accounts represented 0.1 per cent of all the liking, retweeting and responding that the participants did on Twitter. “People who make the claim this affected their behaviour would be making the claim 0.1 per cent of their activities on Twitter are more significant than the other 99.9 per cent,” says Shahin.

Even if the IRA hired more people to pump out tweets, it would be unlikely to make a difference. People aren’t so easily manipulated by propaganda, says Hillygus.

Okay, first of all: By making them aware they were being studied, the results are likely flawed from the outset. Nobody wants to admit they've been manipulated. But as I mentioned above, one of my biggest concerns is the cumulative macro effect. And it doesn't just come from Russian trolls, the click-baity trend on local media outlets is just as bad.

Case in point: Go to any of your local tv station web pages (especially network affiliates), and scroll down their "news" section. You'll see a few local stories, but then you will see "local" news from other states. Woman with baby was car-jacked in Des Moines, teacher was arrested for sex with a student in Jacksonville (Florida), etc.,etc. These things might be important where they occur, but here? The only purpose these stories serve (here) is to create drama and fear. It's not fake news, but it's not relevant news, either. I recently told a Facebook friend that an article she shared was actually in another state. Her answer? "I'm sure it's happening here too."

As information consumers, we all need to exercise discretion. But we also need media outlets and Internet platforms to do the same thing.



The image above

is one of the many "local news" outlets created by shadowy conservative groups in an effort to sway opinions. Most of their stuff is legit news aggregated from other outlets, but the cumulative effect leans in a certain direction.

Scary sh ...

"I'm sure it's happening here too."

That's the holy grail for people like Trump and Tillis. They're betting they can scare people into submission, and they might be right.