JimWes Thinks

January 4, 2020

THE DISINFORMATION NATION

Filed under: credibility, government, Truth, Uncategorized — jimwes @ 6:34 pm

Here are two long extracts from an obscure article[1] titled The Disinformation Nation. In my opinion, this article could have been great had its author resisted the today popular temptation to mix opinion with fact.[2] It contains much factual information about disinformation that needs to be read, especially by Netcolics like me. These extracts mostly appear to be quite factual and unbiased. Unfortunately the central part of the article (not reproduced here) degenerates into intense Trumphobia for a good way, finally returning to mostly factual and helpful information about disinformation. Following its own advice, I hesitated to quote from it. But I really believe that these extracts are worth reading and paying strict attention so here goes. Use you own intuition about its content. I personally think it is a brilliant article even thought I disagree with parts of it.

PART ONE

>> It may be getting harder and harder to figure out the truth, but at least this much is clear: It’s a good time to be a liar.

We’ve spent three years arguing if fake news swung the 2016 election — debating whether the hordes of Russian bots, hoax Facebook pages and inflammatory, dishonest tweets tipped the democratic balance to elect Donald Trump as president.

 

Yet in those same years we’ve learned that the stakes in the fight against truth, in a muddy world of social media platforms, go beyond politics.

In Brazil, public health workers were attacked after far-right activists lied on YouTube that they were spreading the Zika virus. In Myanmar, government soldiers used fake Facebook accounts to drive an ethnic cleansing, full of incendiary claims and false stories about Muslim minorities raping Buddhist women. Gunmen radicalized by false white-supremacist conspiracies on internet forums like 4chan and 8chan shot up a synagogue in California, a Walmart in Texas and mosques in New Zealand.

Elections have consequences. So do algorithms.

So now, heading into the 2020 election, experts are warning that trolls, hoaxers and dishonest politicians are arming themselves with a whole new arsenal of weapons of mass deception. New technology is making it easier to hoax audio and video, while advances in artificial intelligence are making it all the more difficult to weed out computer-automated “bot” accounts.

And there’s a deeper risk, beyond figuring out the inaccuracy of any one article.

The deluge of misinformation — full of Trump tweets, deepfakes, InfoWars videos, Russian bots, 4chan trolls, that Washington Post correction, those out-of-context memes and your great aunt’s latest questionable Facebook post — has become so overwhelming that some of us may simply give up trying to make sense of it all.

A lie doesn’t need to be believed. It just needs to create enough doubt that the truth becomes polluted. With enough pollution, it’s impossible to see what’s right in front of you.

“When you’re flooded with so much bullshit,” New York Times media columnist Charlie Warzel says, separating fact from fiction becomes so difficult that “the task of trying to do it becomes, you know, tiresome, so you just stop.”

It’s the sort of thing your college philosophy professor might call an “epistemic crisis.” We don’t know what to believe. Truth is hazy. Reality itself becomes irrelevant. It’s a phenomenon that has already happened in places like Russia and the Philippines — and experts say that in the past few years, the United States has suddenly found itself on the same path.

“And that, to me, is one of the scariest things to think about,” Warzel says. “It feels like we’ve come incredibly far since 2015.”

THE WEB OF CONSPIRACY 

History has a pattern.

An advancement in communications technology hands liars the means to lie louder and spread those lies further. Look at the 1830s, when the invention of the steam printing press and other paper-making technologies produced the rise of the “penny press.” Newspapers became cheaper, more independent, more widespread, more competitive, and eager publishers found the power of the 19th-century version of clickbait. The New York-based Sun put out a series of entirely fictional stories that purported that “man-bats” and other exotic creatures were scurrying around on the moon.

Soviet-born British TV producer Peter Pomerantsev, author of This is Not Propaganda: Adventures in the War Against Reality, argues that when tech rips open the floodgates of communication, the bad guys always find a way to exploit it. Dictators quickly harnessed the power of radio. Joseph McCarthy, as a U.S. senator in the ’50s, used television to spread his anti-Communist conspiracy theories.

Yet for decades, the internet was heralded as a new frontier that allowed “citizen journalists” to take on the stodgy media elite. In 1998, the Drudge Report, a right-wing news-aggregating website, broke the Monica Lewinsky scandal when Newsweek got cold feet. In 2004, when Dan Rather and 60 Minutes put out a 1973 memo purporting to show that President George W. Bush had received special treatment while in the Texas Air National Guard, Drudge elevated the conservative bloggers who persuasively argued the memo was a fake written in Microsoft Word.

An “Army of Davids” — as some bloggers dubbed themselves — swarmed to debunk flawed media accounts, trying to counter bias wherever they saw it. The gatekeepers were being overthrown, the drawbridge had been flung open and the villagers could storm the castle.

But the villagers had their own standards for newsworthiness. Drudge also sent his readers to darker corners, where sketchy websites claimed Barack Obama wasn’t an American citizen and Bill Clinton had a secret love child. Drudge even provided fuel for “Pizzagate,” the conspiracy that drove a man in 2016 to fire an AR-15 inside a pizzeria, because the internet told him that they were harboring child sex slaves.

Conspiracy theorists used to spread their gospel through books, newsletters, public access television shows, and by standing on street corners and handing out fliers. But the web gave every community a niche — no matter how fringe — and allowed them to spread their message in only a few keystrokes. On the internet, the corkboard is infinite and the spool of yarn used to connect pictures of shadowy figures never runs out.

The internet, Warzel says, handed fringe figures like Alex Jones of InfoWars a powerful new megaphone.

“He was one of the early pioneers of internet radio and video,” Warzel says. “It was a way to get around the notion that it was hard to sell advertising around some of his kooky ideas

An audience of millions repeatedly tuned into Jones’ red-faced rants about 9/11 being an inside job, Obama chemtrails turning frogs gay, and the Sandy Hook shootings being faked. Drudge repeatedly linked to him.

Social media sites only accelerated the spread of misinformation. It’s easier than ever for a single comment, particular an untrue one, to go viral. In ancient times, the opinions of quacks were largely quarantined to a newspaper’s page of letters to the editor.

New York Times reporter Maggie Haberman and racist Twitter randos with names like “@WhiteGenocideTM” are all simmering in the same stew together. Both, after all, get retweeted by the president.

A Massachusetts Institute of Technology study published last year took a look at over a decade of Twitter posts and found that tweets about false news went viral six times faster than tweets about true news. After all, lies are often more sensational, tapping into human emotions of shock, fear and disgust.

It wasn’t just that humans were more likely to share these kinds of stories. It was that Facebook, Twitter and YouTube developed algorithms to elevate certain types of content into your social media feed. It usually didn’t matter if they were true — social media sites didn’t want to become the truth police. It mattered that the stories drew people in.

“The way they keep people clicking and sharing and commenting is prioritizing things that get your heart pumping,” says Andrew Marantz, author of Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation. “It’s like stocking a huge grocery store, but all of the visible aisles are Oreos and rat poison.”

YouTube actually started rewarding conspiracy theories above popular content. YouTube used to have what they internally called the “Gangnam Style” problem, where YouTube’s autoplaying recommendation engine would eventually send every viewer to the 2012 South Korean pop hit. In response, YouTube changed their algorithm in 2015, turning down the recommendation dial for merely popular videos and cranking up the preference for videos that led people down rabbit holes. Conspiracy-theory videos flourished.

Simultaneously, the internet had handed brand-new weapons to pranksters, vandals and assholes — “trolls” who could use misinformation and harassment to make life hellish for chosen targets. Image boards like 4chan combined anonymity and a near-total absence of moderation to become a frothing hive of racists, trolls and trolls pretending to be racists.

The boards delighted in pulling hoaxes — creating fake Jewish Twitter accounts to sow discord in the Jewish community, publishing coupons claiming black people were getting free coffee at Starbucks, and attempting to trick journalists into identifying mass shooters as the wrong person.

Sometimes the hoaxes became reality. A 4chan scheme to trick mainstream media outlets into reporting that the “OK” hand gesture was a white-supremacist sign resulted in white supremacists actually adopting the signal.

One particularly pernicious trolling tactic was to call 911 from a spoofed number and report a horrific crime, in hopes an armed SWAT team would descend on that location.

Marantz says he spent three years embedded in this world.

“There are people who just want to watch the world burn,” Marantz says. “And that’s a phrase I returned to again and again.”

The motivations vary. In Macedonia, Warzel says, there are clickfarms filled with teenagers pumping out hoax news stories for fake publications, buying Facebook likes, all as a way to make money.

“It’s essentially just like a lemonade stand for them,” he says. But there are also foreign governments trying to influence global trends, politicians trying to game power and true believers who spread falsehoods because they think it’s the truth.

“To some degree, it doesn’t matter as long as there’s power to be gained and money to be made,” Warzel says. <<

PART TWO

>> In August, the Pentagon started talking to partners for their new Semantic Forensics program, intending to develop technologies “to help to identify, understand and deter adversary disinformation campaigns.”

The private sector’s pushing for similar measures.

Last month, Facebook, Microsoft and a slew of research institutions announced they were joining forces for the “Deepfake Detection Challenge,” a contest to better understand the little clues that give even sophisticated deepfakes away. Deepfakes rarely blink in the right way. The heads might have a strange tic. The eye color might be off.

Facebook chipped in $10 million to the effort. But those trying to create hoaxes are innovating, too, trying to think of ways to outthink the detection system.

“Networks of bots are behaving more and more like you and me,” Warzel says.

Ultimately, he says, it may come down to two different artificial-intelligence systems trying to outthink each other.

“You basically have two sets of computers playing war games with each other,” Warzel says.

The fight isn’t just about technology. It’s about corporate policies. In the last two years, tech companies have tried to change their policies, ditching their laissez-faire libertarian approach to try their hand at benevolent censorship.

White supremacists and conspiracy theorists like Alex Jones got banned from Facebook, Twitter and YouTube. YouTube shifted viewers away from straight-up conspiracy theory videos in its recommendation stream — although liberals may be unhappy to learn they often landed at Fox News instead. Twitter banned the #Resistance-tweeting Krassenstein brothers in June, citing rules that prohibit “operating multiple fake accounts and purchasing account interactions.”

Right now, both major political parties are calling for regulation, including raising the prospect of forcing Facebook to shrink in size. But Republicans and Democrats want different things. While liberals complain about lax regulation allowing “Nazis” to run wild on the site, conservatives fret about overregulation, worried that conservatives could be censored for their political opinions.

But the lack of censorship is dangerous, too, argue some experts. Whitney Phillips, author of the forthcoming book You Are Here: A Field Guide for Navigating Network Pollution, points to YouTube and Facebook’s recent announcement that since political statements were newsworthy, the sites would rarely take down posts from politicians, even if the posts broke the rules.

“At every turn, at every conceivable opportunity, despite how loud the chorus might get, these technology companies made a choice to protect their bottom line over protecting the democratic process,” Phillips says.

Facebook’s motto for its developers was “move fast and break things.” Phillips thinks they were successful.

“Yeah, they’ve broken democracy,” Phillips says. “There’s no more simple way to describe it.”

TOO MUCH INFORMATION

Phillips wants to make it clear that it’s not just Facebook or Twitter’s fault. It’s not just the fault of Alex Jones or Donald Trump or 4chan. It’s your fault, too.

“A lot of the misinformation being spread is not the result of bad actors,” Phillips says. “It’s everyday people doing everyday things.”

She thinks of it in terms of an ecological metaphor, where pollution is the accumulation of a billion little actions from individuals. All of the tweeting, retweeting and Facebook posting adds up.

“We’re sort of at the whims of everyday folks, disinformation agents, algorithms, white supremacists, all jockeying to win the attention economy,” Phillips says. “The result is an air that is so clogged that we can barely breathe.”

In that environment, with so many different competing and contradictory claims, people “don’t even necessarily trust there is such a thing as truth.”

But Phillips doesn’t necessarily agree that more information is the answer. Journalists like to say that sunlight is the best disinfectant. But Phillips argues that sometimes the sunlight simply heats up the petri dish and spreads the disease — especially when people are liable to believe a hoax is true because a journalist says it isn’t.

“The truth can contribute to pollution as much as falsehood can,” Phillips says. “It is easy to feel like you are pushing back against a story when you are saying, ‘This story’s terrible.’ But the algorithm doesn’t care about your righteous indignation. The algorithm cares that you’re engaging with content.”

She urges journalists and everyday people to shift the lens, focusing less on the liars and more on how lies and ideologies have impacted communities.

Warzel, meanwhile, also urges social media users to slow down. Be wary about clicking that retweet button. If a story seems too perfect, doubt it. If a crazy news story doesn’t come from an established media outlet, wait until at least one outlet covers it — ideally two.

Marantz, the expert on online trolls, says the long-term solution to the disinformation crisis is a deep and philosophical one that he’d explain at length with phrases like “reaffirming our commitment to epistemic depth.” But for now, the simpler way to react to disinformation is to rely a little bit more on the old gatekeepers.

“If you read the New York Times or the BBC or the alt-weekly in your town or the New Yorker, you’re going to be better informed than if you read Facebook,” Marantz says.

Not because they’re perfect — there’s a billion reasons to complain about mainstream journalists, he says — but because, for all their flaws, right now they’re the best we’ve got.

“It’s the best short-term solution,” he says, “as opposed to just living a world where no one knows anything.” ♦

HOW TO AVOID SPREADING MISINFORMATION

Wait before reposting. You won’t need to apologize for forwarding untrue information if you never share it to begin with.

Don’t share something just because it comes from a friend. Double check the source to make sure the reporting is from a respectable publication and that they’re not just summarizing the reaction on social media. Better yet, wait until a second publication independently confirms the reports.

Read the actual story first. Follow links to make sure the links actually back up the news stories. Biased news sources are infamous for making sensational claims in their headlines that the underlying material doesn’t support.

Be cautious about sharing bogus stories just to point how stupid or wrong they are. That’s an easy way to inadvertently spread a falsehood. <<

[1] I first read the article “The Disinformation Nation” excerpted here in the Boulder, Colorado Weekly (https://www.boulderweekly.com/ne…/the-disinformation-nation/ ). It originated in The Inlander of  Spokane, Washington.(https://www.inlander.com/spokane/trolls-conspiracy-theorists-hoaxers-and-trump-have-twisted-facebook-youtube-and-the-news-to-toxic-levels-and-its-only-getting-worse/Content?oid=18415723&fbclid=IwAR2SLoUzigEW4daIrkiQWtObIRHtCtAlkduT_YvFOveXrY1c75FUF7NZdR4 ).

[2] The author, Daniel Walters, was born and raised in Spokane. He has been writing for the Inlander since 2008. In that time, he’s written about investor fraud, online bullying and how Facebook is destroying the news business.

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: