How to weed-whack through fake COVID-19 information

VIRAL VIRUS MISINFORMATION

First seen in the Colorado Springs Indy

By Elena Trapp with assets from Shutterstock

Colorado Springs company Lead Stories is fighting the spread of misinformation on the internet. A reporter, a tech entrepreneur and a lawyer joined forces in 2015 to create a platform that wrote about trending internet items. Alan Duke met Perry Sanders, a Colorado Springs-based lawyer, while reporting in Los Angeles on cases that Sanders litigated for the families of murdered hip-hop artist Biggie Smalls and Michael Jackson. A few years later, after many conversations concerning media literacy and the threat of tabloid journalism, the two persuaded Belgium-based Maarten Schenk, who designed the technology that Lead Stories uses to monitor trending items, to join them after flying him out to Colorado Springs. In 2016, the platform transitioned to finding false trending stories and disproving them as quickly as possible.

RELATED Who’s spreading COVID misinformation?
RELATED Fake news is killing us. How can we stop it?: BEYOND THE FRINGE

Their motto is “Just Because It’s Trending Doesn’t Mean It’s True.” In February 2019, Lead Stories partnered with Facebook as a third-party fact-checker, labeling false information shared on the site. The company is also a member of The Poynter Institute’s #CoronavirusFacts Alliance, which partners with more than 100 fact-checkers internationally to track the spread of COVID-19 misinformation. The company is self-funded and relies on advertising revenue and license fees for use of the Trendolizer (technology that monitors trending items across the internet), and the revenue from the partnership with Facebook.

The World Health Organization has called the misinformation and disinformation surrounding COVID-19 an “infodemic.” The Indy caught up with Alan Duke, the co-founder and editor-in-chief of Lead Stories, in California to ask him about the rapidly evolving world of misinformation, how to fight it and how to keep from getting distracted by shiny memes that spread lies.

Indy: Why is this such an important time to be thinking about misinformation and how it spreads?

Alan Duke: There’s a new angle to misinformation and that is profit. This has been going on now for several years. You started hearing of clickbait probably about eight years ago. And it started simply [by] people figuring out if they write a story a certain way that they will get more clicks and that translates into money for their website. But then others figured it out that clickbait is even stickier and more effective if the story is not true. People will be more likely to click and share something that’s not true. It’s just human nature because they get pulled in by it, and they think it’s so unbelievable, and it’s special knowledge for them that they want to share with their friends. And that’s how it goes viral. But misinformation isn’t new. We’ve had propaganda. It’s just now because of social platforms, it spreads quicker and faster, further, and wider. It’s much more effective and cheaper to get out there because everybody’s got a smartphone.

Why is it harder now to stop this misinformation?

Up until January or February, the content that we were identifying and countering was made for profit by people who have some degree of professionalism. By that I don’t mean necessarily they went to journalism school, but their profession is to create the content. But the majority of the content was in the form of articles that were published on websites that attempted to look real and credible. We still go after that in certain parts of the world. But frankly, the largest percentage of the thoughts — content — that we’re now debunking is not by people who are making a living doing it. It’s made by people who either sincerely believe what they’re putting out there, or they want other people to believe it.

What are the top ways that misinformation about COVID-19 is spreading right now?

The two most effective ways are videos on YouTube or TikTok or Facebook, and the most perfect form that they share it in is the meme. It is often just 20 to 40 words and maybe an image that conveys a message that is not supported by fact. Some of the most common forms are questions about social distancing and the wearing of face masks or the honesty of the people at the Centers for Disease Control [and Prevention]. People are undermining the known information about COVID-19 that authorities are sharing.

Why are videos and memes effective in spreading false information?

Through what we call inauthentic behavior, where someone creates a fake network of affiliated [fake] groups — sometimes even they use fake [profile] pictures — in order to promote content. They’ll seed that content on various groups that they believe will cause it to set fire. The fake groups are almost like little embers in a wildfire, and they spread to other areas. The winds spread them out and that’s how they go viral.
They’re dying, or they’re getting sick, or they’re getting their friends sick.— Alan Duke, co-founder and editor-in-chief of Lead Storiestweet this

What are the real-life consequences of the spread of COVID-19 misinformation?

People are dying. I think people are choosing not to protect themselves with proper strategies such as social distancing or the wearing of a mask. They’re dying, or they’re getting sick, or they’re getting their friends sick. There was the example of a relatively young kid in Texas, where his last dying words to the nurse were [“I think I made a mistake,”] meaning he went to a COVID-19 party. He wanted to get COVID-19 because he thought it wasn’t a big deal, that it was a hoax. People can die from misinformation. Misinformation can be deadly.

Who benefits from the spread of misinformation surrounding COVID-19?

I can’t imagine anybody actually benefiting from it. I do know that people use it in order to increase their social media ranking to get followers. Some popular producers of misinformation on YouTube make a lot of money doing it. But this is an interesting evolution that we cracked [early this year]. The people who are fighting the idea of masks and social distancing and the dangers of the novel coronavirus are a mixture. Groups have converged in the last several months — militia and white supremacists, QAnon and anti-vaxxers. So you’ve got all of those wrapped up together, pushing misinformation. It’s really what you might say is a perfect storm that’s creating a real problem. People don’t know what to trust. They don’t know what to believe. You put all that in there together and it’s quite a combination. It gives the misinformation a lot of power. 

What are some of the most common mistakes you see with people sifting through misinformation?

The most common problem is they don’t read it. The most common problem is they see a photo of a headline, the first few sentences, and they think that they understand it. They don’t read it and they share it. Some of the most viral pieces of content have actually been designed that way to fool them.

“People don’t know what to trust. They don’t know what to believe.”— Alan Duke, co-founder and editor-in-chief of Lead Storiestweet this

What are tips to avoid misinformation?

Consider the sources. … Is this a credible source? Does it seem like it’s too good to be true? Are the sources named? Is there a way that the content provides you a way to verify for yourself? But the main thing is just actually read it and think about it. Look for contrary opinions, contrary information, if you really care. But the problem is, most people don’t care. Most people don’t want to know if it’s true or not. A lot of misinformation takes advantage of that. It tells you something you want to believe. The people who will go to the extent of researching, if something’s true, those are not the problem. The problem is the people who just say it’s true because [they] believe it’s true. And those are the ones where there’s no really good way to talk them out of that or to convince them otherwise.

How do we best confront people who do believe in this misinformation and are sharing it?

Ninety percent of the people who want to believe something, you’re not going to change their mind. And I say just judging from the conversations that I have from people who call my phone and email me. I go back and forth with them. What I find out is that it’s beyond me to change their mind to tell them something is not true. What I can do, as a fact-checker, is debunk the content and work with my clients — the social media platforms — to put labels on the content, letting people know that it’s not true. And 95 percent of the time when a label is put on it on Facebook, people choose no longer to share it. So when we debunk a piece of content, its future is limited. Now new versions of it will come up and we’ll have to go after that. In fact, we go after thousands of them a month, but technically we are able to help the social platforms limit the spread of it on social media.

Are there any unique lessons or takeaways from Lead Stories’ work on COVID-19 misinformation?

It baffles me how political a virus could be and how wearing a mask becomes a political issue. I don’t think I really realized it until one day I was walking down a sidewalk in Southern California wearing a mask and someone slowed down in a van, rolled the window down and said, “Oh, you have your mind control mask on.” I don’t understand how it is that your health and your children’s and your family’s health and your neighbor’s health can be political. But the answer to that is, people do not trust the source of telling you to wear a mask because they think that there’s some conspiracy. And I think that that is very dangerous for us.