There has been tremendous concern recently over misinformation on social media. It was a pervasive topic during the 2020 U.S. presidential election, continues to be an issue during the COVID-19 pandemic and plays an important part in Russian propaganda efforts in the war on Ukraine. This concern is plenty justified, as the consequences of believing false information are arguably shaping the future of nations and greatly affecting our individual and collective health.

One popular theory about why some people fall for misinformation they encounter online is that they lack digital literacy skills, a nebulous term that describes how a person navigates digital spaces. Someone lacking digital literacy skills, the thinking goes, may be more susceptible to believing—and sharing—false information. As a result, less digitally literate people may play a significant role in the spread of misinformation.

This argument makes intuitive sense. Yet very little research has actually investigated the link between digital literacy and susceptibility to believe false information. There’s even less understanding of the potential link between digital literacy and what people share on social media. As researchers who study the psychology of online misinformation, we wanted to explore these potential associations.

To begin, we needed to establish clarity on what “digital literacy” means in this context. The term is used in many different ways, and the first step to studying it rigorously was to define it. We landed on two definitions: Digital literacy is the possession of basic digital skills required to effectively find information online, such as using the Internet to answer questions like “What is the capital city of Malawi?” or “What is the only U.S. National Park that begins with the letter T?" The other is focused specifically on social media, asking whether people understand how platforms decide what to show in the newsfeed.

With these measures in hand, we surveyed 1,341 Americans who matched the national distribution on age, gender, ethnicity and geographic region; in this way, they were representative of the U.S. population. We first showed them two dozen news headlines about politics or COVID—half of which were accurate, and half of which had been shown to be false by professional fact-checking Web sites. Then we measured their digital literacy by having them report their familiarity with various Internet-related terms and answer a question about how Facebook decides what to show in their newsfeeds. We examined the association between these digital literacy measures and two different outcomes: belief in, and willingness to share, accurate versus false news about these topics.

Our study found that digital literacy is indeed a good predictor of one’s ability to discern accurate information from falsehoods. Both of our digital literacy measures were independently predictive of the tendency of study participants to rate factual news as more accurate than false news. The result was the same, regardless of the subjects’ political affiliation and regardless of whether the news headlines were about politics or COVID.

When we looked at the connection between digital literacy and the willingness to share false information with others through social media, however, the results were different. People who were more digitally literate were just as likely to say they’d share false articles as people who lacked digital literacy. Like the first finding, the (lack of) connection between digital literacy and sharing false news was not affected by political party affiliation or whether the topic was politics or the pandemic.

Most surprisingly, even people with high digital literacy were not immune from clicking “share” for false news. This sounds odd. If you are digitally literate and can better tell the difference between true and false news, why wouldn’t you be less likely to share falsehoods? A potential answer comes from prior work of ours on why people share misinformation. We found that although most people don’t want to spread misinformation, social media is distracting: people are scrolling quickly, and their attention is drawn to social validation and other feedback, such as how many likes their posts will get. This means we often forget to even ask ourselves if a story is true or false when considering, however quickly, whether to share it.

Our latest study adds to these prior findings by suggesting that believing and sharing are not one and the same. Just because a piece of false information parading as “news” has been shared millions of times doesn’t necessarily mean that millions of people believed it to be true; it could just be that the sharers never considered whether the news was true or not. And just because someone is better at distinguishing fact from falsehood if they stop to think about it does not necessarily mean that they will share more accurate information.

The bottom line is that, surprisingly, digital literacy may not be a key factor for predicting who spreads misinformation on social media. No one is immune from the potential to spread misinformation—so be sure to stop and ask yourself whether the news you see is accurate before you click “share.”

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.