During the 2016 election cycle, certain Web sites spread false information across the Internet. But a new study suggests they did not have as much impact as some have feared.

About 44 percent of voters, mostly right-leaning, saw at least one site, the study found.* Yet those voters also saw plenty of legitimate news on the Web. “This content, while worrisome, is only a small fraction of most people’s information,” says Brendan Nyhan, a professor of government at Dartmouth College and one of the three authors of the study, which was published today in Nature Human Behaviour.

The research provided the most systematic examination of people’s exposure to these fringe Web sites to date. It showed that while these untrustworthy sources might have a small effect on public opinion, in 2016 they did not substantially move individuals’ positions about then presidential candidate Donald Trump or whether to go to the polls.

Emily Thorson, an assistant professor of political science at Syracuse University, says she is not surprised that these sites did not have a huge effect. A single piece of information rarely changes anyone’s opinion, “whether it’s true-or false,” says Thorson, who was not involved in the new study. “That’s a good thing.” The idea that a handful of unreliable outlets were going to substantially alter views or behaviors “is pretty far-fetched, given what we know about the stability of people’s political attitudes,” she says.

The research paired responses to an online survey with data about which Web sites participants visited. In 2016 the survey responses were collected from 3,251 volunteers between October 21 and 31, and the Web traffic was recorded between October 7 and November 14. The election was held that year on November 8.

The study “is consistent with, and adds to, prior research that suggests that while a fair number of people had some exposure to ‘fake news,’ that the exposure was highly concentrated among a small number of conservatives,” says David Lazer, University Distinguished Professor of Political Science and Computer and Information Science at Northeastern University.

Lazer, who provided feedback for the paper but was not involved in the work, notes that it examined an individual’s browsing behavior, whereas previous studies looked solely at sharing false information on Facebook or, in the case of his own research, on exposure to, and dissemination of, such content on Twitter.

In his study, Lazer and his colleagues showed that untrue material accounted for nearly 6 percent of all news consumed on the Twitter. But only 1 percent of users were exposed to 80 percent of this misinformation, and 0.1 percent shared 80 percent of it.

Nyhan and his colleagues’ new research concludes that most people find these untrustworthy Web sites through social media, particularly Facebook. “It shows Facebook as a major conduit to fake news [and] misinformation,” Lazer says.

Thorson says that while the Nature Human Behaviour study was expensive and difficult to conduct, Facebook already has much of the same information readily available—and should provide more of it to researchers. “One of the big takeaways for me is how important it is to start being able to look inside of what Facebook is doing,” she says.

In 2018 voters were less exposed to misleading content than they had been in 2016, Nyhan says. But it is unclear if that reduction is because social media platforms such as Facebook were taking measures to minimize the effects of these fringe sites or whether there was simply less activity during a midterm election year.

Nyhan adds that he and his colleagues conducted the study because of shortcomings they saw in some other research and common misconceptions about the role of these Web sites. “I do worry that people’s often incorrect sense of the prevalence of this type of content is leading them to support more extreme responses,” he says. Measures to halt the transmission of material can “raise important concerns about the free flow of information and the exercise of power by the platforms over the information that people see.”

The main problem with these sites, Nyhan says, is not what they post but the risk that someone in power will amplify their lies. “One implication of our study is that most of the misinformation that people get about politics doesn’t come from these fringe Web sites. It comes from the mainstream—it comes from the media and political figures who are the primary sources of political news and commentary,” he says.

A Web site might promote unscientific theories about the origins of the coronavirus without changing a lot of minds, Nyhan says. But when someone like conservative commentator Rush Limbaugh talks on air about those same theories, it has a bigger effect, he adds.

Will the 2020 election prove to be any different than the one in 2016 in terms of the power of these fringe sites? It is too soon to tell, Nyhan says. “The public is at least potentially more aware of the issue, though I don’t know of any systematic evidence helping them make better choices,” he says. The year “2020 will be the first real test.”

*Editor’s Note (3/3/2020): This sentence was edited after posting. It originally said 20 percent of voters, mostly right-leaning, saw such Web sites.