Something that's interested me for quite a while now is the effect the internet has on what we believe and how we learn. And quite frankly, for many people, the answer seems to be very ambiguous, to say the least.
Even in this very Science Forum we see people spreading tinfoil hat level theories, pseudo-scientific nonsense and things that are just plain wrong. And what's even more fascinating/frustrating: They are sometimes resistant to correction.
There are many things at work here, I'll just try to name a few:
1) "It's published, so it must be true."
Many people seem to have huge issues discerning the quality of sources they use online. This is a typical internet problem. Back in the age of published print information people grew up with the idea: If it's published, it can't be utter bullshit. At least when it comes to science, we expected published stuff to be reasonably well-sourced or even peer-reviewed.
In the Internet Age everyone and their depressed guinea pig can publish stuff online. And many people seem to not get the idea that most of the stuff that's online is just pretty fucking terrible in quality.
2) "Why do people live in Echo Chambers?"
Easy: Google logic. In the internet, you only find what you're looking for. So if you're looking for conspiracy theories you will find mostly stuff that supports your personal nonsense theory. And from those sites you'll just get linked to more bullshit sites. You can live your entire online life in one of these Echo Chambers and just never be exposed to any evidence that might debunk the conspiracy.
And conspiracies spread. Like crazy. And they're doing a much better job at it than actual Science News. Want proof? Here's a scientific article dealing with this very problem: The Spreading of Misinformation Online
The gist of it:
3) "Why can't people see that the sites are bullshit?"
From the same article I already quoted above:
Secondly, the Dunning-Kruger effect means that the less you know about a specific topic, the more likely you are to overestimate your actual competence in it.
The Dunning-Kruger effect combined with the tendency of people to only acknowledge information that supports their own ideas means, in essence: People believe stupid shit, and because they refuse to learn stuff that debunks the stupid shit... they continue to believe in stupid shit. They believe they're smarter than everybody else and have a strong dislike for what they call "mainstream science".
4) Debunking
Haven't we all noticed how fucking hard it is to debunk a nonsense theory?
Don't get me wrong, the content is easy enough to debunk. We're doing that all the time.
But the tinfoil hat people just won't stop. They refuse to accept that their theories have been thoroughly debunked. It's frustrating for us and it basically means that, at times, this whole forum isn't about scientific discussion but merely about debunking the nonsense of people that refuse to accept that they're wrong.
Another scientific article deals with how inefficient debunking is: Debunking in a World of Tribes
Short summary:
So yeah, sorry about the rant but:
Solutions? Thoughts?
How should we deal with this here? Pay less attention to those who spread pseudo-scientific nonsense since debunking doesn't work anyway and it just ruins our conversations?
Even in this very Science Forum we see people spreading tinfoil hat level theories, pseudo-scientific nonsense and things that are just plain wrong. And what's even more fascinating/frustrating: They are sometimes resistant to correction.
There are many things at work here, I'll just try to name a few:
1) "It's published, so it must be true."
Many people seem to have huge issues discerning the quality of sources they use online. This is a typical internet problem. Back in the age of published print information people grew up with the idea: If it's published, it can't be utter bullshit. At least when it comes to science, we expected published stuff to be reasonably well-sourced or even peer-reviewed.
In the Internet Age everyone and their depressed guinea pig can publish stuff online. And many people seem to not get the idea that most of the stuff that's online is just pretty fucking terrible in quality.
2) "Why do people live in Echo Chambers?"
Easy: Google logic. In the internet, you only find what you're looking for. So if you're looking for conspiracy theories you will find mostly stuff that supports your personal nonsense theory. And from those sites you'll just get linked to more bullshit sites. You can live your entire online life in one of these Echo Chambers and just never be exposed to any evidence that might debunk the conspiracy.
And conspiracies spread. Like crazy. And they're doing a much better job at it than actual Science News. Want proof? Here's a scientific article dealing with this very problem: The Spreading of Misinformation Online
The gist of it:
We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.”
Science news is usually assimilated, i.e., it reaches a higher level of diffusion, quickly, and a longer lifetime does not correspond to a higher level of interest. Conversely, conspiracy rumors are assimilated more slowly and show a positive relation between lifetime and size.
3) "Why can't people see that the sites are bullshit?"
From the same article I already quoted above:
I think there are several issues at work here. Firstly, the Echo Chamber logic means that it's much easier now to only consume news stories that fit your specific world view. It's comfortable I suppose. Fox News uses that, too.Whether a news item, either substantiated or not, is accepted as true by a user may be strongly affected by social norms or by how much it coheres with the user’s system of beliefs (32, 33). Many mechanisms cause false information to gain acceptance, which in turn generate false beliefs that, once adopted by an individual, are highly resistant to correction (34–37).
Secondly, the Dunning-Kruger effect means that the less you know about a specific topic, the more likely you are to overestimate your actual competence in it.
The problem is, many people don't want to learn evidence that contradicts their world view. So that's why they continue to not be able to recognize and acknowledge their own misconceptions.Wikipedia said:Dunning and Kruger proposed that, for a given skill, incompetent people will:
- fail to recognize their own lack of skill
- fail to recognize the extent of their inadequacy
- fail to recognize genuine skill in others
- recognize and acknowledge their own lack of skill, after they are exposed to training for that skill
The Dunning-Kruger effect combined with the tendency of people to only acknowledge information that supports their own ideas means, in essence: People believe stupid shit, and because they refuse to learn stuff that debunks the stupid shit... they continue to believe in stupid shit. They believe they're smarter than everybody else and have a strong dislike for what they call "mainstream science".
4) Debunking
Haven't we all noticed how fucking hard it is to debunk a nonsense theory?
Don't get me wrong, the content is easy enough to debunk. We're doing that all the time.
But the tinfoil hat people just won't stop. They refuse to accept that their theories have been thoroughly debunked. It's frustrating for us and it basically means that, at times, this whole forum isn't about scientific discussion but merely about debunking the nonsense of people that refuse to accept that they're wrong.
Another scientific article deals with how inefficient debunking is: Debunking in a World of Tribes
Short summary:
Again: People refuse to even read debunking posts properly. And after they do... they just jump even deeper into their own pseudo-scientific nonsense.We examine 47,780 debunking posts and find that attempts at debunking are largely ineffective. For one, only a small fraction of usual consumers of unsubstantiated information interact with the posts. Furthermore, we show that those few are often the most committed conspiracy users and rather than internalizing debunking information, they often react to it negatively. Indeed, after interacting with debunking posts, users retain, or even increase, their engagement within the conspiracy echo chamber
So yeah, sorry about the rant but:
Solutions? Thoughts?

How should we deal with this here? Pay less attention to those who spread pseudo-scientific nonsense since debunking doesn't work anyway and it just ruins our conversations?
Last edited: