The impressive growth in supply on the information market does not favor scientific analysis nor even rational thought. Quite the contrary, it entices the individual to search for the easiest or most convenient solutions, to make the least effort possible, latching on to those opinions that best fit his or her own worldview. And to make matters worse, on the web, the battle between belief and knowledge is being won by the former. In the hypercompetitive media market, cognitive demagogy is taking the day.
The notion that the borders of the empire of beliefs are defined by irrationality, by stupidity or by ignorance is an old idea in the history of thought. We find it in Montaigne, in Fontenelle and even in the Encyclopaedists, who argued that ignorance was the source of all credulity. This interpretation fuels the dream of a society freed of the bonds of belief by the enlightenment of education in particular. We can allow, without questioning the conceit, that an increase in the level of education, mass access to information and the development of science have all helped to root out all kinds of false ideas from the public sphere. Thus, however metaphorical our depiction of the birth of the universe, we find it easier to envision it as the product of a big bang rather than as the result of the separation of two gigantic entities as narrated, for example, in the Babylonian tale of Enûma Eliš.
Yet even a cursory glance at our community life reveals a persistence, indeed an effervescence, in our collective credulity. Why have the predictions formulated by the thinkers of the Enlightenment and by so many of those who have come after them failed to materialize? It is worth distinguishing two separate questions in this connection: Why do beliefs persist in general? And why are they so vibrant today in particular? Both questions are fascinating, of course, but in this article I shall be considering only the latter, presenting some of the changing beliefs primarily in connection with the way in which our contemporaries have access to the information that helps to feed their conception of the world.
THE INFORMATION GLUT. The information market in contemporary Western societies has unquestionably been massively deregulated, particularly since the internet arrived on the scene. Just imagine that while man produced 150 exabytes of data, which is already a considerable amount, in 2005, by 2010 he was producing fully eight times that number. Putting it in a nutshell, we are disseminating more and more information and we are doing so on such a vast scale that the fact itself is already a major event in the history of mankind. But, you may argue, if more and more information is available, then surely this is wonderful for democracy and for knowledge, which is bound to end up prevailing in everyone’s mind! Sadly, that argument appears to be excessively optimistic. It is based on the assumption that in this open competition between belief and methodical knowledge, the latter will inevitably prevail. The truth of the matter is that, faced with this overwhelming market offer, the individual may easily be tempted to build himself a picture of the world that is mentally convenient rather than true. In other words, the wider the range of propositions offered him, the more easily he can dodge the mental hassle often occasioned by the product of knowledge. The explosion in supply facilitates the plurality of knowledge offered on the market while at the same time making it far easier to access.
The least obvious and yet the most decisive consequence of this state of affairs is that all the conditions are now in place for the confirmation bias to display the full measure of its ability to distort the truth. Of all the cognitive temptations hampering our ordinary logic, the confirmation bias is unquestionably the most decisive in those processes that perpetuate beliefs. The confirmation bias allows people to reaffirm all kinds of beliefs, ranging from the most harmless to the most spectacular. We often find a way of observing those facts that are not incompatible with a questionable premise, but a demonstration of that kind is useless unless we take into account the question of proportion or the existence of other facts that disprove it.
While this hankering after confirmation is anything but an expression of objective rationality, it nonetheless makes life easier for us in many ways. While the information process is probably more effective if our aim is to seek the truth (because it decreases the likelihood of our considering something false to be true), at the same time, it requires what may well be an exorbitant investment in time and in mental energy. After all, people accept certain objectively questionable explanations because those explanations appear to be relevant, in the sense understood by Sperber and Wilson (see their 1989 La Pertinence. Communication et cognition). They explained that in a competition, a person will opt for the proposition that produces the greatest possible cognitive effect in return for the least amount of mental effort required to achieve it. Because beliefs often offer solutions that follow the mind’s natural bent and because they rest on the confirmation bias, they produce a highly advantageous cognitive effect by comparison with the mental effort required to take them on board. As Ross and Leeper have shown, once an idea has been accepted, individuals will persevere in their belief. And this becomes easier and easier as the growing non-selective dissemination of information increases the likelihood that they will encounter “data” confirming their belief.
A study conducted in 2006 examined readers of political blogs, unsurprisingly revealing that 94% of the 2,300 respondents consult only those blogs that match their own outlook. By the same token, orders of political books on Amazon are based more and more on buyers’ political preferences. It is now common knowledge that algorithms, especially in the social media, contribute to the formation of positions of cognitive insularity in this sea of information. All of this allows us to deduce the “theorem of information credulity” based on the fact that the selective search mechanism is facilitated by this massification of information. We might put it thus: the more unselected information there is in a given social space, the more credulity will spread.
TRUTH OR DARE. But quite apart from the confirmation bias, we may ask ourselves what viewpoints a web user without any particular preconceived ideas on a subject is in danger of encountering. If someone uses a search engine such as Google to form an opinion in connection with a theme liable to convey beliefs, what will they find? I have attempted to simulate the way a surfer might access a given cognitive offer on the internet in relation to four different topics: astrology, the Loch Ness monster, crop circles and pyschokinesis. I felt that these topics would be interesting to test because scientific orthodoxy questions the reality of the beliefs that they tend to fuel.
We are not concerned here with assessing the truth or falsehood of the premises, only with observing the competition between those responses that can lay claim to scientific orthodoxy and those that cannot do so (which is why, for the sake of convenience, I have called them “beliefs”). The topics I chose provide us with an interesting observation point for assessing the visibility enjoyed by questionable propositions.
Well, the results are irrefutable. If we consider only those websites that promote favorable or unfavorable arguments, we find an average of over 80% of belief-based websites in the first 30 hits on Google in relation to these topics. How can we explain such a result? The fact is that the internet, as a cognitive market, is extremely sensitive to the way an offer is structured; every offer depends on the motivation of the person doing the offering. It is a fact also that “believers” tend to be more highly motivated than “non-believers” in defending their viewpoint. They devote more time to it.
Belief is the major aspect of a believer’s identity, so he is highly likely to want to seek out new information capable of confirming that belief. The non-believer, on the other hand, will often be indifferent, rejecting the belief without feeling the need for any further justification than the fragility of its premise. This state of affairs can be tangibly verified in internet forums where believers and non-believers occasionally duel.
In the 23 forums I studied (embracing all four of my chosen topics), 211 viewpoints were expressed – 83 in defense of the belief and 45 opposing it, while 83 were neutral. The striking thing when reading these forums is that skeptics are often happy with simply writing sarcastic messages mocking the belief rather than arguing against it, whereas the premise’s devotees produce unquestionably uneven arguments (links, videos, copied and pasted texts and so forth) but they invariably illustrate and defend their viewpoint. Some 36% of posts from those seeking to defend a belief are backed up by a document, a link or a properly expounded argument, whereas the same is true of only 10% of posts from “non-believers”.
Scientists generally display little academic or personal interest in devoting any time to such debates, with the somewhat paradoxical result that it is the believers who, in connection with a whole range of beliefs, have succeeded in establishing a cognitive oligopoly not only online but also, where certain issues are concerned (in particular with regard to gmos, low-frequency waves and so forth), in the mainstream media. The result is that those media have now become hypersensitive to unorthodox sources of information.
DUMB AND DUMBER. I do not think we can argue that the internet makes people dumber or smarter, but the very fact of its existence oils the slopes of certain mental inclinations we have. It orchestrates the presentation of information in a way that does not always favor orthodox knowledge. In other words, free competition among ideas does not always favor the most methodical and reasonable thought. This, especially since mainstream media are now prisoners of unbridled competition on the information market – a competition that impresses a pace on the dissemination of information which is not always in step with the pace of knowledge. The desire for speed lessens the time available for crosschecking any given bit of information and thus prompts a perpetuation of errors which end up being accepted as common sense.
This is particularly visible in the sphere of risk perception. Here, we are witnessing the dissemination on almost every front of an ideology of fear in connection with health and the environment that is not always grounded in scientific knowledge. A health alert issued by an association driven by the best of intentions may end up having a negative impact because it takes science far longer to downplay an alert (if it is groundless) than it took the media to disseminate it in the first place. A case in point is people’s mistrust of vaccines – a mistrust that is spreading like wildfire, though vaccination is likely one of modern medicine’s most remarkable contributions to public health.
In other words, the present situation is likely to structure a viral advantage in favor of credulity in connection with a given set of topics. On the now hypercompetitive information market, those who disseminate information professionally owe their survival to the attention that they are capable of attracting. Under such circumstances, it is understandable that we should be witnessing the spread of cognitive demagogy. In other words, what we are seeing is an offer of information pegged increasingly to the nature of the demand for that information. Thus everyone is well aware of living in a post-truth society, and this fuels a situation of widespread mistrust: mistrust of politicians, mistrust of the media, mistrust of experts, mistrust of scientists…
The mistrust of authorities, in particular, is consubstantial with democracy, as Rosanvallon reminded us in 2006 (in his La Contre-démocratie: La politique à l’âge de la défiance), but in the trial of strength currently taking place between the democracy of the gullible and the democracy of knowledge, it works in favor of the former rather than of the latter.
 Lee Ross and Robert Leeper, “The perseverance of beliefs: empirical and normative considerations,” in New directions for methodology of behavioral science: fallible judgement in behavioral research (Shweder et Fiske eds), 1980, Jossey-Bass.
* This article is taken from Aspenia International, n.80-81