top of page

The Age of Unreason? Social Media, Artificial Intelligence and the Retreat from the Enlightenment

  • 3 minutes ago
  • 6 min read

Tuesday 24 February 2026


There was a time — not so very long ago — when the word ‘Enlightenment’ signified an aspiration rather than an irony. The eighteenth century’s philosophers did not imagine that human beings were perfectible. They understood too well the frailty of reason and the temptations of passion. But they insisted upon a discipline: that claims to truth should be tested, that authority should justify itself, that facts should be distinguished from superstition, and that argument should be conducted in public according to shared standards of logic and evidence.


Today we inhabit a world in which information is no longer scarce but superabundant — in which every citizen of a developed society carries in his or her pocket a device more powerful than the computers that guided Apollo to the moon — and yet we are beset by a suspicion that something has gone wrong. It is not merely that falsehoods circulate. Falsehoods have always circulated. It is that falsehoods circulate at industrial speed, clothed in the aesthetic forms of authority, and are consumed as identity rather than scrutinised as propositions.


The question therefore is stark: is idiocy becoming the intellectual normal? And if so, what — if anything — can be done about it?


The democratisation of publication — and its costs


The internet dissolved the traditional gatekeepers of knowledge. Newspapers, universities, broadcasters and learned journals once filtered information, imperfectly but consciously. Their standards could be criticised — and often were — but they existed. The editor, the peer reviewer and the fact-checker formed a modest but real barrier between rumour and publication.


Social media abolished that barrier. The platforms — whether it be Facebook, X or TikTok — are not publishers in the classical sense. They are distribution systems optimised for engagement. Engagement is measured in clicks, shares and emotional reactions. The content most likely to spread is not the content most likely to be true, but the content most likely to provoke outrage, affirmation or fear.


This has structural consequences. Enlightenment discourse depends upon delay — upon time for reflection, rebuttal and revision. Social media depends upon immediacy. A falsehood that travels the globe in minutes may be corrected hours later, but by then it has already shaped impressions. In an attention economy, first impressions are often decisive.


Nor is this merely a problem of “fake news”. It is a problem of epistemology — of how we know what we know. The constant stream of headlines, clips and opinions produces the illusion of omniscience without the labour of study. To scroll is to feel informed. To feel informed is not necessarily to be informed.


Artificial intelligence and the automation of plausibility


Into this already unstable ecosystem has entered artificial intelligence. Systems such as OpenAI’s ChatGPT and image generators from Midjourney can produce fluent text and persuasive imagery at negligible cost. The technology is extraordinary — and often beneficial — but it further erodes the visible markers of authority.


A well-written paragraph once implied human effort. A realistic photograph once implied a physical event. Now both may be fabricated within seconds. The boundary between the authentic and the synthetic has become porous.


The danger is not that artificial intelligence is uniquely deceptive. It is that it produces plausibility at scale. When plausibility is cheap, discernment becomes expensive. The Enlightenment assumption that reasoned argument will eventually prevail presupposes a manageable quantity of claims. In a world where millions of arguments can be generated daily — many tailored to specific audiences — the cognitive burden placed upon citizens becomes immense.


The collapse of shared standards


Enlightenment ideals were not merely about reason in the abstract. They were about shared reason. Public discourse functioned because there were broadly agreed criteria for evidence and argument. One might disagree about policy, but not about whether two plus two equalled four.


Contemporary discourse often lacks that common foundation. Conspiracy theories flourish not only because they are sensational but because they provide coherence — a narrative that explains complexity in emotionally satisfying terms. Social media algorithms reinforce such narratives by surrounding individuals with like-minded content. The result is epistemic tribalism.


In such an environment, idiocy — defined not as low intelligence but as the refusal to submit one’s beliefs to rational scrutiny — can indeed become normal. It becomes socially rewarded within certain communities. It becomes a badge of authenticity to reject “elite” expertise. The Enlightenment’s faith in progress gives way to a theatre of perpetual indignation.


Yet it would be too simple — and itself a form of intellectual laziness — to conclude that we are living through a straightforward decline. Literacy rates are higher than ever. Access to primary sources is unprecedented. Scholarly materials once confined to libraries are freely available. The same networks that spread nonsense also enable rigorous inquiry.


The problem is not access to knowledge. It is the discipline required to evaluate it.


Can anything be done?


If the diagnosis is partly correct, the remedy cannot be merely technical. Regulation of platforms may address the most egregious abuses — incitement to violence, coordinated disinformation campaigns or foreign interference. But law cannot compel wisdom. Nor can it define truth without risking authoritarian overreach.


Three avenues merit serious consideration.


First, education — not merely in the accumulation of facts, but in the habits of critical reasoning. Media literacy should be treated as a civic necessity, akin to reading and arithmetic. Students must be taught how algorithms shape their information environment, how sources can be verified, and how arguments can be deconstructed. This is not glamorous reform. It is slow and generational. But Enlightenment culture was never built in a day.


Secondly, institutional reform. Traditional media must regain public trust not by nostalgia but by transparency — by explaining editorial processes, correcting errors visibly and resisting the temptation to chase viral content at the expense of depth. Universities, too, must defend standards without retreating into jargon or ideological conformity. Authority must be earned, not assumed.


Thirdly, technological countermeasures. Artificial intelligence can be deployed defensively — to detect coordinated manipulation, to flag synthetic media and to assist users in tracing sources. The same ingenuity that enables deception can enable verification. However such systems must be governed carefully lest they entrench new forms of unaccountable power.


None of these measures will eradicate idiocy. The Enlightenment never promised the abolition of folly. Human beings are not purely rational creatures. We are motivated by status, fear, belonging and emotion. The digital environment amplifies those impulses.


The more realistic aim is not to restore a mythical golden age of reason but to cultivate resilience — to create social norms that value intellectual humility and to reward those who change their minds in light of evidence rather than those who shout the loudest.


A cultural rather than merely technological crisis


What may be most troubling is not the quantity of misinformation but the erosion of patience. Enlightenment discourse requires time: time to read a long argument, time to compare sources, time to think. Social media fragments attention into seconds. Artificial intelligence accelerates production. The entire architecture of the contemporary information economy favours speed over reflection.


If idiocy is indeed gaining ground, it is not because human intelligence has diminished. It is because the environment in which intelligence operates has changed. We are asking minds evolved for small communities and slow communication to navigate a global torrent of stimuli.


The task, therefore, is cultural. It is to reassert — deliberately and consciously — the value of seriousness. To read long essays rather than summaries. To prefer primary documents to commentary. To reward expertise while remaining sceptical of arrogance.


There is no single lever that can be pulled to restore Enlightenment ideals. There is only a daily practice: of questioning, of verifying, of listening, of revising one’s own assumptions. These habits are unfashionable in an age of performative certainty. Yet they remain the only antidote to intellectual decline.


Idiocy may be loud. It may be algorithmically amplified. But it is not inevitable. The Enlightenment was never a completed project; it was an ongoing argument about how to live with reason. The question is not whether the gates of information can be closed — they cannot — but whether we are willing to cultivate the intellectual virtues required to walk through them without losing ourselves.


In that sense, the future of reason does not depend solely upon platforms or parliaments. It depends upon citizens — upon whether we choose the arduous discipline of thought over the intoxicating ease of outrage.

 
 

Note from Matthew Parish, Editor-in-Chief. The Lviv Herald is a unique and independent source of analytical journalism about the war in Ukraine and its aftermath, and all the geopolitical and diplomatic consequences of the war as well as the tremendous advances in military technology the war has yielded. To achieve this independence, we rely exclusively on donations. Please donate if you can, either with the buttons at the top of this page or become a subscriber via www.patreon.com/lvivherald.

Copyright (c) Lviv Herald 2024-25. All rights reserved.  Accredited by the Armed Forces of Ukraine after approval by the State Security Service of Ukraine. To view our policy on the anonymity of authors, please click the "About" page.

bottom of page