Over the last week two trains of thought merged. The first: I’m falling out of love with social media, despite having been extremely online for more than 20 years. The second: excessive phone use has definitely made me dumber (no sniggering at the back, there) and it’s time to take corrective action. These reflections are linked by a transition in tech and culture so radical that at present I can only grasp its edges: AI, or (to put it another way) the industrialisation of thought.
AI is central to the decline of social media platforms. It’s also, if misused, the greasy chute to cognitive atrophy. And I think the upshot of these twin phenomena is already visible: a bifurcation between those who enjoy the upsides of this industrialisation of thought without falling victim to its temptations, and those growing increasingly cognitively adrift under the algorithms. If you don’t want to be among the latter, the time to take evasive action is now.
Of course the digital revolution, including its less desirable effects on cognition, is not just about AI. That revolution began in earnest when social media swapped long-form media consumption (even movies are relatively long-form) for the distraction economy and limbic capitalism enabled first by the internet, and then - to a far more pervasive and revolutionary degree - by internet-enabled smartphones.
Covid lockdowns were the inflection point. We are now a digital-first culture. And there’s already plenty of dismayed commentary on the cognitive impact of this shift, especially as delivered via smartphones. Just recently, the pseudonymous American college professor “Hilarius Bookbinder”, described this in his students over recent years:
Most of our students are functionally illiterate. This is not a joke. […] I’m not saying our students just prefer genre books or graphic novels or whatever. No, our average graduate literally could not read a serious adult novel cover-to-cover and understand what they read. They just couldn’t do it. They don’t have the desire to try, the vocabulary to grasp what they read, and most certainly not the attention span to finish.
Bookbinder reports that writing is just as bad:
Their writing skills are at the 8th-grade level. Spelling is atrocious, grammar is random, and the correct use of apostrophes is cause for celebration. Worse is the resistance to original thought. What I mean is the reflexive submission of the cheapest cliché as novel insight.
He connects this decline in concentration, vocabulary, reasoning, spelling, and capacity to think long-form squarely to smartphone use, reporting that his students can’t even get through a 50-minute seminar without leaving the room to check their phone, or scroll on a laptop while pretending to type notes.
It’s not just one college professor. Writing for the Financial Times recently, John Burn-Murdoch argued that it’s not a coincidence that global measures of IQ peaked in the early 2010s, as this is the point at which smartphones became pervasive. Since that point, a decisive switch away from reading to scrolling as the default mode of information consumption has begun to have the effects described by Professor Bookbinder: degraded concentration, verbal reasoning, vocabulary, and capacity to think full stop.
All this has been radically accelerated by the maturing of AI as an active force in postmodern culture. But before I get into this, I want to make two things clear.
Firstly, none of what follows is pure anti-tech polemic, or not just that. I am not proposing we deactivate the internet and all who sail in her, as if that were even possible. But if we are to stay human through the digital transition, we have to look clearly at those ways our digital tools are shaping us - the bad as well as the good.
Secondly, I I take the view that “AI” is a misnomer. I think excited prophecies of “artificial general intelligence” radically misunderstand what consciousness is. AI is not “intelligent” as humans are intelligent, and never will be.
As I argued here, it would would be more accurate (if less snappy) to describe AI as “powerful modelling and prediction tools based on pattern recognition across very large datasets”. It is, in other words, not a type of cognition in its own right, but - to borrow a term from Marshall McLuhan - one of the “extensions of man”: specifically a means of extending cognition itself.
In extending cognition, AI industrialises what it extends - that is, re-orders ever more of our thinking to the market. And as we’re beginning to see, the same dynamic attenuates self-reliance in whatever is industrialised. Someone who lives on takeaway has no reason to learn to cook from scratch, and someone who has handed off manual labour to a set of powerful machines may grow soft and weak compared to his physically labouring forebears. In the same way, someone who hands off cognitive tasks such as research, synthesis, or summary generation to a digital extension may grow cognitively more flabby as a result.
There are several facets to this, but I’ll name two principal ones.
Keep reading with a 7-day free trial
Subscribe to Mary Harrington to keep reading this post and get 7 days of free access to the full post archives.