The New Hatred of Technology

People have never been better, here in the Year of Our Simulation 2024, at hating the very forces underlying that simulation—at hating, in other words, digital technology itself. And good for them. These everywhere-active tech critics don’t just rely, for their on-trend position-taking, on vague, nostalgist, technophobic feelings anymore. Now they have research papers to back them up. They have bestsellers by the likes of Harari and Haidt. They have—picture their smugness—statistics. The kids, I don’t know if you’ve heard, are killing themselves by the classroomful.

None of this bothers me. Well, teen suicide obviously does, it’s horrible, but it’s not hard to debunk arguments blaming technology. What is hard to debunk, and what does bother me, is the one exception, in my estimation, to this rule: the anti-tech argument offered by the modern-day philosopher.

By philosopher, I don’t mean some stats-spouting writer of glorified self-help. I mean a deepest-level, ridiculously learned overanalyzer, someone who breaks down problems into their relevant bits so that, when those bits are put back together, nothing looks quite the same. Descartes didn’t just blurt out “I think, therefore I am” off the top of his head. He had to go as far into his head as he humanly could, stripping away everything else, before he could arrive at his classic one-liner. (Plus God. People always seem to forget that Descartes, inventor of the so-called rational mind, couldn’t strip away God.)

For someone trying to marshal a case against technology, then, a Descartes-style line of attack might go something like this: When we go as far into the technology as we can, stripping everything else away and breaking the problem down into its constituent bits, where do we end up? Exactly there, of course: at the literal bits, the 1s and 0s of digital computation. And what do bits tell us about the world? I’m simplifying here, but pretty much: everything. Cat or dog. Harris or Trump. Black or white. Everyone thinks in binary terms these days. Because that’s what’s enforced and entrenched by the dominant machinery.

Or so goes, in brief, the snazziest argument against digital technology: “I binarize,” the computers teach us, “therefore I am.” Certain technoliterates have been venturing versions of this Theory of Everything for a while now; earlier this year, an English professor at Dartmouth, Aden Evens, published what is, as far as I can tell, its first properly philosophical codification, The Digital and Its Discontents. I’ve chatted a bit with Evens. Nice guy. Not a technophobe, he claims, but still: It’s clear he’s world-historically distressed by digital life, and he roots that distress in the fundaments of the technology.

I might’ve agreed, once. Now, as I say: I’m bothered. I’m unsatisfied. The more I think about the technophilosophy of Evens et al., the less I want to accept it. Two reasons for my dissatisfaction, I think. One: Since when do the base units of anything dictate the entirety of its higher-level expression? Genes, the base units of life, only account for some submajority percentage of how we develop and behave. Quantum-mechanical phenomena, the base units of physics, have no bearing on my physical actions. (Otherwise I’d be walking through walls—when I wasn’t, half the time, being dead.) So why must binary digits define, for all time, the limits of computation, and our experience of it? New behaviors always have a way, when complex systems interact, of mysteriously emerging. Nowhere in the individual bird can you find the flocking algorithm! Turing himself said you can’t look at computer code and know, completely, what’ll happen.

And two: Blaming technology’s discontents on the 1s and 0s treats the digital as an endpoint, as some sort of logical conclusion to the history of human thought—as if humanity, as Evens suggests, had finally achieved the dreams of an Enlightened rationality. There’s no reason to believe such a thing. Computing was, for most of its history, not digital. And, if predictions about an analog comeback are right, it won’t stay purely digital for much longer. I’m not here to say whether computer scientists should or shouldn’t be evolving chips analogically, only to say that, were it to happen, it’d be silly to claim that all the binarisms of modern existence, so thoroughly inculcated in us by our digitized machinery, would suddenly collapse into nuance and glorious analog complexity. We invent technology. Technology doesn’t invent us.

Go to Source