Human as Somatic Supercomputer

I listened to Michael Pollan on the Ezra Klein podcast this morning and immediately thought of a conversation I had with my friend Mara Naselli — about how our somatic senses– our ability to see, hear, touch, smell, taste and sense are like a human supercomputer.

Pollan describes a moment when he was documenting his inner thoughts for a research experiment: "I was standing in line at a bakery deciding whether to buy a roll. When I was at that bakery standing in line, there was the smell of baked goods and cheese… there was the image of this woman in front of me wearing a loud, plaid skirt, hideous… there was my awareness of the other people there ‘did I recognize anyone’… my thoughts were so inter-infected by one another, one thought coloring the next…” (Michael Pollan, The Ezra Klein Show, (5:05-5:50: March 31, 2026)

The researcher's conclusion was that Pollan had “very little inner life”. I thought the exact opposite. What Pollan was describing in that bakery is an incredibly fast processor of multi-sensory inputs that took a single unremarkable moment in a bread line and turned it into a smell, a sight, a social observation, an aesthetic judgment, a hunger pang, and a purchase decision. All automatically and simultaneously. What the researcher called "little inner life" — I'd call it computing.

Our body is our operating system. AI has no body. It doesn't breathe, get goosebumps, or feel the exhaustion of a long day or the exhilaration of jumping into ocean waves. It can describe these things — fluently, beautifully, in seventeen languages — but only if we experience them first and describe them for it.

Our bodies can do more than impress algorithms, though social media has been training us to optimize how we look on camera, how we sound in a clip, how many seconds we hold someone's attention. We've been using our OS to serve the apps. Let’s try to remember which one of us actually has a body.

April 2026

Next
Next

Semantic Compression: AI is the "Mp3" of Human Thought