There's this sort of infamous Reddit post on a forum dedicated to people sharing retirement and financial advice. This user concludes that the currency of life is time, not money – not exactly novel, but I found myself thinking of it today.
Technology in the 21st century, namely the mass adoption of personal computing, was a monumental shift for humanity in part because it represents a major milestone in our shift away from manual labor and into cognitive labor. This technology promised to achieve what any great technology seeks: efficiency, to do what couldn't be done otherwise, and to enable a better quality of life for all.
But the feedback I get from many of my peers is they feel overly connected with technology and seek joy in a form that is purposefully disconnected. Computing is interwoven into so many facets of our lives—work, socialization, utility—it feels as though even a momentary break from our phone is a reclamation of our humanity.
So it seems like a logical leap, particularly given the prevalence of science fiction themes in popular culture, that AI is bound to accelerate our sense of connection with technology – something I can appreciate not many people having desire for.
I'd like to suggest that AI has a non-zero chance of actually improving our relationship with technology. In fact, AI may just help us reclaim lost ground.
Reclaiming Lost Ground
When I think of parts of my own humanity I have lost to technology, even if temporarily, I think of things like: tech addiction, decreased face-to-face interaction, information overload, sedentary lifestyle, and privacy norms.
Tech Addiction & Information Overload
Anyone familiar with addiction can attest: it is not a matter of willpower. More often, the people who succeed at breaking it put themselves in situations where they don't have to rely on willpower alone. Each time I turn off notifications for a new application on my phone, I feel a small sense of victory – a reclamation.
It is easy to feel overloaded with information from apps that were built using attention-optimizing algorithms. These systems are able to hijack your own biology, capitalizing on the reward center of your brain to keep you scrolling.
As a reaction, there is growing evidence of a wider interest in "digital detox" specifically by Gen Z and Millennials. "Dumbphones" are even on the rise.
We're Going on a Bear Hunt
I believe if you were to zoom out on the timeline of humanity, there is a non-zero chance that the current moment AI is in marks a shift from computing directly to computing in a way that feels so natural, it is indirect by comparison. Perhaps we grapple with the growing pains of tech not by going around it, but through it.
Consider a small example like Ray Ban and Meta's new smart glasses. There's no display. They're just a really good pair of glasses that happen to have AI and bone conductive headphones built in. With their built-in camera, I can leave my phone behind when I go on hikes and still enjoy sharing photos after I'm done with the activity, allowing me to remain more in the moment.
It doesn't take a great leap of faith to imagine that kind of technology paired with a number of other new advances in the AI space. Take Groq's invention of the LPU, for instance. When they demonstrated their ability to make large language models run in a near-instant fashion, a world of possibilities was unlocked. Imagine being able to have a conversation with an AI assistant that actually understood cadence and was able to respond without perceptual delay or direct prompting.
Or take for instance the many AI hardware products last year. One of which, the Rabbit R1, promises to build an entirely new kind of operating system (which they call a large action model, or "LAM") which completes actions for you at request through its ability to interact with apps you use every day. Imagine being able to speak your standup notes to your digital assistant and have those automatically reported via your team's proper channels. Or imagine never having to schedule another meeting or appointment ever again as Google demonstrated in 2019.
How long will it be before even more natural interfaces emerge? Microsoft has famously contributed to this space through their early exploration into micro-interactions. Now, 14 years later, we're starting to see these in real products.
From Microsoft's 2010 paper titled Interactions:
"We have progressed from batch computing with punch cards, to interactive command line systems, to mouse-based graphical user interfaces, and more recently to mobile computing. Each of these paradigm shifts has drastically changed the way we use technology for work and life, often in unpredictable and profound ways."
What if you could compute just by thinking? Of course there's Elon's method of putting a chip in your brain. But did you know Apple owns a AirPods design patent featuring EEG capabilities to monitor brain activity? And that AI can now passively translate words and sentences from your real-time EEG data? What may sound creepy to some enables communication altogether for others. A computing ecosystem that you don't have to hold in front of your face to operate allows us all to be more connected to the moment, to each other, and to our work.
The Future
Through more natural interactions and interfaces, might we ditch the small, bright rectangular screens we've all grown so accustom to these last 16 years? Ironically, integrating technology deeper into our world may carry with it the unexpected result of making technology feel more out of the way. I believe there is a chance AI enables us all to disconnect from technology and dream of a new, better future.
The big questions I think we should be asking ourselves: if we are in the middle of a shift from physical manual labor to cognitive manual labor, what's next? What do we want to accomplish individually and together that we could not before? What kind of culture do we want to collectively build? Where is the joy in tomorrow that could not be found in the world we live in today?
© Elijah Kleinsmith • All Rights Reserved