Where does your comfort end and surrender begin? I've been wondering this lately, watching how seamlessly we've let technology slip into the driver's seat of our lives. We're not standing at some distant future crossroads; we're already well down the path, handing over the keys bit by bit. But at what point do we stop writing our own stories?
Each little decision we hand over feels almost meaningless on its own. Who cares if Spotify picks your next song or Netflix your next show? The aggregate, however, amounts to something profound. Dominican priest and Scriptural theologian Aquinas might call this slow surrender a 'slow weakening of our moral muscle.' Like any muscle, our capacity for meaningful choice only strengthens through regular use. The whole concept of choice assumes someone's doing the choosing, weighing possibilities against values, desires, and intuitions born from lived experiences. When we outsource these decisions to systems running on entirely different principles, we're wandering into uncharted territory.
Recently, the Vatican issued its 'Antiqua et Nova' doctrine. It speaks to something I've felt but couldn't quite name, this strange tension between timeless questions about what makes us human and the uncharted waters of technological partnership. Invisible threads of algorithms now weave through pretty much everything we do. They manage our money, orchestrate our work, delegate to our teams, even choreograph our family calendars. We've welcomed these digital companions into the most intimate corners of our lives with barely a ceremony, letting them curate what we read, filter who gets interviewed, even suggest what might be wrong with our bodies when we're feeling off. At work, these tools have graduated from assistants to authorities, optimising strategies, distributing tasks, increasingly passing judgment on human performance. And honestly? I go along with it because there is a lot of value that they add, making life smoother, and less friction-filled. But something nags at me, in this frictionless glide toward optimisation, something vital is being polished away: the rough, beautiful texture of human choice.
The math of convenience versus agency gets complicated quickly. We save time, sure, but at what cost? Something less tangible but perhaps more precious: that almost sacred gift of determining our own paths. Take something seemingly trivial like our tastes in music, books, or art, supposedly the most personal and subjective parts of our identities. Recommendation engines now shape these preferences, filtering out incongruity, creating these cosy echo chambers of artistic exposure. I sometimes wonder, in theological terms: can beauty, the glimpse of divine harmony that stops us in our tracks, really be encountered through such filtered experiences? What happens to those beautiful accidents, stumbling across something that challenges rather than confirms who we think we are? Are we creating feedback loops that narrow rather than expand our horizons? It reminds me of what prolific writer and theologian Augustine might recognise as a narrowing of the soul's capacity for wonder.
The path leads to not only automation, but also permission and proxy. French philosopher Jean-Paul Sartre argued that we define ourselves through action. So, what happens when those actions are increasingly curated, suggested, or carried out by something other than us? If AI picks our investments in our sleep, anticipates illnesses, and even predicts who we might love, aren't we becoming passengers rather than drivers of our own existence? Authentic choice assumes awareness, understanding our options, decisions, and consequences. As these processes disappear behind interfaces designed to eliminate friction, we might find ourselves with more time but less agency, more convenience but less authorship.
German historian and philosopher Hannah Arendt saw human agency in the spontaneous and unpredictable, exactly what machine learning tries to pattern and predict. The beautiful mess of human growth, wisdom earned through failure, unexpected connections that spark breakthroughs, are we sanding away these essential imperfections in our relentless pursuit of optimisation? The concept of metanoia, or genuine transformation, depends on the unexpected, those unplanned moments of grace that break through our careful planning. 'Antiqua et Nova' reminds us that if free will represents a divine gift, automated decision-making fundamentally challenges the spiritual dimension of being human.
Beyond all this theory, we face immediate boundaries to draw. AI isn't approaching; it has already arrived and made itself comfortable in the inner sanctums of our decision-making. Some companies have started experimenting with automated management, where employees report not to human bosses but to systems optimising performance with cool precision. The efficiency is obvious. The deeper implications, less so. Our task isn't deciding whether AI shapes our lives, but how much shaping we permit. The contours of this negotiation remain fuzzy, happening piecemeal across disciplines, cultures, and values, often without explicitly acknowledging the stakes.
'Antiqua et Nova' offers a guiding principle for drawing these boundaries. By its outline, technology should serve authentic human connection rather than replace it, enhance our capacity for self-gift rather than diminish it. Wisdom lies in delineating between which aspects of choice must remain unmediated to shape our humanity and have potential for enhancement through technological partnership. Perhaps the ancient practice of discernment, weighing options in light of ultimate values, becomes more, not less, essential in this new landscape. In that moment of conscious boundary-setting, perhaps we finally will understand what we must preserve.
Chris Neff is Anomaly's global head of emerging experience and technology.