Love in the Experience Machine
A year ago, I wrote a post introducing Robert Nozick’s idea of an “experience machine”, a simulator into which you can plug and experience a perfect world. There you must not endure pain, grief or sorrow. You can visit places more beautiful than exist anywhere else on earth and even the small inconveniences of life are replaced by feelings of bliss and happiness. In my article, I defended the experience machine on utilitarian grounds:
[Arguments attacking the experience machine] all seem to make an implicit assumption that there is more to reality than what can be experienced but fail to give any reason on why this should be true. For me, the only thing that matters is the presence of pleasure and the absence of suffering and whether these are only “not real” in a cosmic sense is of no consequence as they certainly would feel real to the being experiencing them.
My commitment to this principle is now being tested. Recently, I learned about a company called Luka, which offers a “Replika”, a system that allows the user to create a simulated person according to their own specifications and interact with it via a chat box. The AI can give you compliments, talk with you about your day and even send you AI-generated images. The company claims to have had about 1 million users, of which 35%-40% are looking for a romantic relationship.
In a way, Replika is an experience machine. It offers some of the comforts of human companionship even if you would, for whatever reason, not be able to have such a relationship in the physical world. Assuming we ignore second-order effects such a system has on the individuals using it and society (of course we can’t, and governments have taken notice), I should not have negative feelings about the bot, and, instead, cherish the idea. But I do not. I want to clarify that I do not want to make fun of the users or dismiss their experiences as “not real”. Still, the thought of forgoing human relationships in favor of an algorithm running on a nameless server in the basement of an Amazon warehouse deeply saddens me, whether it’s my own life or the life of another person.
On utilitarian grounds, I cannot defend my position, and this post does not attempt to find a solution to my cognitive dissonance. I just want to note down that this is the first time I have become aware of a non-hypothetical experience machine and what my reaction is. Given the recent advances in AI, I suspect, it won’t be the last.