Would People Get in the Experience Machine?

| | Comments (4)

Ilya Somin points to a recent discussion of what life would be like if we have virtual reality machines that we could spend most of our life in. He's right to mention that this isn't a new debate based on the technology we now have but goes back (in the technological form) at least to the 70s, with Robert Nozick's experience machine.

There may have been other forms of the discussion. In fact, I'd be surprised if it never came up with the Epicureans, although I know of know extant document raising a similar puzzle for them. But it strikes me as odd that the Stoics or some other group wouldn't have raised the possibility of someone being misled about reality but experiencing pleasure, which seriously separates the two views of what counts as a good life. Epicureans would have to find some contingent reason why it would be bad to get in the machine, e.g. it may break down and then you'll miss it later, which constitutes pain (Epicurus' reason for never eating gourmet food) or someone might program in bad experiences while you're in it, and you'll never be able to get out to change the program back to what you wanted (which is similar to the Epicureans' response to the problem raised by an invisibility ring allowing you to get away with whatever you wanted). On the other hand, most people's reasons for not replacing your real experiences with machine-generated ones (at least as a permanent lifestyle) is because it's not real. That's just a bad life.

Somin's post indicates that he's unsure whether people would turn their life over to such a machine. His reason is that there are lots of people with lots of difference preferences. I think he's right about there being variation of preferences, but I think we all have the same basic preferences based on what's really and truly good. We just make mistakes about what will get us those, and those mistakes might lead some people to get into the machine.

I'm a lot less sure than he is that there would be very high numbers of such people, though, at least if my students are any indication. I present this issue in pretty much every ethics class and every ancient philosophy class I teach. That's been somewhere from 30-60 students every semester for the last several years. Once in a while I get a student who says they'd get in the machine. It's never been more than 2-3 in any given class, and more often than not no one thinks they'd get in. Maybe this is weighted in a certain direction because they're college students or something, but I really have a hard time believing a large number of people would turn their whole lives over to a virtual reality just because it's possible to do so.


I seriously wonder if this is one of those things we like to think we would never do, but if confronted with the real opportunity, would we do what we say? For instance, what if you were put into such a machine without your knowledge, and began living in this pleasant virtual reality? Then someone came along and made you aware that you were living in the machine, and you could choose between your smooth virtual life, and a real bumpy one. I would venture to guess more than a few people would stay in...or refuse to believe the whole thing as an excuse to stay in.

It's like TV. Would we say that we are willing to spend this huge percentage of our time zoning in front of mediocre TV shows, rather than getting out and living active, quality lives? Of course not, yet this is what a large portion of our population does voluntarily. So I just don't buy it. I think people are lying.

Ask instead how many people would simply try it out.

I think you would find in reality, the number of people who would 'choose' to spend their life in these machines would be greatly influenced by addiction (fueled by 'innocent' escapism and experimentation)

The question you ask is like asking a group of people to say how many would prefer to be on heroin for the rest of their lives.....

Alan, I'm asking the question for different reasons than Somin is. He's interested in whether it would be a social problem if we had this technology. I'm interested in whether people's preferences reveal their deeper philosophical commitments, such as a value theory that includes more goods than just pleasure and things you can be conscious of. So it's a perfectly good question in the context I use it even if what you're saying is right. But that would be an important consideration for Somin's question.

One worry I have about the comparison with heroin is that heroin involves an outright chemical addiction. This wouldn't be like that. So it couldn't be even as strong as the urge to keep smoking after trying it a bunch of times, never mind like the addiction heroin forms after one try.

Fair enough, I guess I missed that slant of your question...although I think the psychological addictive factor of such a machine would be quite high, especially if you look at what is essentially a poor mans version of this (television usage, world of warcraft etc)

Leave a comment


    The Parablemen are: , , and .



Books I'm Reading

Fiction I've Finished Recently

Non-Fiction I've Finished Recently

Books I've Been Referring To

I've Been Listening To

Games I've Been Playing

Other Stuff


    thinking blogger
    thinking blogger

    Dr. Seuss Pro

    Search or read the Bible

    Example: John 1 or love one another (ESV)

  • Link Policy
Powered by Movable Type 5.04