Tango1 pointed out a new gaming technology, and while I have no problem with Armand's posting, I do with something in the article, so I am ranting here instead of poisoning his thread.
the device's ability to immerse wearers in virtual worlds.
Technology does not immerse people in an artificial environment; people cognitively immerse themselves in an artificial environment. It bothers me that we waste so much effort (money, time) chasing technology (and I am an overeducated EE/CS guy) when it is not always, or often, the most important factor.
What's do I mean by the person immerses themselves vice the technology does it? Movies are another technology for immersing a person in an artificial environment. You will happily watch and enjoy a crappy movie about something that piques your interest, but can't bear a well-made movie about something for which you don't give a tinker's dam. This factor in immersion is extrinsic to the technology.
Granted, a poor implementation of the technology can make the immersion bad. But people often extend that idea in a perverse converse and imply that the addition of more working technology can make the immersion better. Having a broken door on the outside of the is bad for the security of your house. Replacing it with a better constructed door will fix the problem. But continuing on and adding dozen more doors on the outside of your house doesn't make the security of your house better. The fact that adding a new door improved security in one case doesn't mean that keeping on adding new ones continues to improve security. I see this type of logic being applied all the time to gaming and simulation.
Visual stimulation is certainly an important extrinsic factor for video games (tabletop games, too!). But that doesn't necessarily mean more realism, either. Especially where a desire for novelty is closely linked with the desire for visual stimulation. Eight bit pixelated art has a charm of its own. That is different from the visual attraction of something like Muramasa. The fact that two disparate approaches like these (neither of which is realistic) can be visually intriguing and entice players to relate to the characters and situations in the games should illustrate the lack of intrinsic importance for visual realism.
It bothers me when people (especially people investing money) talk about the "digital natives" and how they can't relate to anything but (or relate best to) a high-fidelity digital environment. Those people are not of my generation. By which I mean they never stood in line outside a Barnes and Noble with a bunch of parents and school kids from 8pm when it closed until midnight of the next day to get their reserved copy of the new Harry Potter book. Like eight times.
So what is technology's role in immersion? Simple. It can create stimuli that align with untouched extrinsic needs. My issue is really that we need to drive decisions on what the extrinsic needs are rather than picking a part of the technology and improving it in hope that it makes things better. More isn't better. More of something relevant that isn't completely covered yet is better.