You step onto a Full Immersion VR device the size of a gym workout machine, and suddenly, you’re in another world.
A virtual world, where you can explore for miles, discovering ancient forests and fantasy cities. Where you can be a knight or sorceress or space pirate. Where the game world looks, sounds, smells, feels, and tastes just like the real world.
When you run, crouch, or jump on this device, your virtual avatar reacts in the same way. But not only that… this device moves you, too.
Riding through the countryside and fall off your horse? The device drops you in real life. Pick up a rock and heave it over a cliff? You’ll feel the weight of the stone in your arms.
In my novel, Path of Relics, the “Portal Rig” is just such a device (not to be confused with the Holotron pictured above… but more on that later, too). The Portal Rig “transports” players to the fantasy lands of North Näor and allows them to experience real-as-life virtual reality.
But just how realistic is the Portal Rig? And when can you buy one?
tl;dr – The tech of today is obviously not up to par with that of my fictional 2044, but we’re getting there. And someday, I believe we could have a full immersion VR device just like the Portal Rig, allowing us to live out our GameLit and LitRPG dreams. But the experience of Path of Relics is about more than the interface, it’s also about the game simulation.
So in today’s article, I’ll go over some of the modern tech I see as precursors to the fictional tech in my story. We’ll discuss the two major components of the experience: The Portal Rig, and the game itself.
Full Immersion VR vs Full Dive VR
Let’s start by defining what I mean when I say “full immersion.”
Full Immersion VR means actually moving your body in real life to manipulate your avatar in the virtual world. It means stimulating your body’s senses (mostly) in the way nature intended. For example, by using a visual system for your eyes, haptics and force feedback for touch, speakers for sound.
That’s different from Full Dive VR, like in Sword Art Online. In SAO, the Nerve Gear fits over the user’s head and manipulates brain waves, allowing them to have the entire experience in their mind. Ready Player Two featured a similar device (and The Matrix, for that matter).
This type of tech requires paralyzing the body in real life, so all those brain signals to run, jump, and kick don’t have you punting your cat from the living room.
In my opinion, we’re a LONG way off from Full Dive technology. Full Immersion VR, however, is much more do-able. And when I imagine how much fun it’d be to exist in a fantasy world, swinging a sword and chasing after villains, I see the ultimate fitness machine in the making.
What Makes a Portal Rig?
When my main character, Terry, first sees the Portal Rig, he describes it as:
“a peculiar device [that] rocked and swayed like a raft on whitewater rapids.”
And it’s further detailed in this way:
“The Portal Rig, currently occupied by a man running at a full sprint, most closely resembled stair-stepper machines Terry had seen in commercials. Its base was large and flat, and tapered up into a wide pole ending chest height. From the pole, a mechanical arm craned out. The arm was jointed in multiple places and attached to a frame holding the man.”
Several components go into making the Portal Rig. Let’s go through each and consider the closest modern tech has come to reaching this ultimate gaming potential.
The Arm and Exoskeleton
In the Ready Player One movie, the main characters use an omnidirectional treadmill to move around in virtual reality. The issue with this is, there’s no way something like that could fully mirror reality.
What if you walk up a mountain in the game? Your treadmill stays flat. What if you mount a horse? Or suddenly develop the ability to fly?
While the RP1 book describes slightly more robust solutions to locomotion than the movie, I felt they were still too limited (all due respect to Ernest Cline!). So in my novel, I designed the Portal Rig with an arm and exoskeleton, giving it the ability to match any conceivable movements happening in-game to the occupant’s actual body.
A company called Holotron has developed prototypes for a full immersion rig that could one day become something like the Portal Rig. In the following video, you can see their latest efforts.
I’m not going to lie, this thing looks frightening! But notice how the movements of the device match those of the man’s avatar in the simulation. This kind of all-body manipulation will be necessary for full immersion.
We already have virtual reality headsets, and each year, their visual capabilities improve. There are several consumer and professional headsets on the market, but for better or worse, one currently stands out above the rest…
The Meta Quest 2 is the current market leader for virtual reality. I have one, and it’s fun. But it is a far cry from “indistinguishable-from-reality” graphics.
To give you an idea, the field of view on the Meta—that’s how wide an area you’re able to see through the headset lenses—is about 90 degrees. The maximum width of human peripheral vision is just over 210 degrees, so wearing the Meta Quest is like peering at the virtual world through a pair of snorkel goggles.
Also, image resolution in the Meta is about 21 pixels per degree (PPD). Yet researchers estimate human eye resolution to be as high as 60 PPD.
Other headsets have better resolution, field of view, and more compared to the Meta Quest. Still, we’ve got a long way to go.
Force Feedback, Texture, and Temperature
This is another one of those “tricky-but-necessary” aspects of full-immersion. Even with the most realistic visuals, immersion breaks when your hands pass through the virtual walls. Or when you lift your game sword, but it has no weight (since it’s not actually there).
Force feedback aims to eliminate these limitations by simulating the physicality of the virtual world. It does this by providing a “counter force” to your movements.
The image above is a glove by a company called Haptx. The thin wires along the back of the fingers provide the force feedback. They pull back on your fingers when you grip something, preventing your hand from clipping through the virtual object. This gives the sense the object is actually in your hands, since it feels tangible.
The gloves also have temperature elements to simulate heat and cold. And tiny “pressure dimples” line the fingers. An air pump manipulates these, and when inflated, they simulate texture so you can feel feathers or bricks or water. It’s a smart solution for immersion, which I’ve posted about before.
Apply these concepts to your entire body (using something like the Rig exoskeleton and Exosuit in Path of Relics), and the virtual world becomes a very tangible place.
Haptics have been around a long time. They’re in your cell phone and your game controllers. Like force feedback, it’s another way to simulate the physicality of the virtual world. Small motors vibrate, conveying a sense of impact, touch, and texture.
The TeslaSuit (not affiliated with Elon) has taken haptics to the max, creating a full body suit including motors, electrical muscle stimulation, and more. This was certainly an inspiration behind the Exosuit in my story.
The Portal Helmet
In Path of Relics, I envisioned the Portal Helmet as a stepping-stone tech on the way to full-dive VR like in Sword Art Online. The Portal Helmet stimulates the brain to “fill in the gaps” of the Portal Rig.
It’s one of the more hand-wavy bits of sci-fi in the story, but it helped explain harder to mimic senses like taste and smell. Still, here’s some tech that could make it a reality.
You’ve likely seen EEG devices before in tv shows and movies. All those electrodes sitting on a person’s scalp, reading their brain waves. That tech continues to improve and get more compact.
These days, we already have consumer gaming devices using EEG, but the results are… not so spectacular. Most who test them report clunky controls and a novelty feel. Still, the fact these function at all is encouraging.
The Brain Lace referred to in Path of Relics was a not-so-subtle nod to Neural Link. The idea behind this tech is it inserts tiny electrodes directly into the brain. These allow the device to “read” brain activity and use that information for beneficial tasks.
Think of someone who has a spine injury and their nerves can’t communicate with their legs anymore. The Neural Link could read the brain signals of a kick, then wirelessly transfer those instructions to devices on the lower spine that instruct the legs what to do.
But the crazy part is, it could also “write” to the brain. Meaning, if the same paralyzed person were to step onto a hot charcoal near a campfire, those sensations of pain could travel in the opposite direction from foot to brain. Among other things, that’s useful for avoiding roasted feet.
My gambit in Path of Relics was to say that, after years of using something like Neural Link, we’d learn a lot more about how the human brain functions. Then, employing machine learning, we could crunch all that data and achieve similar BMI (brain-machine interface) results using the over-the-scalp methods of EEG nodes.
This would eliminate the need for invasive surgery to experience the benefits of something like Neural Link. And that’s good, because undergoing open brain surgery to play another version of Skyrim is quite an ask. Then again, maybe not?
GVS (Galvanic Vestibular Stimulation)
Even with a state-of-the art “Portal Rig” whipping you through the fantasy world, not all movements can be faithfully recreated. For example, what if you fall down a chasm? There’s no way something like a Portal Rig could continuously simulate that fall. And what about the sense of acceleration when driving a car or a spaceship?
To the rescue comes GVS, or Galvanic Vestibular Stimulation. I’ve included a video below of the technique in use. The concept is that an electrical current stimulates nerves in the inner ear that control balance. This has the effect of making you feel you’re moving, even if you’re standing still.
Now, if you’re just standing in a room, this can be problematic. If the GVS manipulates your sense of balance to think you’re moving left, your instinct is to correct by moving to the right. The problem is, you weren’t actually moving left, so you end up overcompensating and tipping over to the right.
But combined with something like the Portal Rig where you’re strapped in, the experience might be seamless.
The Game World
The ability to run forever in virtual reality is meaningless without the world itself. Just like the Portal Rig, we’ve got a journey ahead to create a game as realistic as Path of Relics. But we’re getting there.
The three main areas I see us needing to work on are graphics, procedural generation, and artificial intelligence.
This is the furthest along of the three areas, which isn’t surprising considering computer graphics have matured for over thirty years now. But with advances like Unreal Engine 5, photo realistic video games seem closer than ever.
Check out the video below… yeah, that’s in a game engine; it’s not real.
If you’ve heard of No Man’s Sky, you know about procedural generation. Right now, it’s a mixed bag. The process relies on algorithms to automatically create parts of a game: plants, characters, worlds, galaxies.
Thing is, current tech doesn’t allow for a realistic sense of variety. This is improving, but to reach the depth required to simulate a real world, this technology will need to step it up.
Fleshing out the details of just one fully simulated game character is too much work for any developer, or team of developers, to accomplish. Now, expand that to an entire world of characters, cities, meaningful quest-lines, an economy, a coherent history, etc.
Procedural generation would need to lend a BIG hand in all of it to reach the realism and scale of something like Path of Relics. Here are a couple of examples of where we are now.
I’ve followed this project for years now, even though I’ve never played (mostly because I don’t have a powerful enough PC… or the time). Still, I’m fascinated by the game’s potential.
In Star Citizen, the dev team uses procedural generation to create entire planets of content, including colonial outposts, cave systems, rivers, flora, and more. No Man Sky does similar, but the worlds of Star Citizen look more believable.
In the NPC department, the SC team is working on giving each game character random “idle” gestures to make them seem more real. Things like sneezing or scratching their face or stretching. And their “Quanta” system gives NPCs different demeanors and motivations to simulate rudimentary personalities. It’ll also keep track of past interactions with players, giving these game characters a kind of personal history.
I’ve been experimenting with the beta version of Dall-E 2, and it’s interesting. You describe what artwork you want by typing it into the program. It then creates an image based on your text description. You can even specify what style of artwork you want, from impressionism to digital art to realistic.
Here’s some of my experiments and the text prompts I used to achieve them…
“An illustration of backstreets in a fantasy-style city. Food stalls are on the side of the streets. Higher up, clothes hang on lines between buildings. In the foreground, four travelers in assorted armor walk down the center of the street, their backs to the viewer.”
“Santa Claus driving a green Jeep on the beach.”
It’s still quirky and often difficult to get exactly what you’re looking for, but this type of “human world” understanding is necessary to create believable virtual worlds.
This is the largest bottleneck to building a world like Path of Relics. Personally, I’d prefer AI like my fictional game developers intended… almost indistinguishable from real humans, but not actually sentient. And while even that is a long way off, researchers have made strides in this department.
Specifically, language models like GPT-3 allow AI to comprehend the information you type. For example, in the program AI Dungeon, the AI serves as a virtual Dungeon Master. It gives you the beginning of a story, and you type in what you want to do next.
Maybe you’re in a forest clearing, and you say you want to explore the mountains nearby. The AI fills in the blanks and writes out what happens, providing the next leg of your adventure. Then you decide something else to do.
And below is a video of a similar process, except instead of typing, the developer is talking to the NPCs. The language model listens to what he says, then has the NPCs respond using voice. You can see there’s a lag as the algorithm processes the information, but, small steps.
We’re Getting There
There are many reasons I wrote Path of Relics as my first novel, but one of them was as an outlet to experience full immersion VR.
We just aren’t there yet, but fiction is one of the few ways we can glimpse what something like the Portal Rig might be like. Ready Player One is one such story (one I greatly enjoyed), but I wanted to add my imagination to the mix.
And by the way, if you haven’t read my novel yet, you can get it here – Get Path of Relics on Amazon.
In my story, they introduce the Portal Rig to the world in 2044, but I hope we achieve full-immersion long before that.
What about you? What experiences would you love to have in something like the Portal Rig (PG-13 tops please ;)?
VR Steampunk Woman by Ady Setiawan