In the last blog post we looked at the differences between augmented reality (AR) and virtual reality (VR), and how these technologies are poised to revolutionize not just how we entertain ourselves, but also the way we do everything else, from science and medicine to education to product design and manufacturing simulation. In this article, we want to look at a fast-evolving aspect of VR called “haptics.”

Up until recently, virtual reality technology focused mainly on two of our senses: sight and sound. VR users typically wear headsets composed of goggles that feed them artificial visual information, such as a fantasy location, and earphones that supply the audio that goes with it. Together, they create a fairly realistic sensation of being in another place.

Lately, however, VR has been taking on a third sense: touch. A technology known as “haptics” is making VR a far more powerful and useful tool, taking it to the next level of immersive experience.

The Origins of Haptics

The term “haptic” comes from the Greek words haptόs, meaning “palpable,” and haptikόs, meaning “suitable for touch.” It refers to the science of applying touch (tactile) sensation and control to interaction with computer applications. Haptic communication (also sometimes called “kinesthetic communication”) uses forces, vibrations, or motions applied to the user to create the sensation of feeling.

Haptics in some form or another has been around for a while, and its use in video games is nothing new. Simple vibration has been used to help heighten the sensation of reality in video games for decades—for example, in video gaming vests that use electromagnetic technology to transform acoustic signals into vibrations against a player’s chest (like wearing a subwoofer) when a virtual explosion or physical contact occurs in the game. Since then, gaming vests have evolved and can convert electrical, hydraulic, and pneumatic energy into vibration that makes the gaming experience seem more real.

Haptics: Taking VR to the Next Level

Now VR is beginning to incorporate haptics, too. In practical terms, building haptics into VR means that users will now be able to feel their artificial environment, in addition to seeing and hearing it, making it a more lifelike experience. (Imagine being able to reach out and feel like you’re touching a wall that isn’t really there.)

There are already haptics suits for VR games that use vibration, and even one that uses a lightweight exoskeleton to apply physical forces to the body. But in an exciting new development, scientists at the Hasso Plattner Institute in Germany have recently been making inroads in electrical muscle stimulation that goes beyond vibration to actually allow users to feel virtual objects.

This technology works by stimulating opposition muscles to a desired movement, creating the sensation of weight, especially when combined with VR images. When a user “picks up” a virtual object, a counterforce is generated, and they can feel the object. The harder they press it, the more counterforce is generated.

The technology is still young, and users have to wear a backpack in addition to the headset, and a bunch of sensors stuck to their arms, but the days are not far off when these sensors will be built into clothing such as specialized suits and gloves, helping to create a seamlessly immersive experience for the user.

Other Applications for VR and Haptics

Of course, there’s more to life (and VR) than just gaming.

Take, for example, the work of a company called Virtalis. Virtalis has developed VR software systems for businesses and organizations that help them use their own data to visualize, simulate, and immerse themselves in a VR experience related to their own businesses. The incredible “Visionary Render” system can provide a virtual environment in which a company’s engineers and designers can create, access, simulate, test and refine their products in 3D, saving their organization huge sums when compared to the costs of doing these activities in the real world, e.g. building real prototypes, finding mistakes, and rebuilding them. (Watch the video here—it will blow your mind.)

Visionary Render doesn’t even incorporate haptics yet, but imagine what it will be like when it does. Those days may be just around the corner. According to Virtalis, haptics is the next step in the evolution of VR. “Haptic devices provide a tactile interface to radically change the way we use computers…Haptic technology is now maturing and coming out of research laboratories and into real products and applications.” In fact, Virtalis has developed some amazing VR systems that do use haptics to provide an incredibly immersive user experience.

Teaming up with a company called Freeform Studios, Virtalis came up with their revolutionary “Touch and Discover Systems.” Touch and Discover includes the Probos system, which allows visitors to museums to virtually “touch” priceless artefacts.

There’s also the Haptic Cow and Haptic Horse veterinary training applications that let students do a virtual internal exam of a horse or cow, and actually feel their internal organs, preparing them for real-life situations. “Your brain is telling you there is nothing there, but your fingers are telling you that they are exploring and feeling the virtual image your eyes are seeing. It’ll surely be not too many years before all virtual models are haptically enabled as a matter of course.”

The Future of Business and Manufacturing

Haptics technology is taking VR to the next level, and will soon be an integral part of how businesses design and test their products; train their employees in a hyper-real but safe, simulated environment; virtually assemble production lines and test efficiency before performing the expensive real-life installations, and more—the applications are infinite. The potential for huge cost savings alone will inevitably make haptics-enabled VR a permanent feature of the business and manufacturing landscape in the not-so-distant future.

Applications for Simutech’s Troubleshooting Skills Training System™ Software?

All of this has got us wondering—is there an opportunity to use haptics in our TSTS™? We’ve had customers suggest that some kind of tactile “feedback” would be useful for trainees when they make a mistake—perhaps some “feedback” when someone when they virtually electrocute themselves? Let us know your thoughts in the Comments section, below!

Share This