Skip to main content
Illustration: Benjamin Currie/Gizmodo

In an ideal world, virtual reality would be just like regular reality, except without regular reality’s annoying restrictions on human flight, teleportation, endless/consequence-free consumption, etc. As it is VR, it is inescapably VR—you can lose yourself, but only to a point. This has to do partly with the technology’s bulkiness, but also its sensory limitations. Smell and taste, two fairly major components of regular reality—key to memory, present-time enjoyment, and more—do not factor into commercial VR equipment. Of course, this problem isn’t unknown to the score of researchers presently working on VR, and over the last few years, in labs all over the world, these researchers have made real advances in the realm(s) of sensorial verisimilitude. For this week’s Giz Asks, we reached out to some of them for insight into these developments.


Krzystof Pietroszek

Assistant Professor, Communication, American University, and Founding Director of the Institute for Immersive Designs, Experiences, Applications and Stories (Institute for IDEAS)

Technology that lets you simulate other senses is already available in various commercial forms—Teslasuit (no connection to Tesla) makes haptic suits that let you feel as if you were hit if you’re shot in a game, for instance. Not very strong hit, but still, a hit. HaptX, meanwhile, makes a glove that serves as a kind of exoskeleton, providing force feedback—simulating what it feels like when you close your hands on a steering wheel or an apple. There’s also the Digital Lollipop, made by Nimesha Ranasinghe at the National University of Singapore, which is a digital device you can put on your tongue to simulate different tastes. And a company called FeelReal is developing a mask you can put under a VR headset on which different smells are programmed—flowers in spring, forest fires, etc.

Sarah Ostadabbas

Assistant Professor, Electrical and Computer Engineering, Northeastern University

You pull a strap, cinching your suit, and turn on the game. Immediately you’re transported to a warm castle in a fantasy version of medieval Europe. You feel the warmth from the fire against the back of your legs. As you walk to the window, you hear your footsteps echoing off the stone walls. Opening the window, you feel the cold breeze against your cheeks and smell spring flowers mixed with the smell of sheep—it’s shearing time. You notice a tray of pastries and taste the sweet and slightly salty custard as your bite into the tart.

Theme parks have been creating experiences like this with 3D glasses, hydraulic seats, and machines for blowing air, spraying water, and emitting scent. But an interactive, multi-sensory VR experience like this feels about as far-fetched as VR felt in the ‘80s. Of course, this seems possible with high resolution direct brain stimulation and much greater computational power, but this technology is at least decades away. Meanwhile, there are technologies that can help us achieve part of this. Sound is the most integrated, with every VR system including some means of approximating 3D sound. Sound fidelity still lags significantly behind the visual experience, but it’s easy to produce and essential for immersion, so for the most part we can consider that covered. For smell, there are products and research based on liquid scent cartridges that are rapidly vaporized and mixed for immersive olfactory experiences. There has also been preliminary work in direct electrical stimulation of smell receptors. Given that taste centers around regionally distributed receptors on the tongue, most research has focussed on direct electrical and thermal stimulation.

The final sense, touch, is complicated. This sense actually encompasses pressure, temperature, sheer, acceleration (linear and angular) and physical resistance. For years, flight simulators have used hydraulics to simulate feelings of motion and acceleration. Recently electrical motion bases have become available for home use. They are expensive, but within range of a well-funded gamer. Haptic systems, which resist motion and provide tactile feedback, such as vibration, are commercially available and widely used in industry for surgical roboticsCAD input, and in gaming (see here and here). There has also been work on creating haptic textures for more nuanced experiences. There are even full haptic suits for better training.

Like most technologies, wide adoption would drop the price considerably (due to volume), resulting in even wider adoption. As “normal” sight-and-sound VR gets adopted, this will prove of interest and start to drive price drops in further immersion. Right now, though, VR is commercially available, but still relatively niche.

Robert Stone

Chair in Interactive Multimedia Systems and Director of the Human Interface Technologies Team at the University of Birmingham

My team and I have been evaluating commercial VR scent or smell systems (more formally, “olfactory displays”) for the last five or six years. There are, at the moment, a few commercial products on the market. A company called Olorama has invented a cartridge injection system, which can release synthetic smells; the process is similar to perfume production sometimes used on a larger scale in stores and supermarkets. Other techniques use synthesized smells in blocks of paraffin—you inject a high-speed jet of air into the paraffin, and that releases the smell. Some use a tray of small plastic wells which are then heated up, and as they vaporize a fan blows the scent out. A new olfactory display company based in the States, OVR, has developed an innovative scent delivery system that clips to the bottom of most of the popular virtual reality headsets (HTC Vive, Oculus Quest, etc.); I believe their system can currently handle up to nine smells. We’re hoping to work with them this year to bring realistic smells to our 17th century “Virtual Mayflower” demo, so that users can smell just how bad Plymouth was in those days when the Pilgrims departed for the New World!

To trigger the smells in VR, we place an invisible “force field”, a bounding box, around a given object, so that when your movement path intersects with that force field it triggers the relevant smell or smells. The recreation of the old harbor in Plymouth gives us plenty of objects to act as triggers, and believe me, it’s going to smell pretty bad—fish, stale seawater, human excrement, animal excrement, etc. When you walk close to a 17th century latrine, a dung pile, or over a wooden plank spanning the human waste run-off channels into the sea, you’re going to get a whiff of something truly awful, but hopefully realistic for the period!

But the one problem we’ve always had with olfactory displays involves the delivery mechanism. Take, for example, a project we undertook for the British Army. We wanted to recreate the smells of a Middle Eastern village, because we learned that when Army personnel were on patrol, certain smells – even missing smells—could alert them that something about what was about to go down. We could synthesize the smells—cooking, tobacco, rotting, hanging meat and so on, and release them, but the noise of the electromechanical and pneumatic components of the hardware alerted the users long before the smells did, which rendered the simulation useless.

Having said that, at the moment, it’s smell I’m most excited about in a VR context, because I think a good smell delivery system will definitely enhance immersion, and invoke emotions and memories. We found many, many years ago that adding sound to a VR experience made all the difference to how users engaged with and believed the visuals. Bringing sound and smell together will, I have no doubt, revolutionize virtual experiences.

As for taste, which, of course, is closely related to smell: there are electronic techniques that have been around since the 18th century (described as “galvanic tongue stimulation). But, in all honesty, I can’t see an acceptable non-intrusive means of combining and displaying the tastes being available for decades. Indeed, I believe we’re not going to achieve a truly immersive, all-senses-covered VR system, including believable sight, sound, touch, smell and taste experiences (sweetness, bitterness, saltiness, sourness, and umami (savoury)) until we’ve got something akin to a real Star Trek “Holodeck”, something that avoids the need to don encumbering wearable tech!

Murat Akcakaya

Associate Professor, Electrical and Computer Engineering, University of Pittsburgh

Right now, we’re working on National Science Foundation-funded research on haptic interaction in simulated environments, including virtual reality.

We start with human beings interacting with objects in a real environment—interfacing, say, with different textures (rough, smooth, etc.). These textures produce different brain patterns, as measured via EEG. Our work shows that, looking only at a person’s EEG, their real-time brain responses, we can identify what texture they’re touching.

So now our idea is: let’s go back into the VR environment and monitor people’s brain responses there. Applying certain haptic stimulations—electrical stimulation, really any vibrating tactical stimulant—and controlling certain parameters like amplitude, magnitude, frequency, phasing, etc.—when we control those parameters, can we generate or replicate responses in the brain similar to those generated by interacting with objects in a real environment?

We’re still in the phase of investigating different effects, but the overall object is to integrate all those real-time brain-response-guided stimulation control capabilities in a simulated environment such as VR in order to provide a more realistic simulation of touch.

Michael R.M. Jenkin

Professor, Electrical Engineering and Computer Science, York University, whose research interests include Computer Vision, virtual reality and mobile robotics

I have a special interest in understanding the effects of long-duration microgravity on human perception. Let’s suppose you want to train someone to learn how to land a spaceship on the moon, and you train them on earth. Well, when they actually go ahead and do it on the moon, or near the moon, they’re going to be subject to a quite reduced gravity effect. So the question is: is your simulator, which you’ve learnt on earth, going to properly train you? Or is it going to get things wrong, and negatively impact your performance, because gravity is different on the moon? You can ask the question the other way around: if you’re an astronaut on a long-duration space mission in zero gravity and you’re about to land back on earth, in a one-g environment, are there any things you learned in your microgravity training that are going to negatively impact your performance in that scenario? The real question, then, is whether we can build virtual reality that simulates the different loading forces on the body. And I think this can be done.

Jurgen Schulze

Research Scientist, University of California San Diego

Already, we have VR for our ears via spatial audio, which pretty much every VR headset supports. Just hearing a noise in one ear louder than in the other, providing a directional sense, can provide a pretty immersive level of realism.

While arguably the most obvious, I also find it the most important, and the most neglected. Programmers often don’t implement spatial audio very carefully. Often, they don’t factor in the way sound reflects off objects in the environment—they just assume the environment is empty, free of any objects, and simulate what a given act would sound like in empty space. You have directionality, but you don’t have the realism of sound bouncing off of an object, or sounding different in different situations. When you enter a church made mostly of stone, it sounds very different than walking through a space composed of carpets and curtains. And when a soccer ball hits a stone floor, it makes a different sound than it does when it hits a wooden floor, or a carpeted floor. But these differences are rarely reflected, and are often not supported by current VR apps.

Kevin Curran

Professor, Cyber Security, Ulster University

Virtual Reality (VR) of course fools the primal parts of our brains to usher us into immersive worlds. Other senses, however, can be used to compliment the vision sense in VR.

The senses that can be used in VR are sight (sense of vision), audition (sense of hearing), olfaction (sense of smell), somatosensation (sense of touch), and thermoception (ability to sense intensity of heat). Whenith sights, smells, thermoception, sounds and haptic sensation are all utilized together in a spatially representative environment, one is more aligned sensorially.

The first VR attempt to assemble various senses was in 1957 with “Sensorama.” It has a display enriched with physical world objects (e.g., images, stereo sounds, smell, wind, and seat vibration). However, since this experiment, the most common senses stimulated in VR have been sight (through the use of 3D images) combined with hearing.

The Lost Foxfire game system, for example, consists of a virtual reality headset paired with a configurable multisensory suit that delivers thermal, wind, and olfactory stimuli to players, helping them in the game.

The suit contains five heat modules, allowing players to sense heat on the front, the back, the sides of their necks, and on their faces. When players encounter a fox character, they will catch a whiff of the scent of apples, and as players approach fire, they can feel the heat it emits. This leads to a more immersive gaming experience.

Sound is crucial in many VR experiences, but truly realistic immersive sound requires spatial audio, which simulates sounds coming from various distances and directions. Most of the leading VR Companies include spatial audio in their hardware to ensure it is localized correctly matching the manner sound naturally travels from one ear to the other.

The amygdala—the part of the brain that manages emotion and memory—is linked to smell. This is why smells can be so powerful. It is not trivial to incorporate smells in VR, as the chemicals required to create scents can linger or mingle with other scents. Examples however have been games where sea scents have been used when the player is near the sea or in a forest.

It is not all about games, however. There have been studies which have used tactile or olfactory stimuli with the specific purpose of treating specific phobias or posttraumatic stress disorder. They found that by adding other sense stimuli, the effectiveness of VR exposure therapy for the phobia increased.

Haptic interfaces—devices which simulate physical sensations—rely on vibration to simulate sensory experiences. However, they have typically been bulky and need large battery packs or wires to power them. Recently, Northwestern university developed a 15cm square which can be stuck onto the body and uses actuators that vibrate against the skin to simulate tactile sensations. This ‘synthetic skin’ is controlled wirelessly with an app that transmits tactile patterns to the patch.

Virtual reality taps into evolutionary biology to make our brains believe, with all 5 senses, that we are experiencing something more real than previously thought possible. The covid pandemic has shown us how fragile the real-world is and how much we miss human contact. Multi-sensory VR may be just the way out of the next pandemic. Remember, technology never gets worse—it only improves.

Source: Is There VR for Senses Other Than Sight?