Adrian David Cheok on virtual senses for the internet

posted in: Media | 0

30. March 2017


Name: Adrian David Cheok
Position: Director of the Imagineering Institute Malaysia, Chair Professor of Pervasive Computing at City University London, Founder and Director of the Mixed Reality Lab Singapore
Thema: Virtual senses for the internet


Nowadays, we mostly perceive our digital world by viewing text and images or hearing audio. But what about the rest of our senses?

Adrian Cheok, Director of the Imagineering Institute Malaysia, founded the Mixed Reality Lab in Singapore and tries to integrate the rest of our senses, like smelling or tasting, into our digital experience. „What do we need digital smells and tastes for anyway?“, one might ask.

Those senses are of big importance for influencing emotion. Every day we can experience how smells can change our mood, e.g. when eating a delicious meal.

The possibilities for using digital senses are widely spread: games, films, messaging and communication over social networks or telephone, but also commercial usage like advertisement.

Cheok’s Mixed Reality Lab develops devices that stimulate our senses with electrical signals, „because we can’t send the chemicals over the internet“, that are normally needed for these body-reactions, explains Prof. Cheok. As he puts it, we already live in our own „analog virtual reality“ with our brain as a device to perceive the world surrounding us. That’s why he believes that the step to a digital virtual reality will not be a very big one.

Might there be a danger in replacing our „real life“ with digital virtual reality?
According to Cheok, differences will become smaller, „but society will adapt“. Kissing or marrying a robot in the future may be as normal as human marriage today.

Nonetheless, the focus of his research is not in replacing our life with digital experiences, but in expanding the analog life by adding digital impressions.


Adrian David Cheok Keynote Speaker The title is “Love and Sex with Robots” at IT Innovation Day 2017, Amersfoort, Netherlands – 28/09/2017

Professor Adrian David Cheok will give a keynote speech at IT Innovation Day on 28 September 2017, in Amersfoort, Netherlands.

Title: Love and Sex with Robots
Time: 14:33, 28 September 2017
Location: Prodentfabriek, Amersfoort, Netherlands

In his speech, Professor Cheok will look at the tangible aspect (touch) of technology, and the ways in which this will contribute to an overall experience for people, including sexual behaviour. Adrian Cheok will outline a more controversial view of the future, along with how tangible technology will enhance experiences at all levels of human behaviour. When is good, good enough, real enough, and how can quality be improved? Adrian will also share his latest tech inventions with the public.

Imagineering Institute launches Digital Food exhibition at Singapore Science Center

posted in: Media | 0

Digital Food is an exhibition that focuses on the futuristic idea of how taste or flavour have been evolving, i.e. from natural, to artificial or synthetic flavours. The exhibition also explores how senses can be manipulated using digital technology in future. It challenges us to think about how digital food is going to enhance living quality and improve our health. This exhibition is only available for a limited period so do catch it while it last! It is jointly developed by Science Centre Singapore and Imagineering Institute.

Exhibition Dates:

20 Sep 2017 – 20 Nov 2017


Hall A, Science Centre

Typical time required:

30 min


Exhibition Highlights

Digital Candy Shop

The Digital Candy Shop is the main highlight of this exhibition with two interactive stations, i.e. the Digital Cream Pot and the Digital Lollipop, which allow you to “taste” food using technology.

Can you smell and taste colours?

This is part of the story of building up an artificial food experience. Visual cues are part of our perception of flavour and food. From the moment we see the food, our brains begin to build expectations using memories of previous experiences linked with the food’s colour, smell or appearance. Come have a “taste” of the smell.


Checking your sensitivity in smell

This exhibit challenges visitor if they have a super nose to distinguish different smells and identify the common ones.

Exhibition Partner

Adrian Cheok Keynote Speaker at FDG 2017, Cape Cod

Professor Adrian Cheok was invited to give a keynote speech at the International Conference on Foundations of Digital Games 2017, at Cape Cod, USA.

FDG 2017 is a major international event in-cooperation with ACM SIGAISIGCHI, and SIGGRAPH. It seeks to promote the exchange of information concerning the foundations of digital games, technology used to develop digital games, and the study of digital games and their design, broadly construed. The goal of the conference is the advancement of the study of digital games, including but not limited to new game technologies, critical analysis, innovative designs, theories on play, empirical studies, and data analysis.

Professor Cheok’s keynote speech will be covering the trending topic of “Love and Sex with Robots”.

Time: 15 Aug 2017, 9am

Venue: The Resort and Conference Center at Hyannis, Cape Cod, MA, USA

Title: Love and Sex with Robots

Abstract: “Love and Sex with Robots” has recently become a serious academic topic within the fields of Human Machine Interaction and Human Robot Interaction. This topic has also witnessed a strong upsurge of interest amongst the general public, print media, TV documentaries and feature films. This talk covers the personal aspects of human relationships and interactions with robots and artificial partners. New technologies and research prototypes have been developed to allow more intimate interactions with robot companions like sex robots, emotional robots, humanoid robots, and artificial intelligent systems that can simulate human emotions. Such technologies and systems also engage the users with all their senses, including touch, taste and smell, creating multisensory and immersive interactive experiences. In this talk, we will conclude that humans will marry robots by 2050.

For more information on the conference, visit

PRESS RELEASE: Electric Smell Machine for Internet & Virtual Smell

posted in: Research | 0

Date: August 7, 2017
Adrian David Cheok, Kasun Karunanayaka, Surina Hariri, Hanis Camelia, and Sharon Kalu Ufere Imagineering Institute, Iskandar Puteri, Malaysia & City, University of London,UK.
Phone: +607 509 6568
Fax: +607 509 6713

Here we are excited to introduce the world’s first computer controlled digital device developed to stimulate olfactory receptor neurons with the aim of producing smell sensations purely using electrical pulses. Using this device, now we can easily stimulate the various areas of nasal cavity with different kinds of electric pulses. During the initial user experiments, some participants experienced smell sensations including floral, fruity, chemical, and woody. In addition, we have observed a dif- ference in the ability of smelling odorants before and after the electrical stimulation. These results suggest that this technology could be enhanced to artificially create and modify smell sensations. By conducting more experiments with human subjects, we are expecting to uncover the patterns of electrical stimulations, that can effectively generate, modify, and recall smell sensations. This invention can lead to internet and virtual reality digital smell.

Figure 1: Concept of stimulating human olfactory receptor neurons using electric pulses.

To date, almost all smell regeneration methods used in both academia and industry are based on chemicals. These methods have several limitations such as being expensive for long term use, complex, need of routine maintenance, require refilling, less controllability, and non-uniform distribution in the air. More importantly, these chemical based smells cannot be transmitted over the digital networks and regenerate remotely, as we do for the visual and auditory data. Therefore, discovering a method to produce smell sensations without us- ing chemical odorants is a necessity for digitizing the sense of smell. Our concept is illustrated in the Figure 1, which is electrically stimulating the olfactory receptor neurons (ORN) and study whether this approach can produce or modify smell sensations. During a medical experiment in 1973, electrical stimulation of olfactory receptors reported some smell sensations including almond, bitter almond, and vanilla [1]. However, three other similar experiments that used electrical stimulation failed to reproduce any smell sensations [2, 3, 4]. Therefore, finding a proper method to electrically reproduce smell sensations was still undiscovered.

Figure 2: The digital olfactory receptor stimulation device: It has a current controller circuit, endoscope camera, a pair of silver electrodes, a microcontroller, a power supply, a low current multimeter, and a laptop.

Our approach is different from the previous research mentioned above. Our main objective is to develop a controllable and repeatable digital technology, a device that connects to computers and be easily able to programmed and controlled. Also this device needs to generate electric pulses of different frequencies, cur- rents, pulse widths and stimulation times. To provide more stimulation possibilities, we wanted this device to be capable of stimulating diverse sites at the ventral surface of the inferior, middle, and superior nasal concha. Fig. 2 shows the computer controlled digital device we have developed to stimulate olfactory receptors. The amount of current output by the circuit can be controlled using one of the five push buttons shown in Figure 2 and the respective LED near the push button will lights up after the selection. The frequency of the stimulation pulses and stimulation time is controlled by the microcontroller program. It is possible to vary the stimulation frequency from 0Hz to 33kHz and pulse width using the programming. The pair of silver electrodes combined with the endoscopic camera was used to stimulate olfactory receptor neurons, and during the stimulation, one electrode is configured as the positive and the other electrode as the ground. Fig 3 and Fig 4 shows testing our device with human subjects.

Figure 3: This image shows the user study setup and stimulating the nasal cavity targeting the middle and superior concha regions using the device

During our first user study, we have stimulated the 30 subjects using 1mA to 5mA range with frequencies 2Hz, 10Hz, 70Hz, and 180Hz. 1mA at 10Hz and 1mA at 70Hz were the stimulation parameters which gave most prominent results for the smell related responses. Electrical stimulation with 1mA and 70Hz induced the highest odor perceptions. 27% of the participants reported the perceived fragrant and chemical sensa- tions. Other smell sensations that are reported for include, 20% fruity, 20% sweet, 17% tosted and nutty, 10% minty, and 13% woody. Stimulation parameters 1mA/10Hz reported 17% fragrant, 27% sweet 27%, chemical 10%, woody 10%. Meanwhile, results for the 4mA/70Hz reported 82% for pain and 64% reported pressure sensations. We have also probed the effect of electrical stimulation on the nose after stimulation. Therefore, we asked participants to repeat the sniffing of known odorants immediately after stimulation and rate the intensity. Most of the participants reported higher intensity after stimulation. This showed that the electrical stimulation increased the intensity of the odorants in the nose.

Figure 4: This image shows a person is testing the Electric Smell Interface in the lab environment

We are planning to extend this user experiment with more number of participants. The effects of the differ- ent electrical stimulation parameters such as frequency, current, and stimulation period will be more closely studied in future. By analyzing the results, we plan to identify various stimulation patterns that can produce different smell sensations. If the electrical stimulation of olfactory receptors effectively produce smell sen- sations, it will revolutionize the field of communication. Multisensory communication is currently limited to text, audio and video contents. Digitizing touch sense are already been achieved experimentally in the research level and will be embedded to daily communication near future. If the digitization of smell be- comes possible it will paved the way for sensing, communicating and reproducing flavor sensations over the internet. This will create more applications in the fields such as human computer interaction, virtual reality, telepresence, and internet shopping.


1.Uziel, A.: Stimulation of human olfactory neuro-epithelium by long-term continuous electrical currents. Journal de physiologie 66(4) (1973) 409422

2.Weiss, T., Shushan, S., Ravia, A., Hahamy, A., Secundo, L., Weissbrod, A., Ben-Yakov, A., Holtzman, Y., Cohen- Atsmoni, S., Roth, Y., et al.: From nose to brain: Un-sensed electrical currents applied in the nose alter activity in deep brain structures. Cerebral Cortex (2016)

3.Straschill, M., Stahl, H., Gorkisch, K.: Effects of electrical stimulation of the human olfactory mucosa.Stereotactic and Functional Neurosurgery 46(5-6) (1984) 286289

4.Ishimaru, T., Shimada, T., Sakumoto, M., Miwa, T., Kimura, Y., Furukawa, M.: Olfactory evoked potential produced by electrical stimulation of the human olfactory mucosa. Chemical senses 22(1) (1997) 7781

Adrian Cheok Keynote Speaker at Visual SG 2017


Professor Adrian David Cheok will give a keynote speech at Visual SG in Singapore Science Centre on 28 July 2017.

Topic: Everysense Everywhere Human Communication

Time: 11:10am, 28 July 2017

Location: Singapore Science Centre

Visual SG is South East Asia’s signature Visualisation Festival. The Festival celebrates beauty through its bold emphasis on the visual aesthetics, insights and narratives that reside in data and scientific visualisation. Envisaged as both a serious study and playful showcase, VisualSG presents a full on visual spectacle of data through the lens of artistic and creative expression. Through its line-up of interactive displays, forums and workshops, Visual SG not only raises awareness of the burgeoning field of big data, it also aims to provoke conversations on the significant role of data analytics in today’s business and societal context.

The theme for Visual SG this year is “Make Visual!”. This theme takes us back on a journey to discover our intrinsic roots; that of the intrepid explorer, creator and inventor. It encourages all of us to take that first step to discover the magic that is all around us through unbridled curiosity.

Visual SG’s 2017 line up is an eclectic collection of artists and scientists who are all pushing the boundaries to tell their stories of science through visually stunning and engaging media.

Seks met robots, omdat robots ook gevoelens hebben

posted in: Media | 0

13 July 2017, by René Schoemaker–omdat-robots-ook-gevoelens-hebben

Hoe gevoel, tast en reuk gedigitaliseerd kunnen worden.

Seks met robots is niet ver meer weg, zegt Adrian Cheok van het Imagineering Institute. Maar eerst gaan we kennismaken met de robotleraar en de robotdokter. En dat gaat dit jaar al gebeuren.


Kan je ons meer vertellen over je werk bij het Imagineering Institute?

Imagineering Institute is een plek waar we multidisciplinair onderzoek doen. Ons researchteam bestaat uit experts met verschillende achtergronden en die samen werken aan onderzoek gerelateerd aan multisensorcommunicatie, HCI, AI en robotics. Het werk in het lab wordt ‘Imagineering’ genoemd, oftewel fantasierijke toepassing van technische wetenschappen. Imagineering betrekt drie hoofdstromen. Ten eerste de fantasierijke verbeelding: de projecties en gezichtspunten van artiesten en ontwerpers. Ten tweede, toekomstverbeelding: extrapolatie van recente en huidige technologische ontwikkelingen, het maken van imaginaire, maar realistische (uitvoerbare) scenario’s en simulaties van de toekomst. Ten derde, creatieve engineering: nieuw productontwerp, prototyping, en demonstratiewerk van engineers, computerwetenschappers en ontwerpers. Het lab voert onderzoek uit in de velden Mixed Reality, Internet Digital Media, Pervasive Computing, Wearable Technology en Multisensory Communication.


Je gaat het op IT Innovation Day hebben over ‘tastbare’ technologie. Veel mensen kunnen zich weinig voorstellen bij een internet dat smaak, gevoel en reuk kan doorgeven. Hoe werkt dat?

Wij willen tast, smaak en reuk digitaliseren. We hebben proof-of-concept prototypes ontwikkeld en verbeteren die. Als die technologie klaar is, is het mogelijk om tast, smaak en geur te digitaliseren, te communiceren en te doen herleven, net als we nu al doen met beeld en geluid. Huggy PajamaPoultry InternetRingU en Kissenger zijn voorbeelden van technologie die we hebben ontwikkeld voor tast-communicatie tussen mensen en tussen mens en dier. Die technologieën zijn in staat om het aanraken te voelen, door te geven en te herproduceren. Voor smaak en reuk gebruiken we voornamelijk elektrische of warmte-energie om de reuk- en smaak-receptors te stimuleren. We kunnen met die stimulatie de receptorcellen activeren en zij genereren dezelfde sensaties die chemische smaak- of geurstimulatie voortbrengt. We hebben wetenschappelijk bewezen dat dat mogelijk is voor smaak en we zijn nu bezig met een serie experimenten voor de reuk. Als dat eveneens lukt, dan zullen we in de komende tien jaar mensen zien die via mobiele apparaten kunnen communiceren met digitale smaak en geur.


Op welke manier zou dat een verrijking zijn voor de manier waarop mensen interacteren met elkaar en met apparaten?

We zetten grote stappen naar een hyperconnected wereld waarin alle machines, systemen en processen om ons heen gedigitaliseerd worden en met elkaar verbonden. Dat maakt interacties mogelijk van mens tot mens, mens tot machine en machine tot machine. Digitale interfaces voor tast, smaak en geur kunnen direct worden geïntegreerd en gebruikt voor die scenario’s. Wij geloven dat dit de traditionele tekst-, audio- en videogebaseerde communicatie zal verrijken tot ware multisensorcommunicatie. Daardoor zullen vele toepassingen worden veranderd, zoals internetwinkelen, messaging, videobellen, e-mail, VR en gaming.


Wat zijn echt praktische oplossingen waarin gevoel en smaak belangrijk kunnen worden?

Volgens mij vooral in communicatie als één op één-messaging, videoconferencing en websites. Dat zijn de toepassingen die we dagelijks gebruiken in onze interactie met anderen. Wij denken dat deze technologieën een revolutionaire verandering zal brengen in communicatie.


Voor veel mensen is de (gedachte aan) interactie met robots wat eng. Zou het gebruik van gevoel hierin kunnen helpen?

Ja. Ik denk dat nieuwe technologieën als tast, smaak en geur, en AI, kunnen helpen in het reduceren van het verschil tussen mens en robot. We zullen deze technologieën implementeren in robots en dat zal de communicatie tussen mens en robot leuker maken. Als jouw robotvriend bijvoorbeeld een lekker hapje tegenkomt als hij buiten de deur is kan hij de smaak delen. Via technologieën als KIssinger zijn we in staat voelsensaties te delen met elkaar. Met gebruik van AI kunnen we tevens robots vriendelijker maken, sensitief en emotioneel. Daarom zijn we nu aan het onderzoeken of we robots kunnen gebruiken als leraren en artsen. Ook hebben we een nieuw onderzoeksveld gestart onder de naam Love and Sex with Robots. Daarin onderzoeken we of mensen intieme relaties kunnen aangaan met robots.


Kan je ons wat concrete uitvindingen melden die werkelijk in de markt kunnen worden gezet?

Waarschijnlijk worden de eerste prototypen van een robotarts en robotleraar dit jaar al geïntroduceerd. Daarnaast zijn we van plan een serie onderzoeksrapporten te publiceren op het onderwerp ‘liefde en seks met robots’. Verder zijn we van plan Kissenger dit jaar al commercieel beschikbaar te maken.


Wat zijn de voordelen voor de industrie en het bedrijfsleven?

IoT en smartphonetechnologie hebben de manier waarop industrie en bedrijfsleven kijken naar R&D fundamenteel veranderd. In het verleden konden bedrijven als Kodak met succes hun eigen topproducten ontwikkelen binnen hun eigen labs, omdat de concurrentie beperkt was (het was moeilijk te voorspellen dat ze omver zouden worden geblazen door digitale technologie). Tegenwoordig moeten R&D-labs concurreren met miljoenen technologisch onderlegde jongeren die werken vanuit de kelder van hun ouders met de bedoeling de status quo omver te gooien. Het voordeel van een lab zoals die van de Imagineering Institute is dat we bedrijven de laatste trends helpen begrijpen zodat ze hun technologische achterstand kunnen goedmaken en op een gelijk niveau kunnen concurreren met de disrupters.


Heb je nog iets toe te voegen dat we nog niet hebben behandeld?

Een van de unieke features van Imagineering Institute is dat het een business incubator heeft (The Hangout Malaysia) binnen het onderzoekslab die uitmunt door de symbiose tussen de oprichters van de deelnemende startup en de onderzoekers die zich richten op nieuwe technologieën en ‘future casting’. De startups doorgaan een rigoureus trainingsprogramma om er zeker van te zijn dat mensen hun producten willen en ervoor willen betalen. We benadrukken tevens de schaalbaarheid van het bedrijf voor lokale, regionale en wereldwijde groei en de positionering ten opzichte van investeerders.


Adrian Cheok, gerenommeerd wetenschapper, spreker en onderzoeker richt zich op het tactiele van het internet. Hoe breng je tast over via het internet? Adrian laveert langs de grens van de mogelijkheden op gebied van robotics en gevoel.

Hij zal ingaan op technologie in relatie tot aanraking en op de wijze waarop dit zal bijdragen aan een algehele ervaring voor mensen, inclusief seksualiteit. Wanneer is goed, goed genoeg, echt genoeg en hoe kan de kwaliteit technisch worden verbeterd? Een must see!

Fifth Sense: The next stage of VR is total sensory immersion

posted in: Media | 0
Wearable logo
Wednesday, May 17, 2017, By Gareth May@garethmay

How will VR expand from audio and visual to incorporate the other senses?

Fifth Sense: The next stage of VR


Last year, the director of the Imagineering Institute in Malaysia, Dr. Adrian Cheok, the brain behind mixed reality wearable Huggy Pajamas, which consoles children with virtual hugs, and Scentee, the smartphone attachment that pings pongs over the data highway, claimed that three senses are pivotal in creating a future sense of presence in the virtual world.

He told Asian Scientist magazine that he is working on technology that allows for “virtual communication of touch, taste and smell by digitizing these senses.”

In the instance of smell it’s a claim that now has scientific backing. In a paper published last October, in the journal Virtual Reality, researchers from the University of Ottawa found that the addition of smell when in a VRenvironment “increases the sense of presence.”

The ‘unpleasant odour’ in this instance was piped into the room from an exterior accessory, a common method of simulating smell for VR users. Likewise, Valencia-based Olorama’s wireless aromatizing device does exactly that, fanning smells, such as ‘pastry shop’ and ‘wet ground’, around a VR play space.

At present this smell hack, if you will, is the easiest way to imitate odours. But simulating smell just isn’t that simple; it requires the imitation of molecular science, and ultimately the replication of certain molecules that trigger electrical pulses in brain. As a result, Dr. Cheok’s dream of digitised senses remains a long way off.

It’s in the area of the curated experience – in the form of perfumery, temperature, and haptics – where we’re seeing developments.


Premiered at Gamescom last year, Ubisoft’s Oculus Rift send-up, the fart-simulating Nosulus Rift, gives gamers the ability to smell the farts of characters from the second South Park game, appropriately named The Fractured but Whole (Don’t get it? Try reading it aloud).

“The Nosulus Rift is a fully functional mask using sensors activated through inaudible sound waves in the in-game fart sound, every time the player makes use of his nefarious [fart] powers,” an Ubisoft spokesperson told us. “Each time the sensors are activated, they trigger the odour’s puff. Meticulously and without mercy.”

Virtual flatulence not your bag? How about virtual body odour, breath, or even private parts? Earlier this year, adult entertainment webcam platform CamSoda announced a device called the OhRoma; a gasmask-style wearable with interchangeable odour canisters that releases smells matching any of the thirty ‘broadcast’ by a cam model via Bluetooth. The company is taking pre-orders now.

Both of these nasal wearables don’t allow for the interaction of smells, however, and that’s something Tokyo-born but Silicon Valley-based startup Vaqso VR demoed back in January with a Mars-bar sized VR accessory that’s able to emit multiple smells at once. The showcase revealed that players of a VR experience could smell not only the gunpowder of a gun but also the scent of a peach when it was pierced with a bullet.

“This device makes your VR experience richer,” says CEO Kentaro Kawaguchi, adding that he’s also working on simulating taste. “We want to perfectly reproduce the various senses of the five senses. Currently we can produce smells though taste may take a little while to develop.”

Compatible with PSVR, Oculus Rift, and HTC Vive, with claims on the site that the team can make any scent on demand, the consumer version of Vasqo’s VR scent device is scheduled for the first half of next year.

Fifth Sense: The next stage of VR is total sensory immersion

Warmer, warmer…

Prototyped at GDC in 2015, the multi sensory Feelreal mask promised to simulate temperatures and imitate wet and warm environments using a sophisticated combo of misters, heaters, and coolers (plus an ‘odour generator’). It didn’t get off the ground after a failed Kickstarter campaign.

One company that is delivering on their multi sensory promise is Sensiks. Its sensory reality pod, in which the user is seated, provides a totally immersive VR experience, augmenting the visuals from the headset with a set of exterior wind, light, and heat sense simulators – or, as founder Fred Galstaun puts it, “full sensory symphonies.”

“Real life reality is always full sensory and 360. Even a small cool breeze on the skin sets off the brain in ways you cannot even imagine,” he says. “Within a closed controlled environment where all the senses, including audio-visual, are made 360, there is no difference for the brain anymore between real and fake. It has become reality for the senses.”

Galstaun calls his pods—which are currently used in medical institutions for PTSD trauma recovery and with mentally disabled and elderly patients—sensory reality or SR for short. “We place SR next to VR and AR, a brand new product category in the programmed reality scene.”

But, as pods, these stimuli are exterior. As we’re seeing with smell, could temperature be incorporated into a wearable experience down the line?

The sensation of temperature is something that Samsung’s C-Lab is exploring with their T.O.B headband. As we previously reported, all we know about Touch On The Brain so far is that it generates the sensation of heat using an acoustic impulse that stimulates the brain. We asked for an interview but were told that because T.O.B is still at the very beginning stage in terms of development, no developers were available to chat. We’ll be waiting patiently to find out more.

Taste Test

As we all know, much of our perception of a meal relies upon different sensory inputs, from smell to sight to sound. Building on this core principle, with the aid of a VR headset and specially-created technology, is LA-based Project Nourished, a gastronomic experience that’s attempting to simulate eating by tricking the brain into thinking it’s consuming food.

Not that the brain is easily duped. The tech Project Nourished uses (main image) include a gyroscopic utensil and a virtual cocktail glass that allow the diner’s movements to be translated into virtual reality, a diffuser to imitate the smell of various foods, and a ‘bone conduction transducer’ that “mimics the chewing sounds that are transmitted from the diner’s mouth to ear drums via soft tissues and bones.”

When combined with an edible gum the result is ultimate brain bamboozlement (Willy Wonka would be jealous) and a system the creators hope could be used to treat people with obesity and eating disorders, as well as help children to form positive eating habits from an early age.

Fifth Sense: The next stage of VR is total sensory immersion

Currently haptics are the most popular way of incorporating the sensation of touch into VR and it looks like this will be the first sense to . This starts from something as simple as Go Touch VR’s finger cover accessory that simulates the sensation of force you get when your finger encounters a real life object. It’s a VR glove without the actual glove part and it works with a rough schedule from the French startup of early 2019 for mass production.

At the other end in terms of both impact and expense, the Rez InfiniteSynesthesia Suit, created by students at the Keio University Graduate School of Media and Design in Japan, is a full-body Velcro haptic VR suit that’s kitted out with small motors that vibrate as you journey through the virtual world. It’s been described as like “traveling through a psychedelic kaleidoscope“.

Experiences like this hint that we’re on the road to multi sensory VR but we’re unlikely to see much of it brought to reality in 2017. Still, next time you’re dazzled by the sound and picture of a VR experience but your body is crying out for something more immersive, just remember that it’s a work in progress. Buckle up, it’s going to get bumpy.

1 2 3 4 5 22