http://www.wired.co.uk/news/archive/2014-03/31/touch-taste-and-smell-technology
31 MARCH 14 by KATIE COLLINS
Augmented reality is yesterday’s news. Wired.co.uk takes an in-depth look at some of the developing technology designed to virtually stimulate all five senses
“It’s very strong,” I’m warned, but it is already too late — the sour taste of lemons has hit and I’m reminded instantly of how I never could finish those Toxic Waste sweets. My tongue, which is wedged between two metal sensors, cannot bear it for more than a few seconds even now.
I pull off the electrodes and the sensation vanishes. By passing an electrical current through my tongue, they had temporarily tricked my taste receptors into experiencing a sour taste. Varying the frequency of the current allows the electrodes to also simulate other tastes — sweet, salt and bitter.
Wired.co.uk first saw this taste actuator device when it part of a shortlisted proposal for Ferran Adrià’s Hacking Bullipedia competition. The proposal, put together by Professor Adrian Cheok, founder and director of Singapore’s Mixed Reality Lab, didn’t win the competition, but one of the other main elements of Cheok’s proposal has already successfully been made into a commercial product.
The Scentee is a module that can be attached to the bottom of a smartphone and emits puff of scent using chemical cartridges. It simply slots into the phone’s headphone jack and works in conjunction with an app. The basic idea behind it is that you can send aromas over the internet, although the technology is increasingly being appropriated as a marketing tool.
MICHELIN-STAR ODOURS
While the Scentee may have failed to win favour with Ferran Adrià, that doesn’t mean there’s no place for it in the world of fine dining. In one commercial project the device, along with a dedicated app, is being used to create a pre-dinner treat for customers of the Michelin-starred Mugaritz in San Sebastian, Spain, which is currently listed as the fourth best restaurant in the world.
Mugaritz’s head chef, who trained under Adrià at El Bulli, is using the Scentee to connect with customers who might make bookings several months in advance, by giving them a small taste (or whiff) of what to expect from one of his dishes beforehand.
“It’s basically very simple, they give you different kinds of seeds in this bowl and you grind it and it’s multisensory — the sound, the smell, the vision — and the taste of course — so finally you drink it,” explains Cheok.
The Mugaritz team created an app that they will tell diners to download when they send them the Scentee device. The app recreates virtually the experience of crushing the seeds in a pestle and mortar, allowing the user to experience the sound and smell of making the dish. Holding the phone horizontal, Cheok demos the app, which allows for a partial view inside a mortar. As the phone — which acts as the pestle — is gently moved around using a stirring action however, more of the mortar is shown and seeds drop into the bowl with a tinkle.
“That’s the actual sound they recorded from the mortar,” he says, as it rings gently. After gently rotating the phone for a while, the Scentee device emits puffs of pepper, sesame and saffron, all of which were also created in the restaurant’s kitchen. “What he’s saying is that he wants to share the experience of the restaurant even before you go there.
WAKE UP AND SMELL THE BACON
The Scentee is also being used more widely in advertising campaigns, including on television. One particular campaign in the US has seen bacon company Oscar Meyer make a thousand or so bacon capsule, which it is sending out with Scentees to competition winners, with the promises they can wake up to the smell of fresh bacon as their phone goes off in the morning.
While there is nothing else quite like the Scentee available on the market, the chemical stimulation of smell is not in itself particularly new idea. At the moment, however, Cheok is planning to use his previous work using the Scentee and the taste actuator to help him try and build an element that will be able to stimulate smell electronically.
“If we can do it, I think it would be the first time anyone makes an artificial smell sensation,” says Cheok. It’s a tricky task because of where the olfactory bulb — the neural structure that perceives odours — sits in the brain. It’s tucked right at the back of the nasal cavity and is very soft and spongy, which would make it hard to attach electrodes to even if it could be reached. Instead, Cheok has something completely non-invasive in mind.
“We will put a magnetic coil at the back of the mouth — so maybe something like a dental guard you can wear — and because the olfactory bulb is quite close to the palatine bone, we can use time-varying magnetic fields to produce electrical currents in the olfactory bulbs. That will then produce artificial smell simulation, similar to the taste,” he explains.
“Using this technique, we can also produce the more complicated smells so because you know our tongue is only the basic five tastes — sour, salty, sweet, bitter and umami — everything else which is actually flavour is from our nasal cavity, so when we develop this it’ll be much more complicated.”
This is the research Cheok is currently undertaking at his lab at City University in London where he is professor of pervasive computing. While he’s not sure actually sure what kind of product it might become a part of yet, he says he confident that “something will come along”. “Sometimes you have to push the barriers of science and then people will come up with their own application.”
Cheok has been based at City University for under a year, before which he was based at Keio University in Japan and the National University of Singapore. He first became interested in the kind of simulating touch, smell and taste when he was working on a project to build an augmented reality toolkit about ten years ago. While it was groundbreaking at the time, he quickly realised it was stimulating only one of our senses, despite all the available data and potential for communication across multiple channels. “So that’s basically the motivation,” he says. “Can we go from the age of information to the age of experience?”
THE AGE OF VIRTUAL INTIMACY
“You can have a virtual animal there — a 3D dog or something — but people always wanted to touch it, it’s just a natural reaction. When you see objects, touching is a very important part of how we explore the world, so I realised we had to go beyond just augmenting our vision and we should also try to augment all of the five senses. More and more of our communication now is done online, but online we still can’t get the sense of presence we have in the physical world,” said Cheok.
This started him working on replicating touch. He kicked off his research by developing ways for people to overcome the difficulties of interacting with their pets from a distance (“it’s very hard to make a telephone call, because we can’t speak to animals yet”), by creating a squeezable doll that would trigger pressure actuators on the animal’s body.
This research morphed into a project called the Huggy Pajama, which was designed to allow similar communication between parents and toddlers wearing haptic jackets. The concept has now been turned into a commercial product that’s specifically designed to help give comfort to children with autism who have difficulty with human-to-human contact.
Cheok along with his colleagues at the National University in Singapore conducted research into the emotional impact of this kind of touch, publishing a paper in 2008 entitledSqueeze me, but don’t tease me, which concluded that “touch seems to be a special sensory signal that influences recipients in the absence of conscious reflection and that promotes prosocial behaviour”.
It’s already a well-known phenomenon that if you’re being touched by another human while watching a horror film, you have a decreased fear response, but the team’s research showed that touch using virtual devices provoked an almost identical decrease in fear response.
“Similar to taste and smell, touch has a different part of the brain processing the haptic sensation than audiovisual. So it’s not like if you just write ‘hug’ in your email — it’s definitely different actually really hugging, because it’s a different part of your brain which is actuated when you’re doing the hug. Having this actual touch sensation does produce a different response.”
Of course there are many different potential applications here for consumer products too. Cheok and his small team at City are currently working hard to meet their deadline on the RingU, a piece of jewellery that transmits a haptic vibration to the wearer of a paired ring when the silicon gem on the top is pressed. Three tiny LEDs, smaller than pin heads will also cause the ring to glow gently when the ‘hug’ is transmitted. The finished product is due to launch in Japan and Korea first, and then will eventually be available in the UK as well.
As well as being a standalone consumer product used for long-distance personal communication, Cheok envisages that the RingU will be branded by bands, who will then be able to communicate with teenage fans at concerts in a similar way to Xylobands. “Of course one pop star can’t hug a million fans, but this way we can have a virtual hug and so fans will like that — a different kind of connection.
“With vision and sound it is omni-directional, so you can go to a concert and hundreds of thousands of people can hear the music, but with touch it’s very limited, it’s very intimate,” he adds.
Speaking of intimate, Cheok is also currently working on the latest version of the Kissenger, a device that allows people to send kiss messages over the internet. Wired.co.uk first reported on the product when it involved kissing spherical robotic pigs, but the new iteration will look very different (and, thankfully, not at all like an animal). Users will attach a module to the bottom of their smartphones, a little bit like the Scentee, and will then be able to kiss face-to-face while video calling. The resolution on the module will be much higher, says Cheok, working like a pin image captor in order to provide a very detailed and precise level of feedback.
It’s not only whimsical consumer products being developed thanks to the evolving haptic touch technology though. As Cheok points out, there are also a lot of potential applications in related areas like robotics. “Robots need to be able to sense the physical world, especially humanoid robots, and also for example, home robots. It will be carrying grandma to bed or something and so it needs to be able to have very realistic touch.”
Collaboration is not only a vital part of finding applications for the technology, but for understanding the potential scientific benefits of the research. While attempting to create the electrical smell simulator, Cheok will be working alongside a neuroscientist from the University of Marseille to discover more about the actual areas of the brain that are being stimulated when people experience electrical tastes and smells, compared to those being stimulated when they experience the real deal.
“For example, this,” says Cheok gesturing to the taste actuator, “allows taste perception of sour, but we’re not really sure yet whether it stimulates the same parts of the brain as the real sour, so we’re going to do these experiments and compare.” They will study the brain signals they observe from people using the smell and taste technology inside an MFRI machine, as well as brain signals from people who have had drops of liquid put on their tongues.
PROGRAMMING DREAMS WITH SMELL
Many of Cheoks biggest hopes and dreams for future research will require closer work with neuroscientists, particularly to test the technology as widely as possible across potential areas for application, including mental health. Even though he could potentially pursue the entrepreneurship opportunities his research opens up, his ambitions lie in pushing the boundaries of science.
“I’ve been thinking for a few years now, can we interface with people when they’re sleeping? So much of computer technology is focused on the conscious communication, but a lot of communication is subconscious.”
Cheok has been inspired to pursue this idea further by a piece of recent research that discovered people could be taught to remember aromas they had smelt for the first time when they were asleep. After exposing test subjects to smells while sleeping, they were then put in a magnetic imaging machine, which could see that when exposed to that same smell again, the area of the brain relating to memory was activated.
“Because smell is connected to emotion, we want to see if we can programme people’s dreams,” states Cheok boldly. “We want to see if we can use these kind of smell devices, for example, to make a happy dream or a fearful dream.” His hope is that this could potentially be used to help treat those who suffer from bad dreams due to post-traumatic stress.
Once they have built the technology, the first step would be to test its effect on the emotion of people who are awake, and then repeat the experiment on people who are asleep. From there, they could go about beginning to work out how people would use it — although, says Cheok, given that most people take their phones to bed with them these days, that should be fairly easy. “You’ve already got a device that’s a computer and can emit a smell… so then you could use this to affect people’s sleep and maybe even new kinds of learning.”
Cheok’s big dreams do not stop there though. His current work relies on finding ways to stimulate the sensors that activate the olfactory bulb and simulate the effects of touch, but ultimately he would like to be able to find a way to bypass the sensors and go to directly to the brain itself.
“This might seem a little bit science fiction now, but already there’s been some work where they can connect the optical fibre to the neuron of an animal,” he says. “That means we can already send some electrical signal from a computer to a neuron and it really won’t take long until we can do this for hundreds and thousands of neurons. Eventually I think we’re going to see in our lifetime some direct brain interface and that will be probably the next stage of this research.”
This moonshot strategy might seem overly ambitious, but it’s worked for Cheok before and he believes he will again. “What I always say to students is do research that is a quantum step, not just incremental,” he says. “We’re not always successful, because sometimes you can’t get a thing to work.. but that’s what we’re aiming for.”