Sensory hacking: perfume-infused dreams and virtual intimacy

posted in: Media

wired_logo

http://www.wired.co.uk/news/archive/2014-03/31/touch-taste-and-smell-technology

31 MARCH 14 by KATIE COLLINS

Augmented reality is yesterday’s news. Wired.co.uk takes an in-depth look at some of the developing technology designed to virtually stimulate all five senses

bullipedia-4

“It’s very strong,” I’m warned, but it is already too late — the sour taste of lemons has hit and I’m reminded instantly of how I never could finish those Toxic Waste sweets. My tongue, which is wedged between two metal sensors, cannot bear it for more than a few seconds even now.

I pull off the electrodes and the sensation vanishes. By passing an electrical current through my tongue, they had temporarily tricked my taste receptors into experiencing a sour taste. Varying the frequency of the current allows the electrodes to also simulate other tastes — sweet, salt and bitter.

Wired.co.uk first saw this taste actuator device when it part of a shortlisted proposal for Ferran Adrià’s Hacking Bullipedia competition. The proposal, put together by Professor Adrian Cheok, founder and director of Singapore’s Mixed Reality Lab, didn’t win the competition, but one of the other main elements of Cheok’s proposal has already successfully been made into a commercial product.

scentee_1

The Scentee is a module that can be attached to the bottom of a smartphone and emits puff of scent using chemical cartridges. It simply slots into the phone’s headphone jack and works in conjunction with an app. The basic idea behind it is that you can send aromas over the internet, although the technology is increasingly being appropriated as a marketing tool.

MICHELIN-STAR ODOURS

While the Scentee may have failed to win favour with Ferran Adrià, that doesn’t mean there’s no place for it in the world of fine dining. In one commercial project the device, along with a dedicated app, is being used to create a pre-dinner treat for customers of the Michelin-starred Mugaritz in San Sebastian, Spain, which is currently listed as the fourth best restaurant in the world.

Mugaritz’s head chef, who trained under Adrià at El Bulli, is using the Scentee to connect with customers who might make bookings several months in advance, by giving them a small taste (or whiff) of what to expect from one of his dishes beforehand.

mugaritz

“It’s basically very simple, they give you different kinds of seeds in this bowl and you grind it and it’s multisensory — the sound, the smell, the vision — and the taste of course — so finally you drink it,” explains Cheok.

The Mugaritz team created an app that they will tell diners to download when they send them the Scentee device. The app recreates virtually the experience of crushing the seeds in a pestle and mortar, allowing the user to experience the sound and smell of making the dish. Holding the phone horizontal, Cheok demos the app, which allows for a partial view inside a mortar. As the phone — which acts as the pestle — is gently moved around using a stirring action however, more of the mortar is shown and seeds drop into the bowl with a tinkle.

“That’s the actual sound they recorded from the mortar,” he says, as it rings gently. After gently rotating the phone for a while, the Scentee device emits puffs of pepper, sesame and saffron, all of which were also created in the restaurant’s kitchen. “What he’s saying is that he wants to share the experience of the restaurant even before you go there.

WAKE UP AND SMELL THE BACON

The Scentee is also being used more widely in advertising campaigns, including on television. One particular campaign in the US has seen bacon company Oscar Meyer make a thousand or so bacon capsule, which it is sending out with Scentees to competition winners, with the promises they can wake up to the smell of fresh bacon as their phone goes off in the morning.

While there is nothing else quite like the Scentee available on the market, the chemical stimulation of smell is not in itself particularly new idea. At the moment, however, Cheok is planning to use his previous work using the Scentee and the taste actuator to help him try and build an element that will be able to stimulate smell electronically.

“If we can do it, I think it would be the first time anyone makes an artificial smell sensation,” says Cheok. It’s a tricky task because of where the olfactory bulb — the neural structure that perceives odours — sits in the brain. It’s tucked right at the back of the nasal cavity and is very soft and spongy, which would make it hard to attach electrodes to even if it could be reached. Instead, Cheok has something completely non-invasive in mind.

“We will put a magnetic coil at the back of the mouth — so maybe something like a dental guard you can wear — and because the olfactory bulb is quite close to the palatine bone, we can use time-varying magnetic fields to produce electrical currents in the olfactory bulbs. That will then produce artificial smell simulation, similar to the taste,” he explains.

“Using this technique, we can also produce the more complicated smells so because you know our tongue is only the basic five tastes — sour, salty, sweet, bitter and umami — everything else which is actually flavour is from our nasal cavity, so when we develop this it’ll be much more complicated.”

This is the research Cheok is currently undertaking at his lab at City University in London where he is professor of pervasive computing. While he’s not sure actually sure what kind of product it might become a part of yet, he says he confident that “something will come along”. “Sometimes you have to push the barriers of science and then people will come up with their own application.”

Cheok has been based at City University for under a year, before which he was based at Keio University in Japan and the National University of Singapore. He first became interested in the kind of simulating touch, smell and taste when he was working on a project to build an augmented reality toolkit about ten years ago. While it was groundbreaking at the time, he quickly realised it was stimulating only one of our senses, despite all the available data and potential for communication across multiple channels. “So that’s basically the motivation,” he says. “Can we go from the age of information to the age of experience?”

THE AGE OF VIRTUAL INTIMACY

“You can have a virtual animal there — a 3D dog or something — but people always wanted to touch it, it’s just a natural reaction. When you see objects, touching is a very important part of how we explore the world, so I realised we had to go beyond just augmenting our vision and we should also try to augment all of the five senses. More and more of our communication now is done online, but online we still can’t get the sense of presence we have in the physical world,” said Cheok.

This started him working on replicating touch. He kicked off his research by developing ways for people to overcome the difficulties of interacting with their pets from a distance (“it’s very hard to make a telephone call, because we can’t speak to animals yet”), by creating a squeezable doll that would trigger pressure actuators on the animal’s body.

This research morphed into a project called the Huggy Pajama, which was designed to allow similar communication between parents and toddlers wearing haptic jackets. The concept has now been turned into a commercial product that’s specifically designed to help give comfort to children with autism who have difficulty with human-to-human contact.

Cheok along with his colleagues at the National University in Singapore conducted research into the emotional impact of this kind of touch, publishing a paper in 2008 entitledSqueeze me, but don’t tease me, which concluded that “touch seems to be a special sensory signal that influences recipients in the absence of conscious reflection and that promotes prosocial behaviour”.

It’s already a well-known phenomenon that if you’re being touched by another human while watching a horror film, you have a decreased fear response, but the team’s research showed that touch using virtual devices provoked an almost identical decrease in fear response.

“Similar to taste and smell, touch has a different part of the brain processing the haptic sensation than audiovisual. So it’s not like if you just write ‘hug’ in your email — it’s definitely different actually really hugging, because it’s a different part of your brain which is actuated when you’re doing the hug. Having this actual touch sensation does produce a different response.”

ringu-3

Of course there are many different potential applications here for consumer products too. Cheok and his small team at City are currently working hard to meet their deadline on the RingU, a piece of jewellery that transmits a haptic vibration to the wearer of a paired ring when the silicon gem on the top is pressed. Three tiny LEDs, smaller than pin heads will also cause the ring to glow gently when the ‘hug’ is transmitted. The finished product is due to launch in Japan and Korea first, and then will eventually be available in the UK as well.

ringu-2_1

ringu

As well as being a standalone consumer product used for long-distance personal communication, Cheok envisages that the RingU will be branded by bands, who will then be able to communicate with teenage fans at concerts in a similar way to Xylobands. “Of course one pop star can’t hug a million fans, but this way we can have a virtual hug and so fans will like that — a different kind of connection.

“With vision and sound it is omni-directional, so you can go to a concert and hundreds of thousands of people can hear the music, but with touch it’s very limited, it’s very intimate,” he adds.

Speaking of intimate, Cheok is also currently working on the latest version of the Kissenger, a device that allows people to send kiss messages over the internet. Wired.co.uk first reported on the product when it involved kissing spherical robotic pigs, but the new iteration will look very different (and, thankfully, not at all like an animal). Users will attach a module to the bottom of their smartphones, a little bit like the Scentee, and will then be able to kiss face-to-face while video calling. The resolution on the module will be much higher, says Cheok, working like a pin image captor in order to provide a very detailed and precise level of feedback.

kissenger

It’s not only whimsical consumer products being developed thanks to the evolving haptic touch technology though. As Cheok points out, there are also a lot of potential applications in related areas like robotics. “Robots need to be able to sense the physical world, especially humanoid robots, and also for example, home robots. It will be carrying grandma to bed or something and so it needs to be able to have very realistic touch.”

Collaboration is not only a vital part of finding applications for the technology, but for understanding the potential scientific benefits of the research. While attempting to create the electrical smell simulator, Cheok will be working alongside a neuroscientist from the University of Marseille to discover more about the actual areas of the brain that are being stimulated when people experience electrical tastes and smells, compared to those being stimulated when they experience the real deal.

lemon

“For example, this,” says Cheok gesturing to the taste actuator, “allows taste perception of sour, but we’re not really sure yet whether it stimulates the same parts of the brain as the real sour, so we’re going to do these experiments and compare.” They will study the brain signals they observe from people using the smell and taste technology inside an MFRI machine, as well as brain signals from people who have had drops of liquid put on their tongues.

PROGRAMMING DREAMS WITH SMELL

Many of Cheoks biggest hopes and dreams for future research will require closer work with neuroscientists, particularly to test the technology as widely as possible across potential areas for application, including mental health. Even though he could potentially pursue the entrepreneurship opportunities his research opens up, his ambitions lie in pushing the boundaries of science.

“I’ve been thinking for a few years now, can we interface with people when they’re sleeping? So much of computer technology is focused on the conscious communication, but a lot of communication is subconscious.”

dreamcatcher

Cheok has been inspired to pursue this idea further by a piece of recent research that discovered people could be taught to remember aromas they had smelt for the first time when they were asleep. After exposing test subjects to smells while sleeping, they were then put in a magnetic imaging machine, which could see that when exposed to that same smell again, the area of the brain relating to memory was activated.

“Because smell is connected to emotion, we want to see if we can programme people’s dreams,” states Cheok boldly. “We want to see if we can use these kind of smell devices, for example, to make a happy dream or a fearful dream.” His hope is that this could potentially be used to help treat those who suffer from bad dreams due to post-traumatic stress.

Once they have built the technology, the first step would be to test its effect on the emotion of people who are awake, and then repeat the experiment on people who are asleep. From there, they could go about beginning to work out how people would use it — although, says Cheok, given that most people take their phones to bed with them these days, that should be fairly easy. “You’ve already got a device that’s a computer and can emit a smell… so then you could use this to affect people’s sleep and maybe even new kinds of learning.”

Cheok’s big dreams do not stop there though. His current work relies on finding ways to stimulate the sensors that activate the olfactory bulb and simulate the effects of touch, but ultimately he would like to be able to find a way to bypass the sensors and go to directly to the brain itself.

“This might seem a little bit science fiction now, but already there’s been some work where they can connect the optical fibre to the neuron of an animal,” he says. “That means we can already send some electrical signal from a computer to a neuron and it really won’t take long until we can do this for hundreds and thousands of neurons. Eventually I think we’re going to see in our lifetime some direct brain interface and that will be probably the next stage of this research.”

This moonshot strategy might seem overly ambitious, but it’s worked for Cheok before and he believes he will again. “What I always say to students is do research that is a quantum step, not just incremental,” he says. “We’re not always successful, because sometimes you can’t get a thing to work.. but that’s what we’re aiming for.”

 

Interview on CNN: Forget text messaging, the ‘oPhone’ lets you send smells

posted in: Media

cnn logo

http://edition.cnn.com/2014/03/17/tech/innovation/the-ophone-phone-lets-you-send-smells/

By Kieron Monks for CNN

March 17, 2014

cnn-1
Already on the market is the Scentee plug-in. It allows a smartphone user to attach a small device to their phone and receive “smell notifications” when a message arrives.
cnn-2
Currently, each device can only emit one smell at a time: “Right now it’s the equivalent of music before MP3s,” says augmented reality professor Adrian Cheok. “You had to record a song on a tape and physically give it someone.”
cnn-3
Dr Cheok (right) is hoping to change this. Alongside chef Andoni Luis Aduriz, he presented the “world’s first digital smell app” at the Madrid Fusion 2014 food festival.
cnn-4
The device contains magnetic coils that send electric signals into the brain’s olfactory bulb to simulate the effect of smell. Cheok hopes to have a prototype available within two years.

(CNN) — Holiday albums could be less forgettable when pictures of a Mediterranean meal carry the scent of olives; a selfie on the beach contains a trace of salt spray or a rainy London scene conveys the distinctive aroma of freshly wet concrete.

If the digital age has increased the volume of communication, it may not have improved the quality. Reversing that trend is the goal of a new generation of sensory engineers who are going beyond sight and sound to produce devices that use our untapped faculties. Perhaps the most exciting breakthroughs right now are arriving in the form of smell-centered communication.

“Our motto is ‘aroma tells a thousand pictures'”, says Dr. David Edwards, biomedical engineer at Harvard and founder of Le Laboratoire, known for producing radical sensory devices such as calorie-free chocolate spray. Every human has thousands of distinct smell sensors, Edwards explains, a resource he taps with his newest invention the oPhone.

Set for a beta launch in July, this phone offers the most sophisticated smell messaging yet created. In collaboration with Paris perfumers Givaudan and baristas Café Coutume, Edwards has created a menu of scents, contained in ‘Ochips’. MIT electrical engineer Eyal Shahar designed containers for them that release when heated by the touch of a button, but cool quickly to keep smells distinct and localized, a historic difficulty with the much-mocked smell-o-vision experiments in cinema.

Mix and match

The oPhone user can mix and match aromas and then send their composition as a message, which will be recreated on a fellow user’s device. Up to 356 combinations will be possible in the first wave, rising to several thousand in the next year, and the dream is an exhaustive base — the ‘universal chip’.

“Biologically we respond powerfully to aroma, so if we become familiar with the design of aromatic communication we might be able to say things we couldn’t before”, says Edwards. He sees the limited aromas of the oPhone as the first letters of a rich new language, that may be used as a basis for novels and symphonies. The faith is grounded on the acknowledged influence of smell on the subconscious, and the potential to learn its secrets.

The first oPhones will be limited to a select community of coffee enthusiasts. But the launch on July 10 will be accompanied by a more inclusive product: the first olfactory social network.

Our motto is ‘aroma tells a thousand pictures’. – Dr. David Edwards

A free app will allow anyone to compose and send a smell note by text or email, based on a set menu of aromas and variations. The message can be received by any normal phone as a text. The recipient can then download the composition from hotspots which will be set up in the launch city of Boston.

“We’re expecting an interest in self-expression and we’re ready to learn with the public”, says Edwards. “We would like to be reactive as new ideas for aromatic vocabularies arise, and to continue providing them for new interests.”

He is betting the public around Boston’s famous technology centers are early adopters, and will take the concept forward. Beyond the city, the network will include a public interface for people to trade tips and recipes, and store them in cloud software. Edwards plans to feature ‘smell emoticons’ and viral stunts, and may offer a mixing deck that allows overlap with music production software.

The concept can benefit from saturation of the current communication market, says trend analyst and editor of ‘Green Futures’ Anna Simpson. “We’re reaching a limit with what we can do with text data, and there is the potential to connect more deeply and personally through smell.”

Simpson also believes a consumer shift toward experience could drive adoption. “There is growing interest from brands in resources for creating richer experiences.

Smelly start-ups

Giants such as Olympus are publishing research, but for now start-ups are taking the initiative. Singapore’s Mixed Reality Lab has been prolific in this space, engineering Japanese device Scentee that allows users to send a single fragrance between them. The company released an app worldwide in February, and has lucrative partnerships such as with Mugaritz restaurant in Spain, that allows for online cooking tutorials with leading chefs to give students a whiff of the smell they are aiming for.

We’re reaching a limit with what we can do with text data, and there is the potential to connect more deeply and personally through smell. – Anna Simpson

“Right now it’s the equivalent of music before mp3s, when you had to record a song on a tape and physically give it someone”, says Dr. Adrian Cheok, founder of the Mixed Reality Lab and professor of pervasive technology at City University, London. “We can send a basic scent through a device like Scentee, but we need the framework to make millions of them available through digitization.”

Cheok is testing a device that would connect us directly to the Internet, inspired by the successful connection of optical fibres to neurons of mice. His lab experiments involve subjects wearing a mouthguard-like device containing magnetic coils, from which electric signals are directed into the olfactory bowl to simulate the effect of smell. The wearer’s brains are scanned before and after to pinpoint the effect, and the results have encouraged Cheok enough to believe a prototype could be available in two years.

A similar technique has already born fruit with a similar design simulating the effects of taste. But taste has just four primary forms — bitter, sweet, salty, sour– whereas smell involves identifying individual molecules with no primary form.

“The most basic smell still has hundreds of molecules and you need analytical chemistry to see what’s there”, says Dr. Joel Mainland of the Monell Chemical Senses Center. “Perhaps only 5% would have an impact on smell, so it’s difficult to pick them out. It’s more trial and error than quantitative science.”

Healthy aroma

Monell are also pursuing the goal of digitizing olfaction, with healthcare applications high on the agenda. One of their research areas is seeking smell biomarkers in cancer patients, using an ‘e-nose’ to hunt chemicals in the blood to deliver early diagnosis. The process was inspired by the ability of dogs to sense sickness, although their smelling ability is multiples higher.

Although this research is still young in the lab, similar technology is already being smartphone-enabled. A NASA-developed chemical sensor has been released to a commercial partner as the basis for mobile applications that could breath-test users. UK Nanotechnology company Owlstone are raising several million dollars in venture capital for a handheld sensor that that could detect a wider range of diseases.

Medical uses are high on the agenda for the burgeoning Digital Olfaction Society, whose upcoming conference will discuss olfaction technology for identifying dangerous gases, guidance for the blind and cognitive aid for Alzheimer’s sufferers. But industries as varied as military, travel, jewellery, food and entertainment will also be represented.

Dr. Cheok believes the ultimate direction of goal is a multi-sensory device unifying all five senses to create an immersive virtual reality, and could be usable within five years. The neglected senses are making up for lost time.

BBC TV Feature: Can an ‘electronic lollipop’ simulate taste?

posted in: Media

bbc-blocks-dark

http://www.live.bbc.co.uk/news/technology-26487218

8 March 2014 Last updated at 00:09 GMT

Scientists at City University, London have developed a machine that they say is able to simulate taste.

When touched on the tongue, the experts claim that the so called “electronic lollipop” is able to trick taste receptors using an electronic signal.

BBC Click’s Spencer Kelly reports.

Watch more clips on the Click website. If you are in the UK you can watch the whole programme on BBC iPlayer.

BBC TV Feature: How to turn your smartphone into a ‘smell phone’

posted in: Media

bbc-blocks-dark http://www.live.bbc.co.uk/news/technology-26526916

12 March 2014 Last updated at 08:26 GMT

Award-winning chef Andoni Luis Aduriz is developing an app to bring the full sensory experience of his cooking to smartphones.

The app allows the user to virtually recreate one of his signature dishes which can then be smelt through the use of a device which plugs into the phone. BBC Click’s Lara Lewington reports. Watch more clips on the Click website. If you are in the UK you can watch the whole programme on BBC iPlayer.

Adrian David Cheok, Mixed Reality Lab, City University London, features on BBC Click | BBC News Channel

posted in: Media

BBC_click

http://www.bbc.co.uk/programmes/b03y9z19/broadcasts/upcoming

DURATION: 30 MINUTES

Click investigates the latest ‘smell’ tech. Includes tech news and Webscape.

Upcoming Broadcast Schedule:
Also available on BBC iPlayer:

http://www.bbc.co.uk/iplayer/episode/b03y9z19/Click_08_03_2014/

Broadcast Times Outside UK on BBC World News:

http://www.bbc.co.uk/programmes/n13xtmd5/broadcasts/upcoming

1 16 17 18 19 20 21 22