This year you’ll receive an SMS with a difference as technology is introduced to transmit scents through your smartphone, says Josh McNorton
Imagine the next selfie you see posted is accompanied by the scent of perfume. The Instagram photo of your gourmet steak dinner comes with a whiff of buttery mashed potatoes. The olfactory overload of a Sunday afternoon visit to your local flower market can be texted to a friend a thousand miles away. In 2015, I predict that the ability to digitally transmit smells will hit the mainstream.
Digitising messages, images and sounds is so last century. In 2014, scientists in the UK, US and Japan have unveiled devices which can electronically simulate smells, providing a direct route to the limbic system of the brain, the part responsible for memory and provoking emotion.
The current leading device for digital smell transmission is a smartphone attachment called Scentee, developed in Japan and available there and in the US. Scentee can release a puff of coffee or bacon-scented mist to wake you up in the morning (unsurprisingly, this technology was used in a promotional campaign by the Oscar Mayer meat company called Wake Up and Smell the Bacon).
Scentee uses alcohol-based aroma cartridges which come in specific flavours and are housed inside a small plastic device that attaches to the headphone input of a smartphone. The signal is transmitted digitally to the device’s ultrasonic transducer, which then releases the scent as a puff of vapour.
Mugaritz, one of the world’s top-ranked restaurants, has paired Scentee with its mobile app to virtually evoke the aromas of some of its signature dishes. The technology behind Scentee opens the door to a new form of digital escapism. In the case of Mugaritz, users can experience the bouquets of a Michelin-star meal from a restaurant in northern Spain without leaving the UK (or spending the money to eat there).
Adrian Cheok, Professor of Pervasive Computing at City University London, developed the technology behind Scentee and is currently working on a device that doesn’t rely on chemicals or pre-set cartridges. Instead, the latest technology sends a magnetic signal to a mouthguard which sits in the back of the throat and stimulates the olfactory bulb.
If an electronic mouthguard isn’t to your taste, scientists at Harvard have developed the oPhone, a pipe-shaped device made for receiving scent messages (called oNotes) triggered by an iPhone app called oSnap. The app allows you to take a photo and choose one of thousands of aromas to tag it with before sending. In the very near future, we will use devices like the oPhone to take a virtual tour of Marrakech, absorbing all the sounds, sights and smells of the souks and market square.
Professor Cheok and a team of City University researchers have also been studying the effect of synthetic smells, sent via the Internet, on emotions. The implications for marketing are huge. Could the digital scent of salt water and sea breeze on a travel website increase your likelihood of booking a beach holiday?
It’s been half a century since the concept was first introduced to unimpressed cinema audiences and we’ve since voted it one of the worst inventions of all time. But while we’ve turned our noses up at past attempts, I believe 2015 is the year smellovision will finally lose its stink.
Adrian Cheok will be presenting his latest prototypes and projects at FutureFest, Nesta’s two-day festival of innovation on 14-15 March 2015 in London.
During this year’s distinguished annual event, schoolchildren were treated to a robot orchestra performance and a taste of the electric lollipop developed by City’s Professor Adrian Cheok.
City’s Department of Computer Science has played a prominent role in the 2014 Royal Institution (RI) Christmas Lectures, which were presented by Professor Danielle George, with the theme, ‘Sparks Will Fly’. The lectures will be broadcast on BBC Four at 8pm on December 29th, 30th and 31st.
The RI Christmas Lecture Series, regarded as an annual highlight for a science event addressed to young people, is a series of talks on a single topic. The lectures have been held at London’s Royal Institution each year since 1825, except for the period 1939-1942 due to the Second World War.
Michael Faraday initiated the first RI Christmas Lecture Series in 1825 at a time when organised education for young people was scarce. Since then the lectures have followed a tradition of presenting scientific subjects to a general audience in an informative and entertaining manner.
During this year’s RI lectures, opportunities were provided for children in the audience to test out the world’s first electrical lollipop and Scentee smartphone smell app developed by Professor Adrian Cheok’s pervasive computing research team. PhD students Emma Yann Zhang, Gilang Pradana and visiting researcher Shogo Nishiguchi helped to demonstrate the taste and smell devices in the Royal Institution. The lecture will be broadcast on BBC Four on 30th December at 8pm.
Young volunteer Zara Rashid, 11, of Henrietta Barnet School in Hampstead, said:
“I thought the electronic lollipop was really cool, it was hard to work out exactly what the flavour was with just the lollipop but when there was a smell as well that made the taste much sharper. I really enjoyed the Christmas Lectures!”
Professor Stephen Hawking has become the most recent high-profile expert to speak out about the dangers of artificial intelligence (AI), telling the BBC that developing it fully “could spell the end of the human race”.
Despite acknowledging that certain forms of AI that have been created so far have been useful – including technology the motor neurone disease sufferer himself uses to help him speak – Hawking, theoretical physicist and author of A Brief History of Time, warned that future developments could be dangerous.
“It would take off on its own, and re-design itself at an ever increasing rate,” he said. “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”
Hawking is not alone in his concerns. Elon Musk, the billionaire technology entrepreneur declared that AI was the biggest threat to human survival in October during an interview at the Massachusetts Institute of Technology (MIT).
“I think we should be very careful about artificial intelligence,” he said. “If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful.
“I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.”
Despite having invested in AI company DeepMind himself, Musk explained to American news channel CNBC in June that he made this decision “not from the standpoint of actually trying to make any investment return… I like to just keep an eye on what’s going on with artificial intelligence. I think there is potentially a dangerous outcome there.”
“There have been movies about this, you know, like Terminator. There are some scary outcomes. And we should try to make sure the outcomes are good, not bad,” he added.
However, some experts are more positive about the possibilities of artificial intelligence.
David Levy, is a Master of Chess and an AI expert who has won the Loebner prize for the most human-like chatbot twice – once in 1997 and then again in 2009.
In 2007 he published a book called Love, Sex and Robots which claims that human and robot sex will be a common practice by 2050. In a recent interview with Newsweek Levy explained that: “I believe that loving sex robots will be a great boon to society. There are millions of people out there who, for one reason or another, cannot establish good relationships.”
Another believer in AI is professor Adrian Cheok from City University London. He and Levy are working together to create a new ‘chat agent’ software. They hope that the project, called I-Friend, will produce software that can respond to natural human language and speech.
Cheok also believes that AI will allow people to share “digital intimacy” in the future – he is currently developing the ‘Kissinger’ device that will be able to transfer kisses via two devices that can follow the mouth movement of a human.
“When I started out,” says David Levy, international chess champion and expert in artificial intelligence, “I didn’t know anything about artificial vaginas. It is quite extraordinary how much interest there is in that subject.”
Levy’s book, Love and Sex with Robots, is perhaps the fullest exploration of the future of humans and robots, especially their interaction in the bedroom. It explores the details of internet-linked devices that transmit real physical contact.
And Levy is no fantasist. He is the only person to win the Loebner prize – an annual competition to determine which chat software is the most realistic – in two separate decades, first in 1997 and again in 2009.
It was while researching his 2003 book, Robots Unlimited, that he first became interested in the subject. Specifically, he read a quote from a 1984 book by Sherry Turkel, a professor at the Massachusetts Institute of Technology. An interviewee, ‘Anthony’, told Turkel that he had tried having girlfriends but preferred his relationship with his computer.
“That quotation hit me like a brick wall,” says Levy. “I thought – if a smart guy could think like that in 1984, I wonder how much the concept of human-computer emotional relationships has developed since then.”
A great deal is the answer. Adrian David Cheok, Professor of Pervasive Computing at London’s City University, has been refining a device called a Kissinger: a set of pressure-sensitive artificial lips that can transmit a kiss from a real mouth to a similar device owned by a partner who might be thousands of miles away.
The Kissinger system has been in development for about eight years, with the latest model designed to plug into a smartphone. By kissing the screen, the movements of a person’s lips can be mirrored in the other machine and that kiss will be given to whoever has his or her mouth against a corresponding machine.
Several companies have shown an interest in the device and Cheok expects to see it hit the market in mid-2015.
Eventually, Cheok believes, “almost every physical thing, every being, every body, will be connected to the internet in some way.’’
The future, he says, will involve the subconscious part of the brain. We already have intimate data on the internet, but we still don’t feel that we can really know somebody online. There’s something missing between the experience of making a Skype call and meeting someone. And this is where transmitting the other senses is so important.”
Levy, 69, and Cheok, 42, have teamed up to work on a new “chat agent” – software that can understand and respond to natural human language and speech. The project, named I-Friend, will be based on artificial intelligence software that won Levy and his team the Loebner prize for a second time in 2009.
“It will be one of the most realistic artificial chat agents when the project is finished,” says Cheok.
Levy is keen to stress the versatility of the software they’re developing. The I-Friend, he says, can be configured for any embodiment and persona that the market requires.“It could, for example, be an upmarket toy such as a furry animal or a creature from another planet; or a web avatar that repeatedly turns the conversation to discuss a company and its products; or a mobile app such as a virtual girlfriend or boyfriend.”
Cheok adds: “In the first instance, it could probably replace all the phone sex for which people for some reason pay very high rates.” Ultimately, however, the aim would be for it to be “used in robots for artificial love and sex chat”.
And this is where the artificial vaginas come in.
“I believe it is going to be perfectly normal that people will be friends with robots, and that people will have sex with robots,” says Cheok. “All media will touch humanity.”
There is already a market for realistic-looking life-sized dolls made from a durable high elastometer silicone material. Female dolls either have fixed or removable vaginas and cost anything from $5,000-$8,000. But they don’t do anything. They are unresponsive.
In time, Levy predicts, it will be quite normal for people to buy robots as companions and lovers. “I believe that loving sex robots will be a great boon to society,” he says. “There are millions of people out there who, for one reason or another, cannot establish good relationships.”
And when does he think this might come about? “I think we’re talking about the middle of the century, if you are referring to a robot that many people would find appealing as a companion, lover, or possible spouse.”
Levy, a former Chess Master who represented Scotland, developed his interest in computing while studying at St Andrews university and later as a computer science postgraduate at the University of Glasgow, where he taught his students to program. During this time, he began looking into the programming of chess, which ultimately led to an interest in human-computer conversation.
He and Cheok’s “I-Friends” will have a sophisticated module which will endow the software with emotions, personality and moods. They aim to tailor the software to any required persona, for example a girlfriend or boyfriend who will be able to take part in continual and varied sexually-charged conversations.
I-Friends is a range of conversational software companions based on Artificial Intelligence. Its working name is “Do-Much-More”. Levy and Cheok currently are trying to commercialise this chatbot [a program designed to simulate intelligent conversation] by adding significantly to its conversational capabilities.
It will serve as a software core that can be configured for anything the market requires. It could, for example, discuss a company and its products; or a mobile app such as a virtual girlfriend or boyfriend; or a server based application with which cell phone users can interact via SMS messaging.The same core software can be used as the basis for any desired character, simply by changing the data that defines the persona.
“The very first chatbot was the famous ELIZA program written at MIT in the 1960s, named after Eliza Dolittle in George Bernard Shaw’s Pygmalion,’’ says Levy. “ELIZA did very little but caused a stir at the time and is well documented in the Artificial Intelligence literature. Our first chatbot program had the name Do-A-Lot because it did more than ELIZA. Our second generation chatbot does even more, and was therefore given the working name Do-Much-More.’’
Levy says consumers eventually will be able to experience “appropriately designed artificial genitalia’’ that feel and behave like the real thing.
“There will be body warmth, synthesised speech, moving limbs. The first sex robots will be primitive in quality but with time more sophisticated ones will be available.’’
Do-Much-More delivers a significant leap in performance relative to the original Do-A-Lot software. That leap has been achieved by retaining the original strengths of Do-A-Lot, enhancing its power by extending its system of “variables” (word types) and its morphology (for example by the inclusion of phrasal verbs), and increasing the sophistication of its response generation system through the use of two important lexical resources that have been developed within the Computational Linguistics community in the academic world: WordNet and ConceptNet.
WordNet is a semantic lexicon for the English language. It groups English words into sets of synonyms called synsets, provides short, general definitions, and records the various semantic relations between these synonym sets.
The purpose is twofold: to produce a combination of a dictionary and thesaurus that is more intuitively usable, and to support automatic text analysis and artificial intelligence applications. The database and software tools have been released under a formal license and can be downloaded and used freely.
ConceptNet is knowledge-based, created as part of the Open Mind Common Sense project, which is an artificial intelligence scheme based at the Massachusetts Institute of Technology Media Lab. The goal is to build what’s known as a large “common sense knowledge base’’ developed from the contributions of many thousands of people across the web.
“We employ WordNet to provide Do-Much-More with certain useful linguistic data about words, helping us to generate responses that generally appear to be natural in terms of word association,’’ says Levy. “And we employ ConceptNet to provide Do-Much-More with real-world commonsense information so that Do-Much-More sometimes appears not only to understand what the user is saying but also to know something about the subject.’’
Cheok likens this development to the early days of mobile telephones.
“There were these businessmen with these bricks and you thought it so geeky and who’d ever want to use that?’’ he says. “Initially, some technologies are a niche market. But once enough people use it you have a kind of bandwagon effect. Now, sure you can choose not to have a mobile phone, but because everyone else has got one, it’s become the new social norm. So I think a lot of these technologies will become like that – including robotics and mixed reality and all these things that people initially might find a little bit scary.’’
Correction: An earlier version of this article stated that David Levy was the only person to win the Loebner prize twice. He is in fact the only person to win it in two separate decades.
Gadget Man host Richard Ayoade and celebrity foodie Adrian Edmondson try out Professor Adrian Cheok’s research – the Digital Taste Machine and Scentee from Mixed Reality Lab in the latest Gadget Man episode – Cooking and Dinning Out.
Human communication often encompasses a mixture of senses. People connect with one and other through a combination of sight, sound, smell, taste and touch. The virtual world aims to become a popular mode of communication as individuals form bonds with people across the world. This technology currently engages two of the senses – sight and sound. However, Professor Adrien David Cheok, director of the Mixed Reality Lab, believes that virtual communication may one day embrace all the human senses – making it a truly physical experience. Professor Cheok is at the forefront of this technology, integrating touch, taste and smell into current technology. ‘Telepresent technology’ may help form and maintain relationships at distances in an increasingly globalised world. Combining pre-existing mobile technology and a plug-in device, the Scentee provides smell based notifications to the user. Designed by Professor Cheok, the small bulb like device releases scents from cartridges. For example a user may choose to set their alarm to wake up to the smell of coffee, or they may receive a certain smell depending on who contacts them. Professor Cheok’s Scentee has proved popular in Japan on a commercial scale, and has more recently become available worldwide. Whilst digitilising this chemical sensation is challenging, Professor Cheok aims to further the technology by manufacturing a magnetic coil that sits near the olfactory bulb (part of the brain responsible for interpreting smell). This would stimulate the artificial perception of smell. “It is actually true that a smell can subconsciously change your mood, so they are very important senses that you can bring to the internet.”
Taste is another sense which Professor Cheok aims to bring into the virtual world. He has developed a device which stimulates the tongue through electrical impulses. It may recreate sweet, sour, salty or bitter sensations. Using different combinations of heat and amperage Professor Cheok and his team are experimenting to develop a host of different tastes through the device. The team envisions a future where family members may be able to experience eating together at the dinner table from the other side of the planet.
Touch is the final sense in the physical jigsaw. Through such behavior as hugging, touch has the ability to comfort and create a sense of safety. Professor Cheok created the ‘Huggy Pajama’ designed primarily for parents who may want to send hugs to their children when away at work. Connected through the internet, the wearable jacket is filled with air pockets and heating components that inflate and warm in areas that help recreate the sensation of a hug. However, the virtual sensation of touch may be more subtle. Professor Cheok also helped design the RingU, described as the first ‘tele-hug’ ring. The device aims to bring friends, partners or family members closer together by providing a subtle hugging sensation on the finger. Through the internet, the user may send a signal to their companion’s RingU. The receiving ring then squeezes, providing a simple, effective message that the person is thinking of them. Users of the RingU may also change the intensity of the sensation and the colour that the ring emits, depending on the emotion that they want to convey.
Professor Cheok believes people may move from the age of information into the “age of experience”. He believes that, as this technology develops, virtual communication may become a fully immersive physical experience – important in the future of online communication. His goal is to “go beyond the chemicals” and create a fully integrated, immersive virtual experience. Individuals may therefore socialise and communicate with all of their senses through the internet. Rather than receiving a descriptive text of a trip to the pub, individuals may one day virtually experience the atmosphere through online communication. The Scentee, RingU and taste technology all mark the beginning of this complete, physical digitalisation of the senses. Professor Cheok and his colleagues are fast developing the technology to find novel ways to bring telepresence to the public.
How else might telepresent technology help bring people closer together?
A recent competition run by an American bacon manufacturer to win an iPhone-connecting device that emits the smell of rashers with the wake-up alarm could be viewed as a cruel trick on the senses. But it proved popular with thousands of entrants and marked a breakthrough for the London-based academic behind the technology.
Adrian Cheok, professor of pervasive computing at City University London, designed the gadget, which attached to the iPhone via its headphone jack and released a puff of bacon-scented mist, as well as the sound of frying, in a promotion for the Oscar Mayer meat company dubbed Wake Up and Smell the Bacon.
Cheok’s device may have been harnessed to an advertising gimmick, but the Australian scientist has an ambitious plan: to transmit taste and smell electronically so that when someone looks at a picture of a rose on a phone they are able to smell it, or experience the aroma of a Spanish paella when looking at snaps from their holiday on the Costa Brava.
Cheok’s technology is also behind a device available in Japan called Scentee, which is a small balloon-shaped smartphone attachment that emits a smell when programmed to do so.
The £12 device has a motor that vibrates and emits a mist containing the concentrated fluid essence of rose, coffee, lavender or rosemary, among others. It can be programmed to give out a hazy floral puff when someone’s partner contacts them. Smells are contained in separate attachments, which are sold individually and contain between 200 and 300 bursts.
This first step to bringing more senses to the digital world follows a career in which 42-year-old Cheok has worked on augmented reality systems, where computers enhance a user’s experience.
One of his creations was a virtual reality Pac-Man challenge, where the user would put on a specially designed suit and headset to roam the streets looking for “cookies”.
Cheok said: “In the real world we don’t just sense the world with our eyes, we have five senses.
“So with virtual reality, we can see a virtual rose on the table but we can’t touch it, we can’t pick it up, we can’t smell it, we can’t taste it. That is when I started to think that we needed to develop a new type of augmented reality for all of the five senses.”
During early experiments he worked with elderly people, developing applications to avoid loneliness so that relatives could cook together via the internet while separated.
Cooking utensils were augmented to include haptic technology – devices that recreate the sense of touch. This allowed users to “feel” when they were doing their tasks and used 3D food printers to replicate meals.
“Right now when we think about computing, we don’t think that we can have taste and smell experiences. But actually taste and smell are the only two senses which are connected to the limbic system of the brain, which is also responsible for the emotion and memory,” said Cheok.
“It is actually true that a smell can subconsciously change your mood, so they are very important senses that you can bring to the internet, especially when you are talking about emotional communication [between] for example the elderly and grandchildren,” he added.
The main problem with transmitting taste and smell was clear. While light and sound can be digitised, taste and smell are chemically based and “you can’t send chemicals through wires,” he said.
The initial development was the Scentee, which was launched in Japan and emits only single smells. This is just a first step, however, and other more ambitious projects are in progress.
A prototype device has been made that stimulates sweet, sour, salty and bitter tastes when the prongs are pressed on a person’s tongue.
Low-level electrical currents stimulate the taste neuron, which produces artificial taste sensations.
Cheok is also working with neuroscientists to make a magnetic coil to go in the back of the mouth to indirectly stimulate the olfactory bulb, a key part of the brain responsible for smell. Using a device like a mouthguard, he said, this could produce artificial smell perception in the brain.
“Once you have digitised this signal, you can transmit through the internet. You can transmit to your mobile phone. It becomes a signal which does not require the transmission of chemicals,” he said.
“My research goal is to go beyond the chemicals. We want to make versions of this where you don’t need this digital device so you basically just have the electronics and then the electronics could be just embedded into your mobile phone or the back of your mouth and then making infinite kinds of smell combinations once you can control the signals.”
The advent of wearable technologies that interact with the body, such as health monitoring devices, will result in people accepting such devices in coming years, he said.
“[Now] you still can’t feel the experience of what it is like to be in this place in London or if we share a meal together or go to the pub together.
“In the future we will be able to communicate our experience – not just communicate information but experience.
“I believe we will go from the information age, which is where we are now, to the age of experience.”
Also in development is a ring to be worn on the finger, which, using haptic (tactile) technologies, can be squeezed to send a sensation to a corresponding ring – effectively creating a greeting between partners.
“Once we make these devices, the virtual world will become almost as physical as our reality,” said Cheok.
While Heston Blumenthal’s Fat Duck restaurant is known for implanting an iPod into shells so that diners can hear the sound of the sea during one of their fish courses, Spain’s Mugaritz – also ranked among the best eateries in the world – went one step further. The head chef, Andoni Luis Aduriz, used a Scentee to simulate one of his dishes so that when someone virtually crushes herbs and spices using his restaurant’s smartphone app, aromas of black pepper, sesame and saffron are emitted from the device.