Sending a scent via text messaging is poised to be the next big thing in 2015, says British innovation charity Nesta.
According to the group’s top 10 prediction list for 2015, “smell-o-grams” or smelltexts via a smartphone will be the craze in the coming months.
Imagine that instead of buying roses for your loved one, you can send him or her a smell-o-gram via your smartphone. Sounds far-fetched? Not at all.
In 1960, Smell-O-Vision, a similar system that diffused odors, was used during the screening of the film Scent of Mystery. The idea was to make cinemagoers associate the smell with the ongoing action in each scene. However, the invention did not go down too well with viewers. Time magazine even voted it the worst invention ever.
The concept for 2015 perhaps reeks the smell of success, thanks to the advent of more advanced applications. Earlier in 2014, researchers at Harvard deployed an iPhone app to communicate the smell of macaroons and champagne from Paris to New York.
“Imagine the next selfie you see posted is accompanied by the scent of perfume. The Instagram photo of your gourmet steak dinner comes with a whiff of buttery mashed potatoes,” said Josh McNorton, Nesta’s project manager. “In 2015, I predict that the ability to digitally transmit smells will hit the mainstream.”
Sending smells along with a picture over the smartphone is already possible because of Adrian Cheok’s invention Scentee, which can be plugged into the headphone socket of a smartphone. Scentee uses aroma cartridges that are alcohol-based to diffuse wisps of vapor once it has been triggered into action.
Another device that could assist in making smelltexts mainstream is the pipe-shaped oPhone Duo, which iscapable of producing over 300,000 fragrances by deploying aroma combos from nearly eight vapor cartridges. oPhone enables users to send smelltexts or oNotes via its oSnap app.
Whether smelltexts will indeed be the next big thing or fail to strike a note with consumers like Smell-O-Vision remains to be seen.
PUBLISHED: 05:58, 19 December 2014 | UPDATED: 15:47, 19 December 2014
Scientists sent smell of champagne from Paris to US with iPhone in June
oPhone Duo device can recreate smell out of palette of 32 available scents
And the Scentee iPhone attachment can release aroma upon receiving text
Emerging technologies forecast by innovation charity Nesta to make it big
Others include life-saving apps and food waste feeding millions of people
Smartphones will be sending and receiving scented messages by the end of next year, experts say.
The concept is one of ten emerging technologies forecast by innovation charity Nesta to make it big 2015, with others including life-saving apps and food waste feeding millions of people.
It comes six months after scientists managed to send the smell of champagne and macarons from Paris to New York with an iPhone app using a device called the oPhone Duo.
The system consists of an oSnap app which allows users to create an oNote with a smell created out of a palette of 32 available scents that can be combined in 300,000 possible combinations.
The oNote can then be sent to the oPhone hardware – a device which is able to recreate the smell.
Other technology in this field is the Scentee device, which can release a favourite aroma at the same time as a phone clock alarm or when an individual receives a text message.
It also claims to be able to change the taste of food with its mini air-freshener-like alcohol-based aroma cartridges. A user can select to emit a puff of scent at will using the small plastic device.
City University computing professor Adrian Cheok developed the technology behind Scentee and is now working on a device that will send a magnetic signal to a mouthguard in the back of the throat.
Nesta project manager Josh McNorton said: ‘Imagine the next selfie you see posted is accompanied by the scent of perfume. The Instagram photo of your gourmet steak dinner comes with a whiff of buttery mashed potatoes.
‘The olfactory overload of a Sunday afternoon visit to your local flower market can be texted to a friend a thousand miles away. In 2015, I predict that the ability to digitally transmit smells will hit the mainstream.’
It has been more than half a century since the concept of ‘Smell-O-Vision’ was introduced to cinema audiences, making its first widespread appearance in the 1960 film Scent Of Mystery.
The film opened in three specially-equipped theatres in New York, Chicago and Los Angeles – with the idea being that certain odours would be timed to specific points in the narrative.
But the mechanism did not work properly and audience members complained of a hissing noise accompanying the scents – as well as a delay between the actions and their corresponding smells.
Mr McNorton added: ‘While we’ve turned our noses up at past attempts, I believe 2015 is the year “smell-o-vision” will finally lose its stink.’
Another prediction is of a huge innovation in first aid that will see ambulance trusts incorporate smartphone technology locating local trained first aiders, who can respond instead of paramedics.
It is also claimed that in 2015 enough fruit and vegetables will be diverted from food waste to feed millions of people, through ‘gleaning’ harvest food that would be otherwise left to rot in farms.
Of Nesta’s ten 2014 predictions, one of the most interesting is that there would be an ‘introduction of services that help us improve our lives based on the data that we give away every day’.
SMELLS ON SCREENS: A HISTORY
Smell-O-Vision was a system created in 1960 by Hans Laube, and was used in cinemas during the film Scent of Mystery.
The system was fitted to cinema seats and released 30 smells at different points during the film, triggered by the film’s soundtrack. Smells included pipe tobacco.
In 2013, researchers in Tokyo developed a prototype smelling screen. The smelling screen combines a digital display with four small fans, one at each corner of the display.
Smells are stored in gel packets and are released at set times. The smells are blown parallel to the screen. By varying the speed and strength of each fan, the different smells are moved to specific areas of the screen.
2015’S TRENDS, SOCIAL MOVEMENTS AND TECH BREAKTHROUGHS (via Nesta)
Democracy makes itself at home online: 2015 will see the creation of new political parties organised in radically different ways
Smell-O-Vision loses its stink: This year you’ll receive an SMS with a difference as technology is introduced to transmit scents through your smartphone
Internet of everything, coming to a neighbourhood near you: 2015 will bring new network technologies that connect constellations of low-powered sensors across entire districts, creating widespread smart civic infrastructure
Digital art gets up close and personal: This year digital art will become entrenched in daily life as cultural producers exploit digital technologies to create more accessible experiences
Killer apps for life savers: This year smartphone tech will fuel the biggest innovation in first aid for over 100 years
Crafts get a 21st century makeover: Shared access to digital fabrication tools such as laser cutters and 3D printers will create a new breed of digital artisan manufacturers
Gleaning will change our attitude to food waste: In 2015, enough fruit and veg will be diverted from food waste to offer millions one of their five a day
A bust funded by the crowd: 2015 will see a high-profile blow-up in the world of crowdfunding and peer-to-peer lending. But this is a good sign, not a bad one
Programming a new generation of digital makers: From apps to films, in 2015 every young person across the UK will make and share something digital
Crowd-aware billboards: This year cities will play host to Minority Report style billboards broadcasting tailored content based on data from your GPS-enabled phone
This year you’ll receive an SMS with a difference as technology is introduced to transmit scents through your smartphone, says Josh McNorton
Imagine the next selfie you see posted is accompanied by the scent of perfume. The Instagram photo of your gourmet steak dinner comes with a whiff of buttery mashed potatoes. The olfactory overload of a Sunday afternoon visit to your local flower market can be texted to a friend a thousand miles away. In 2015, I predict that the ability to digitally transmit smells will hit the mainstream.
Digitising messages, images and sounds is so last century. In 2014, scientists in the UK, US and Japan have unveiled devices which can electronically simulate smells, providing a direct route to the limbic system of the brain, the part responsible for memory and provoking emotion.
The current leading device for digital smell transmission is a smartphone attachment called Scentee, developed in Japan and available there and in the US. Scentee can release a puff of coffee or bacon-scented mist to wake you up in the morning (unsurprisingly, this technology was used in a promotional campaign by the Oscar Mayer meat company called Wake Up and Smell the Bacon).
Scentee uses alcohol-based aroma cartridges which come in specific flavours and are housed inside a small plastic device that attaches to the headphone input of a smartphone. The signal is transmitted digitally to the device’s ultrasonic transducer, which then releases the scent as a puff of vapour.
Mugaritz, one of the world’s top-ranked restaurants, has paired Scentee with its mobile app to virtually evoke the aromas of some of its signature dishes. The technology behind Scentee opens the door to a new form of digital escapism. In the case of Mugaritz, users can experience the bouquets of a Michelin-star meal from a restaurant in northern Spain without leaving the UK (or spending the money to eat there).
Adrian Cheok, Professor of Pervasive Computing at City University London, developed the technology behind Scentee and is currently working on a device that doesn’t rely on chemicals or pre-set cartridges. Instead, the latest technology sends a magnetic signal to a mouthguard which sits in the back of the throat and stimulates the olfactory bulb.
If an electronic mouthguard isn’t to your taste, scientists at Harvard have developed the oPhone, a pipe-shaped device made for receiving scent messages (called oNotes) triggered by an iPhone app called oSnap. The app allows you to take a photo and choose one of thousands of aromas to tag it with before sending. In the very near future, we will use devices like the oPhone to take a virtual tour of Marrakech, absorbing all the sounds, sights and smells of the souks and market square.
Professor Cheok and a team of City University researchers have also been studying the effect of synthetic smells, sent via the Internet, on emotions. The implications for marketing are huge. Could the digital scent of salt water and sea breeze on a travel website increase your likelihood of booking a beach holiday?
It’s been half a century since the concept was first introduced to unimpressed cinema audiences and we’ve since voted it one of the worst inventions of all time. But while we’ve turned our noses up at past attempts, I believe 2015 is the year smellovision will finally lose its stink.
Adrian Cheok will be presenting his latest prototypes and projects at FutureFest, Nesta’s two-day festival of innovation on 14-15 March 2015 in London.
During this year’s distinguished annual event, schoolchildren were treated to a robot orchestra performance and a taste of the electric lollipop developed by City’s Professor Adrian Cheok.
City’s Department of Computer Science has played a prominent role in the 2014 Royal Institution (RI) Christmas Lectures, which were presented by Professor Danielle George, with the theme, ‘Sparks Will Fly’. The lectures will be broadcast on BBC Four at 8pm on December 29th, 30th and 31st.
The RI Christmas Lecture Series, regarded as an annual highlight for a science event addressed to young people, is a series of talks on a single topic. The lectures have been held at London’s Royal Institution each year since 1825, except for the period 1939-1942 due to the Second World War.
Michael Faraday initiated the first RI Christmas Lecture Series in 1825 at a time when organised education for young people was scarce. Since then the lectures have followed a tradition of presenting scientific subjects to a general audience in an informative and entertaining manner.
During this year’s RI lectures, opportunities were provided for children in the audience to test out the world’s first electrical lollipop and Scentee smartphone smell app developed by Professor Adrian Cheok’s pervasive computing research team. PhD students Emma Yann Zhang, Gilang Pradana and visiting researcher Shogo Nishiguchi helped to demonstrate the taste and smell devices in the Royal Institution. The lecture will be broadcast on BBC Four on 30th December at 8pm.
Young volunteer Zara Rashid, 11, of Henrietta Barnet School in Hampstead, said:
“I thought the electronic lollipop was really cool, it was hard to work out exactly what the flavour was with just the lollipop but when there was a smell as well that made the taste much sharper. I really enjoyed the Christmas Lectures!”
Professor Stephen Hawking has become the most recent high-profile expert to speak out about the dangers of artificial intelligence (AI), telling the BBC that developing it fully “could spell the end of the human race”.
Despite acknowledging that certain forms of AI that have been created so far have been useful – including technology the motor neurone disease sufferer himself uses to help him speak – Hawking, theoretical physicist and author of A Brief History of Time, warned that future developments could be dangerous.
“It would take off on its own, and re-design itself at an ever increasing rate,” he said. “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”
Hawking is not alone in his concerns. Elon Musk, the billionaire technology entrepreneur declared that AI was the biggest threat to human survival in October during an interview at the Massachusetts Institute of Technology (MIT).
“I think we should be very careful about artificial intelligence,” he said. “If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful.
“I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.”
Despite having invested in AI company DeepMind himself, Musk explained to American news channel CNBC in June that he made this decision “not from the standpoint of actually trying to make any investment return… I like to just keep an eye on what’s going on with artificial intelligence. I think there is potentially a dangerous outcome there.”
“There have been movies about this, you know, like Terminator. There are some scary outcomes. And we should try to make sure the outcomes are good, not bad,” he added.
However, some experts are more positive about the possibilities of artificial intelligence.
David Levy, is a Master of Chess and an AI expert who has won the Loebner prize for the most human-like chatbot twice – once in 1997 and then again in 2009.
In 2007 he published a book called Love, Sex and Robots which claims that human and robot sex will be a common practice by 2050. In a recent interview with Newsweek Levy explained that: “I believe that loving sex robots will be a great boon to society. There are millions of people out there who, for one reason or another, cannot establish good relationships.”
Another believer in AI is professor Adrian Cheok from City University London. He and Levy are working together to create a new ‘chat agent’ software. They hope that the project, called I-Friend, will produce software that can respond to natural human language and speech.
Cheok also believes that AI will allow people to share “digital intimacy” in the future – he is currently developing the ‘Kissinger’ device that will be able to transfer kisses via two devices that can follow the mouth movement of a human.
“When I started out,” says David Levy, international chess champion and expert in artificial intelligence, “I didn’t know anything about artificial vaginas. It is quite extraordinary how much interest there is in that subject.”
Levy’s book, Love and Sex with Robots, is perhaps the fullest exploration of the future of humans and robots, especially their interaction in the bedroom. It explores the details of internet-linked devices that transmit real physical contact.
And Levy is no fantasist. He is the only person to win the Loebner prize – an annual competition to determine which chat software is the most realistic – in two separate decades, first in 1997 and again in 2009.
It was while researching his 2003 book, Robots Unlimited, that he first became interested in the subject. Specifically, he read a quote from a 1984 book by Sherry Turkel, a professor at the Massachusetts Institute of Technology. An interviewee, ‘Anthony’, told Turkel that he had tried having girlfriends but preferred his relationship with his computer.
“That quotation hit me like a brick wall,” says Levy. “I thought – if a smart guy could think like that in 1984, I wonder how much the concept of human-computer emotional relationships has developed since then.”
A great deal is the answer. Adrian David Cheok, Professor of Pervasive Computing at London’s City University, has been refining a device called a Kissinger: a set of pressure-sensitive artificial lips that can transmit a kiss from a real mouth to a similar device owned by a partner who might be thousands of miles away.
The Kissinger system has been in development for about eight years, with the latest model designed to plug into a smartphone. By kissing the screen, the movements of a person’s lips can be mirrored in the other machine and that kiss will be given to whoever has his or her mouth against a corresponding machine.
Several companies have shown an interest in the device and Cheok expects to see it hit the market in mid-2015.
Eventually, Cheok believes, “almost every physical thing, every being, every body, will be connected to the internet in some way.’’
The future, he says, will involve the subconscious part of the brain. We already have intimate data on the internet, but we still don’t feel that we can really know somebody online. There’s something missing between the experience of making a Skype call and meeting someone. And this is where transmitting the other senses is so important.”
Levy, 69, and Cheok, 42, have teamed up to work on a new “chat agent” – software that can understand and respond to natural human language and speech. The project, named I-Friend, will be based on artificial intelligence software that won Levy and his team the Loebner prize for a second time in 2009.
“It will be one of the most realistic artificial chat agents when the project is finished,” says Cheok.
Levy is keen to stress the versatility of the software they’re developing. The I-Friend, he says, can be configured for any embodiment and persona that the market requires.“It could, for example, be an upmarket toy such as a furry animal or a creature from another planet; or a web avatar that repeatedly turns the conversation to discuss a company and its products; or a mobile app such as a virtual girlfriend or boyfriend.”
Cheok adds: “In the first instance, it could probably replace all the phone sex for which people for some reason pay very high rates.” Ultimately, however, the aim would be for it to be “used in robots for artificial love and sex chat”.
And this is where the artificial vaginas come in.
“I believe it is going to be perfectly normal that people will be friends with robots, and that people will have sex with robots,” says Cheok. “All media will touch humanity.”
There is already a market for realistic-looking life-sized dolls made from a durable high elastometer silicone material. Female dolls either have fixed or removable vaginas and cost anything from $5,000-$8,000. But they don’t do anything. They are unresponsive.
In time, Levy predicts, it will be quite normal for people to buy robots as companions and lovers. “I believe that loving sex robots will be a great boon to society,” he says. “There are millions of people out there who, for one reason or another, cannot establish good relationships.”
And when does he think this might come about? “I think we’re talking about the middle of the century, if you are referring to a robot that many people would find appealing as a companion, lover, or possible spouse.”
Levy, a former Chess Master who represented Scotland, developed his interest in computing while studying at St Andrews university and later as a computer science postgraduate at the University of Glasgow, where he taught his students to program. During this time, he began looking into the programming of chess, which ultimately led to an interest in human-computer conversation.
He and Cheok’s “I-Friends” will have a sophisticated module which will endow the software with emotions, personality and moods. They aim to tailor the software to any required persona, for example a girlfriend or boyfriend who will be able to take part in continual and varied sexually-charged conversations.
I-Friends is a range of conversational software companions based on Artificial Intelligence. Its working name is “Do-Much-More”. Levy and Cheok currently are trying to commercialise this chatbot [a program designed to simulate intelligent conversation] by adding significantly to its conversational capabilities.
It will serve as a software core that can be configured for anything the market requires. It could, for example, discuss a company and its products; or a mobile app such as a virtual girlfriend or boyfriend; or a server based application with which cell phone users can interact via SMS messaging.The same core software can be used as the basis for any desired character, simply by changing the data that defines the persona.
“The very first chatbot was the famous ELIZA program written at MIT in the 1960s, named after Eliza Dolittle in George Bernard Shaw’s Pygmalion,’’ says Levy. “ELIZA did very little but caused a stir at the time and is well documented in the Artificial Intelligence literature. Our first chatbot program had the name Do-A-Lot because it did more than ELIZA. Our second generation chatbot does even more, and was therefore given the working name Do-Much-More.’’
Levy says consumers eventually will be able to experience “appropriately designed artificial genitalia’’ that feel and behave like the real thing.
“There will be body warmth, synthesised speech, moving limbs. The first sex robots will be primitive in quality but with time more sophisticated ones will be available.’’
Do-Much-More delivers a significant leap in performance relative to the original Do-A-Lot software. That leap has been achieved by retaining the original strengths of Do-A-Lot, enhancing its power by extending its system of “variables” (word types) and its morphology (for example by the inclusion of phrasal verbs), and increasing the sophistication of its response generation system through the use of two important lexical resources that have been developed within the Computational Linguistics community in the academic world: WordNet and ConceptNet.
WordNet is a semantic lexicon for the English language. It groups English words into sets of synonyms called synsets, provides short, general definitions, and records the various semantic relations between these synonym sets.
The purpose is twofold: to produce a combination of a dictionary and thesaurus that is more intuitively usable, and to support automatic text analysis and artificial intelligence applications. The database and software tools have been released under a formal license and can be downloaded and used freely.
ConceptNet is knowledge-based, created as part of the Open Mind Common Sense project, which is an artificial intelligence scheme based at the Massachusetts Institute of Technology Media Lab. The goal is to build what’s known as a large “common sense knowledge base’’ developed from the contributions of many thousands of people across the web.
“We employ WordNet to provide Do-Much-More with certain useful linguistic data about words, helping us to generate responses that generally appear to be natural in terms of word association,’’ says Levy. “And we employ ConceptNet to provide Do-Much-More with real-world commonsense information so that Do-Much-More sometimes appears not only to understand what the user is saying but also to know something about the subject.’’
Cheok likens this development to the early days of mobile telephones.
“There were these businessmen with these bricks and you thought it so geeky and who’d ever want to use that?’’ he says. “Initially, some technologies are a niche market. But once enough people use it you have a kind of bandwagon effect. Now, sure you can choose not to have a mobile phone, but because everyone else has got one, it’s become the new social norm. So I think a lot of these technologies will become like that – including robotics and mixed reality and all these things that people initially might find a little bit scary.’’
Correction: An earlier version of this article stated that David Levy was the only person to win the Loebner prize twice. He is in fact the only person to win it in two separate decades.
Gadget Man host Richard Ayoade and celebrity foodie Adrian Edmondson try out Professor Adrian Cheok’s research – the Digital Taste Machine and Scentee from Mixed Reality Lab in the latest Gadget Man episode – Cooking and Dinning Out.
Human communication often encompasses a mixture of senses. People connect with one and other through a combination of sight, sound, smell, taste and touch. The virtual world aims to become a popular mode of communication as individuals form bonds with people across the world. This technology currently engages two of the senses – sight and sound. However, Professor Adrien David Cheok, director of the Mixed Reality Lab, believes that virtual communication may one day embrace all the human senses – making it a truly physical experience. Professor Cheok is at the forefront of this technology, integrating touch, taste and smell into current technology. ‘Telepresent technology’ may help form and maintain relationships at distances in an increasingly globalised world. Combining pre-existing mobile technology and a plug-in device, the Scentee provides smell based notifications to the user. Designed by Professor Cheok, the small bulb like device releases scents from cartridges. For example a user may choose to set their alarm to wake up to the smell of coffee, or they may receive a certain smell depending on who contacts them. Professor Cheok’s Scentee has proved popular in Japan on a commercial scale, and has more recently become available worldwide. Whilst digitilising this chemical sensation is challenging, Professor Cheok aims to further the technology by manufacturing a magnetic coil that sits near the olfactory bulb (part of the brain responsible for interpreting smell). This would stimulate the artificial perception of smell. “It is actually true that a smell can subconsciously change your mood, so they are very important senses that you can bring to the internet.”
Taste is another sense which Professor Cheok aims to bring into the virtual world. He has developed a device which stimulates the tongue through electrical impulses. It may recreate sweet, sour, salty or bitter sensations. Using different combinations of heat and amperage Professor Cheok and his team are experimenting to develop a host of different tastes through the device. The team envisions a future where family members may be able to experience eating together at the dinner table from the other side of the planet.
Touch is the final sense in the physical jigsaw. Through such behavior as hugging, touch has the ability to comfort and create a sense of safety. Professor Cheok created the ‘Huggy Pajama’ designed primarily for parents who may want to send hugs to their children when away at work. Connected through the internet, the wearable jacket is filled with air pockets and heating components that inflate and warm in areas that help recreate the sensation of a hug. However, the virtual sensation of touch may be more subtle. Professor Cheok also helped design the RingU, described as the first ‘tele-hug’ ring. The device aims to bring friends, partners or family members closer together by providing a subtle hugging sensation on the finger. Through the internet, the user may send a signal to their companion’s RingU. The receiving ring then squeezes, providing a simple, effective message that the person is thinking of them. Users of the RingU may also change the intensity of the sensation and the colour that the ring emits, depending on the emotion that they want to convey.
Professor Cheok believes people may move from the age of information into the “age of experience”. He believes that, as this technology develops, virtual communication may become a fully immersive physical experience – important in the future of online communication. His goal is to “go beyond the chemicals” and create a fully integrated, immersive virtual experience. Individuals may therefore socialise and communicate with all of their senses through the internet. Rather than receiving a descriptive text of a trip to the pub, individuals may one day virtually experience the atmosphere through online communication. The Scentee, RingU and taste technology all mark the beginning of this complete, physical digitalisation of the senses. Professor Cheok and his colleagues are fast developing the technology to find novel ways to bring telepresence to the public.
How else might telepresent technology help bring people closer together?