Apps that can communicate touch, taste and smell: A taste of what’s to come

posted in: Media | 0

indy-new-masthead-800x64

Interview article from The Independent:
By Rhodri Marsden

theIndependent1
This device uses electrodes to convince the brain that it is ‘tasting’ something

Websites and apps are frequently described by their creators as offering a “rich experience”. The beautiful designs, intuitive layouts and compelling interactivity may well be engaging and satisfying to use, but when they’re hailed as being a “feast for the senses”, it’s evident that they’re a feast for merely two.

Online entertainment is about sight and sound; everything is mediated through a glass panel and a speaker, leaving us well short of being immersed in an alternative reality. But with studies having demonstrated that more than half of human communication is non-verbal, scientists have been working on ways of communicating touch, taste and smell via the internet, and many of those experiments have been gathering pace.

“What do you smell?” asks Adrian Cheok, professor of pervasive computing at City University London. The whiff of melon is unmistakable; it emerged from a tiny device clipped to an iPhone and was triggered by Cheok standing on the other side of the room. “Right,” he says. “These devices have been commercialised in Japan – they’re selling 10,000 units a month – and they’re bringing smells into a social interface.” It’s still early days with this technology; the device I’m holding is similar to an inkjet printer in that it contains a melon “smell sachet”, and when it’s empty you have to buy another one. Nor is it a particularly new concept; in 1999, Wired magazine ran a front cover story about a company called Digiscents that had produced a USB “personal scent synthesiser” for your computer called the iSmell. Digiscents folded two years later. But the technology that failed to excite us back then now looks slightly less gimmicky in the context of modern smartphone usage, with its super- connectivity and emoticons galore.

On the surface, Cheok’s projects are fun, almost throwaway. “I’ve worked on hugging pyjamas,” he says. “They consist of a suit you can put on your body to virtually hug someone, remotely. Then we have these small haptic rings; if I squeeze my ring someone else will feel a squeeze on theirs through the internet – like a remote sensation of hand-holding.” He’s also been working on a device with electrodes that excites taste receptors on the tongue,  producing an artificial sensation of taste in the brain. Similar work is also under way at the National University of Singapore, where a team of researchers is constructing a “digital  lollipop” that fools the tongue into experiencing sweet, salt, sour or bitter tastes.

theIndependent2
Adrian Cheok demonstrates one of his creations

 

In the shorter term, the applications of these devices seem slightly frivolous; Cheok’s rings, for example, are being turned into a product that the music industry plans to sell to fans. “You go to the concert,” he says, “the pop star would send a special message, and if you’re wearing the ring you’d get a squeeze on your finger.” I grimace slightly, and he laughs.

“Fortunately or unfortunately,” he says, “that’s where they’ve decided that the money is – but we need to explore the boundaries of how these things can be used, because scientists and inventors can’t think of all the possibilities. For example, Thomson Reuters has been in touch to ask about using the rings to send  tactile information about stock prices or  currency movements.”

Our transition to an internet of all the senses is evidently dependent on the breadth of information that can be conveyed from one person to another as a series of zeroes and ones. “You have to find a way of, say, transmitting smell digitally, without using a sachet,” says Cheok.

“So I’m working with a French neuroscientist, Olivier Oullier, on a device which can produce an artificial sensation of smell through magnetic actuation. The olfactory bulb in our nasal cavity that’s responsible for smell can be stimulated by pulsing magnetic fields. So this is about directly exciting the brain’s neural path by bypassing the external sensor – in this case the human body.”

This immediately plunges us into what seems like incredibly futuristic territory, where brains are communicating sensory information directly with other brains across digital networks. But it’s already been demonstrated by the synthetic neurobiology group at MIT (Massachusetts Institute of Technology) that optical fibre can be connected to neurons, and Cheok is excited about where this may lead in the relatively short term. “We will have direct connection to the brain within our lifetime,” he says, “although what level that will be I’m not sure. Physical stimulation of neurons may not produce the effects that we would hope for and predict.”

Few of us can conceive of the pace with which technological power is developing. Ray Kurzweil (author, futurist, and a director of engineering at Google) predicts that by 2025 we’ll have a computer which has the processing power of the human brain, and by 2045 it’ll have the processing power of six billion brains – ie, everyone on the planet. Cheok sees these as hugely important tipping points for society.  “If you’re able to download your brain to a computer, there are major philosophical questions that we’ll have to deal with in the next 30 years, such as whether we’re human, or whether we’re computers.”

Society will also have to work out how it’s going to handle the hyper-connectivity of a multisensory internet – bearing in mind that we can already become deeply frustrated by the few kilobytes of information contained within the average overloaded email inbox. Text messages that are not replied to already provoke consternation – what about unreciprocated touches, provocative odours or unwanted tastes?

“Our brains haven’t changed to cope with infinite communication,” says Cheok. “We don’t have a mechanism for knowing when there’s too much, in the way that we do when we’ve eaten too much food. Communication is not just a desire, it’s a basic need – but we’ve gone from being hunter-gatherers in groups of 20 or 30 to being in a world of infinite data. We could literally gorge on communication and be unable to stop. We’ll have to find new norms and new mechanisms, but it’s difficult to  predict what they will be.”

Marshall McLuhan, the Canadian philosopher of communication theory, famously used the term “global village” to describe the effect of connected media upon the world’s population; it has become overused, but Cheok believes that new sensory-communication channels will demonstrate how prescient that prediction was. “For most of human history, we didn’t have privacy,” he says. “Everyone knew who was doing what. And these developments will mean that we become more and more open – the end of secrecy, almost bringing us back to the way that life used to be in hunter-gatherer times. Except, of course, it’s now global. A lot more people will know.”

The implications of the work of Cheok and his contemporaries seem to sit midway between exciting and terrifying, but in the shorter term it’s about focusing on relatively mundane objectives, such as emitting multiple odours from a smartphone. “People will get used to this new mode of communication,” says Cheok, “and develop new languages. We don’t yet have a language of smell, or of touch; exactly the same pressure in terms of a touch can have a  completely different response in the brain, depending on context. But combined with emotion and the subconscious, it’ll bring a heightened sense of presence. I want us to be able to eat together across the internet. I’ve no idea what that will feel like,” he adds, smiling, “but I’ve always believed that human communication goes far beyond the logical.”

Catching the whiff of success

posted in: Media, Research | 0

A team made led by City University London’s Mixed Reality Lab and other university academics are finalists in the HackingBullipedia Global Challenge, aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.

A combined team comprising academics from City University London’s Mixed Reality Lab, University of Aix-Marseille (France) and Sogang University (South Korea) has made the final of this year’s HackingBullipedia Global Challenge aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.

Led by Professor Adrian Cheok, Professor of Pervasive Computing in the School of Informatics, their competition entry is titled “Digital Olfaction and Gustation: A Novel Input and Output Method for Bullipedia”.

The team proposes novel methods of digital olfaction and gustation as input and output for internet interaction, specifically for creating and experiencing the digital representation of food, cooking and recipes on the Bullipedia. Other team members include Jordan Tewell, Olivier Oullier and Yongsoon Choi.

No stranger to digital olfaction applications in the culinary space, Professor Cheok recently gave a Digital Taste and Smell presentation to the third top chef in the world, Chef Andoni Luiz Aduriz, at Mugaritz restaurant in San Sebastian, Spain.

The HackingBullipedia Global Challenge was created by the renowned world leading culinary expert, Chef Ferran Adria I Acosta.

The jury, comprising some of the best culinary and digital technology experts in the world arrived at a shortlist of four teams after carefully sifting through 30 proposals from three continents drawn from a mix of independent and university teams.

The other teams in the final are from Uni­ver­si­tat Pom­peu Fabra (Barcelona); the Tech­ni­cal Uni­ver­sity of Cat­alo­nia; and an independent (non university) team from Madrid.

On the 27th of November, two representatives from each of the four finalist teams will pitch their proposal and give a demonstration to the competition’s judges after which the winner will be decided.

Professor Cheok is very pleased that City will be in the final of the competition final:

“I am quite delighted that we were able to make the final of this very challenging and prestigious competition. There were entries from various parts of the world covering a broad spectrum of expertise including a multidisciplinary field of scientists, chefs, designers, culinary professionals, data visualisation experts and artists. We are confident that our team has prepared an equally challenging and creative proposal which will be a game-changer in the gastronomic arena.”

[http://hackingbullipedia.org/thechallenge/overview]

The Multi-Sensory Internet Brings Smell, Taste, and Touch to the Web

posted in: Media | 0

Screen Shot 2014-01-22 at 10.10.20 am

The Multi-Sensory Internet Brings Smell, Taste, and Touch to the Web

By Gian Volpicelli

Interview article from Motherboard:

Motherboard1
Adrian Cheok with his taste-transmitting device. Photos by Jonathan Shkurko

Adrian Cheok, professor of pervasive computing at City University London and director of the Mixed Reality Lab at the National University of Singapore, is on a mission to transform cyberspace into a multi-sensory world. He wants to tear through the audiovisual paradigm of the internet by developing devices able to transmit smells, tastes, and tactile sensations over the web.

Lying on the desk in Cheok’s lab is one of his inventions: a device that connects to a smartphone and shoots out a given person’s scent when they send you a message or post on your Facebook wall. Then there’s a plexiglass cubic box you can stick your tongue in to taste internet-delivered flavours. Finally, a small plastic and silicone gadget with a pressure sensor and a moveable peg in the middle. It’s a long-distance-kissing machine: You make out with it, and your tongue and lip movements travel over the internet to your partner’s identical device—and vice versa.

“It’s still a prototype but we’ll be able to tweak it and make it transmit a person’s odour, and create the feeling of human body temperature coming from it,” Cheok says, grinning as he points at the twin make-out machines. Just about the only thing Cheok’s device can’t do is ooze digital saliva.

I caught up with Cheok to find out more about his work toward a “multi-sensory internet.”

Motherboard2
The make-out device, plugged into an iPhone

 

Motherboard: Can you tell us a bit more about what you’re doing here, and what this multi-sensory internet is all about?

There is a problem with the current internet technology. The problem is that, online, everything is audiovisual and behind a screen. Even when you interact with your touchscreen, you’re still touching a piece of glass. It’s like being behind a window all the time. Also, on the internet you can’t use all your senses—touch, smell and taste—like you do in the physical world.

Here we are working on new technologies that will allow people to use all their senses while communicating through the Internet. You’ve already seen the kissing machine, and the device that sends smell-messages to your smartphone. We’ve also created devices to hug people via the web: You squeeze a doll and somebody wearing a particular bodysuit feels your hug on their body.

What about tastes and smells? How complex are the scents you can convey through your devices?

We’re still at an early stage, so right now each device can just spray one simple aroma contained in a cartridge. But our long-term goal is acting directly on the brain to produce more elaborated perceptions.

What do you mean?

We want to transmit smells without using any chemical, so what we’re going to do is use magnetic coils to stimulate the olfactory bulb [part of the brain associated with smell]. At first, our plan was to insert them through the skull, but unfortunately the olfactory part of the brain is at the bottom, and doing deep-brain stimulation is very difficult.

And having that stuff going on in your brain is quite dangerous, I suppose. 

Not much—magnetic fields are very safe. Anyway, our present idea is to place the coils at the back of your mouth. There is a bone there called the palatine bone, which is very close to the region of your brain that makes you perceive smells and tastes. In that way we’ll be able to make you feel them just by means of magnetic actuation.

Motherboard3
Cheok demonstrates the taste-transmitter

 

But why should we send smells and tastes to each other in first place?

For example, somebody may want to send you a sweet or a bitter message to tell you how they’re feeling. Smell and taste are strongly linked with emotions and memories, so a certain smell can affect your mood; that’s a totally new way of communicating. Another use is commercial. We are working with the fourth best restaurant in the worldin Spain, to make a device people can use to smell the menu through their phones.

Can you do the same thing also when it comes to tactile sensations? I mean, can you put something in my brain to make me feel hugged? 

It is possible, and there are scientists in Japan who are trying to do that. But the problem with that is that, for the brain, the boundary between touch and pain is very thin. So, if you perform such stimulation you may very easily trigger pain.

It looks like you’re particularly interested in cuddling distant people. When I used to live in Rome, I once had a relationship with a girl living in Turin and it sucked because, well, you can’t make out online. Did you start your research because of a similar episode?

Well, I have always been away from my loved ones. I was born in Australia, but I moved to Japan when I was very young, and I have relatives living in Greece and Malaysia. So maybe my motivation has been my desire to feel closer to my family, rather than to a girl. But of course I know that the internet has globalized our personal networks, so more and more people have long-distance relationships. And, even if we have internet communications, the issue of physical presence is very relevant for distant lovers. That’s why we need to change the internet itself.

Motherboard4
The scent device in action

 

So far you have worked on a long-distance-hugging device and a long-distance-kissing machine. We also have gadgets that can transmit a person’s body odour. If I connect the dots, the next step will be a device for long-distance sex.

Actually, I am currently doing some research about that. You see, the internet has produced a lot of lonely people, who only interact with each other online. Therefore, we need to create technologies that bring people physically—and sexually—together again. Then, there’s another aspect of the issue…

What’s that?

As you noticed, if you put all my devices together, what you’re going to have soon are sorts of “multi-sensory robots”. And I think that, within our lifetime, humans will be able to fall in love with robots and, yeah, even have sex with them.

It seems to me all the work you’re doing here may be very attractive for the internet pornography business.

Of course, one of the big industries that could be interested in our prototypes is the internet sex industry. And, frankly speaking, that being a way of bringing happiness, I think there’s nothing wrong with that. Sex is part of people’s lives. In addition, very often the sex industry has helped to spur technology.

But so far I haven’t been contacted by anybody from that sector. Apparently, there’s quite a big gap between people working in porn and academia.

By Gian Volpicelli

Seminar Multisensory Internet Communication and Virtual Love Chaired by Sir Peter Williams CBE, Speakers Adrian David Cheok and David Levy

Love and sex with robots seminar

 

Seminar details:

26 November 2013

Event time: 6:00 – 7:20pm

Drinks reception: 7:20pm – 8:00pm

Daiwa Foundation Japan House, 13/14 Cornwall Terrace, Outer Circle, London NW1 4QP

Organised by The Daiwa Anglo-Japanese Foundation

Booking Form

 

 

Seminar

Multisensory Internet Communication and Virtual Love

The era of hyperconnected internet allows for new embodied interaction between humans, animals and computers, leading to new forms of social and physical expression. The technologies being developed will in the future augment or mix the real world together with the virtual world. Humans will be able to experience new types of communication environments using all of the senses, where we can see virtual objects in the real environment, virtually touch someone from a distance away, and smell and taste virtual food. Our physical world will be augmented with sensors connected to the internet, buildings and physical spaces, cars, clothes and even our bodies. During the seminar, we will discuss some different research prototype systems for interactive communication, culture, and play. This merging of computing with the physical world may lead to us developing personal feelings for computers, machines and robots, which we will discuss in the second part of the seminar. In the second part, we will be inviting the audience to join us in an exploration of the limits of artificial intelligence. What will it mean for society when artificial intelligence researchers succeed in creating sophisticated artificial personalities, artificial emotions and artificial consciousness? When robots are also endowed with the ability to recognize what we say and what we mean, will they be able to carry on interesting, amusing, intelligent and friendly, even loving conversations with us? How will humans react to this new breed of “person” that can say “I love you” and mean it? These are some of the questions that will touch on the possibility of love, sex and marriage with robots.

 

About the contributors

Professor Adrian David Cheok Professor Adrian David Cheok is Professor of Pervasive Computing at City University London and Founder and Director of the Mixed Reality Lab. His background is in Engineering, and he gained his PhD at the University of Adelaide in 1999. After working at the National University of Singapore and Mitsubishi Electric in Japan, he became Professor at Keio University in the Graduate School of Media Design. His research is concerned with mixed reality, human-computer interfaces, wearable computers, pervasive and ubiquitous computing. He is a recipient of many awards and prizes, including the Hitachi Fellowship, the Microsoft Research Award in Gaming and Graphics and the SIP Distinguished Fellow Award, and was designated as a Young Global Leader by the World Economic Forum in 2008. Professor Cheok often discusses his work on media outlets such as the BBC, CNN and the Discovery Channel, and also works as Editor in Chief of three academic journals, one of which is Lovotics: Academic Studies of Love and Friendship with Robots.

 

Dr David Levy Dr David Levy is President of the International Computer Games Association, and CEO of the London based company Intelligent Toys Ltd. He graduated from the University of St. Andrews in 1967, and moved into the world of business, professional chess playing and writing. He has written more than thirty books on chess, and was awarded the International Master title by FIDE, the World Chess Federation in 1969. In 1968, David started a bet with four Artificial Intelligence professors that he would not lose a chess match against a computer program within ten years. He won that bet. Since 1977 David has been involved in the development of many chess playing and other programs for consumer electronic products. David’s interest in Artificial Intelligence has expanded beyond computer games into other areas of AI, including human-computer conversation, and in 1997 he led the team that won the Loebner Prize competition in New York, which he won again in 2009. His fiftieth book, Love and Sex with Robots, was published in November 2007, shortly after he was awarded a PhD by the University of Maastricht for his thesis entitled Intimate Relationships with Artificial Partners.

 

Sir Peter Williams CBE (chair)
Sir Peter Williams CBE is the Chairman of the Daiwa Anglo-Japanese Foundation, and has a PhD in Engineering from the University of Cambridge. He has previously served as Honorary Treasurer and Vice President of the Royal Society, Chairman of the National Physical Laboratory, Chancellor of the University of Leicester, Chairman and Chief Executive of Oxford Instruments plc, Deputy Chief Executive of VG Instruments Ltd., Master of St. Catherine’s College Oxford, Chairman of Trustees of the Science Museum and Chairman of the Engineering & Technology Board. He has advised Government on issues of science and education, including the ‘Williams Review’ of primary mathematics in 2008 and in 2010 was a member of an international review of the Intergovernmental Panel on Climate Change (IPCC) for the UN Secretary General. He was knighted in 1998 and is a Fellow of the Royal Society and of the Royal Academy of Engineering.

Keynote Speech at VS-Games 2013

vsbanner41

 

Keynote speech at VS-Games 2013

September 11-13, 2013. Bournemouth University, UK.

Keynote Title: Multisensory Feeling Communication in the Hyperconnected Era

Abstract: This talk outlines new facilities that are arising in the hyperconnected internet era within human media spaces. This allows new embodied interaction between humans, species, and computation both socially and physically, with the aim of novel interactive communication and entertainment. Humans can develop new types of communication environments using all the senses, including touch, taste, and smell, which can increase support for multi-person multi-modal interaction and remote presence. In this talk, we present an alternative ubiquitous computing environment and space based on an integrated design of real and virtual worlds. We discuss some different research prototype systems for interactive communication, culture, and play.

 

http://www.vsgames2013.org/p/keynotes.html

Adrian David Cheok Keynote Speaker at International Conference On Informatics and Creative Multimedia 2013 (ICICM’13)

ICICM 13

Keynote speech at International Conference On Informatics and Creative Multimedia 2013 (ICICM’13).
Universiti Teknologi Malaysia, Kuala Lumpur. September 3-6, 2013.

Keynote Title: Multisensory Human Communication in the Hyperconnected Era

Summary:
This talk outlines new facilities that are arising in the hyperconnected internet era within human media spaces. This allows new embodied interaction between humans, species, and computation both socially and physically, with the aim of novel interactive communication and entertainment. Humans can develop new types of communication environments using all the senses, including touch, taste, and smell, which can increase support for multi-person multi-modal interaction and remote presence. In this talk, we present an alternative ubiquitous computing environment and space based on an integrated design of real and virtual worlds. We discuss some different research prototype systems for interactive communication, culture, and play.

http://icicm.mmu.edu.my/speakers/

1 18 19 20 21 22