Adrian David Cheok is featured on a TV documentary series, Tegenlicht, aired in the Netherlands.
The Future of Our Digital Senses
Adrian David Cheok is currently Professor of Pervasive Computing at City University London and the Founder and Director of the Mixed Reality Lab, Singapore. A gifted inventor, academic and speaker, with an impressive research pedigree; his work ranges across wearable computers, ubiquitous computing and pervasive and virtual computer realities.
For Cheok, nothing less than “the next level of the Internet” will suffice. He wants to create a sensing symbiosis – between humans and machines and the analog and digital world. He is striving to form a new sensory vocabulary, that redefines what we experience. If he is successful, the way we perceive our world and the way we sense our reality may be altered drastically forever.
© 2016 Hypernetec Ltd
Recorded at the Wearable Technology Show 2015.
You wake to a loving hug from your partner a 1000 miles away provided by haptic sensors in your pyjamas. The scent of your breakfast wafts towards you from your smartphone and, before you leave for your morning appointment, you share a goodbye kiss with your absent lover, using a pressure sensitive, bi-directional kissing device. Welcome, to the weird and wonderful future of the multi-sensory internet, and the visionary, pioneering work of Professor Adrian David Cheok.
Cheok is currently Professor of Pervasive Computing at City University London and the Founder and Director of the Mixed Reality Lab, Singapore. A gifted inventor, academic, and speaker, with an impressive research pedigree; his work ranges across wearable computers, ubiquitous computing and pervasive and virtual computer realities.
In person, he is affable and animated, throwing out ideas with a rapidity that makes you wonder why he never runs out of them. After spending a few minutes with him, it’s easy to see why, because many of his ideas are a quantum step ahead of everyone else’s.
“We live in the information age and can share almost limitless data,” he says, “but it’s still very difficult to share experiences, because an experience, is about all of the five senses.”
The Next Level of Internet
Cheok wants to create the next stage of the internet, a multi-sensory platform enabling entirely new types of communication. From touching at a distance to smelling and tasting in virtual environments; Cheok’s vision of the future will see us connecting and augmenting the physical and virtual in ways that will change our perceptions of both.
Developing effective interfaces to enable human sensory communication over networks, is no small challenge. Cheok collaborates with researchers and engineers across the globe, working to push the envelope of the possible and to develop new tools and interfaces.
“It’s still a very big research issue,” he says. “How do we sense and how do we replicate the sense of touch, taste and smell? The fundamental difficulty is that audio-visual signals, such as light and sound are waves with different frequencies and [while] you can easily turn a frequency into a number and send it over the internet, smell and taste are molecular-based.”
Since molecules cannot be transmitted through the web, Cheok’s approach is to build devices that can create sense perceptions and send the output of the devices as messages over the internet.
Licking Digital Lollipops
Nimesha Ranasinghe, a former student of Cheok’s, recently demonstrated a ‘digital lollipop’ device which uses electrical and thermal stimulation to create artificial taste sensations. Combining temperature variance with electrical currents has (so far) yielded impressive results. Still, the complexity of the human taste response requires that it be paired with our other senses to create a full flavored response.
Cheok is involved in a product called ‘Scentee,’ a mobile messaging system that uses chemical aromas paired to a smartphone app to send smell messages over networks. The scents are released by an accessory plugged into the phone’s dock connector. While this approach has merits, Cheok acknowledges its limitations. He is currently researching the use of magnetic fields and talks of wanting to stimulate the senses directly.
“If you have a real taste, for example, a drop of lemon juice on your tongue, there is some kind of chemical ionisation. But the next level, is that it causes some sort of electrical signal. What we are doing, is directly stimulating with the electrical current, that signal. You can use similar techniques for the touch receptors [and] you can simulate touch using electrical signals. With the olfactory or smell sensor, the principle is the same”.
Ultimately Cheok’s vision is not tied to a single device or fixed approach; it involves using cutting-edge neuroscience and engineering disciplines, to push past the limits of what is currently possible. He is confident that the ever-rising bell curve of technological advancement will see his ideas come to fruition in the next five to ten years.
It is hard to over-estimate how potentially revolutionary a sensory internet will be. Cheok believes that initially, people will attempt to reproduce what is familiar to them, but over time, new kinds of creative expression will develop.
What Will You Program for Dinner?
For Cheok, the future is a place where we will program food, in the same way, we now program our music. Instead of hugging one person, we will embrace thousands. These will be new kinds of sensing and communication experiences that will alter the way we feel and interact with each other on a very deep experiential level.
For this particular professor, nothing less than “the next level of the Internet” will suffice. He wants to create a sensing symbiosis between humans and machines and between the analog and digital. In doing so, he is striving to form a new sensory vocabulary; one that will revolutionise the way we experience the world.
“The most important thing is to keep pushing the barrier,” he says. “Do quantum step innovation, not incremental work. So that’s what we’re aiming for.”
Tech innovators are adding a fourth dimension to gadgets and devices: the sense of smell
Smell remains the most mysterious of the human senses – scientists are still trying to explain why one scent is pleasant to some people and offensive to others, how fragrances conjure memories from years past, and how aromas influence behavior.
“The relationship between individual aromas and emotions can vary considerably from one person to another,” says Beverley Hawkins of the West Coast Institute of Aromatherapy. “There is no guarantee that two people smelling the same aroma will trigger the same memories or emotions. In fact, more often than not, they will not.”
A study released earlier this year by the Proceedings of the National Academy of Sciences (PNAS) supports Hawkins’ thought. Researchers found that the genes the body uses to detect scents vary up to 30% in any two given individuals. They concluded that each person has an “olfactory fingerprint” that triggers a unique reaction to the same odor molecule.
On average, a person experiences about 10,000 scents in a day. “Accordingly, it only makes sense that some of these are more pleasing than others to your senses,” says Elizabeth Musmanno, president of theFragrance Foundation. “And this in turn absolutely affects your mood.”
Making smell digital
Scientists have long known that the sense of smell serves as a type of bodyguard, warning people about dangers such as spoiled food or a fire. And there is a clear connection between the sense of smell and the sense of taste. Yet despite their strong impact on our bodies, those two senses are often not at the forefront of our minds as we go about our daily routines – mealtimes being the exception, of course.
“All nutrients that enter our body are monitored by the senses of taste and smell, so these senses are very important in general,” says Dr Richard Doty, director of theSmell and Taste Center at the University of Pennsylvania. “Unfortunately they are taken for granted until they become injured or otherwise disabled.”
That could change as product developers move closer toward creating digital experiences that better mimic the real world. For example, Oscar Mayer collaborated with computer scientist Adrian Cheok to design a phone attachment that releases the scent of bacon – and plays the sound of frying – at a preset time. The Wake Up and Smell the Bacon project won the Most Creative Use of Technology prize at the 2015 Shorty Awards.
Another recent invention is the Ophone, a device invented by Harvard University biomedical engineers that allows users to send “smell messages” in a method that’s akin to texting. Also, the Japanese company Scentee has built odor cartridges that attach to a phone’s earbud jack. One intended use is to trick a user’s tastebuds into believing he’s eating, say, a delicious steak instead of a bland salad – a nice way to make dieting more enjoyable.
Musmanno notes another emerging trend: scenting environments. A store can try to create an inviting place for shopping, a hotel may want to convey the scent of luxury or a 4D movie will perhaps use aromas to tell a story.Glade explored the connection between scent, emotion, and interactive and sensory experiences at its Museum of Feelings exhibit in New York City during the holiday season. Visitors walked through a variety of galleries that were inspired by fragrances and learned about how scent impacts emotions.
Advances in scent technology could also stretch to the workplace. Doty imagines a future in which businesses use smells to boost employee performance. “I can foresee the use of odors in public places such as lobbies of buildings to energize workers,” he says. “This has to be done carefully, however, as some people are allergic to certain odors.”
And then there’s virtual reality. For now, VR headsets are able to produce a fairly realistic replication of scenery and human interactions via two senses: sight and hearing. However for a true real-world experience, the other senses will have to be stimulated, too. “Most likely, smells will be included in virtual reality scenarios just to enhance the experience,” says Doty.
There are challenges in turning scents digital, as they’re not nearly as adaptable to mass electronic distribution as images and sound. However, “as we continue to learn more about our sense of smell and what it can do, there will most likely be more applications in the future”, Musmanno says.
“Scent will definitely be part of the evolution of technology. The more the sense of smell is studied, the more amazing it is discovered to be.”
This content is paid for by SC Johnson
by Adrian Peter Tse on 2 Dec 2015
From a multi-sensory internet to smell coding and smart fabrics, through to applying theatrical principles to branding, the realm of the senses represents a brave new world for experiential marketers.
Last week at AdAsia in Taipei, Adrian Tse caught up with five individuals featured by TEDx Taipei, to explore the future of the five senses in experiential marketing—and beyond.
In this video you will meet:
- Adrian David Cheok, professor of pervasive computing at City University of London and director of Imagineering Institute (be sure to watch until the end of the video)
- Stefen Chow, photographer, mountaineer and creator of the Poverty Line
- Sissel Tolaas, chemist, researcher and artist
- Jesko Von Den Steinen, principal artist and actor at The House of Dancing Water and ex-creative strategist at Sid Lee; and
- Elaine Yan Ling Ng, founder and designer director at The Fabrick Lab.
Dara O’Briain tries out our digital taste machine in the new BBC One show, Tomorrow’s Food.
Dara O Briain reveals the awe-inspiring future of our food. To bring us the amazing innovations that will soon be on our dinner plates, he’s joined by a team of experts.
14 July 2014 By Kate Nightingale
We live in an increasingly digital world. We work, shop and play digitally most of the time or at least the digital device is used at some point during these activities. More countries all over the world have access to internet whether via computer or a mobile device.
Basically most of our daily activities are facilitated, shared by or experienced with some type of digital device. The crucial word in here is ‘EXPERIENCE’. We all search for meaningful, intriguing or shocking experiences every day of our lives. Whether it’s sipping cafe au lait in a romantic cafe in Paris, watching a chick flick with your girlfriends and running out of tissues, or meeting your new love for the first time. All these experiences have one thing in common: they are multi-sensory. The smell of that freshly brewed coffee, the warmth and complexity of that first taste, the view of Eiffel Tower, the passion and musicality of French language…
Feeling like jumping on a Eurostar for a quick Paris experience? Now imagine that you can have all that in a comfort of your home. I know, it probably won’t feel as romantic and extraordinary as in real life but it certainly will be possible in not too distant future.
Scientist are heavy at work developing technologies that will allow you to transfer smells, tastes and textures digitally or even at some point create an augmented/virtual reality of a Paris cafe with all those sensations available for you. But they are also teaching computers how to see, smell or develop nutritious and healthy tastes with goal of improving our lives.
One of the better developed areas of research is on seeing. There are already plenty of programmes available that can, for example, read our emotions while we watch an advert so the advertising executives know whether the ad they have produced will have a desired effect. One of these programmes is the FaceReader developed by VicarVision which also has been recently introduced for online use. Another exciting project of VicarVision is ‘Empathic Products’ using emotion recognition to, for example, personalise digital signage and adverts in shopping centres.
How about social media analytics and consumer insight? As we share more and more visual content and less text, the need for analysing our likes and dislikes based on the photos we share became urgent. Fortunately companies like Curalate have developed the software to help companies gain useful insight from visual content or allow them to send personalised offers based on the photos people share via Instagram.
But these are not the most exciting developments. Much more intriguing and perhaps slightly shocking technologies are being developed to help us touch, sniff and taste digitally.
We already have various vibrations on mobile devices to let us know when we perform certain functions. Notice the difference in vibrations when you press the keyboard to when you receive a text or tweet? This is nothing! Soon we will be able to feel textures of fabrics and other materials via the use of ‘microscopic’ vibrations send to our mobile devices.
Imagine shopping online for a dress and being able to feel the textures of the fabric it is made of. Or looking at an advert of a jumper on a train station and being able to touch it and obviously buy it instantly. Or think about the possibilities for B2B market – buyers being able to check the texture and quality of the product virtually before ordering thousands of items to sell in their stores. And how about feeling the temperature or the climate via your phone? This will add a completely another dimension to booking travel and, who knows, maybe even virtual travel. Virgin Holidays opened last year a real-life version of such experience, ‘sensory holiday laboratory’ as they called it, last year in Bluewater where you can stand on a sandy beach, smell the sea and take photographs to share on your social media. Now imagine the same experience in your living room…
The area of research which is working on making it possible is called HAPTICS, as in haptic (touch) perception. One of the experts in the field is Katherine Kuchenbecker who runs the Haptics Group in the University of Pennsylvania. In this short video she explains some of the research the group is working on and introduces the term Haptography, a photography with haptic qualities. How about Instagraming or Tweeting a picture of a cat that you can actually stroke?! Oooh!
IBM Research lab is yet another institution working on developing such technology. They explain that at the beginning it will take a form of a dictionary with, for example, silk having a specific vibration definition that a company will be able to use to represent the fabric they used. However, eventually we will be able to touch digitally in real time.
Immersion Corporation, founded in 1993, is a pioneering company in the use of haptics to enhance digital experience. They are developing some really interesting technologies for mobile, gaming and even films and sport. They have, for example, created an engine that automatically translates the audio in the game to haptic feedback. They are also working on applying this to video content such as advert, action movies and sporting broadcast. How would you like feel like you’re on the field during the World Cup Final?! Soon it will be possible.
It all sounds ‘haptastic’ but why would companies invest in that? Immersion Corporation actually did some research on that and found that content with haptics in it increased the viewer’s level of arousal by 25%. From consumer psychology we know that arousal and pleasure are the key motivators to purchase so imagine the effects of the haptic content on your sales figures.
They have also tested a metric used commonly in streaming video called quality of experience. They asked participants to watch 5min long content and divided them into three conditions: no haptics, haptics reflecting the subwoofer experience, and haptics adding to the story-telling. They found that quality of experience was 10-15% higher in subwoofer haptics condition and between 25-30% higher in narrative haptics condition as compared to no haptics content. See more of their research here.
So soon we will be able to touch the dress before we buy it but how about buying perfume or other cosmetics online? Not to worry! Digital scent messaging is already here.
A new invention called oPhone has been just introduced to the market. It allows you send scent messages and even create your own scent impressions. There is also an IPhone app called oSnap which allows you to create sensory oNotes which you can share with your friends. However, to be able to actually smell your creations, your friends will either have to have the oPhone or go to one of the HotSpots, currently only available in Paris and New York. One of the founders Dr. David Edwards says that the scent vocabulary is at the moment limited to some food-related smells but it’s only a matter of time before we will be able to watch a movie and smell the beach we see.
Another inventor in the field is Dr. Adrian Cheok, founder of Mixed Reality Lab in Singapore and professor of pervasive technology in the City University London. He and his team invented a small device called Scentee which you can attach to your smartphone to send various smells to your friends and family. However, you need to have separate cartridge for each smell and the scent vocabulary is currently limited.
Dr. Cheok also works on digital taste, an ability to send tastes via internet and mobile devices. He presented his work last month on the event called the Circus for the Senses that took place in the Natural History Museum during the Universities Week. It certainly had a great reception. Who wouldn’t want to watch their favourite chef preparing a delicious raspberry Pavlova and be able to taste it immediately. I’m sure you will get up right this second and run to buy or make it. No, by then you will be able to press a button on your TV or mobile and it will jump out of the screen onto your table! I know, maybe slightly farfetched but totally possible within I guess about 10 years.
So now we are impressed when we can download movies and music via our mobile or purchase our groceries. In 5 years we will have all these amazing gear available allowing us to sniff, taste and touch what you see on your screen.
However people will still want an experience and social connections. This is where augmented reality or virtual shops and other venues will come into play. Brands will be able to have virtual shops which people can visit from a comfort of their home. I’m not talking about using avatar but to be actually immersed in the multi-sensory virtual brand experience. So you will be able to walk through the virtual shop, touch the merchandise, smell it and even try it on. Imagine the possibilities for the company to personalise this experience to each individual with a touch of a button! Oh, sorry! This will be automated with the state-of-the-art software!
And how about applying such technologies as Face Reader that can read our emotions and other programmes reacting to our biological functions like heart rate and level of arousal to adjust this virtual experience? For example, the computers will be able to see disgust or other unpleasant emotion on your face and attribute it to a smell you perceived. That will allow a retailer to change this olfactory experience to a positive one instantly.
And how about online dating? We will be able to sniff pheromones adding a completely different dimension to an idea of love at first sniff.
Do you know of the Secret Cinema? These are very secretive events where you can truly experience certain movies by being inserted into a specially created set. Imagine now that you can do it from a comfort of your couch. It’s going to be kind of like 3D with added touch, temperature, scent and taste sensations. It will make you feel like you’re a part of the action and, who knows, maybe even insert yourself into a plot. That’s a true co-creation!
Dr. Cheok certainly shares that view as represented in his comment for CNN article: ‘the ultimate direction of goal is a multi-sensory device unifying all five senses to create an immersive virtual reality, and could be usable within five years’.
Of course, before this technology becomes widely available and affordable, companies need to create immersive and co-creative multi-sensory consumer experiences in real life. As research in consumer psychology and marketing shows us this can have incredible effects on the consumer-brand relationship and obviously the bottom line. Look out for our Sense Reports (coming soon) explaining some of these effects.
See more at: http://stylepsychology.co.uk/digitalmultisensoryconsumerexperience/#sthash.5g0R3S0k.Or2f9K8J.dpuf
Pelkkä audiovisuaalinen viestiminen on kohta niin passé. Lähitulevaisuudessa viestimme ja sometamme kaikilla viidellä aistillamme. Kokkiohjelmia voi kohta haistaa ja maistaa, suudelmat tulevat perille robottien avulla ja halauksia välitetään älypyjamalla.
– Me elämme nyt informaatioaikakautta. Mutta olemme siirtymässä tiedonvälityksestä kokemusten jakamiseen ja pystymme pian välittämään myös kosketuksia, makuja ja hajuja verkon yli. Siitä tulee ihan uudenlaista laajennettua todellisuutta, selittäätietotekniikan professoriAdrian Cheok.
Adrian Cheok haaveilee, että voimme kohta esimerkiksi maistaa tv:n kokkiohjelmat. Ensimmäinen askel siihen suuntaan onkehitetty kieleen kytkettävä simulaattori, jolla huijataan makuaistia sähköisesti esimerkiksi maistamaan happaman maun:
Adrian Cheok on ollut myös mukana kehittämässä puhelimen lisälaitetta, jolla jo nyt voi lähettää tuoksuviestejä verkossa tai herätä uuteen aamuun lempituoksu nenässä. Miten olisi ruusuntuoksuinen syntymäpäiväonnitteluviesti? Tai herkullisen tuoksuinen kaloriton ateria? Tässä vähän esimakua, tai -hajua, jälkimmäisestä:
Tulevaisuuden teknologiat mahdollistavat siis sen, että voimme kohta kokata ja/tai syödä yhdessä, vaikka olisimmekin kaukana toisistamme, koska voimme jakaa kokemuksemme – aistimamme hajut ja maut – verkon yli.
Fyysistä välimatkaa lyhentämään ja ikävää helpottamaan kehitellään koko ajan uusia välineitä. Osakan yliopistossa on kehitetty ihmisen muotoista, halattavaa tyynyrobottia, jonka sisälle voi sujauttaa puhelimen ja näin kuvitella, ettei puhukaan puhelimessa, vaan tiukassa halauksessa:
Adrian Cheokin johtamassa Mixed Reality Lab:ssa taas on kehitetty halaavaa pyjamaa, joka välittää vaikkapa työmatkalla olevan vanhemman halaukset lapselle, ja Kissenger-robottia, jonka avulla voi suudella netissä:
Eivätkä tutkijat ole unohtaneet lemmikeitäkään. Adrian Cheok on ollut mukana kehittämässä laitteistoa, jonka avulla omistaja voi silitellä lemmikkiään – vaikkapa lemmikkikukkoaan – verkon yli:
Robottiavioliitot tulevat. Miten kännykkä välittää tuoksun ja kosketuksen? Teknologia tunkee kehoomme! Näistä visioista lisää Prisma Studiossa keskiviikkona 23.9. TV1 klo 20.
Uskaliaita väitteitä pöyhimässä futuristi Elina Hiltunen, biotekniikan tutkija Lauri Reuter ja psykologi Jukka Häkkinen. Ohjelmaa luotsaa Marjo Harju.
It’s Saturday night, 2050. You switch on some music, turn down the lights and flick the switch to ON. No need for dinner or even a clean shirt because tonight, you’re romancing a robot.
That’s the scenario envisaged by David Levy, author of “Love and Sex and Robots,” who predicts it won’t be long before we’re all doing it — with machines.
“It just takes one famous person to say I had fantastic sex with a robot and you’ll have people queuing up from New York to California,” the CEO of Intelligent Toys Limited told News.com.au. “If you’ve got a robot that looks like a human, feels like a human, behaves like a human, talks like a human, why shouldn’t people find it appealing?”
This November, Levy along with Professor Adrian Cheok will chair the second international congress on “Love and Sex with Robots” in Malaysia. The event will bring together academics from around the world to discuss the legal, ethical and moral questions on everything from “teledildonics” to “humanoids”.
Levy said the subject has spawned a huge amount of interest since his 2007 book and it’s only a matter of time before the currently “crude” versions available become more sophisticated and go mainstream.
“If there was a sophisticated sex robot around now, then I would be very curious to try it,” he said.
“It can’t be long before we get to the point that there are robots looking very lifelike and with appealing designs that people find appealing to look at and then it’s a question of how long it will take before the artificial intelligence is developed to the point where they can carry on interesting and entertaining conversations?”
Whether you find it horrifying or appealing, there’s no doubt the idea has taken root in popular culture with films like “Her,” “Lars and the Real Girl” and “Ex-Machina” dedicated to the relationship between humans and machines.
IF THERE WAS A SOPHISTICATED SEX ROBOT AROUND NOW, THEN I WOULD BE VERY CURIOUS TO TRY IT.
This week the makers of Japanese robot Pepper issued a warning, saying using it for “sexual purposes” breaks the rental agreement after people hacked its software to give it “virtual breasts.”
Meanwhile real-life technological advances like David Hanson’s human robots or Hiroshi Ishiguro’s version have been making robots look more lifelike by the year. Several versions of robotic sex dolls already exist, include RealDoll made by Californian company Abyss, whose owner David Mills once told Vanity Fair he loves women but “doesn’t really like to be around people.”
But along with advances in artificial intelligence, ethical debate is raging around the use of robots whether in the military, medicine or at home, with many questioning what the rapid advances are doing to our relationships with others and ourselves.
Levy is “absolutely convinced” sex with robots is a positive thing for the “millions and millions” of people around the world who don’t have satisfactory relationships. He thinks they could be the cure for everything from loneliness to pedophilia by helping to “wean” pedophiles off having sex with the children they’re attracted to.
“For whatever reason there are huge numbers of people who just don’t have a relationship with someone they can love and someone who can love them,” he said. “For people like that, I think that sex robots will be a real boon. It will get rid of a problem they’ve got, fill a big void in their lives and make them much happier.”
It’s a view that has been described as a “terrifying nightmare” by robotics ethicist Dr. Kathleen Richardson. The senior research fellow at De Montfort University recently launched a Campaign Against Sex Robots with fellow researcher Dr. Erik Billing and wants to highlight the kind of inequalities sex robots can perpetuate in real life.
“We’re not for a ban of sex robots, what we’re giving people is information about are the arguments for sex robots justified, and we’re asking them to examine their own conscience and whether they want to contribute to this development,” she told News.com.au
“Everyone thinks because it’s a robot prostitute then real women and children in the industry won’t be harmed. But that’s not happened because if you don’t address the core idea that it’s not OK to reduce some human beings to things then all you do is add a new layer of complexity and complication and distortion to an already distorted relationship.”
While the emerging nature of the technology means long-term effects have not been documented, Dr. Richardson fears widespread use of robots for sex will destroy human capacity for empathy and entrench notions of sex and gender already prevalent in the sex industry.
“Sex can never not be relational. You need another person. If it’s not relational you’re really masturbating,” she said.
These complexities are the kind of moral, ethical and legal quandaries Professor Cheok expects to air at the conference.
The Australian-born digital expert specializes in human-computer interfaces and thinks robots will be integrated into our lives in the short-term as friends, sex objects and care-givers before the relationships develop and could even include different levels of compliance for the types of relationships people want to have.
“We really don’t know how human society will react. The worst-case scenario is that people begin to have a robot partner rather than a human partner,” he said, adding that this could happen to a “small percentage of the population” similar to the way people have died after being gripped by the reality of video games.
“There will be some people … that prefer robots over humans but I think that won’t be the majority. I think most people will prefer to have real human relationship.”
Professor Adrian David Cheok envisions a world of ‘mixed reality”, where computing is experienced with all our five senses. “We are moving from the age of information towards the age of experience and multi-sensory Internet,” he says.
Professor Cheok develops ways to incorporate touch, sight, sound, smell and taste into computing. In his world of “mixed reality”, people can give each other remote hugs and transmit smells and tastes over the Internet.
“The future Internet integrates all our five senses and leads to new forms of communication. Instead of looking behind a screen, we’ll be able to jump in to smell and taste the world,” he says.
In addition to a hugging ring and pyjama, Cheok has developed a long-distance kissing robot called Kissinger, which looks like a rabbit. The sensors in Kissinger’s lips detect the pressure of a kiss and transmit the kiss to the other user in real time.
“Touch is a fundamental human need. Previous research has demonstrated that when infant rhesus monkeys are given the choice between an artificial-looking mother made of wire that has milk and a realistic-looking, furry mother that has no food, the infant monkeys will almost invariably prefer the realistic-looking mother.”
Cheok is also working on systems that enable people to smell and taste virtual food. 3D printers can create edible messages, and co-cooking devices allow people to cook together over a distance.
“Certain smells and tastes can subconsciously trigger a memory and affect our mood, so it’s important for computer scientists to also consider the emotional part of communications.”
Adrian David Cheok is a professor of pervasive computing at City University London and director of the Mixed Reality Lab at the National University of Singapore. Cheok’s research interests include mixed reality, human-computer interfaces, wearable computers, ubiquitous computing, fuzzy systems, embedded systems and power electronics.
Cheok gave an inspirational talk titled “Everysense Everywhere Internet Connection” at Tampere University of Technology (TUT) on Friday, 25 September. The talk was organised by TUT’s UBINET doctoral training network. He was also one of the keynote speakers at the 2015 Academic MindTrek Conference held in Tampere, Finland, on 22-24 September.
By Naukkarinen Anna
by Daniela Hernandez
First, scientists called for a ban on autonomous killing machines. Now, their sights are set on sex robots.
This week, ethicist Kathleen Richardson and roboticist Erik Billing launched the Campaign Against Sex Robots, saying sexbots would perpetuate the “immense horrors still present in the world of prostitution” which “justifies [the use of women and children] as sex objects.” The campaign is the offshoot of a paper they wrote earlier this year on the ethics of sex robots. Robotic sex dolls, they say, are a menace to society because they “reinforce power relations of inequality and violence.”
Ok, let’s unpack that a little because those are some pretty strong claims. The science on the social effects of sex robots is still embryonic. When I looked to Google Scholar for research on the societal effects of sex robots, many of the hits that came up were ethics-related papers warning how sexualized machines might change the way we view sex and relationships. I couldn’t readily find published articles that delved into how sex robots have specifically altered our sexual worldview. Even the Journal of Sexual Medicine didn’t have much to offer. But that might soon change, with sex robots becoming more and more mainstream.
Last year, David Levy and Adrian Cheok, two robo-sexperts organized the first congress on love and sex with robots. The second one, which will include topics like intelligent electronic sex hardware, gender, and psychological and sociological approaches to sex robots, is scheduled for mid-November in Malaysia.
More life-like sex robots are in the works. Companies, like Real Doll, are rushing to add artificial intelligence to their offerings. That means we might one day have personalized sex machines built to fulfill our every whim, that can be programmed to learn our likes and dislikes.
A person can say no to your weird fetishes. A machine, at least at first, won’t. So you’ll be able to teach it and tweak it to your specifications. There’s research that shows people tend to value things they build more than objects they buy off-the-shelf. And those are just inanimate things like furniture.
What happens when the thing you’re building has glimmers of intelligence and personality? Will we become addicted to these machines? Will the diagnoses of amalgatophilia—a condition in which a person develops sexual desires for objects like statues, dolls or mannequins—become more common, and even acceptable?
We do already have sex with machines after all. Research shows that women turn to technology just as much as men to spice up their sex lives. A 2009 study found that nearly 53% of women used a vibrator, compared to almost 45% of men.
A sex robot’s brain—assuming it’s connected to the internet—will have access to all the sexual activities of the world’s sex robots. Even the most sexually active people won’t have as much data with which to improve their performance in bed. It will be able to please you in ways that may be impossible for a human. Robots could become a mirror of our best and darkest wishes, and in that context, maybe we’ll fall in love—or lust—with our robots faster and more deeply than with other fleshy beings. We often complain we can’t find a real date because of Tinder bots, could you imagine what it’d be like if we had to compete with physical robots as well?
Enter the Campaign Against Sex Robots.
“I think if a person is lonely they will draw on what is available to help them. It’s not a failing of a person if they feel lonely,” anti-sex robot campaigner Richardson told me in an email. “Women also feel lonely.”
Part of Richardson’s objection is that sex robots are likely to be designed solely to please men.
“The fact [is] you’ll be hard pressed to find any company that has put real efforts into creating a male robot for women,” Richardson said. “This is because in the ‘real world’ of human relationships there is a gender division. Men are mainly the buyers of sex; their sexual desire is validated and seen as ok, and women are seen as the suppliers of sex. Female sexual desire and sexuality is very underdeveloped.”
Put that way, it sounds less like the Campaign Against Sex Robots or more like the Campaign for Male Sex Robots.
Richardson’s critique is born out in an Amazon search for the term “sex doll,” which yielded more than 10,000 hits. The vast majority look female.
When I searched “male sex doll,” the options dwindled down to around 3,000, and the majority were actually sex dolls meant most likely forheterosexual males, rather than a male-looking sex buddy.
In a world with sexbots, Richardson thinks these biases will be magnified. Tech already has a gender problem. Our apps and devices are designed by men, usually white men, because they dominate that industry. We’ve seen how this has played out poorly for women already.Until recently, Apple’s HealthKit app didn’t offer menstruation tracking, for instance.
The sex robot industry may well reduce the female body to automated, on-demand vaginas, body parts that can be accessed any time, any way. Richardson and Billing aren’t the first to caution the world against the societal effects of sexual fembots. At the WeRobot conference in April, J.D. candidate Sinziana Gutiu who specializes in the legal implications of human-robot interactions, presented a paper titled “Sex Robots and the Robotization of Consent.” In it, she argues:
The sex robot is an ever-consenting sexual partner and the user has full control of the robot and the sexual interaction. By circumventing any need for consent, sex robots eliminate the need for communication, mutual respect and compromise in the sexual relationship. The use of sex robots results in the dehumanization of sex and intimacy by allowing users to physically act out rape fantasies and confirm rape myths. Of greatest concern is how sex robots will affect men’s ability to identify and understand consent in sexual interactions with women. Widespread use of sex robots will promote user’s antisocial practices and impair the dignity of women.
Robots embody social stereotypes of what is attractive, which could promote or exacerbate new or already existing biases toward women, like the idea that all women are “delicate, passive, obedient, and physically attractive,” Gutiu adds. The sexbots in production today resemble young, mostly white or Asian, women.
Richardson and Billing also express concern that “the development of sex robots will further reduce human empathy that can only be developed by an experience of mutual relationship.”
“We are not against sex. Sex between consenting adults is wonderful, [adults] who can meet each other as free subjects without coercion,” Richardson told me in an email. “We are against sexual objectification of any person, be it women, or men.”
But it may be that people having sex with robots won’t think of them just as objects. We tend to see robots as our friends. We interact with them in ways that we don’t with inanimate appliances. (I named my Roomba Wall-e. My refrigerator doesn’t get the privilege of a nickname.) We do this because robots have what experts call “social valence,” meaning we ascribe to them human qualities, like personality. That makes “robots feel different. So society will treat them differently,” says University of Washington cyberlaw expert Ryan Calo.
But if people do want to sexually assault robots, should we stop them? Some researchers think robot-directed violence and aggression could be used as a proxy to understand what makes humans tick, and could possibly help sex offenders work through their emotions and psychological issues.
The question is how society will react to human-like robots being used in this way. Some companies are already thinking ahead. The manufacturer of the cute Pepper robot, for instance, spells out that sexual acts are against its ownership contract.
Without a legal framework through which to deal with robosex, it’s unclear what a call for ban will do, other than generate headlines. That’s been one of the criticisms with the call for a ban on autonomous weapons. Without a way to enforce a call to action, it’s very likely to fail, and especially when it comes to sex-related things. No one likes killer robots. Everyone likes sexy Cylons.
And if they scare you, take solace in this: “I think it will be some time before these machines will get to the point where they are a dangerous influence,” said John Sullins, a Sonoma State University philosopher who studies robotics and AI. “It would take breakthroughs in robotics and artificial intelligence that I do not see forthcoming. The robots that exist today are far less appealing than the static silicon love dolls. The human imagination is very strong and the way that a sex robot…talks and moves is just not that attractive.”
This story has been updated to include comment from John Sullins.
Published time: 24 Sep, 2015 09:43
A Japanese-based company Softbank, which has created Pepper the robot, has forced customers to sign a document forbidding its owners from using the humanoid for sexual purposes, as well as creating sexy apps.
Even after having paid nearly $2,000 US dollars for the robot, users may have to return Pepper to its makers should they get too personal with the emotional artificial being.
The clause reads that Pepper must not be used “for sexual activity and actions for the purpose of indecent acts, or acts for the purpose of meeting and dating and making acquaintance of the opposite sex.”
Incidentally the child-sized robot has already fallen victim to a hacker prank receiving a pair of virtual breasts on its touch screen. The female developer who reprogrammed the robot to shake its hips and moan when its “breasts” are touched and called it Peppai – a mix of the brand name and Japanese word for breasts “oopai” – said she has done it “for the purpose of testing sexual harassment”.
Pepper has been created by Aldebaran Robotics and Softbank Mobile, one of the largest mobile phone operators in Japan. It is already greeting and interacting with customers in stores.
Pepper is now available for use at home, though people have found that communication is really her only asset, as her domestic skills, such as cleaning or cooking are severely lacking.
“Pepper is a social robot able to converse with you, recognize and react to your emotions, move and live autonomously,” the developer’s website states.
Experts warn that with creations that are more advanced than Pepper, humanity will enter an entirely new territory regarding ethics and legal issues.
“Soon there will be realistic humanoid robots with AI [artificial intelligence],” Professor Adrian David Cheok, who teaches Robotics at the London City University, told the Daily Mail.
“For example, does sex with a robot, when you are married, mean you are cheating?’, he stated.
Cheok believes it is pointless to try and stop humans from falling in love with robots and claims that 60 per cent of people could in fact love an artificial being.
He told the Daily Mail that “We can fall in love with robots and we will think it is alive because we have that empathy that is often extended to non-human things like animals and even teddy bears.”
While SoftBank has banned its customers from any kind of sexual interaction with Pepper, it is believed a whole new generation of robots that will be designed specifically for sexual purposes.
“Sex robots seem to be a growing focus in the robotics industry and the models that they draw on, how they will look, what roles they would play, are very disturbing indeed,” robot ethicist Dr. Kathleen Richardson told the BBC.
Currently Pepper is available for purchase for Japanese residents only and they must be older than 20.
- Pepper the robot costs around £1,300 to buy, another £250-a-month to rent
- Creators SoftBank say using it for ‘sexual purposes’ breaks this agreement
- Computer pranksters reprogrammed an iPad to give Pepper virtual breasts
- Four-foot-tall robot reads human emotions and even offers its user advice
- Throws spotlight on the growing concern over rights for robots
The creators of Pepper the ’emotional robot’ have forbidden users from using it for sexual purposes, creating ‘sexy apps’ for it or reprogramming it to stalk people.
One thousand people paid £1,300 to buy the ‘companion bot’ within one minute of it going on sale in Japan this June, and then £250-a-month in rent.
Japan-based SoftBank included a clause in the ownership contract which said using the robot for ‘the purpose of sexual or indecent behavior’ breaks this agreement.
Disturbingly, computer pranksters have already reprogrammed the touchscreen hanging from its neck to give Pepper ‘virtual breasts’ which makes it shake its hips and moan when touched.
It has reignited the debate around so-called ‘sexbots’, with one roboticist telling MailOnline that machines which humans can realistically fall in love with are only ‘years away’.
The revolutionary Pepper is designed to live alongside humans. It reads emotions, gives its owners advice and makes small talk.
The super-advanced machine is so human-like that it can mimic human behaviour such as empathy, and even love.
Its creators SoftBank have urged customers ‘not to develop any sexy, obscene, or violent apps or actions for Pepper’.
The clause reads that Pepper must not be used ‘for sexual activity and actions for the purpose of indecent acts, or acts for the purpose of meeting and dating and making acquaintance of the opposite sex.’
In a further prohibitive clause Softbank simply inserts ‘no stalking’ into the contract.
Unfortunately the innocent Pepper has fallen into the hands of Japanese computer programmers who rewrote its software and created virtual breasts for the asexual robot.
The female developer who created Peppai – a play on the bot’s name and the Japanese word ‘oopai’ for breasts – said it was ‘for the purposes of testing sexual harassment’.
Some experts now say sex robots far more sensitive, attractive and ’emphatic’ than Pepper – which humans could seemingly fall in love with – are just a few years away.
Soon there will realistic humanoid robots with AI [artificial intelligence]’, said professor Adrian David Cheok, a roboticist at London City University, ‘Some of us will fall in love and have sex with robots.
PEPPER THE ATTENTIVE ROBOT
Within a minute of going on sale last month, the first 1,000 Pepper robots sold out in Japan.
The robot that can read human emotions, comes with a set of comprehensive instructions and guidelines, preparing owners for life with him.
According to reviews, the four-foot-tall machine-on-wheels is charming, considerate, offers advice and will ‘prattle on and on’.
A Japanese journalist who spent half a day with the robot said that the most striking feature is the ‘absolutely ardent attention [Pepper] gives you.’
It told him he looked thin and even asked him about his day.
‘We are entering a completely new territory of ethics and legal issues. We haven’t worked out the ethics yet for robots.’
Professor Cheok suggests the time has come to decide whether robots have rights or not. He added: ‘For example – does sex with a robot, when you are married, mean you are cheating?’.
Leading robot ethicist Dr Kathleen Richardson said could ‘seriously damage’ human relationships.
‘Sex robots seem to be a growing focus in the robotics industry and the models that they draw on – how they will look, what roles they would play – are very disturbing indeed,’ she told the BBC.
She believes that they reinforce traditional stereotypes of women and the view that a relationship need be nothing more than physical.
Dr Richardson added: ‘We think that the creation of such robots will contribute to detrimental relationships between men and women, adults and children, men and men and women and women.’
Professor Cheok agrees we need to draw some boundaries when it comes to robot ethics but he believes trying to stop humans falling in love with robots is ‘pointless’.
His research claims that 60 per cent of people could love a robot. To prove this theory, he is developing a kiss simulator called ‘Kissenger’ which is being adapted for use in some of the world’s most realistic robots.
He said: ‘So we can fall in love with robot and we will think it is alive because we have that empathy that is often extended to non-human things like animals and even teddy bears.
‘Whether its alive or just electronics, a robot will be “alive” for all intent and purposes because they emulate life.
‘So just as we give rights to animals I’m sure we will give rights to robots.’
Such rights couldn’t come sooner for robots in Japan, as it was revealed that Japanese children bullied a robot in public recently, while a Japanese man is being charged after attacking a Pepper robot in a Softbank shop.
KUALA LUMPUR: In the world envisioned by Prof Adrian David Cheok, computing wouldn’t just be something you see or hear. It would be experienced with all the senses.
Cheok, a professor of pervasive computing at the City University London, has been researching ways to incorporate touch, sight, sound, smell and taste into computing for many years.
He calls it the world of “mixed reality”, where one will be able to use special pyjamas to give each other hugs even when miles apart.
Such pyjamas may seem trivial but it has a real-world application.
“Some autistic children can only be calmed by hugs,” Cheok said at the KL Converge! 2015 annual conference and exhibition held here from Thursday to Saturday.
The event was organised to advance the digital lifestyles of all Malaysians through converged communications. This year’s theme was Convergence and Digital Inclusion.
Cheok showed a pair of couple rings – when one user activates the ring, the other receives a “squeeze”.
Also in the works is a kissing robot that allows two users in different locations to “kiss” each other.
The robot, shaped like a rabbit, has lips with sensors to detect and measure the pressure of a kiss. When two users put their lips to the robots, they will transmit the “kiss” to each other in real time.
“You’re probably wondering why we made the robot look like a cute rabbit. We originally made it look like a human head and everyone said it looked and felt creepy,” said the Australian-born Cheok.
Research in pervasive computing doesn’t stop at that – scientists are also working on making glasses that can produce scent.
This can help improve the mood between two users by releasing pleasant scents as they chat, Cheok revealed.
He showed a demo with several funny scenarios where the product could be used – for instance, a poor student could make it release the aroma of a steak dinner to augment a plate of rice.
The device has already been used for marketing purposes in some countries, he added.
To introduce the sense of taste into the world of computing, scientists are working on a device that can be placed on the tongue – it will produce electricity to stimulate the taste receptors to have the user experience different flavours.
Cheok showed a demo at KL converge where he placed the gadget on a user’s tongue and made it produce a lemon-like flavour.
Cheok said the product could be used by children who could virtually cook a dish and then taste and smell it without having to use fire which could be dangerous.
“In the future, two friends can have dinner together even if they are worlds apart,” he quipped.
All our communication over the internet today happens over two senses – audio and visual – but what if we could also smell, touch and taste through our devices? That is the future Professor of Pervasive Computing Adrian David Cheok is working towards in his research. We ask Adrian about the science behind his research as well as the social, cultural and psychological implications of having virtual experiences as good as the real ones.
Professor Adrian David Cheok will be delivering a talk titled “Everywhere, Everysense Communication” at the TEDxKL conference this Saturday, August 8th, at the Indoor Putra Stadium, Bukit Jalil.