Seminar Multisensory Internet Communication and Virtual Love Chaired by Sir Peter Williams CBE, Speakers Adrian David Cheok and David Levy

Seminar details:
26 November 2013
Event time: 6:00 – 7:20pm
Drinks reception: 7:20pm – 8:00pm
Daiwa Foundation Japan House, 13/14 Cornwall Terrace, Outer Circle, London NW1 4QP
Organised by The Daiwa Anglo-Japanese Foundation
Booking FormSeminar
Multisensory Internet Communication and Virtual Love
The era of hyperconnected internet allows for new embodied interaction between humans, animals and computers, leading to new forms of social and physical expression. The technologies being developed will in the future augment or mix the real world together with the virtual world. Humans will be able to experience new types of communication environments using all of the senses, where we can see virtual objects in the real environment, virtually touch someone from a distance away, and smell and taste virtual food. Our physical world will be augmented with sensors connected to the internet, buildings and physical spaces, cars, clothes and even our bodies. During the seminar, we will discuss some different research prototype systems for interactive communication, culture, and play. This merging of computing with the physical world may lead to us developing personal feelings for computers, machines and robots, which we will discuss in the second part of the seminar. In the second part, we will be inviting the audience to join us in an exploration of the limits of artificial intelligence. What will it mean for society when artificial intelligence researchers succeed in creating sophisticated artificial personalities, artificial emotions and artificial consciousness? When robots are also endowed with the ability to recognize what we say and what we mean, will they be able to carry on interesting, amusing, intelligent and friendly, even loving conversations with us? How will humans react to this new breed of “person” that can say “I love you” and mean it? These are some of the questions that will touch on the possibility of love, sex and marriage with robots.
About the contributors
Keynote Speech at Immersive Media Experiences 2013 “Multisensory Mixed Reality with Smell and Taste” Adrian David Cheok http://immersiveme2013.di.fc.ul.pt/keynote.html
Call For Papers Symposium “Love and Sex with Robots” at AISB 50 Goldsmiths, London, 1-4 April 2014
AISB is the pre-eminent society in the UK for Artificial Intelligence and Simulation of Behaviour (www.aisb.org.uk). In 2014 AISB celebrates its 50th anniversary.
The AISB 50 Annual Convention 2014 will be held at Goldsmiths, University of London, from April 1st – 4th.
This is a Call for Papers for a one day symposium, “Love and Sex with Robots”, which will take place during AISB 50. The exact date within the April 1st-4th timeframe will be announced shortly.
Symposium Overview
Within the fields of Human-Computer Interaction and Human-Robot Interaction, the past few years have witnessed a strong upsurge of interest in the more personal aspects of human relationships with these artificial partners. This upsurge has not only been apparent amongst the general public, as evidenced by an increase in coverage in the print media, TV documentaries and feature films, but also within the academic community.
The symposium welcomes submissions on the following topics, inter alia:
Robot Emotions
Humanoid Robots
Clone Robots
Entertainment Robots
Robot Personalities
Teledildonics
Intelligent electronic sex hardware
Gender Approaches
Affective Approaches
Psychological Approaches
Sociological Approaches
Roboethics
Philosophical Approaches
Submission and Publication Details
Submissions must be extended abstracts of approximately 400-500 words, and should be sent via email to both:
Professor Adrian Cheok, Professor of Pervasive Computing, City University, London:
Adrian.Cheok@city.ac.uk
Dr. David Levy, Intelligent Toys Ltd., London:
davidlevylondon@yahoo.com
For the final submission of accepted papers text editor templates from previous conventions can be found at:
http://www.aisb.org.uk/convention/aisb08/download.html
Each extended abstract will receive at least two reviews.
We request that for those papers that are accepted on the basis of their extended abstracts, the final submitted papers be limited to 8 pages. Selected papers will be published in the general proceedings of the AISB Convention, with the proviso that at least one author attends the symposium in order to present the paper and participate in general symposium activities.
Important Dates
i. 21st January 2014 – Deadline for submission of extended abstracts.
ii. 3rd February 2014 – Notification of acceptance/rejection decisions
iii. 24th February 2014 – Final versions of accepted papers (camera ready copy)
iv. 1st – 4th April 2014 – AISB 50
Additional Information
Please note that there will be separate proceedings for each symposium, produced before the convention. Each delegate will receive a memory stick containing the proceedings of all the symposia. In previous years there have been awards for the best student paper, and limited student bursaries. These details will be circulated as and when they become available. Authors of a selection of the best papers will be invited to submit an extended version of the work to a journal special issue.
Program committee:
Joanna Bryson, University of Bath
Adrian Cheok, City University, London
David Levy, Intelligent Toys Ltd
Anton Nijholt, University of Twente
Dennis Reidsma, University of Twente
Yorick Wilks, Florida Institute for Human and Machine Cognition
Organizing committee:
Professor Adrian Cheok, Professor of Pervasive Computing, City University, London:
Adrian.Cheok@city.ac.uk
Dr. David Levy, Intelligent Toys Ltd., London:
davidlevylondon@yahoo.com
Jordan Tewell, City University, London
mastaegg@gmail.com
Chinwag Psych: PsychUp at City University London the Hangout. Speech on multisensory communication: Adrian David Cheok
Live Interview on BBC World Service, Click program, Adrian David Cheok, “Smelling your phone”
You can touch your screen on your PC or mobile phone and interact with that inanimate object that way but can you smell it? If you can smell it, how about tasting it? It may sound fanciful but Professor Adrian Cheok believes it is not far off and fanciful but near and achievable. He has been working on a device that will allow users to smell the person they are talking to on the phone. He joins Click to demonstrate ChatPerf and the ability to smell and taste our technology.
Live Broadcast:
Tue 8 Oct 2013
18:32 GMT
Repeat Broadcasts:
Wed 9 Oct 2013
01:32 GMT
Wed 9 Oct 2013
08:32 GMT
Listen Online: http://www.bbc.co.uk/programmes/p01jf1ht



Radio Interview #TechTalkfest – The technology of taste and touch with Pervasive Computing with @AdrianCheok – @z1radio

http://www.zoneoneradio.com/2013/09/techtalkfest-technology-of-taste-and.html
ZoneOneRadio
Tech Talkfest is your weekly podcast about the UK technology scene.
This week we talk to Professor and inventor Adrian Cheok about bringing technology from behind the screen and into our senses. He also give us an insight into why Japan wants another Apple and not another Sony…
www.twitter.com/TechTalkfest and www.twitter.com/z1radio
www.ZoneOneRadio.com
www.facebook.com/ZoneOneRadio
Interview with Adrian David Cheok in Sydney Morning Herald “Robots starting to feel the love”
Adrian David Cheok was interviewed for an article in Sydney Morning Herald, one of Australia’s leading newspapers. Read article on http://www.smh.com.au/technology/sci-tech/robots-starting-to-feel-the-love-20130918-2tzey.html#ixzz2gUbt0Itz and an extract is given below.
Researchers believe we will become emotionally attached to robots, even falling in love with them. People already love inanimate objects like cars and smartphones. Is it too far a step to think they will fall deeper for something that interacts back?
“Fantastic!” says Adrian Cheok, of Japan’s Keoi University’s mixed reality lab, when told of the Paro study. Professor Cheok, from Adelaide, is at the forefront of the emerging academic field of Lovotics, or love and robotics.
Cheok believes the increasing complexity of robots means they will have to understand emotion. With social robots that may be with you 24 hours a day, he says it is “very natural” people will want to feel affection for the machine. A care-giver robot will need to understand emotion to do its job, and he says it would be a simple step for the robot to express emotion. “Within a matter of years we’re going to have robots which will effectively be able to detect emotion and display it, and also learn from their environment,” he says.
The rather spooky breakthrough came when artificial intelligence researchers realised they did not need to create artificial life. All they needed to do was mimic life, which makes mirror neurons – the basis of empathy – fire in the brain. “If you have a robot cat or robot human and it looks happy or sad, mirror neurons will be triggered at the subconscious level, and at that level we don’t know if the object is alive or not, we can still feel empathy,” Cheok says. “We can’t really tell the difference if the robot is really feeling the emotion or not and ultimately it doesn’t matter. Even for humans we don’t know whether a person’s happy or sad.” He argues if a robot emulates life, for all intents and purposes it is alive.
Psychologist Amanda Gordon, an adjunct associate professor at the University of Canberra, is sceptical. “It’s not emotional, it’s evoking the emotion in the receiver,” she says. ”That seal isn’t feeling anything. It’s not happy or sad or pleased to see you.”
She says the risk is that people fall for computer programs instead of a real relationship. “Then you’re limiting yourself. You’re not really interacting with another. Real-life relationships are growth-ful, you develop in response to them. They challenge you to do things differently.”
Cheok’s research shows 60 per cent of people could love a robot. “I think people fundamentally have a desire, a need to be loved, or at least cared for,” he says. “I think it’s so strong that we can probably suspend belief to have a loving relationship with a robot.”
Probably the most advanced android in the world is the Geminoid robot clone of its creator Hiroshi Ishiguro, director of the Intelligent Robotics lab at Osaka University. Professor Ishiguro says our bodies are always moving, so he programmed that realistic motion into his creation along with natural facial expressions.
The one thing it does not do is age, which means 49-year-old Ishiguro is constantly confronted with his 41-year-old face. “I’m getting old and the android doesn’t,” he says. ”People are always watching the android and that means the android has my identity.” So he has had plastic surgery – at $10,000, he says it is cheaper than $30,000 to build a new head.
Robots can help kids with autism who do not relate to humans. Ishiguro is working with the Danish government to see how his Telenoid robots can aid the elderly.
Moyle says she has had inquiries from throughout Australia about Paro. A New Zealand study showed dementia victims interacted with a Paro more than a living dog.
“There are a lot of possible negative things [that artificial intelligence and robots could lead to],” Cheok says, “and we should be wary as we move along. We have to make sure we try to adjust. But in general I think the virtual love for the characters in your phone or screen or soon robots is ultimately increasing human happiness, and that’s a good thing for humanity.”
Read more: http://www.smh.com.au/technology/sci-tech/robots-starting-to-feel-the-love-20130918-2tzey.html#ixzz2gUbt0Itz
































