Matt McMullen, CEO of Realbotix, to deliver keynote speech at Love and Sex with Robots conference

posted in: Media

FOR IMMEDIATE RELEASE

29 May 2018

Following its success in London, the International Congress on Love and Sex with Robots will hold its fourth conference at the University of Montana in the United States on 13 – 14 December 2018.

Matt McMullen, CEO, artist and design director of Realbotix, will be the keynote speaker at the conference. He will be delivering a speech on the topic of “Artificial Intelligence, Robotics, and Intimacy: A new Alternative form of relationship”

For the past 20 years, McMullen has been creating “The World’s Finest Love Dolls”, as well as undertaking multiple custom projects along the way. He started Abyss Creations, the manufacturer of Realdoll, out of his garage in 1997. McMullen’s dolls have popped up on more than 20 television shows, and co-starred in 10 films. Now, he focuses on integrating cutting-edge technology with silicone doll artistry and believes that AI driven robots can become companions and partners to human beings, and can connect with us in ways that are often overlooked when considering these technologies.

The Love and Sex with Robots congress is the only annual academic event on the topic of human-robot intimate relationships. The congress was founded and chaired by David Levy, author of the eponymous New York Times bestseller “Love and Sex with Robots”, and Adrian David Cheok, Professor of Pervasive Computing at City, University of London and Director of Imagineering Institute, Malaysia.

ACE 2018 is Now Calling for Submissions

posted in: Media

ACE 2018 is calling for submissions to its 15th conference to be held at the University of Montana, USA, on 10-12 December 2018.

We seek all submissions on Affective Computing, Augmented, Mixed and Virtual Reality, Educational and Fitness Games, Game Design, Interactive Storytelling, Mobile and Ubiquitous Entertainment, Sound and Music Design, Tangible Interfaces etc. A full list of topics is available on our website: http://ace2018.info/topics/

Papers may range from 2 to 20 pages long, in the Springer CS Proceedings format. Deadline for submission is 15th July, 2018 (GMT). Detailed instructions for submission format is found on http://ace2018.info/call-for-submissions/.

This year, our keynote speaker is Professor Peter Gray, research professor of psychology at Boston College who has conducted and published research in neuroendocrinology, developmental psychology, anthropology, and education. He is author of an internationally acclaimed introductory psychology textbook (Psychology, Worth Publishers, now in its 8th edition), which views all of psychology from an evolutionary perspective. His recent research focuses on the role of play in human evolution and how children educate themselves, through play and exploration, when they are free to do so. He has expanded on these ideas in his book, Free to Learn: Why Unleashing the Instinct to Play Will Make Our Children Happier, More Self-Reliant, and Better Students for Life (Basic Books). He also authors a regular blog called Freedom to Learn, for Psychology Today magazine. He earned his undergraduate degree at Columbia College and Ph.D. in biological sciences, many years ago, at the Rockefeller University. His own current play includes kayaking, long distance bicycling, backwoods skiing, and vegetable gardening. http://ace2018.info/keynote/

Adrian David Cheok on virtual senses for the internet

posted in: Media

30. March 2017

https://it-gipfelblog.hpi-web.de/interview_post/adrian-david-cheok-on-virtual-senses-for-the-internet/

 

Name: Adrian David Cheok
Position: Director of the Imagineering Institute Malaysia, Chair Professor of Pervasive Computing at City University London, Founder and Director of the Mixed Reality Lab Singapore
Thema: Virtual senses for the internet

 

Nowadays, we mostly perceive our digital world by viewing text and images or hearing audio. But what about the rest of our senses?

Adrian Cheok, Director of the Imagineering Institute Malaysia, founded the Mixed Reality Lab in Singapore and tries to integrate the rest of our senses, like smelling or tasting, into our digital experience. „What do we need digital smells and tastes for anyway?“, one might ask.

Those senses are of big importance for influencing emotion. Every day we can experience how smells can change our mood, e.g. when eating a delicious meal.

The possibilities for using digital senses are widely spread: games, films, messaging and communication over social networks or telephone, but also commercial usage like advertisement.

Cheok’s Mixed Reality Lab develops devices that stimulate our senses with electrical signals, „because we can’t send the chemicals over the internet“, that are normally needed for these body-reactions, explains Prof. Cheok. As he puts it, we already live in our own „analog virtual reality“ with our brain as a device to perceive the world surrounding us. That’s why he believes that the step to a digital virtual reality will not be a very big one.

Might there be a danger in replacing our „real life“ with digital virtual reality?
According to Cheok, differences will become smaller, „but society will adapt“. Kissing or marrying a robot in the future may be as normal as human marriage today.

Nonetheless, the focus of his research is not in replacing our life with digital experiences, but in expanding the analog life by adding digital impressions.

 

Adrian David Cheok Keynote Speaker The title is “Love and Sex with Robots” at IT Innovation Day 2017, Amersfoort, Netherlands – 28/09/2017

Professor Adrian David Cheok will give a keynote speech at IT Innovation Day on 28 September 2017, in Amersfoort, Netherlands.

Title: Love and Sex with Robots
Time: 14:33, 28 September 2017
Location: Prodentfabriek, Amersfoort, Netherlands

In his speech, Professor Cheok will look at the tangible aspect (touch) of technology, and the ways in which this will contribute to an overall experience for people, including sexual behaviour. Adrian Cheok will outline a more controversial view of the future, along with how tangible technology will enhance experiences at all levels of human behaviour. When is good, good enough, real enough, and how can quality be improved? Adrian will also share his latest tech inventions with the public.

https://itinnovationday.nl/spreker/6/adrian-cheok

Imagineering Institute launches Digital Food exhibition at Singapore Science Center

posted in: Media

Digital Food is an exhibition that focuses on the futuristic idea of how taste or flavour have been evolving, i.e. from natural, to artificial or synthetic flavours. The exhibition also explores how senses can be manipulated using digital technology in future. It challenges us to think about how digital food is going to enhance living quality and improve our health. This exhibition is only available for a limited period so do catch it while it last! It is jointly developed by Science Centre Singapore and Imagineering Institute.

Exhibition Dates:

20 Sep 2017 – 20 Nov 2017

Location:

Hall A, Science Centre

Typical time required:

30 min

 

Exhibition Highlights

Digital Candy Shop

The Digital Candy Shop is the main highlight of this exhibition with two interactive stations, i.e. the Digital Cream Pot and the Digital Lollipop, which allow you to “taste” food using technology.

Can you smell and taste colours?

This is part of the story of building up an artificial food experience. Visual cues are part of our perception of flavour and food. From the moment we see the food, our brains begin to build expectations using memories of previous experiences linked with the food’s colour, smell or appearance. Come have a “taste” of the smell.

 

Checking your sensitivity in smell

This exhibit challenges visitor if they have a super nose to distinguish different smells and identify the common ones.

Exhibition Partner

http://www.science.edu.sg/exhibitions/Pages/digitalfood.aspx

Adrian Cheok Keynote Speaker at FDG 2017, Cape Cod

Professor Adrian Cheok was invited to give a keynote speech at the International Conference on Foundations of Digital Games 2017, at Cape Cod, USA.

FDG 2017 is a major international event in-cooperation with ACM SIGAISIGCHI, and SIGGRAPH. It seeks to promote the exchange of information concerning the foundations of digital games, technology used to develop digital games, and the study of digital games and their design, broadly construed. The goal of the conference is the advancement of the study of digital games, including but not limited to new game technologies, critical analysis, innovative designs, theories on play, empirical studies, and data analysis.

Professor Cheok’s keynote speech will be covering the trending topic of “Love and Sex with Robots”.

Time: 15 Aug 2017, 9am

Venue: The Resort and Conference Center at Hyannis, Cape Cod, MA, USA

Title: Love and Sex with Robots

Abstract: “Love and Sex with Robots” has recently become a serious academic topic within the fields of Human Machine Interaction and Human Robot Interaction. This topic has also witnessed a strong upsurge of interest amongst the general public, print media, TV documentaries and feature films. This talk covers the personal aspects of human relationships and interactions with robots and artificial partners. New technologies and research prototypes have been developed to allow more intimate interactions with robot companions like sex robots, emotional robots, humanoid robots, and artificial intelligent systems that can simulate human emotions. Such technologies and systems also engage the users with all their senses, including touch, taste and smell, creating multisensory and immersive interactive experiences. In this talk, we will conclude that humans will marry robots by 2050.

For more information on the conference, visit http://fdg2017.org/.

PRESS RELEASE: Electric Smell Machine for Internet & Virtual Smell

posted in: Research

Date: August 7, 2017
Adrian David Cheok, Kasun Karunanayaka, Surina Hariri, Hanis Camelia, and Sharon Kalu Ufere Imagineering Institute, Iskandar Puteri, Malaysia & City, University of London,UK.
Email: contact@imagineeringinstitute.org
Phone: +607 509 6568
Fax: +607 509 6713

Here we are excited to introduce the world’s first computer controlled digital device developed to stimulate olfactory receptor neurons with the aim of producing smell sensations purely using electrical pulses. Using this device, now we can easily stimulate the various areas of nasal cavity with different kinds of electric pulses. During the initial user experiments, some participants experienced smell sensations including floral, fruity, chemical, and woody. In addition, we have observed a dif- ference in the ability of smelling odorants before and after the electrical stimulation. These results suggest that this technology could be enhanced to artificially create and modify smell sensations. By conducting more experiments with human subjects, we are expecting to uncover the patterns of electrical stimulations, that can effectively generate, modify, and recall smell sensations. This invention can lead to internet and virtual reality digital smell.

Figure 1: Concept of stimulating human olfactory receptor neurons using electric pulses.

To date, almost all smell regeneration methods used in both academia and industry are based on chemicals. These methods have several limitations such as being expensive for long term use, complex, need of routine maintenance, require refilling, less controllability, and non-uniform distribution in the air. More importantly, these chemical based smells cannot be transmitted over the digital networks and regenerate remotely, as we do for the visual and auditory data. Therefore, discovering a method to produce smell sensations without us- ing chemical odorants is a necessity for digitizing the sense of smell. Our concept is illustrated in the Figure 1, which is electrically stimulating the olfactory receptor neurons (ORN) and study whether this approach can produce or modify smell sensations. During a medical experiment in 1973, electrical stimulation of olfactory receptors reported some smell sensations including almond, bitter almond, and vanilla [1]. However, three other similar experiments that used electrical stimulation failed to reproduce any smell sensations [2, 3, 4]. Therefore, finding a proper method to electrically reproduce smell sensations was still undiscovered.

Figure 2: The digital olfactory receptor stimulation device: It has a current controller circuit, endoscope camera, a pair of silver electrodes, a microcontroller, a power supply, a low current multimeter, and a laptop.

Our approach is different from the previous research mentioned above. Our main objective is to develop a controllable and repeatable digital technology, a device that connects to computers and be easily able to programmed and controlled. Also this device needs to generate electric pulses of different frequencies, cur- rents, pulse widths and stimulation times. To provide more stimulation possibilities, we wanted this device to be capable of stimulating diverse sites at the ventral surface of the inferior, middle, and superior nasal concha. Fig. 2 shows the computer controlled digital device we have developed to stimulate olfactory receptors. The amount of current output by the circuit can be controlled using one of the five push buttons shown in Figure 2 and the respective LED near the push button will lights up after the selection. The frequency of the stimulation pulses and stimulation time is controlled by the microcontroller program. It is possible to vary the stimulation frequency from 0Hz to 33kHz and pulse width using the programming. The pair of silver electrodes combined with the endoscopic camera was used to stimulate olfactory receptor neurons, and during the stimulation, one electrode is configured as the positive and the other electrode as the ground. Fig 3 and Fig 4 shows testing our device with human subjects.

Figure 3: This image shows the user study setup and stimulating the nasal cavity targeting the middle and superior concha regions using the device

During our first user study, we have stimulated the 30 subjects using 1mA to 5mA range with frequencies 2Hz, 10Hz, 70Hz, and 180Hz. 1mA at 10Hz and 1mA at 70Hz were the stimulation parameters which gave most prominent results for the smell related responses. Electrical stimulation with 1mA and 70Hz induced the highest odor perceptions. 27% of the participants reported the perceived fragrant and chemical sensa- tions. Other smell sensations that are reported for include, 20% fruity, 20% sweet, 17% tosted and nutty, 10% minty, and 13% woody. Stimulation parameters 1mA/10Hz reported 17% fragrant, 27% sweet 27%, chemical 10%, woody 10%. Meanwhile, results for the 4mA/70Hz reported 82% for pain and 64% reported pressure sensations. We have also probed the effect of electrical stimulation on the nose after stimulation. Therefore, we asked participants to repeat the sniffing of known odorants immediately after stimulation and rate the intensity. Most of the participants reported higher intensity after stimulation. This showed that the electrical stimulation increased the intensity of the odorants in the nose.

Figure 4: This image shows a person is testing the Electric Smell Interface in the lab environment

We are planning to extend this user experiment with more number of participants. The effects of the differ- ent electrical stimulation parameters such as frequency, current, and stimulation period will be more closely studied in future. By analyzing the results, we plan to identify various stimulation patterns that can produce different smell sensations. If the electrical stimulation of olfactory receptors effectively produce smell sen- sations, it will revolutionize the field of communication. Multisensory communication is currently limited to text, audio and video contents. Digitizing touch sense are already been achieved experimentally in the research level and will be embedded to daily communication near future. If the digitization of smell be- comes possible it will paved the way for sensing, communicating and reproducing flavor sensations over the internet. This will create more applications in the fields such as human computer interaction, virtual reality, telepresence, and internet shopping.

References

1.Uziel, A.: Stimulation of human olfactory neuro-epithelium by long-term continuous electrical currents. Journal de physiologie 66(4) (1973) 409422

2.Weiss, T., Shushan, S., Ravia, A., Hahamy, A., Secundo, L., Weissbrod, A., Ben-Yakov, A., Holtzman, Y., Cohen- Atsmoni, S., Roth, Y., et al.: From nose to brain: Un-sensed electrical currents applied in the nose alter activity in deep brain structures. Cerebral Cortex (2016)

3.Straschill, M., Stahl, H., Gorkisch, K.: Effects of electrical stimulation of the human olfactory mucosa.Stereotactic and Functional Neurosurgery 46(5-6) (1984) 286289

4.Ishimaru, T., Shimada, T., Sakumoto, M., Miwa, T., Kimura, Y., Furukawa, M.: Olfactory evoked potential produced by electrical stimulation of the human olfactory mucosa. Chemical senses 22(1) (1997) 7781

Adrian Cheok Keynote Speaker at Visual SG 2017

VisualSG2017

Professor Adrian David Cheok will give a keynote speech at Visual SG in Singapore Science Centre on 28 July 2017.

Topic: Everysense Everywhere Human Communication

Time: 11:10am, 28 July 2017

Location: Singapore Science Centre

Visual SG is South East Asia’s signature Visualisation Festival. The Festival celebrates beauty through its bold emphasis on the visual aesthetics, insights and narratives that reside in data and scientific visualisation. Envisaged as both a serious study and playful showcase, VisualSG presents a full on visual spectacle of data through the lens of artistic and creative expression. Through its line-up of interactive displays, forums and workshops, Visual SG not only raises awareness of the burgeoning field of big data, it also aims to provoke conversations on the significant role of data analytics in today’s business and societal context.

The theme for Visual SG this year is “Make Visual!”. This theme takes us back on a journey to discover our intrinsic roots; that of the intrepid explorer, creator and inventor. It encourages all of us to take that first step to discover the magic that is all around us through unbridled curiosity.

Visual SG’s 2017 line up is an eclectic collection of artists and scientists who are all pushing the boundaries to tell their stories of science through visually stunning and engaging media.

https://www.visualsg.com/

Seks met robots, omdat robots ook gevoelens hebben

posted in: Media

13 July 2017, by René Schoemaker

http://cio.nl/algemeen/99879-seks-met-robots–omdat-robots-ook-gevoelens-hebben

Hoe gevoel, tast en reuk gedigitaliseerd kunnen worden.

Seks met robots is niet ver meer weg, zegt Adrian Cheok van het Imagineering Institute. Maar eerst gaan we kennismaken met de robotleraar en de robotdokter. En dat gaat dit jaar al gebeuren.

 

Kan je ons meer vertellen over je werk bij het Imagineering Institute?

Imagineering Institute is een plek waar we multidisciplinair onderzoek doen. Ons researchteam bestaat uit experts met verschillende achtergronden en die samen werken aan onderzoek gerelateerd aan multisensorcommunicatie, HCI, AI en robotics. Het werk in het lab wordt ‘Imagineering’ genoemd, oftewel fantasierijke toepassing van technische wetenschappen. Imagineering betrekt drie hoofdstromen. Ten eerste de fantasierijke verbeelding: de projecties en gezichtspunten van artiesten en ontwerpers. Ten tweede, toekomstverbeelding: extrapolatie van recente en huidige technologische ontwikkelingen, het maken van imaginaire, maar realistische (uitvoerbare) scenario’s en simulaties van de toekomst. Ten derde, creatieve engineering: nieuw productontwerp, prototyping, en demonstratiewerk van engineers, computerwetenschappers en ontwerpers. Het lab voert onderzoek uit in de velden Mixed Reality, Internet Digital Media, Pervasive Computing, Wearable Technology en Multisensory Communication.

 

Je gaat het op IT Innovation Day hebben over ‘tastbare’ technologie. Veel mensen kunnen zich weinig voorstellen bij een internet dat smaak, gevoel en reuk kan doorgeven. Hoe werkt dat?

Wij willen tast, smaak en reuk digitaliseren. We hebben proof-of-concept prototypes ontwikkeld en verbeteren die. Als die technologie klaar is, is het mogelijk om tast, smaak en geur te digitaliseren, te communiceren en te doen herleven, net als we nu al doen met beeld en geluid. Huggy PajamaPoultry InternetRingU en Kissenger zijn voorbeelden van technologie die we hebben ontwikkeld voor tast-communicatie tussen mensen en tussen mens en dier. Die technologieën zijn in staat om het aanraken te voelen, door te geven en te herproduceren. Voor smaak en reuk gebruiken we voornamelijk elektrische of warmte-energie om de reuk- en smaak-receptors te stimuleren. We kunnen met die stimulatie de receptorcellen activeren en zij genereren dezelfde sensaties die chemische smaak- of geurstimulatie voortbrengt. We hebben wetenschappelijk bewezen dat dat mogelijk is voor smaak en we zijn nu bezig met een serie experimenten voor de reuk. Als dat eveneens lukt, dan zullen we in de komende tien jaar mensen zien die via mobiele apparaten kunnen communiceren met digitale smaak en geur.

 

Op welke manier zou dat een verrijking zijn voor de manier waarop mensen interacteren met elkaar en met apparaten?

We zetten grote stappen naar een hyperconnected wereld waarin alle machines, systemen en processen om ons heen gedigitaliseerd worden en met elkaar verbonden. Dat maakt interacties mogelijk van mens tot mens, mens tot machine en machine tot machine. Digitale interfaces voor tast, smaak en geur kunnen direct worden geïntegreerd en gebruikt voor die scenario’s. Wij geloven dat dit de traditionele tekst-, audio- en videogebaseerde communicatie zal verrijken tot ware multisensorcommunicatie. Daardoor zullen vele toepassingen worden veranderd, zoals internetwinkelen, messaging, videobellen, e-mail, VR en gaming.

 

Wat zijn echt praktische oplossingen waarin gevoel en smaak belangrijk kunnen worden?

Volgens mij vooral in communicatie als één op één-messaging, videoconferencing en websites. Dat zijn de toepassingen die we dagelijks gebruiken in onze interactie met anderen. Wij denken dat deze technologieën een revolutionaire verandering zal brengen in communicatie.

 

Voor veel mensen is de (gedachte aan) interactie met robots wat eng. Zou het gebruik van gevoel hierin kunnen helpen?

Ja. Ik denk dat nieuwe technologieën als tast, smaak en geur, en AI, kunnen helpen in het reduceren van het verschil tussen mens en robot. We zullen deze technologieën implementeren in robots en dat zal de communicatie tussen mens en robot leuker maken. Als jouw robotvriend bijvoorbeeld een lekker hapje tegenkomt als hij buiten de deur is kan hij de smaak delen. Via technologieën als KIssinger zijn we in staat voelsensaties te delen met elkaar. Met gebruik van AI kunnen we tevens robots vriendelijker maken, sensitief en emotioneel. Daarom zijn we nu aan het onderzoeken of we robots kunnen gebruiken als leraren en artsen. Ook hebben we een nieuw onderzoeksveld gestart onder de naam Love and Sex with Robots. Daarin onderzoeken we of mensen intieme relaties kunnen aangaan met robots.

 

Kan je ons wat concrete uitvindingen melden die werkelijk in de markt kunnen worden gezet?

Waarschijnlijk worden de eerste prototypen van een robotarts en robotleraar dit jaar al geïntroduceerd. Daarnaast zijn we van plan een serie onderzoeksrapporten te publiceren op het onderwerp ‘liefde en seks met robots’. Verder zijn we van plan Kissenger dit jaar al commercieel beschikbaar te maken.

 

Wat zijn de voordelen voor de industrie en het bedrijfsleven?

IoT en smartphonetechnologie hebben de manier waarop industrie en bedrijfsleven kijken naar R&D fundamenteel veranderd. In het verleden konden bedrijven als Kodak met succes hun eigen topproducten ontwikkelen binnen hun eigen labs, omdat de concurrentie beperkt was (het was moeilijk te voorspellen dat ze omver zouden worden geblazen door digitale technologie). Tegenwoordig moeten R&D-labs concurreren met miljoenen technologisch onderlegde jongeren die werken vanuit de kelder van hun ouders met de bedoeling de status quo omver te gooien. Het voordeel van een lab zoals die van de Imagineering Institute is dat we bedrijven de laatste trends helpen begrijpen zodat ze hun technologische achterstand kunnen goedmaken en op een gelijk niveau kunnen concurreren met de disrupters.

 

Heb je nog iets toe te voegen dat we nog niet hebben behandeld?

Een van de unieke features van Imagineering Institute is dat het een business incubator heeft (The Hangout Malaysia) binnen het onderzoekslab die uitmunt door de symbiose tussen de oprichters van de deelnemende startup en de onderzoekers die zich richten op nieuwe technologieën en ‘future casting’. De startups doorgaan een rigoureus trainingsprogramma om er zeker van te zijn dat mensen hun producten willen en ervoor willen betalen. We benadrukken tevens de schaalbaarheid van het bedrijf voor lokale, regionale en wereldwijde groei en de positionering ten opzichte van investeerders.

 

Adrian Cheok, gerenommeerd wetenschapper, spreker en onderzoeker richt zich op het tactiele van het internet. Hoe breng je tast over via het internet? Adrian laveert langs de grens van de mogelijkheden op gebied van robotics en gevoel.

Hij zal ingaan op technologie in relatie tot aanraking en op de wijze waarop dit zal bijdragen aan een algehele ervaring voor mensen, inclusief seksualiteit. Wanneer is goed, goed genoeg, echt genoeg en hoe kan de kwaliteit technisch worden verbeterd? Een must see!

1 2 3 4 5 22