Adrian David Cheok, Professor of UoL, Invited to Exhibit at Ars Electronica Festival 2017

posted in: Research | 0

Professor Adrian David Cheok, Chair Professor of University of London, has been invited to exhibit at the Ars Electronica Festival 2017. His work, Kissenger, has been selected by the Ars Electronica Festival committee to showcase for 5 days at one of the most prestigious media arts events to be held on 7-11 September 2017 in POSTCITY Linz, Austria.

Ars Electronica Festival is an international festival for Art, Technology & Society offering a distinct platform. Since 1979 it has provided an extraordinary meeting point. Artists, scientists, engineers, researchers and developers from all over the world are welcomed in Linz, to confront a specific, interdisciplinary theme in the context of exhibitions, conferences, workshops and interventions.

The theme of the 2017 Festival is AI –The Other I, ideas circulating here are innovative, radical, and eccentric in the best sense of that term, they influence our everyday, become integrated in our lifestyle and are our future way of life. One part of the exhibition will be dedicated to Artificial Intimacy, a special branch providing futuristic technical visions related to intimacy between humans and machines. Questions such as “Can a human love a robot?”, “Can a robot love a human?” will provoke your thoughts while exploring some of the latest technology in this area. https://www.aec.at/ai/en/artificial-intimacy/

The 5-day event is expected to welcome audiences of over 85,000. Ars Electronica Festival is supported by a prestigious list of 382 associates, including Intel, mobility partner Daimler, Animation Festival sponsor Maxon, scientific mentor MIT Media Lab and BioAustria. They make it possible for Ars Electronica to stage a festival characterized by huge dimensions and superb quality.

More information about the festival can be found here: https://www.aec.at/festival/en/

PRESS RELEASE: Electric Smell Machine for Internet & Virtual Smell

posted in: Research | 0

Date: August 7, 2017
Adrian David Cheok, Kasun Karunanayaka, Surina Hariri, Hanis Camelia, and Sharon Kalu Ufere Imagineering Institute, Iskandar Puteri, Malaysia & City, University of London,UK.
Email: contact@imagineeringinstitute.org
Phone: +607 509 6568
Fax: +607 509 6713

Here we are excited to introduce the world’s first computer controlled digital device developed to stimulate olfactory receptor neurons with the aim of producing smell sensations purely using electrical pulses. Using this device, now we can easily stimulate the various areas of nasal cavity with different kinds of electric pulses. During the initial user experiments, some participants experienced smell sensations including floral, fruity, chemical, and woody. In addition, we have observed a dif- ference in the ability of smelling odorants before and after the electrical stimulation. These results suggest that this technology could be enhanced to artificially create and modify smell sensations. By conducting more experiments with human subjects, we are expecting to uncover the patterns of electrical stimulations, that can effectively generate, modify, and recall smell sensations. This invention can lead to internet and virtual reality digital smell.

Figure 1: Concept of stimulating human olfactory receptor neurons using electric pulses.

To date, almost all smell regeneration methods used in both academia and industry are based on chemicals. These methods have several limitations such as being expensive for long term use, complex, need of routine maintenance, require refilling, less controllability, and non-uniform distribution in the air. More importantly, these chemical based smells cannot be transmitted over the digital networks and regenerate remotely, as we do for the visual and auditory data. Therefore, discovering a method to produce smell sensations without us- ing chemical odorants is a necessity for digitizing the sense of smell. Our concept is illustrated in the Figure 1, which is electrically stimulating the olfactory receptor neurons (ORN) and study whether this approach can produce or modify smell sensations. During a medical experiment in 1973, electrical stimulation of olfactory receptors reported some smell sensations including almond, bitter almond, and vanilla [1]. However, three other similar experiments that used electrical stimulation failed to reproduce any smell sensations [2, 3, 4]. Therefore, finding a proper method to electrically reproduce smell sensations was still undiscovered.

Figure 2: The digital olfactory receptor stimulation device: It has a current controller circuit, endoscope camera, a pair of silver electrodes, a microcontroller, a power supply, a low current multimeter, and a laptop.

Our approach is different from the previous research mentioned above. Our main objective is to develop a controllable and repeatable digital technology, a device that connects to computers and be easily able to programmed and controlled. Also this device needs to generate electric pulses of different frequencies, cur- rents, pulse widths and stimulation times. To provide more stimulation possibilities, we wanted this device to be capable of stimulating diverse sites at the ventral surface of the inferior, middle, and superior nasal concha. Fig. 2 shows the computer controlled digital device we have developed to stimulate olfactory receptors. The amount of current output by the circuit can be controlled using one of the five push buttons shown in Figure 2 and the respective LED near the push button will lights up after the selection. The frequency of the stimulation pulses and stimulation time is controlled by the microcontroller program. It is possible to vary the stimulation frequency from 0Hz to 33kHz and pulse width using the programming. The pair of silver electrodes combined with the endoscopic camera was used to stimulate olfactory receptor neurons, and during the stimulation, one electrode is configured as the positive and the other electrode as the ground. Fig 3 and Fig 4 shows testing our device with human subjects.

Figure 3: This image shows the user study setup and stimulating the nasal cavity targeting the middle and superior concha regions using the device

During our first user study, we have stimulated the 30 subjects using 1mA to 5mA range with frequencies 2Hz, 10Hz, 70Hz, and 180Hz. 1mA at 10Hz and 1mA at 70Hz were the stimulation parameters which gave most prominent results for the smell related responses. Electrical stimulation with 1mA and 70Hz induced the highest odor perceptions. 27% of the participants reported the perceived fragrant and chemical sensa- tions. Other smell sensations that are reported for include, 20% fruity, 20% sweet, 17% tosted and nutty, 10% minty, and 13% woody. Stimulation parameters 1mA/10Hz reported 17% fragrant, 27% sweet 27%, chemical 10%, woody 10%. Meanwhile, results for the 4mA/70Hz reported 82% for pain and 64% reported pressure sensations. We have also probed the effect of electrical stimulation on the nose after stimulation. Therefore, we asked participants to repeat the sniffing of known odorants immediately after stimulation and rate the intensity. Most of the participants reported higher intensity after stimulation. This showed that the electrical stimulation increased the intensity of the odorants in the nose.

Figure 4: This image shows a person is testing the Electric Smell Interface in the lab environment

We are planning to extend this user experiment with more number of participants. The effects of the differ- ent electrical stimulation parameters such as frequency, current, and stimulation period will be more closely studied in future. By analyzing the results, we plan to identify various stimulation patterns that can produce different smell sensations. If the electrical stimulation of olfactory receptors effectively produce smell sen- sations, it will revolutionize the field of communication. Multisensory communication is currently limited to text, audio and video contents. Digitizing touch sense are already been achieved experimentally in the research level and will be embedded to daily communication near future. If the digitization of smell be- comes possible it will paved the way for sensing, communicating and reproducing flavor sensations over the internet. This will create more applications in the fields such as human computer interaction, virtual reality, telepresence, and internet shopping.

References

1.Uziel, A.: Stimulation of human olfactory neuro-epithelium by long-term continuous electrical currents. Journal de physiologie 66(4) (1973) 409422

2.Weiss, T., Shushan, S., Ravia, A., Hahamy, A., Secundo, L., Weissbrod, A., Ben-Yakov, A., Holtzman, Y., Cohen- Atsmoni, S., Roth, Y., et al.: From nose to brain: Un-sensed electrical currents applied in the nose alter activity in deep brain structures. Cerebral Cortex (2016)

3.Straschill, M., Stahl, H., Gorkisch, K.: Effects of electrical stimulation of the human olfactory mucosa.Stereotactic and Functional Neurosurgery 46(5-6) (1984) 286289

4.Ishimaru, T., Shimada, T., Sakumoto, M., Miwa, T., Kimura, Y., Furukawa, M.: Olfactory evoked potential produced by electrical stimulation of the human olfactory mucosa. Chemical senses 22(1) (1997) 7781

Invitation to Contribute to Special Issue “Love and Sex with Robots”

posted in: Research | 0

mti

I recently accepted an invitation to serve as the Guest Editor for a Special Issue of the journal Multimodal Technologies and Interaction on the subject of “Love and Sex with Robots”. It is my pleasure to invite all researchers to submit an article on this topic.

The article may be either a full paper or a communication based on your own research in this area, or may be a focused review article on some aspect of the subject. MTI is an open access, peer-reviewed journal, edited by Professor Adrian David Cheok. You will not be required to pay the usual publication fee (Article Processing Charge) in the first issue of this journal.

All submissions will be subject to peer review. If you plan to submit a review article please provide me with a title and brief description at your earliest convenience, in order to avoid multiple reviews covering the same material.

For more information about the Special Issue, please see: http://www.mdpi.com/journal/mti/special_issues/robots

For information on manuscript preparation and related matters, please see the instructions for authors: http://www.mdpi.com/journal/mti/instructions

Although the deadline for submission of manuscripts to the Special Issue is 1 October 2016, I would appreciate hearing from you in the next few weeks whether you would be willing to submit a contribution.

I saw the weird and it was at the MRS 2016 National Conference

posted in: Research | 0

pl-logo1

Posted by: Annie Pettit on 21 March 2016

I go to enough market research conferences to have seen pretty much every technology for running questionnaires. I’ve seen virtual reality and Google Glasses and all things cool. But the MRSlive conference in London was my very first introduction to the weird and wonderful world of virtual taste, smell, hugs, and kisses. Yes, you read that right.

I stopped by a booth managed by Emma Yann Zhang, a PhD student at the Department of Computer Science at the City University London. She had some pretty awesome stuff to showcase.

kissesAnyone who is a fan of The Big Bang television show will know about the kissing machine that Raj and Howard so weirdly tried out on each other.

But this device is indeed available. Simply press your lips to the white section of the device and your lip motions will be transferred to the person on the other end. The most obvious use for this technology is, of course, as a kissing machine for long distance relationships. Kissing Gramma and Grampa good night will bring warm fuzzies to anyone but what about more commercial opportunities? Imagine being able to shop online and feel the fabric of the shirt or the smoothness of the flooring you’re thinking of buying.

smellNeed more weird? How about a device that lets you digitally transmit smells? This device is currently available for sale on Amazon, and it lets you choose a predetermined scent from your smartphone and have that scent be released from someone else’s smartphone. Chemicals are contained within the white ‘balloon’ and the cartridge would have to be periodically replaced.

Aside from hilariously sending your friends every bad and gross smell you can think of, companies could test new perfumes and colognes, scents of cleaning products, scents of food and beverages, and more to determine which scents are most consumer friendly. And they could test these scents with anyone anywhere in the world without bringing them together in a central location.

hugsAre you feeling blue? Maybe you could a little hug send from this hugging ring. This device is still a prototype but it currently works with haptic technology to give your finger a little buzz anytime your significant other sends one from their smartphone, similar to how your fitness devices buzzes on your write. Right now, it’s a ring but imagine a future where it’s a bracelet or a necklace or a belt.

And once again you can imagine all that could come from it. Perfectly, individually designed massage clothing. I am so in for that!

tasteAnd lastly, but not necessarily most weirdly is a digital tasting device. Simply clip the silver metal section to the end of your tongue and it will deliver electrical currents that replicate certain tastes. Once again, the implications are impressive. Imagine creating flavors for innumerable new food and beverages without actually making the recipes thirty or forty times. Make one recipe of lasagna and then digitally manipulate the variables. Add a little more salt, less salt, more pepper, more oregano, more basil, more celery. Try out every possible minute flavour difference until you find the one that your target group of consumers loves the most. And once again, your target group could be anyone, anywhere in the world.

This technology fascinates me. Today, it is weird and wonderful and cutting edge. It doesn’t always seem relevant to the market research industry until you take the time to brainstorm the potential applications. Ten years from now, just like we do with mobile phones, we will chuckle at how old-fashioned and clunky it is.

For now, I’ll continue to be really impressed. How cool is this stuff!

Source: http://web.peanutlabs.com/i-saw-the-weird-and-it-was-at-the-mrs-2016-national-conference-mrslive-mrx-newmr/

Adrian David Cheok Editor-in-Chief of Multimodal Technologies and Interaction Journal

Adrian David Cheok has been invited to be the Editor-in-Chief of the new journal Multimodal Technologies and Interaction (MTI). 

About MTI

Multimodal Technologies and Interaction (ISSN 2414-4088) is an international, multi/interdisciplinary, open access, peer-reviewed journal which publishes original articles, critical reviews, research notes, and short communications on this subject. MTI focuses on fundamental and applied research dealing with all kinds of technologies that can acquire and/or reproduce unimodal and multimodal digital content that supports interaction (e.g. human–computer, human–robot and animal–computer). Such technologies may produce visual, tactile, sonic, taste, smell, flavor or any other kind of content that can enrich consumer/user experience.

Our aim is to encourage scientists to publish experimental, theoretical and computational results in as much detail as possible, so that results can be easily reproduced. There is, therefore, no restriction on the length of the papers.

Scope
  • displays/sensors: visual, tactile/haptic, sonic, taste, smell
  • multimodal interaction, interfaces, and communication
  • human–computer and human–robot relations and interaction
  • animal–computer interaction
  • human factors, cognition
  • multimodal perception
  • smart wearable technology
  • psychology and neuroscience
  • digital and sensory marketing
  • enabling, disruptive technologies
  • multimodal science, technology and interfaces
  • theoretical, social and cultural issues
  • design and evaluation
  • content creation, environments processes and methods
  • application domains

For more information or to submit your manuscript to this journal, visit this link http://www.mdpi.com/journal/mti.

Science Museum Exhibition – Cravings: Can your food control you?

posted in: Media, Research | 0

sciencemusuem_logo

Find out how the food you eat affects your body, brain and eating-habits. See our electric taste interface exhibited in the Cravings exhibition at London’s Science Museum! Free Entry.

Screenshot 2015-03-14 16.42.10

What drives your desires for the foods you love? Is it the colour of your spoon, the food your mum ate while pregnant, the trillions of bacteria that dine with you, or the little known ‘second brain’ in your gut?

From the flavours you learned to love in the womb, to the very next bite you take, your appetite has been shaped by food. Through personal stories, fascinating objects and cutting-edge science and technology, explore how food affects your body, brain and eating habits.

Visit Cravings in our Antenna gallery to:

  • See an artificial gut whirring away.
  • Take part in a real experiment on flavour perception.
  • Touch some 3D-printed mice, sniff a scientific smell kit, and ‘chew’ some ‘bread’ in our interactive displays.
  • Play Craving Commander and express your opinion on how we can get raging cravings under control. Should we ban cake except on birthdays? Use smart refrigerators that police what we eat? You decide in this fast-paced game.
  • Discover unconventional dining utensils designed by scientists and chefs to trick our sense of taste.
  • Find out if scientists think we ‘eat with our eyes’ and if we can be ‘addicted’ to food.

 

http://www.sciencemuseum.org.uk/visitmuseum/Plan_your_visit/exhibitions/cravings.aspx

Malaysian delegation visits the City University Hangout

posted in: Media, Research | 0

http://www.city.ac.uk/news/2014/feb/malaysian-delegation-visits-the-hangout

13 February 2014

Managing Director Tan Sri Dato’ Azman bin Hj Mokhtar of the Government of Malaysia’s strategic investment fund, the Khazanah Nasional Berhad, is impressed with City talent in the heart of London’s Silicon Roundabout.

Malaysia-Hangout-Visit-1

A delegation from the Government of Malaysia’s strategic investment fund, the Khazanah National Berhad, visited the Hangout on January 30th.

Located at the epicentre of London’s Tech City, the Hangout provides a unique working environment and incubation space for City University London academics, students and start-ups to foster relationships with investors in the area in order to get their businesses off the ground.

Khazanah Nasional Berhad promotes economic growth and makes strategic investments on behalf of the Government of Malaysia and is keenly interested in partnering with technology companies and products originating at City. With an investment portfolio comprising over 50 major companies in Malaysia and abroad worth £30bn, Khazanah is involved in a broad spectrum of industries.

Led by managing director Tan Sri Dato’ Azman bin Hj Mokhtar, the Malaysian delegation listened attentively to a variety of presentations and investment opportunities. The Khazanah managing director was impressed with the “high level of creativity” being nurtured at City.

These included a presentation on taste and smell actuation via mobile phone from Professor of Pervasive Computing, Professor Adrian Cheok and PhD students from his Mixed Reality Lab; BarPassOfficial (for payment and collection of drink orders via smartphone); Mashmachines (a new media player bringing together sound, lighting, and video into a single user interface); Popcord (an innovative lightweight mobile phone charger); TechCityNews (London’s leading tech sector news and analysis resource); Modafirma (a social commerce platform allowing emerging and independent fashion designers to reach and sell directly to a global audience); and AtomicDataLabs (a software and data management company building applications in large datasets).

Also in attendance were Pro Vice Chancellor for Research & Enterprise, Professor John Fothergill; Director of Enterprise, Dr Sue O’Hare; Dean of the School of Engineering & Mathematical Sciences and the School of Informatics, Professor Roger Crouch; Professor of Dependability and Security, Professor Kevin Jones; Manager of the London City Incubator and Hangout founder, Leo Castellanos; and, Andrew Humphries, co-founder of The Bakery.

Malaysia-Hangout-Visit-3

Catching the whiff of success

posted in: Media, Research | 0

A team made led by City University London’s Mixed Reality Lab and other university academics are finalists in the HackingBullipedia Global Challenge, aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.

A combined team comprising academics from City University London’s Mixed Reality Lab, University of Aix-Marseille (France) and Sogang University (South Korea) has made the final of this year’s HackingBullipedia Global Challenge aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.

Led by Professor Adrian Cheok, Professor of Pervasive Computing in the School of Informatics, their competition entry is titled “Digital Olfaction and Gustation: A Novel Input and Output Method for Bullipedia”.

The team proposes novel methods of digital olfaction and gustation as input and output for internet interaction, specifically for creating and experiencing the digital representation of food, cooking and recipes on the Bullipedia. Other team members include Jordan Tewell, Olivier Oullier and Yongsoon Choi.

No stranger to digital olfaction applications in the culinary space, Professor Cheok recently gave a Digital Taste and Smell presentation to the third top chef in the world, Chef Andoni Luiz Aduriz, at Mugaritz restaurant in San Sebastian, Spain.

The HackingBullipedia Global Challenge was created by the renowned world leading culinary expert, Chef Ferran Adria I Acosta.

The jury, comprising some of the best culinary and digital technology experts in the world arrived at a shortlist of four teams after carefully sifting through 30 proposals from three continents drawn from a mix of independent and university teams.

The other teams in the final are from Uni­ver­si­tat Pom­peu Fabra (Barcelona); the Tech­ni­cal Uni­ver­sity of Cat­alo­nia; and an independent (non university) team from Madrid.

On the 27th of November, two representatives from each of the four finalist teams will pitch their proposal and give a demonstration to the competition’s judges after which the winner will be decided.

Professor Cheok is very pleased that City will be in the final of the competition final:

“I am quite delighted that we were able to make the final of this very challenging and prestigious competition. There were entries from various parts of the world covering a broad spectrum of expertise including a multidisciplinary field of scientists, chefs, designers, culinary professionals, data visualisation experts and artists. We are confident that our team has prepared an equally challenging and creative proposal which will be a game-changer in the gastronomic arena.”

[http://hackingbullipedia.org/thechallenge/overview]

Call For Papers Symposium “Love and Sex with Robots” at AISB 50 Goldsmiths, London, 1-4 April 2014

posted in: Research | 0
Symposium “Love and Sex with Robots” at AISB 50
Goldsmiths, London, 1-4 April 2014

AISB is the pre-eminent society in the UK for Artificial Intelligence and Simulation of Behaviour (www.aisb.org.uk). In 2014 AISB celebrates its 50th anniversary.

The AISB 50 Annual Convention 2014 will be held at Goldsmiths, University of London, from April 1st – 4th.

This is a Call for Papers for a one day symposium, “Love and Sex with Robots”, which will take place during AISB 50. The exact date within the April 1st-4th timeframe will be announced shortly.

Symposium Overview

Within the fields of Human-Computer Interaction and Human-Robot Interaction, the past few years have witnessed a strong upsurge of interest in the more personal aspects of human relationships with these artificial partners. This upsurge has not only been apparent amongst the general public, as evidenced by an increase in coverage in the print media, TV documentaries and feature films, but also within the academic community.

The symposium welcomes submissions on the following topics, inter alia:

Robot Emotions
Humanoid Robots
Clone Robots
Entertainment Robots
Robot Personalities
Teledildonics
Intelligent electronic sex hardware
Gender Approaches
Affective Approaches
Psychological Approaches
Sociological Approaches
Roboethics
Philosophical Approaches

Submission and Publication Details

Submissions must be extended abstracts of approximately 400-500 words, and should be sent via email to both:

Professor Adrian Cheok, Professor of Pervasive Computing, City University, London:
Adrian.Cheok@city.ac.uk

Dr. David Levy, Intelligent Toys Ltd., London:
davidlevylondon@yahoo.com

For the final submission of accepted papers text editor templates from previous conventions can be found at:

http://www.aisb.org.uk/convention/aisb08/download.html

Each extended abstract will receive at least two reviews.

We request that for those papers that are accepted on the basis of their extended abstracts, the final submitted papers be limited to 8 pages. Selected papers will be published in the general proceedings of the AISB Convention, with the proviso that at least one author attends the symposium in order to present the paper and participate in general symposium activities.

Important Dates

i. 21st January 2014 – Deadline for submission of extended abstracts.

ii. 3rd February 2014 – Notification of acceptance/rejection decisions

iii. 24th February 2014 – Final versions of accepted papers (camera ready copy)

iv. 1st – 4th April 2014 – AISB 50

Additional Information

Please note that there will be separate proceedings for each symposium, produced before the convention. Each delegate will receive a memory stick containing the proceedings of all the symposia. In previous years there have been awards for the best student paper, and limited student bursaries. These details will be circulated as and when they become available. Authors of a selection of the best papers will be invited to submit an extended version of the work to a journal special issue.

Program committee:

Joanna Bryson, University of Bath
Adrian Cheok, City University, London
David Levy, Intelligent Toys Ltd
Anton Nijholt, University of Twente
Dennis Reidsma, University of Twente
Yorick Wilks, Florida Institute for Human and Machine Cognition

Organizing committee:

Professor Adrian Cheok, Professor of Pervasive Computing, City University, London:
Adrian.Cheok@city.ac.uk

Dr. David Levy, Intelligent Toys Ltd., London:
davidlevylondon@yahoo.com

Jordan Tewell, City University, London
mastaegg@gmail.com

1 2 3 4 5