“MIHAJLO PUPIN” TECHNICAL FACULTY IN ZRENJANIN. http://www.tfzr.uns.ac.rs/en/
“MIHAJLO PUPIN” TECHNICAL FACULTY IN ZRENJANIN. http://www.tfzr.uns.ac.rs/en/
Professor Adrian David Cheok, Chair Professor of University of London, has been invited to exhibit at the Ars Electronica Festival 2017. His work, Kissenger, has been selected by the Ars Electronica Festival committee to showcase for 5 days at one of the most prestigious media arts events to be held on 7-11 September 2017 in POSTCITY Linz, Austria.
Ars Electronica Festival is an international festival for Art, Technology & Society offering a distinct platform. Since 1979 it has provided an extraordinary meeting point. Artists, scientists, engineers, researchers and developers from all over the world are welcomed in Linz, to confront a specific, interdisciplinary theme in the context of exhibitions, conferences, workshops and interventions.
The theme of the 2017 Festival is AI –The Other I, ideas circulating here are innovative, radical, and eccentric in the best sense of that term, they influence our everyday, become integrated in our lifestyle and are our future way of life. One part of the exhibition will be dedicated to Artificial Intimacy, a special branch providing futuristic technical visions related to intimacy between humans and machines. Questions such as “Can a human love a robot?”, “Can a robot love a human?” will provoke your thoughts while exploring some of the latest technology in this area. https://www.aec.at/ai/en/artificial-intimacy/
The 5-day event is expected to welcome audiences of over 85,000. Ars Electronica Festival is supported by a prestigious list of 382 associates, including Intel, mobility partner Daimler, Animation Festival sponsor Maxon, scientific mentor MIT Media Lab and BioAustria. They make it possible for Ars Electronica to stage a festival characterized by huge dimensions and superb quality.
More information about the festival can be found here: https://www.aec.at/festival/
Date: August 7, 2017
Adrian David Cheok, Kasun Karunanayaka, Surina Hariri, Hanis Camelia, and Sharon Kalu Ufere Imagineering Institute, Iskandar Puteri, Malaysia & City, University of London,UK.
Phone: +607 509 6568
Fax: +607 509 6713
Here we are excited to introduce the world’s ﬁrst computer controlled digital device developed to stimulate olfactory receptor neurons with the aim of producing smell sensations purely using electrical pulses. Using this device, now we can easily stimulate the various areas of nasal cavity with different kinds of electric pulses. During the initial user experiments, some participants experienced smell sensations including ﬂoral, fruity, chemical, and woody. In addition, we have observed a dif- ference in the ability of smelling odorants before and after the electrical stimulation. These results suggest that this technology could be enhanced to artiﬁcially create and modify smell sensations. By conducting more experiments with human subjects, we are expecting to uncover the patterns of electrical stimulations, that can effectively generate, modify, and recall smell sensations. This invention can lead to internet and virtual reality digital smell.
To date, almost all smell regeneration methods used in both academia and industry are based on chemicals. These methods have several limitations such as being expensive for long term use, complex, need of routine maintenance, require reﬁlling, less controllability, and non-uniform distribution in the air. More importantly, these chemical based smells cannot be transmitted over the digital networks and regenerate remotely, as we do for the visual and auditory data. Therefore, discovering a method to produce smell sensations without us- ing chemical odorants is a necessity for digitizing the sense of smell. Our concept is illustrated in the Figure 1, which is electrically stimulating the olfactory receptor neurons (ORN) and study whether this approach can produce or modify smell sensations. During a medical experiment in 1973, electrical stimulation of olfactory receptors reported some smell sensations including almond, bitter almond, and vanilla . However, three other similar experiments that used electrical stimulation failed to reproduce any smell sensations [2, 3, 4]. Therefore, ﬁnding a proper method to electrically reproduce smell sensations was still undiscovered.
Our approach is different from the previous research mentioned above. Our main objective is to develop a controllable and repeatable digital technology, a device that connects to computers and be easily able to programmed and controlled. Also this device needs to generate electric pulses of different frequencies, cur- rents, pulse widths and stimulation times. To provide more stimulation possibilities, we wanted this device to be capable of stimulating diverse sites at the ventral surface of the inferior, middle, and superior nasal concha. Fig. 2 shows the computer controlled digital device we have developed to stimulate olfactory receptors. The amount of current output by the circuit can be controlled using one of the ﬁve push buttons shown in Figure 2 and the respective LED near the push button will lights up after the selection. The frequency of the stimulation pulses and stimulation time is controlled by the microcontroller program. It is possible to vary the stimulation frequency from 0Hz to 33kHz and pulse width using the programming. The pair of silver electrodes combined with the endoscopic camera was used to stimulate olfactory receptor neurons, and during the stimulation, one electrode is conﬁgured as the positive and the other electrode as the ground. Fig 3 and Fig 4 shows testing our device with human subjects.
During our ﬁrst user study, we have stimulated the 30 subjects using 1mA to 5mA range with frequencies 2Hz, 10Hz, 70Hz, and 180Hz. 1mA at 10Hz and 1mA at 70Hz were the stimulation parameters which gave most prominent results for the smell related responses. Electrical stimulation with 1mA and 70Hz induced the highest odor perceptions. 27% of the participants reported the perceived fragrant and chemical sensa- tions. Other smell sensations that are reported for include, 20% fruity, 20% sweet, 17% tosted and nutty, 10% minty, and 13% woody. Stimulation parameters 1mA/10Hz reported 17% fragrant, 27% sweet 27%, chemical 10%, woody 10%. Meanwhile, results for the 4mA/70Hz reported 82% for pain and 64% reported pressure sensations. We have also probed the effect of electrical stimulation on the nose after stimulation. Therefore, we asked participants to repeat the snifﬁng of known odorants immediately after stimulation and rate the intensity. Most of the participants reported higher intensity after stimulation. This showed that the electrical stimulation increased the intensity of the odorants in the nose.
We are planning to extend this user experiment with more number of participants. The effects of the differ- ent electrical stimulation parameters such as frequency, current, and stimulation period will be more closely studied in future. By analyzing the results, we plan to identify various stimulation patterns that can produce different smell sensations. If the electrical stimulation of olfactory receptors effectively produce smell sen- sations, it will revolutionize the ﬁeld of communication. Multisensory communication is currently limited to text, audio and video contents. Digitizing touch sense are already been achieved experimentally in the research level and will be embedded to daily communication near future. If the digitization of smell be- comes possible it will paved the way for sensing, communicating and reproducing ﬂavor sensations over the internet. This will create more applications in the ﬁelds such as human computer interaction, virtual reality, telepresence, and internet shopping.
1.Uziel, A.: Stimulation of human olfactory neuro-epithelium by long-term continuous electrical currents. Journal de physiologie 66(4) (1973) 409422
2.Weiss, T., Shushan, S., Ravia, A., Hahamy, A., Secundo, L., Weissbrod, A., Ben-Yakov, A., Holtzman, Y., Cohen- Atsmoni, S., Roth, Y., et al.: From nose to brain: Un-sensed electrical currents applied in the nose alter activity in deep brain structures. Cerebral Cortex (2016)
3.Straschill, M., Stahl, H., Gorkisch, K.: Effects of electrical stimulation of the human olfactory mucosa.Stereotactic and Functional Neurosurgery 46(5-6) (1984) 286289
4.Ishimaru, T., Shimada, T., Sakumoto, M., Miwa, T., Kimura, Y., Furukawa, M.: Olfactory evoked potential produced by electrical stimulation of the human olfactory mucosa. Chemical senses 22(1) (1997) 7781
The Universiti Sains Malaysia (USM) Senate, in its 245th Meeting on 24 May 2017, is honoured to appoint Professor Adrian David Cheok as a committee member for the Board of Studies for Cognitive Neuroscience Postgraduate Degree Programme by the USM School of Medical Sciences.
I recently accepted an invitation to serve as the Guest Editor for a Special Issue of the journal Multimodal Technologies and Interaction on the subject of “Love and Sex with Robots”. It is my pleasure to invite all researchers to submit an article on this topic.
The article may be either a full paper or a communication based on your own research in this area, or may be a focused review article on some aspect of the subject. MTI is an open access, peer-reviewed journal, edited by Professor Adrian David Cheok. You will not be required to pay the usual publication fee (Article Processing Charge) in the first issue of this journal.
All submissions will be subject to peer review. If you plan to submit a review article please provide me with a title and brief description at your earliest convenience, in order to avoid multiple reviews covering the same material.
For more information about the Special Issue, please see: http://www.mdpi.com/journal/mti/special_issues/robots
For information on manuscript preparation and related matters, please see the instructions for authors: http://www.mdpi.com/journal/mti/instructions
Although the deadline for submission of manuscripts to the Special Issue is 1 October 2016, I would appreciate hearing from you in the next few weeks whether you would be willing to submit a contribution.
Posted by: Annie Pettit on 21 March 2016
I go to enough market research conferences to have seen pretty much every technology for running questionnaires. I’ve seen virtual reality and Google Glasses and all things cool. But the MRSlive conference in London was my very first introduction to the weird and wonderful world of virtual taste, smell, hugs, and kisses. Yes, you read that right.
I stopped by a booth managed by Emma Yann Zhang, a PhD student at the Department of Computer Science at the City University London. She had some pretty awesome stuff to showcase.
Anyone who is a fan of The Big Bang television show will know about the kissing machine that Raj and Howard so weirdly tried out on each other.
But this device is indeed available. Simply press your lips to the white section of the device and your lip motions will be transferred to the person on the other end. The most obvious use for this technology is, of course, as a kissing machine for long distance relationships. Kissing Gramma and Grampa good night will bring warm fuzzies to anyone but what about more commercial opportunities? Imagine being able to shop online and feel the fabric of the shirt or the smoothness of the flooring you’re thinking of buying.
Need more weird? How about a device that lets you digitally transmit smells? This device is currently available for sale on Amazon, and it lets you choose a predetermined scent from your smartphone and have that scent be released from someone else’s smartphone. Chemicals are contained within the white ‘balloon’ and the cartridge would have to be periodically replaced.
Aside from hilariously sending your friends every bad and gross smell you can think of, companies could test new perfumes and colognes, scents of cleaning products, scents of food and beverages, and more to determine which scents are most consumer friendly. And they could test these scents with anyone anywhere in the world without bringing them together in a central location.
Are you feeling blue? Maybe you could a little hug send from this hugging ring. This device is still a prototype but it currently works with haptic technology to give your finger a little buzz anytime your significant other sends one from their smartphone, similar to how your fitness devices buzzes on your write. Right now, it’s a ring but imagine a future where it’s a bracelet or a necklace or a belt.
And once again you can imagine all that could come from it. Perfectly, individually designed massage clothing. I am so in for that!
And lastly, but not necessarily most weirdly is a digital tasting device. Simply clip the silver metal section to the end of your tongue and it will deliver electrical currents that replicate certain tastes. Once again, the implications are impressive. Imagine creating flavors for innumerable new food and beverages without actually making the recipes thirty or forty times. Make one recipe of lasagna and then digitally manipulate the variables. Add a little more salt, less salt, more pepper, more oregano, more basil, more celery. Try out every possible minute flavour difference until you find the one that your target group of consumers loves the most. And once again, your target group could be anyone, anywhere in the world.
This technology fascinates me. Today, it is weird and wonderful and cutting edge. It doesn’t always seem relevant to the market research industry until you take the time to brainstorm the potential applications. Ten years from now, just like we do with mobile phones, we will chuckle at how old-fashioned and clunky it is.
For now, I’ll continue to be really impressed. How cool is this stuff!
Adrian David Cheok has been invited to be the Editor-in-Chief of the new journal Multimodal Technologies and Interaction (MTI).
Multimodal Technologies and Interaction (ISSN 2414-4088) is an international, multi/interdisciplinary, open access, peer-reviewed journal which publishes original articles, critical reviews, research notes, and short communications on this subject. MTI focuses on fundamental and applied research dealing with all kinds of technologies that can acquire and/or reproduce unimodal and multimodal digital content that supports interaction (e.g. human–computer, human–robot and animal–computer). Such technologies may produce visual, tactile, sonic, taste, smell, flavor or any other kind of content that can enrich consumer/user experience.
Our aim is to encourage scientists to publish experimental, theoretical and computational results in as much detail as possible, so that results can be easily reproduced. There is, therefore, no restriction on the length of the papers.
For more information or to submit your manuscript to this journal, visit this link http://www.mdpi.com/journal/mti.
Check out PCBWeb – a very useful and free CAD application for designing and manufacturing electronics hardware.
Find out how the food you eat affects your body, brain and eating-habits. See our electric taste interface exhibited in the Cravings exhibition at London’s Science Museum! Free Entry.
What drives your desires for the foods you love? Is it the colour of your spoon, the food your mum ate while pregnant, the trillions of bacteria that dine with you, or the little known ‘second brain’ in your gut?
From the flavours you learned to love in the womb, to the very next bite you take, your appetite has been shaped by food. Through personal stories, fascinating objects and cutting-edge science and technology, explore how food affects your body, brain and eating habits.
Visit Cravings in our Antenna gallery to:
13 February 2014
A delegation from the Government of Malaysia’s strategic investment fund, the Khazanah National Berhad, visited the Hangout on January 30th.
Located at the epicentre of London’s Tech City, the Hangout provides a unique working environment and incubation space for City University London academics, students and start-ups to foster relationships with investors in the area in order to get their businesses off the ground.
Khazanah Nasional Berhad promotes economic growth and makes strategic investments on behalf of the Government of Malaysia and is keenly interested in partnering with technology companies and products originating at City. With an investment portfolio comprising over 50 major companies in Malaysia and abroad worth £30bn, Khazanah is involved in a broad spectrum of industries.
Led by managing director Tan Sri Dato’ Azman bin Hj Mokhtar, the Malaysian delegation listened attentively to a variety of presentations and investment opportunities. The Khazanah managing director was impressed with the “high level of creativity” being nurtured at City.
These included a presentation on taste and smell actuation via mobile phone from Professor of Pervasive Computing, Professor Adrian Cheok and PhD students from his Mixed Reality Lab; BarPassOfficial (for payment and collection of drink orders via smartphone); Mashmachines (a new media player bringing together sound, lighting, and video into a single user interface); Popcord (an innovative lightweight mobile phone charger); TechCityNews (London’s leading tech sector news and analysis resource); Modafirma (a social commerce platform allowing emerging and independent fashion designers to reach and sell directly to a global audience); and AtomicDataLabs (a software and data management company building applications in large datasets).
Also in attendance were Pro Vice Chancellor for Research & Enterprise, Professor John Fothergill; Director of Enterprise, Dr Sue O’Hare; Dean of the School of Engineering & Mathematical Sciences and the School of Informatics, Professor Roger Crouch; Professor of Dependability and Security, Professor Kevin Jones; Manager of the London City Incubator and Hangout founder, Leo Castellanos; and, Andrew Humphries, co-founder of The Bakery.
A team made led by City University London’s Mixed Reality Lab and other university academics are finalists in the HackingBullipedia Global Challenge, aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.
A combined team comprising academics from City University London’s Mixed Reality Lab, University of Aix-Marseille (France) and Sogang University (South Korea) has made the final of this year’s HackingBullipedia Global Challenge aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.
Led by Professor Adrian Cheok, Professor of Pervasive Computing in the School of Informatics, their competition entry is titled “Digital Olfaction and Gustation: A Novel Input and Output Method for Bullipedia”.
The team proposes novel methods of digital olfaction and gustation as input and output for internet interaction, specifically for creating and experiencing the digital representation of food, cooking and recipes on the Bullipedia. Other team members include Jordan Tewell, Olivier Oullier and Yongsoon Choi.
No stranger to digital olfaction applications in the culinary space, Professor Cheok recently gave a Digital Taste and Smell presentation to the third top chef in the world, Chef Andoni Luiz Aduriz, at Mugaritz restaurant in San Sebastian, Spain.
The HackingBullipedia Global Challenge was created by the renowned world leading culinary expert, Chef Ferran Adria I Acosta.
The jury, comprising some of the best culinary and digital technology experts in the world arrived at a shortlist of four teams after carefully sifting through 30 proposals from three continents drawn from a mix of independent and university teams.
The other teams in the final are from Universitat Pompeu Fabra (Barcelona); the Technical University of Catalonia; and an independent (non university) team from Madrid.
On the 27th of November, two representatives from each of the four finalist teams will pitch their proposal and give a demonstration to the competition’s judges after which the winner will be decided.
Professor Cheok is very pleased that City will be in the final of the competition final:
“I am quite delighted that we were able to make the final of this very challenging and prestigious competition. There were entries from various parts of the world covering a broad spectrum of expertise including a multidisciplinary field of scientists, chefs, designers, culinary professionals, data visualisation experts and artists. We are confident that our team has prepared an equally challenging and creative proposal which will be a game-changer in the gastronomic arena.”
AISB is the pre-eminent society in the UK for Artificial Intelligence and Simulation of Behaviour (www.aisb.org.uk). In 2014 AISB celebrates its 50th anniversary.
The AISB 50 Annual Convention 2014 will be held at Goldsmiths, University of London, from April 1st – 4th.
This is a Call for Papers for a one day symposium, “Love and Sex with Robots”, which will take place during AISB 50. The exact date within the April 1st-4th timeframe will be announced shortly.
Within the fields of Human-Computer Interaction and Human-Robot Interaction, the past few years have witnessed a strong upsurge of interest in the more personal aspects of human relationships with these artificial partners. This upsurge has not only been apparent amongst the general public, as evidenced by an increase in coverage in the print media, TV documentaries and feature films, but also within the academic community.
The symposium welcomes submissions on the following topics, inter alia:
Intelligent electronic sex hardware
Submission and Publication Details
Submissions must be extended abstracts of approximately 400-500 words, and should be sent via email to both:
Professor Adrian Cheok, Professor of Pervasive Computing, City University, London:
Dr. David Levy, Intelligent Toys Ltd., London:
For the final submission of accepted papers text editor templates from previous conventions can be found at:
Each extended abstract will receive at least two reviews.
We request that for those papers that are accepted on the basis of their extended abstracts, the final submitted papers be limited to 8 pages. Selected papers will be published in the general proceedings of the AISB Convention, with the proviso that at least one author attends the symposium in order to present the paper and participate in general symposium activities.
i. 21st January 2014 – Deadline for submission of extended abstracts.
ii. 3rd February 2014 – Notification of acceptance/rejection decisions
iii. 24th February 2014 – Final versions of accepted papers (camera ready copy)
iv. 1st – 4th April 2014 – AISB 50
Please note that there will be separate proceedings for each symposium, produced before the convention. Each delegate will receive a memory stick containing the proceedings of all the symposia. In previous years there have been awards for the best student paper, and limited student bursaries. These details will be circulated as and when they become available. Authors of a selection of the best papers will be invited to submit an extended version of the work to a journal special issue.
Joanna Bryson, University of Bath
Adrian Cheok, City University, London
David Levy, Intelligent Toys Ltd
Anton Nijholt, University of Twente
Dennis Reidsma, University of Twente
Yorick Wilks, Florida Institute for Human and Machine Cognition
Professor Adrian Cheok, Professor of Pervasive Computing, City University, London:
Dr. David Levy, Intelligent Toys Ltd., London:
Jordan Tewell, City University, London
Adrian David Cheok was interviewed for an article in Sydney Morning Herald, one of Australia’s leading newspapers. Read article on http://www.smh.com.au/technology/sci-tech/robots-starting-to-feel-the-love-20130918-2tzey.html#ixzz2gUbt0Itz and an extract is given below.
Researchers believe we will become emotionally attached to robots, even falling in love with them. People already love inanimate objects like cars and smartphones. Is it too far a step to think they will fall deeper for something that interacts back?
“Fantastic!” says Adrian Cheok, of Japan’s Keoi University’s mixed reality lab, when told of the Paro study. Professor Cheok, from Adelaide, is at the forefront of the emerging academic field of Lovotics, or love and robotics.
Cheok believes the increasing complexity of robots means they will have to understand emotion. With social robots that may be with you 24 hours a day, he says it is “very natural” people will want to feel affection for the machine. A care-giver robot will need to understand emotion to do its job, and he says it would be a simple step for the robot to express emotion. “Within a matter of years we’re going to have robots which will effectively be able to detect emotion and display it, and also learn from their environment,” he says.
The rather spooky breakthrough came when artificial intelligence researchers realised they did not need to create artificial life. All they needed to do was mimic life, which makes mirror neurons – the basis of empathy – fire in the brain. “If you have a robot cat or robot human and it looks happy or sad, mirror neurons will be triggered at the subconscious level, and at that level we don’t know if the object is alive or not, we can still feel empathy,” Cheok says. “We can’t really tell the difference if the robot is really feeling the emotion or not and ultimately it doesn’t matter. Even for humans we don’t know whether a person’s happy or sad.” He argues if a robot emulates life, for all intents and purposes it is alive.
Psychologist Amanda Gordon, an adjunct associate professor at the University of Canberra, is sceptical. “It’s not emotional, it’s evoking the emotion in the receiver,” she says. ”That seal isn’t feeling anything. It’s not happy or sad or pleased to see you.”
She says the risk is that people fall for computer programs instead of a real relationship. “Then you’re limiting yourself. You’re not really interacting with another. Real-life relationships are growth-ful, you develop in response to them. They challenge you to do things differently.”
Cheok’s research shows 60 per cent of people could love a robot. “I think people fundamentally have a desire, a need to be loved, or at least cared for,” he says. “I think it’s so strong that we can probably suspend belief to have a loving relationship with a robot.”
Probably the most advanced android in the world is the Geminoid robot clone of its creator Hiroshi Ishiguro, director of the Intelligent Robotics lab at Osaka University. Professor Ishiguro says our bodies are always moving, so he programmed that realistic motion into his creation along with natural facial expressions.
The one thing it does not do is age, which means 49-year-old Ishiguro is constantly confronted with his 41-year-old face. “I’m getting old and the android doesn’t,” he says. ”People are always watching the android and that means the android has my identity.” So he has had plastic surgery – at $10,000, he says it is cheaper than $30,000 to build a new head.
Robots can help kids with autism who do not relate to humans. Ishiguro is working with the Danish government to see how his Telenoid robots can aid the elderly.
Moyle says she has had inquiries from throughout Australia about Paro. A New Zealand study showed dementia victims interacted with a Paro more than a living dog.
“There are a lot of possible negative things [that artificial intelligence and robots could lead to],” Cheok says, “and we should be wary as we move along. We have to make sure we try to adjust. But in general I think the virtual love for the characters in your phone or screen or soon robots is ultimately increasing human happiness, and that’s a good thing for humanity.”
Read more: http://www.smh.com.au/technology/sci-tech/robots-starting-to-feel-the-love-20130918-2tzey.html#ixzz2gUbt0Itz