Opening Speech and Launch Of Digital Food Exhibition at Singapore Science’s Center – 20/09/2017

posted in: Media

Digital Food is an exhibition that focuses on the futuristic idea of how taste or flavour have been evolving, i.e. from natural, to artificial or synthetic flavours. The exhibition also explores how senses can be manipulated using digital technology in future. It challenges us to think about how digital food is going to enhance living quality and improve our health. This exhibition is only available for a limited period so do catch it while it last! It is jointly developed by Science Centre Singapore and Imagineering Institute.

Exhibition Dates:

20 Sep 2017 – 20 Nov 2017

Location:

Hall A, Science Centre

Typical time required:

30 min

 

Imagineering Institute launches Digital Food exhibition at Singapore Science Center

posted in: Media

Digital Food is an exhibition that focuses on the futuristic idea of how taste or flavour have been evolving, i.e. from natural, to artificial or synthetic flavours. The exhibition also explores how senses can be manipulated using digital technology in future. It challenges us to think about how digital food is going to enhance living quality and improve our health. This exhibition is only available for a limited period so do catch it while it last! It is jointly developed by Science Centre Singapore and Imagineering Institute.

Exhibition Dates:

20 Sep 2017 – 20 Nov 2017

Location:

Hall A, Science Centre

Typical time required:

30 min

 

Exhibition Highlights

Digital Candy Shop

The Digital Candy Shop is the main highlight of this exhibition with two interactive stations, i.e. the Digital Cream Pot and the Digital Lollipop, which allow you to “taste” food using technology.

Can you smell and taste colours?

This is part of the story of building up an artificial food experience. Visual cues are part of our perception of flavour and food. From the moment we see the food, our brains begin to build expectations using memories of previous experiences linked with the food’s colour, smell or appearance. Come have a “taste” of the smell.

 

Checking your sensitivity in smell

This exhibit challenges visitor if they have a super nose to distinguish different smells and identify the common ones.

Exhibition Partner

http://www.science.edu.sg/exhibitions/Pages/digitalfood.aspx

Seks met robots, omdat robots ook gevoelens hebben

posted in: Media

13 July 2017, by René Schoemaker

http://cio.nl/algemeen/99879-seks-met-robots–omdat-robots-ook-gevoelens-hebben

Hoe gevoel, tast en reuk gedigitaliseerd kunnen worden.

Seks met robots is niet ver meer weg, zegt Adrian Cheok van het Imagineering Institute. Maar eerst gaan we kennismaken met de robotleraar en de robotdokter. En dat gaat dit jaar al gebeuren.

 

Kan je ons meer vertellen over je werk bij het Imagineering Institute?

Imagineering Institute is een plek waar we multidisciplinair onderzoek doen. Ons researchteam bestaat uit experts met verschillende achtergronden en die samen werken aan onderzoek gerelateerd aan multisensorcommunicatie, HCI, AI en robotics. Het werk in het lab wordt ‘Imagineering’ genoemd, oftewel fantasierijke toepassing van technische wetenschappen. Imagineering betrekt drie hoofdstromen. Ten eerste de fantasierijke verbeelding: de projecties en gezichtspunten van artiesten en ontwerpers. Ten tweede, toekomstverbeelding: extrapolatie van recente en huidige technologische ontwikkelingen, het maken van imaginaire, maar realistische (uitvoerbare) scenario’s en simulaties van de toekomst. Ten derde, creatieve engineering: nieuw productontwerp, prototyping, en demonstratiewerk van engineers, computerwetenschappers en ontwerpers. Het lab voert onderzoek uit in de velden Mixed Reality, Internet Digital Media, Pervasive Computing, Wearable Technology en Multisensory Communication.

 

Je gaat het op IT Innovation Day hebben over ‘tastbare’ technologie. Veel mensen kunnen zich weinig voorstellen bij een internet dat smaak, gevoel en reuk kan doorgeven. Hoe werkt dat?

Wij willen tast, smaak en reuk digitaliseren. We hebben proof-of-concept prototypes ontwikkeld en verbeteren die. Als die technologie klaar is, is het mogelijk om tast, smaak en geur te digitaliseren, te communiceren en te doen herleven, net als we nu al doen met beeld en geluid. Huggy PajamaPoultry InternetRingU en Kissenger zijn voorbeelden van technologie die we hebben ontwikkeld voor tast-communicatie tussen mensen en tussen mens en dier. Die technologieën zijn in staat om het aanraken te voelen, door te geven en te herproduceren. Voor smaak en reuk gebruiken we voornamelijk elektrische of warmte-energie om de reuk- en smaak-receptors te stimuleren. We kunnen met die stimulatie de receptorcellen activeren en zij genereren dezelfde sensaties die chemische smaak- of geurstimulatie voortbrengt. We hebben wetenschappelijk bewezen dat dat mogelijk is voor smaak en we zijn nu bezig met een serie experimenten voor de reuk. Als dat eveneens lukt, dan zullen we in de komende tien jaar mensen zien die via mobiele apparaten kunnen communiceren met digitale smaak en geur.

 

Op welke manier zou dat een verrijking zijn voor de manier waarop mensen interacteren met elkaar en met apparaten?

We zetten grote stappen naar een hyperconnected wereld waarin alle machines, systemen en processen om ons heen gedigitaliseerd worden en met elkaar verbonden. Dat maakt interacties mogelijk van mens tot mens, mens tot machine en machine tot machine. Digitale interfaces voor tast, smaak en geur kunnen direct worden geïntegreerd en gebruikt voor die scenario’s. Wij geloven dat dit de traditionele tekst-, audio- en videogebaseerde communicatie zal verrijken tot ware multisensorcommunicatie. Daardoor zullen vele toepassingen worden veranderd, zoals internetwinkelen, messaging, videobellen, e-mail, VR en gaming.

 

Wat zijn echt praktische oplossingen waarin gevoel en smaak belangrijk kunnen worden?

Volgens mij vooral in communicatie als één op één-messaging, videoconferencing en websites. Dat zijn de toepassingen die we dagelijks gebruiken in onze interactie met anderen. Wij denken dat deze technologieën een revolutionaire verandering zal brengen in communicatie.

 

Voor veel mensen is de (gedachte aan) interactie met robots wat eng. Zou het gebruik van gevoel hierin kunnen helpen?

Ja. Ik denk dat nieuwe technologieën als tast, smaak en geur, en AI, kunnen helpen in het reduceren van het verschil tussen mens en robot. We zullen deze technologieën implementeren in robots en dat zal de communicatie tussen mens en robot leuker maken. Als jouw robotvriend bijvoorbeeld een lekker hapje tegenkomt als hij buiten de deur is kan hij de smaak delen. Via technologieën als KIssinger zijn we in staat voelsensaties te delen met elkaar. Met gebruik van AI kunnen we tevens robots vriendelijker maken, sensitief en emotioneel. Daarom zijn we nu aan het onderzoeken of we robots kunnen gebruiken als leraren en artsen. Ook hebben we een nieuw onderzoeksveld gestart onder de naam Love and Sex with Robots. Daarin onderzoeken we of mensen intieme relaties kunnen aangaan met robots.

 

Kan je ons wat concrete uitvindingen melden die werkelijk in de markt kunnen worden gezet?

Waarschijnlijk worden de eerste prototypen van een robotarts en robotleraar dit jaar al geïntroduceerd. Daarnaast zijn we van plan een serie onderzoeksrapporten te publiceren op het onderwerp ‘liefde en seks met robots’. Verder zijn we van plan Kissenger dit jaar al commercieel beschikbaar te maken.

 

Wat zijn de voordelen voor de industrie en het bedrijfsleven?

IoT en smartphonetechnologie hebben de manier waarop industrie en bedrijfsleven kijken naar R&D fundamenteel veranderd. In het verleden konden bedrijven als Kodak met succes hun eigen topproducten ontwikkelen binnen hun eigen labs, omdat de concurrentie beperkt was (het was moeilijk te voorspellen dat ze omver zouden worden geblazen door digitale technologie). Tegenwoordig moeten R&D-labs concurreren met miljoenen technologisch onderlegde jongeren die werken vanuit de kelder van hun ouders met de bedoeling de status quo omver te gooien. Het voordeel van een lab zoals die van de Imagineering Institute is dat we bedrijven de laatste trends helpen begrijpen zodat ze hun technologische achterstand kunnen goedmaken en op een gelijk niveau kunnen concurreren met de disrupters.

 

Heb je nog iets toe te voegen dat we nog niet hebben behandeld?

Een van de unieke features van Imagineering Institute is dat het een business incubator heeft (The Hangout Malaysia) binnen het onderzoekslab die uitmunt door de symbiose tussen de oprichters van de deelnemende startup en de onderzoekers die zich richten op nieuwe technologieën en ‘future casting’. De startups doorgaan een rigoureus trainingsprogramma om er zeker van te zijn dat mensen hun producten willen en ervoor willen betalen. We benadrukken tevens de schaalbaarheid van het bedrijf voor lokale, regionale en wereldwijde groei en de positionering ten opzichte van investeerders.

 

Adrian Cheok, gerenommeerd wetenschapper, spreker en onderzoeker richt zich op het tactiele van het internet. Hoe breng je tast over via het internet? Adrian laveert langs de grens van de mogelijkheden op gebied van robotics en gevoel.

Hij zal ingaan op technologie in relatie tot aanraking en op de wijze waarop dit zal bijdragen aan een algehele ervaring voor mensen, inclusief seksualiteit. Wanneer is goed, goed genoeg, echt genoeg en hoe kan de kwaliteit technisch worden verbeterd? Een must see!

Fifth Sense: The next stage of VR is total sensory immersion

posted in: Media
Wearable logo
Wednesday, May 17, 2017, By @garethmay
https://www.wareable.com/vr/senses-touch-taste-smell-immersion-7776

How will VR expand from audio and visual to incorporate the other senses?

Fifth Sense: The next stage of VR

 

Last year, the director of the Imagineering Institute in Malaysia, Dr. Adrian Cheok, the brain behind mixed reality wearable Huggy Pajamas, which consoles children with virtual hugs, and Scentee, the smartphone attachment that pings pongs over the data highway, claimed that three senses are pivotal in creating a future sense of presence in the virtual world.

He told Asian Scientist magazine that he is working on technology that allows for “virtual communication of touch, taste and smell by digitizing these senses.”

In the instance of smell it’s a claim that now has scientific backing. In a paper published last October, in the journal Virtual Reality, researchers from the University of Ottawa found that the addition of smell when in a VRenvironment “increases the sense of presence.”

The ‘unpleasant odour’ in this instance was piped into the room from an exterior accessory, a common method of simulating smell for VR users. Likewise, Valencia-based Olorama’s wireless aromatizing device does exactly that, fanning smells, such as ‘pastry shop’ and ‘wet ground’, around a VR play space.

At present this smell hack, if you will, is the easiest way to imitate odours. But simulating smell just isn’t that simple; it requires the imitation of molecular science, and ultimately the replication of certain molecules that trigger electrical pulses in brain. As a result, Dr. Cheok’s dream of digitised senses remains a long way off.

It’s in the area of the curated experience – in the form of perfumery, temperature, and haptics – where we’re seeing developments.

Smellovision

Premiered at Gamescom last year, Ubisoft’s Oculus Rift send-up, the fart-simulating Nosulus Rift, gives gamers the ability to smell the farts of characters from the second South Park game, appropriately named The Fractured but Whole (Don’t get it? Try reading it aloud).

“The Nosulus Rift is a fully functional mask using sensors activated through inaudible sound waves in the in-game fart sound, every time the player makes use of his nefarious [fart] powers,” an Ubisoft spokesperson told us. “Each time the sensors are activated, they trigger the odour’s puff. Meticulously and without mercy.”

Virtual flatulence not your bag? How about virtual body odour, breath, or even private parts? Earlier this year, adult entertainment webcam platform CamSoda announced a device called the OhRoma; a gasmask-style wearable with interchangeable odour canisters that releases smells matching any of the thirty ‘broadcast’ by a cam model via Bluetooth. The company is taking pre-orders now.

Both of these nasal wearables don’t allow for the interaction of smells, however, and that’s something Tokyo-born but Silicon Valley-based startup Vaqso VR demoed back in January with a Mars-bar sized VR accessory that’s able to emit multiple smells at once. The showcase revealed that players of a VR experience could smell not only the gunpowder of a gun but also the scent of a peach when it was pierced with a bullet.

“This device makes your VR experience richer,” says CEO Kentaro Kawaguchi, adding that he’s also working on simulating taste. “We want to perfectly reproduce the various senses of the five senses. Currently we can produce smells though taste may take a little while to develop.”

Compatible with PSVR, Oculus Rift, and HTC Vive, with claims on the site that the team can make any scent on demand, the consumer version of Vasqo’s VR scent device is scheduled for the first half of next year.

Fifth Sense: The next stage of VR is total sensory immersion

Warmer, warmer…

Prototyped at GDC in 2015, the multi sensory Feelreal mask promised to simulate temperatures and imitate wet and warm environments using a sophisticated combo of misters, heaters, and coolers (plus an ‘odour generator’). It didn’t get off the ground after a failed Kickstarter campaign.

One company that is delivering on their multi sensory promise is Sensiks. Its sensory reality pod, in which the user is seated, provides a totally immersive VR experience, augmenting the visuals from the headset with a set of exterior wind, light, and heat sense simulators – or, as founder Fred Galstaun puts it, “full sensory symphonies.”

“Real life reality is always full sensory and 360. Even a small cool breeze on the skin sets off the brain in ways you cannot even imagine,” he says. “Within a closed controlled environment where all the senses, including audio-visual, are made 360, there is no difference for the brain anymore between real and fake. It has become reality for the senses.”

Galstaun calls his pods—which are currently used in medical institutions for PTSD trauma recovery and with mentally disabled and elderly patients—sensory reality or SR for short. “We place SR next to VR and AR, a brand new product category in the programmed reality scene.”

But, as pods, these stimuli are exterior. As we’re seeing with smell, could temperature be incorporated into a wearable experience down the line?

The sensation of temperature is something that Samsung’s C-Lab is exploring with their T.O.B headband. As we previously reported, all we know about Touch On The Brain so far is that it generates the sensation of heat using an acoustic impulse that stimulates the brain. We asked for an interview but were told that because T.O.B is still at the very beginning stage in terms of development, no developers were available to chat. We’ll be waiting patiently to find out more.

Taste Test

As we all know, much of our perception of a meal relies upon different sensory inputs, from smell to sight to sound. Building on this core principle, with the aid of a VR headset and specially-created technology, is LA-based Project Nourished, a gastronomic experience that’s attempting to simulate eating by tricking the brain into thinking it’s consuming food.

Not that the brain is easily duped. The tech Project Nourished uses (main image) include a gyroscopic utensil and a virtual cocktail glass that allow the diner’s movements to be translated into virtual reality, a diffuser to imitate the smell of various foods, and a ‘bone conduction transducer’ that “mimics the chewing sounds that are transmitted from the diner’s mouth to ear drums via soft tissues and bones.”

When combined with an edible gum the result is ultimate brain bamboozlement (Willy Wonka would be jealous) and a system the creators hope could be used to treat people with obesity and eating disorders, as well as help children to form positive eating habits from an early age.

Fifth Sense: The next stage of VR is total sensory immersion

Currently haptics are the most popular way of incorporating the sensation of touch into VR and it looks like this will be the first sense to . This starts from something as simple as Go Touch VR’s finger cover accessory that simulates the sensation of force you get when your finger encounters a real life object. It’s a VR glove without the actual glove part and it works with a rough schedule from the French startup of early 2019 for mass production.

At the other end in terms of both impact and expense, the Rez InfiniteSynesthesia Suit, created by students at the Keio University Graduate School of Media and Design in Japan, is a full-body Velcro haptic VR suit that’s kitted out with small motors that vibrate as you journey through the virtual world. It’s been described as like “traveling through a psychedelic kaleidoscope“.

Experiences like this hint that we’re on the road to multi sensory VR but we’re unlikely to see much of it brought to reality in 2017. Still, next time you’re dazzled by the sound and picture of a VR experience but your body is crying out for something more immersive, just remember that it’s a work in progress. Buckle up, it’s going to get bumpy.

Exclusive interview for SPINOFF on Kissenger

posted in: Media

spinoff-logo

Article on Spinoff.com by ANASTASIYA SOVLEVICH

https://spinoff.com/kissenger

 

spinoff-1

Exclusive interview for SPINOFF with Prof. Cheok & Ms. Zhang (PhD) on Kissenger, the gadget which allows remote kissing via the Internet.

 

Kissing allows people to share intimacy and emotions. Long distance families, couples and friends can maintain a close relationship even without being physically together. Kissenger is an innovative gadget, invented by Professor Adrian David Cheok, that can be plugged into the smartphone and transmits realistic kissing sensations through the force controller from the sender to the recipient in real time via Internet. With the breakthrough device in the sphere of mixed reality, it became possible to share intimate moments with friends and families, while chatting with them on the phone or having the video chat, in order to keep the social connection with each other. Incorporating touch, smell and physical interaction can make Internet communication a much richer, intimate and meaning experience in the nearest future.

Photo provided by David Cheok
Photo provided by David Cheok

 

S.O.C.: Dear Professor Cheok and Emma, our team is so grateful that you are agreed to spend this hour speaking with us and sharing all information about your unique and one-of-a-kind Kissenger, the remote kissing machine.

So, the potential investors would like to learn more about a vast experience of your academic endeavours and your professional and scientific background. I know that your professional life is connected with not only one university. You are a keynote speaker worldwide and we would like to hear of your professional honours and awards, too.

Professor Cheok: Currently, I have two positions, the Chair Professor of Pervasive Computing at City, University of London and I am also the Director of the Imagineering Institute in Malaysia, the mutual project between the Malaysian Government and City, University of London, some universities in Japan and some Malaysian universities. As the Founder and Director of the Imagineering Institute in Malaysia and the Founder and Director of the Mixed Reality Lab in Singapore, my main work is to run my lab with 50 researchers mainly electrical and mechanical engineering and computer science. I was formerly a full-time professor at Keio University, Graduate School of Media Design and Associate Professor at the National University of Singapore. Previously, for 15 years I have been working on research, covering mixed reality, human-computer interfaces, wearable computers and ubiquitous computing, fuzzy systems, embedded systems and power electronics.

My research output includes numerous high-quality academic journal papers, research awards, keynote speeches, international exhibitions, numerous government demonstrations including to government President and Prime Ministers. I am the Editor in Chief of some academic journals such as Transactions on Edutainment (Springer), ACM Computers in Entertainment, and Lovotics: Academic Studies of Love and Friendship with Robots, and Multimodal Technologies and Interaction. I am also an Associate Editor of around ten computer science, electrical and engineering and virtual reality journals.

spinoff-3
Photos provided by Adrian David Cheok

 

For several years, I was invited to exhibit in the Ars Electronica Museum of the Future. My works “Human Pacman”, “Magic Land”, and “Metazoa Ludens” were each selected as one of the world’s top inventions by Wired and invited to be exhibited in Wired NextFest 2005 and 2007. I was awarded the Hitachi Fellowship, the A-STAR Young Scientist of the Year Award, the SCS Singapore Young Professional of the Year Award, an Associate of the Arts award by the Minister for Information, Communications and the Arts, Singapore, a Microsoft Research Award for Gaming and Graphics, the C4C Children Competition Prize for best interactive media for children, the Integrated Art Competition Prize by the Singapore Land Transport Authority, Creativity in Action Award, and a First Prize Nokia Mindtrek Award, Gold Award for the best Creative Showcase ACE. I am the winner of Keio University Gijyuju-Sho Award, a SIP Distinguished Fellow Award, which honours legendary leaders around the globe and the  Young Global Leader by the World Economic Forum, “Honorary Expert” by Telefonica and El Bulli, Fellow of the Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA). My research on smell interfaces was selected by NESTA as Top 10 Technologies of 2015. In 2016, I was awarded the Distinguished Alumni Awards by University of Adelaide, in recognition of his achievements and contribution in the field of Computing, Engineering and Multisensory communication. Lately, I entered the elite list of the h-Index for Computer Science, a list that contains only the top 0.06% of all computer scientists in the world.

 

 

From this video, we see that things in Virtual Reality changed a lot for recent 20 years. Considering your tremendous experience in this area, we would like to know whether you had other projects before? Could you please share the story of their creation and success. Have you received any grants or state funding on your projects?

When I graduated from the University of Adelaide, Australia in 1998 with my PhD in Engineering, I started working in Mitsubishi Electric Research Lab in Japan, in the field of industrial electronics and systems. Then, at the age of 26, I joined the National University of Singapore, where I was the youngest Assistant Professor. There I was given the grant to work on mixed reality. Two examples of my previous successful research projects in such sphere are Human Pacman and Huggy Pajama.

Human Pacman, a novel mixed reality interactive entertainment system that allows the human players to immerse in role-playing of the characters Pacman and Ghost by physically enacting the roles, was initially invented based on technologies we developed for an Augmented Reality location-based information systems for soldiers in the urban battlefield. Pacman has pioneered a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing. “Human Pacman” received the honour of being selected as the top 100 visionary and high impact technology works in the world by the USA based WIRED magazine and was invited to be demonstrated in the Wired NextFest 2005 in Chicago. The project was also heavily featured in the media worldwide including the Discovery Channel, National Geographic, BBC, CNN etc. I saw that it would just be revolutionary new kind of computer gaming where you play the game in the physical world. And now, in 20 years, we know that such gaming becomes a huge commercial success as Pokemon Go.

I can count a rather big amount of externally funded projects in the area of wearable computers and mixed reality for which we obtained approximately $20 million in funding from Media Development Authority, Nike, National Oilwell Varco, Defense Science Technology Agency, Ministry of Defense, Ministry of Communications and Arts, National Arts Council, Singapore Science Center, and Hougang Primary School.

I also started to look for different possibilities on how my knowledge could be applied new kinds of communication and interaction, not just a vision, but also touch, and smell and even taste. The other successful project was the Huggy Pajama, which is a wearable system aimed at promoting physical interaction in remote communication between parent and child. This system enables parent and child to hug one another through a novel hugging interface device and a wearable, hug reproducing jacket connected through the Internet. This project is a predecessor of Kissenger, as both projects are focused on transmitting touch through the internet. This system was later commercialised for autistic children to relieve their stress by receiving a hug from their parents while they are away.

595a137d60119
Photo provided by Adrian David Cheok

 

Some of my commercial projects are RingU and Scentee. RingU also makes use of remote haptic technology and the haptic device is in the form of a ring. Couples wearing the haptic ring can squeeze their rings to send a mini-hug to each other through the Internet. This project has successfully received funding from a Korean investor and it was launched in Seoul in 2014. Scentee is a Japanese company that we collaborate with working on digital smell technologies. The device is a smartphone attachment that can emit different types of smells using a smartphone app. The device is available for $50 on the Scentee website.

Dear Professor Cheok let’s back to the recent project Kissenger. It is so interesting to know more about the process of Kissenger creation. Please tell on which stage of commercialization it currently is?

The initial idea of Kissenger started about 10 years ago. Besides the Huggy Pajama that allows people to hug each other over the Internet, I also wanted to build a device that people can use to kiss each other through the Internet. The early version of Kissenger was a simple head shaped device with only a pair of realistic lips on it. We did some studies and people felt that it looked creepy. We then built a cuter version of the device in the form of a toy pig with rubber lips. This device had a simple force sensing and vibrotactile haptic feedback mechanism and it was connected to a computer.

 

Photo provided by Adrian David Cheok
Photo provided by Adrian David Cheok

 

I wanted to build a new version of the device for mobile phones, so that people can kiss their loved ones while talking or having video chats with their loved ones on the phone. The current version of the device has a higher resolution force sensing interface to measure the lip pressure from a user, and the linear actuators generate varying pressure on the partner’s lips to reproduce the kiss in real time. The current prototype device can be connected to the mobile phone and users can have a video chat with their friends and families and share a kiss at the same time using the device.

We made a special web-site for the project, through which we have been receiving over 400 unsolicited enquiries monthly from customers who would like to buy Kissenger, even though it is still in the prototype stage and we have not commercialised or publicised it yet. One man has recently been contacting us to commercialise Kissenger, but we felt he is too young and not experienced enough. We are still looking for suitable and experienced investors who are keen to help us commercialise the product.

 

spinoff-emma
Photo provided by Emma Yann Zhang

 

TEAM – so meaningful word.  Could you please share some information about the team members who supported you and the project? What are the key additions to the team needed in the short term?

During my project development, I was supported by my PhD student, Emma Yann Zhang, who has been developing the Kissenger prototypes for her PhD research. We met when she was an undergraduate student in Singapore, and now she has been studying as my PhD student for more than 2 years at City, University of London and in my lab Imagineering Institute in Malaysia. She is a very strong electrical engineer and the main team member, working with me on this project. She knows of Kissenger as much as I do. She will introduce and tell few words about her work and the project.

Emma Yann Zhang: Hello. As Professor Cheok said, I have an engineering background. I studied for two years at the National University of Singapore from 2010-2012. I joined the Mixed Reality Lab for one semester and worked on several projects, including Digital Taste Interface and Huggy Pajama with PhD students and researchers. I have received a B.Eng. in Electronic Engineering (First Class Honours) from the Hong Kong University of Science and Technology (HKUST) in 2013. I worked as a Project Assistant in the Social Media Lab at HKUST, developing a gesture-controlled Bluetooth wearable device that controls smartphone and desktop applications. I was awarded Dean’s List for two consecutive semesters and received first prize in the HK IEEE Student Paper Contest the same year. I won the UK Trade and Investment Sirius Programme, aimed to support and sponsor international graduate entrepreneurs to start a business in the UK. I joined a startup company based in London as a website developer in 2014.

My research interests are multisensory communication, haptic technologies, pervasive and wearable computing. My previous work involved low-power wearable devices, gesture control, microprocessors and embedded systems, mobile app development and web development. Together with Professor Cheok, I am mainly focused on how we can communicate with touch and transmit those senses via the Internet.

When I went to London to work on my PhD degree, I used my previous experience in smartphones apps and Bluetooth devices, microcontrollers in the development of Kissenger for my PhD research. The demo of the Kissenger prototype was presented during an international academic conference held in London and drew huge attention from the participants as well as from the media channels such as BBC, Huffington Post, The Guardian, The Times, Discovery Channel etc. Currently, I am responsible for the development of the hardware prototype and mobile app of the project. The prototype needs to be upgraded to make the commercial version more suitable option for smartphones.

spinoff-kissenger
Photos provided by Adrian David Cheok

 

New technology or product and its following commercialization supposes some problem and addresses unmet needs. Respectively, what problem did you intend to solve by creating Kissenger? What results did you plan to achieve?

Current digital communication technologies focus heavily on visual and audio information, lacking the ability to transmit physical touch. Many people criticise digital technologies for encouraging social isolation and diminishing human abilities to empathise and form emotional bonds. Kissenger was developed to provide an intimate communication channel for families and friends to physically interact with each other remotely, in order to effectively convey deep emotions and intimacy through a multisensory Internet communication experience. With our device long distance families, couples and friends can maintain a close relationship even without being physically together.

 

After this video material we have a clear vision of how the device works, but what are the USP of Kissenger and fundamental difference from other technologies/products? Was the problem which you targeted to solve actual before? Has someone tried to solve it?

As far as we know, there is not a similar product available on the market yet. From the research perspective, we are developing a novel multisensory haptic device which transmits realistic lip pressure through the internet. Most of the haptic devices available nowadays use of vibrotactile stimulation to reproduce the sensation of touch. In real life, being touched or kissed is the sensation of pressure applied to the skin. We use linear actuators instead of vibration motors to generate force feedback on the user’s lips to produce kissing haptic sensations. Furthermore, our system transmits the haptic kissing sensations in real time over the internet bidirectionally. Many of the haptic systems are unidirectional. Lastly, our device works with mobile phones, which can connect to all the social networks such as Facebook Messenger, Whatsapp and Skype etc.

Investors for sure need to understand the investment structure of the company. Are you totally owning your spinoff?

The IP of Kissenger is jointly owned by me (Adrian David Cheok) and my PhD student, Emma Yann Zhang, and Imagineering Institute which I am Director of. Although we have not set up an official company for Kissenger, we are in the process of establishing a company because we have more brilliant ideas for the commercialisation of the product and it will be the next step.

We wonder what is the actual addressable market for your invention and what are the current competitors there? Could you please share with us the results of the market studies, if there are any? What might be the barriers to entry?

We think that this could be a commercial success in USA, Japan, South Korea, China and European markets. In Asian countries such as Japan and South Korea, people are more shy to have physical contact. Having a kissing device might help them experience intimate interactions through a mediated channel. Most of the requests to purchase Kissenger we received are from long distance couples who would like to stay intimate with their partners during their long distance relationship. However, we think this device could also be targeted for families, especially grandparents and grandchildren who often stay apart from each other, or for parents working overseas.

Some similar products available on the market might be the remote controlled vibrators or teledildonics. These sex products are only suitable for couples and they are not commonly found in mainstream stores, hence limiting the range of potential customers. The Kissenger is designed in such a way that it’s not overly sexualised. We did not include tongue interaction for the purpose of making it suitable for families and friends. We think this could appeal to a larger audience.

We always need to paint a clear picture to the potential investors of the market the opportunity of the spinoff that is meaningfully large and growing. Why in your opinion your company might have a high growth potential? What industries and spheres of application do you consider your product may be used?

We expect exponentially growing market because more and more people are connecting through the internet, as families and different generations are living further apart. Kissing and physical touch will remain as a fundamental need for people to maintain close relationships. The Kissenger device will provide a solution for them to build physical intimacy even when physically apart. There will be a growing market that includes families, friends and couples. As mentioned above, this device could also be used by grandparents who would love to kiss their grandchildren living in other countries, or parents to kiss their children while they are away for work.

I know how you love your daughter Kotoko and how you miss her during travelling, and I really remember that feeling during my last business trip to London, how I would like to kiss my 11-year-old daughter, so for sure Kissenger has huge potential among customers and I will be one of your first customers.

Moving forward, the potential investors will be curious whether you already have the first clients and signed contracts?

No, we have not signed contracts or have first clients yet.

Dear Professor Cheok, we both know that for you and the investor it is crucial to reach positive cash flow as soon as possible. Certainly, the market scaling cannot be achieved without proper distributors network and clients. Please tell us about your criteria of partners selection and which markets are open for spinoff activity.

We want to collaborate with partners who are specialised in high-tech products and innovative gadgets. We think that this could be a commercial success in USA, Japan, South Korea, China and European markets, which we want to open for spinoff activity.

It is very important to understand your particular vision about unique features of your company. Why do you consider the major market players might be interested in investing into a promotion of Kissenger on the addressable market?

The current technologies of digital communication are heavily focused on visual and audio information. We believe that touch communication is essential for establishing effective social relationships. Incorporating touch and physical interaction can make internet communication much richer, intimate and meaning experience. Future communication technologies should be multisensory and more immersive, placing more emphasis on users’ emotional needs and social connections.

Now we would like to refer to the next very crucial and we would even say essential aspect for spinoff companies’ as the strategy of R&D, production, distribution and marketing processes. Do you have your own unique strategy? Which of these processes do you consider your spinoff is strong at?   

Maybe I have mentioned before that I am an academic and my PhD student has also mainly been working as a researcher, hence we are most familiar with the research. But we also have very good contacts in Malaysia and China who can produce and manufacture technology products.

As a rule, the majority of spinoffs outgrow into exits. How do you determine the market for the Kissenger and estimate its volume and dynamics? What is your potential share on the market?  How do you think what market cap your company plans to reach the peak of its development and why? How long might this process take?

With the Kissenger product, we think we can be a $100 million company as we are the first one to introduce such a product into the market. In the future, we also have plans to expand the company to make more products in haptic and communication technology.

For spinoff companies, their intellectual property is a key to success. The investors pay particular attention to it. What key intellectual property does your company have (patents, patents pending, copyrights, trade secrets, trademarks, domain names)?

We have applied for a design patent and trademark for Kissenger. As academics, we have mainly been concentrating on academic paper publications which are required of academics. We can share with you some recent papers and book chapters which we have published.

To continue on IP we know that after fast technological breakthrough patent validity period becomes shorter. It is interesting to know the perspectives and protection plan of your technological advancement and leadership in a medium- and long-term prospectives.

We see the current Kissenger prototype as only the beginning of touch communication. With our technologies, not only could we bring remote touch to human users connecting through the Internet, we see a higher potential of such technologies being used in human-robot communication. In recent years, we see a strong trend in research and companies creating robots that people can establish intimate and personal relationships with. Intelligent robots will ultimately become human’s companions and there will be robot lovers and sex robots. The topic of Love and Sex with Robots has been widely discussed in the media and it has also become an area of research among academics. We have started an international academic conference on Love and Sex with Robots, and it is in its third year.

Sex robots are definitely coming in the future source - News.com.au
Sex robots are definitely coming in the future source – News.com.au

 

Haptic technologies and devices are an essential part of creating robot partners with which people can interact with. One obvious application is sex robots, which require advanced touch sensors and a haptic actuation mechanism to perform activities such as kissing, hugging and intimate sexual acts. It is also important for other robots to be able to communicate with humans through touch as research shows that touch is an essential element in building social relationships. We see the future of our company to be in the area of advanced robotics with touch interaction. For example, we are already in the process of making a new humanoid robotic head that can share a kiss with humans.

As the Kissenger is now almost ready for commercialisation and is going through patent pending. Are you seeking for the investments at the moment? What is the volume and time limits? What milestones will the financing get you to? What did you plan to use the invested funds for?

We require the invested funds to design and manufacture a current commercial version of the Kissenger device. The funds will also be used for the sales, distribution and marketing of the commercial device. The funds should be discussed after evaluation, but to bring the product to the market we can speak about the amount of $3 million.

Ideal investor – who is it for you?  What aspects are important for you, for instance, is it experience, country, the amount of own private capital or maybe some personal qualities?

The most important aspects we look for in an investor are the amount of capital and experience with technological products.

What is the most convenient way to receive inquiries from potential investors?

For me, as for the inventor, as well as for the co-founder Emma, the top priority is to bring our Kissenger product to the market, so I, as well as co-founder Emma, could be reached by any means of communication such as email, phone call or Facebook messenger.

Adrian David Cheok:
Email: adrian@imagineeringinstitute.org
Phone: +60197788914
Facebook: adriancheok@gmail.com

 

Emma Yann Zhang
Email:emma@imagineeringinstitute.org
Phone: +60182694730
Facebook: yannc2021@gmail.com

Dear Professor Cheok and Emma, the founder of SPINOFF.COM, our team and I would like to express gratitude for the interview and all provided materials, that will allow showing the Kissenger’s idea and technology for the potential investors. We will attach all the provided materials at the bottom of the page. SPINOFF.COM is honoured to support the development of your spinoff. I just want you to know that I am your big fan and, for sure, I will be one of the first Kissenger customers to thank you for it remotely!

MASS MEDIA

Kissenger on Discovery Channel Daily Planet, January 16th, 2017

“Experts believe AR technology will revolutionize the gaming experience creating an arena where people move about, socialising and interacting with each other instead of being glued to a computer screen. “These games symbolize the dawn of an era where real and virtual interactive experience will form part of the routine of our daily lives, allowing users to indulge in the seamless links across different domains be it for entertainment or socialising,” says Mr Cheok. BBC NEWS on Human Pacman

SPINOFF ANALYST: ANASTASIYA SOVLEVICH

More on: https://spinoff.com/kissenger

City, University of London Professor Enters the Elite List of The h-Index for Computer Science, the top 0.06% of Computer Scientists

posted in: Media

Professor Adrian David Cheok, Chair Professor of Pervasive Computing at City, University of London and Director of Imagineering Institute, Malaysia, enters the elite list of the h Index for Computer Science, emerging as the top 0.06% of computer science researchers.

This list contains the authors of computer science papers who have a h-index of 40 and above, computed from various sources including Google Scholar and DBLP. Out of the 1.7 million computer science authors listed on DBLP, only about 1000 authors in the entire world meet this requirement, making them the top 0.06% of computer science researchers. This list is maintained by Professor Jens Palsberg, Professor of Computer Science at UCLA. Other prominent researchers on the list include Nobel Laureate Dr Herbert Simon, Turing Award winners, members of the National Academy of Engineering, members of the National Academy of Sciences, and Fellows of IEEE and ACM.

Professor Adrian David Cheok is highly recognised for his research on mixed reality, multisensory internet communication, human-computer interfaces, wearable computers, pervasive and ubiquitous computing. He is winner of numerous prestigious awards, including the Hitachi Research Fellowship, Young Global Leader by the World Economic Forum, Fellow of the Royal Society of Arts, Manufactures and Commerce (RSA), and Distinguished Alumni Awards by University of Adelaide.

The h-Index for Computer Science list can be found at http://web.cs.ucla.edu/~palsberg/h-number.html

1 23 24 25 26 27 28 29 58