A New Electric Spoon Could Make Vegetables Taste ‘Like Chocolate’

posted in: Media

screen-shot-2017-01-08-at-03-08-18

by Chris Crowley, October 13, 2016 10:30 a.m.

Of course someone decided to disrupt cutlery. Photo: Leisa Tyler/LightRocket via Getty Images
In the near future, humans will laugh about their poor ancestors who had no choice but to eat with basic, outdated cutlery that couldn’t manipulate the flavor of their food. That will be thanks to a group of scientists at the University of London, who are developing a device that makes low-sugar food taste sweeter or saltier, if for some reason you actually want your sautéed spinach to taste like birthday cake.

For at least one of the scientists, professor Adrian Cheok, this is a dream come true. He tells The Telegraph he got into engineering with just one noble goal: not to make kids like vegetables, but to allow them to still hate veggies and eat them anyway because technology can make them “taste like chocolate.”

Dubbed the Taste Buddy, the deceptive device will trick people, through a low-level electrical current that stimulates taste buds, into tasting flavors that aren’t actually present. The scientists’ hope for the device, which was revealed this week at England’s Big Bang U.K. Young Scientists & Engineers Fair, is to engineer it to fit within everyday utensils or beverage cans. Currently, the team is working on a prototype spoon, but if it all works out, there will be a whole line of microchip-like Bluetooth devices that allow users to “choose the levels of taste you’d like.”

http://www.grubstreet.com/2016/10/electric-spoon-could-make-vegetables-taste-like-chocolate.html

Speech at Design & Emotion Conference in Amsterdam

Professor Adrian David Cheok was invited to be a thought leader at the 10th Conference on Design & Emotion held in Amsterdam this year from September 27 to 30.

The International Conference on Design & Emotion is a forum held every other year where practitioners, researchers and industry leaders meet and exchange knowledge and insights concerning the cross-disciplinary field of design and emotion, such as social science, humanities, engineering, computer science, HCI, psychology, cognitive science, health sciences, marketing and business.

Design & Emotion went from being the rookie in the field to –and this sounds scary– belonging to the establishment. This 10th edition is an excellent moment to discuss the next 15 years. The Design & Emotion community has proven to be dedicated and committed, the ideal group in a relevant discussion like this. Therefore,The conference invited 8 thought leaders to host theme oriented sessions, exploring and discussing future directions, with which we aim to define a new framework for the mission statement for the next 15 years.

The theme of the first thought leader session is “Enhancing everyday life”, hosted by thought leaders Adrian Cheok and Jodi Forlizzi. They explored how design can cultivate, enrich, or even enhance the way we live our lives. During the session, Adrian gave a talk about his research on multisensory communication and mixed reality.

http://www.de2016.org/

Adrian Cheok awarded Distinguished Alumni award by University of Adelaide

posted in: Media

photojo-sm-0006

Professor Adrian David Cheok was awarded the Distinguished Alumni Awards by University of Adelaide in recognition of his achievements and contribution in the field of Computing, Engineering and Multisensory communication. The Distinguished Alumni Awards recognise the outstanding contribution and significant impact made by alumni of the University.

Professor Adrian Cheok obtained a Bachelor of Engineering (Electrical and Electronic) with First Class Honours in 1994 and a PhD in Engineering in 1999. Professor Cheok is a pioneer in mixed reality and multisensory communication; his innovation and leadership has been recognised internationally through multiple awards.

Some of his pioneering works in mixed reality include innovative and interactive games such as ‘3dlive’, ‘Human Pacman’ and ‘Huggy Pajama’. Professor Cheok is also the inventor of the world’s first electric and thermal taste machine, which produces virtual tastes with electric current and thermal energy.

Past winners include The Honourable Julia Gillard MP, former Prime Minister of Australia,The Right Hon Chief Minister of Sarawak, YAB Pehin Sri Dr Haji Abdul Taib Mahmud AO, Professor Oliver Mayo, Mr Ong Teng Cheong, first elected President of Singapore, Dr Tony Tan Keng Yam, the current President of Singapore. A full list of past winners can be found here.

https://www.adelaide.edu.au/alumni/recognised/distinguished-awards/

photo photojo-0069

photojo-0070photojo-0071 photojo-0072 photojo-0073

Wake Up and Smell The Roses—Virtually | AsianScientist

posted in: Media

asianscientist

You can now send tastes, smells and even kisses virtually. A new age of virtual reality involving all five senses is here, with Adrian Cheok of the Imagineering Institute at its forefront.

Daniel Soo | August 1, 2016 | Editorials

20160727-cheok-31xt3jp2lkguv5y867gc1s

AsianScientist (Aug. 1, 2016) – Once the stuff of science fiction, virtual reality (VR) technologies are becoming ever more present in our daily lives. From documentaries that bring you to the Great Barrier Reef, to games that let you soar through the air as an eagle, this technology looks set to change the world as we know it. As director of the Imagineering Institute, Malaysia, and founder and director of the Mixed Reality Lab, Singapore, engineer and inventor Dr. Adrian Cheok believes that the future of mixed reality—the integration of the virtual and physical world—belongs to smell, taste and touch. Cheok has played a key role in the innovation of various mixed reality devices and applications, some of which have already been released commercially. His Huggy Pajamas comforts anxious children by allowing them to receive virtual hugs sent from their parents. More intriguingly, his Scentee smart phone attachment allows one to send and receive various scents on command. Asian Scientist Magazine recently chatted with Cheok, who was in Singapore to deliver the keynote speech for the Singapore Science Festival’s Visual SG event, to find out what has been keeping him busy.

In your own words, what is mixed reality?

Mixed reality is the merging of our physical reality with virtual reality, which can be done at the level of all five senses. Science has shown that we communicate with all of our five senses. In fact, non-verbal communication is more than half of human communication, and that’s why it’s still very different to have a meeting with someone with a video call than to meet them in person. Something is missing when you just communicate with video or through the internet. I believe that in the future, we will be able to communicate with every one of our senses through the internet, and move from the age of information that we are in today to the age of experience.

What are some applications of mixed reality?

Telepresence will of course be a big example of how mixed reality can be applied. If we can transmit all five senses, telepresence would allow people to really feel like they are together. They could touch each other, or even share a dinner together, even though they may be on totally different sides of the world. Mixed reality devices can also create new kinds of communication. If you can digitally taste and smell, then you can have an app on your smartphone and virtually taste and smell a dish at a famous restaurant. Another big benefit is also of collaboration. For example, you could collaborate with one or many people, like cooking a dish together through the internet. Another application that mixed reality will lead to is new kinds of learning. For example, instead of reading a book or watching a movie about ancient Rome, you could feel what it’s like to be there and even taste and smell what it’s like to live in an ancient city. This would create a totally new kind of learning because we humans learn very much experientially.

What first drew you to work on virtual reality?

I first began by looking at augmented reality systems, which allow people to see virtual 3D objects in the physical world. I noticed that the first thing people did was to try to touch the objects. That’s when I realized that we have to extend augmented reality beyond the 3D graphics that we see on our video games and movies. We need to use touch, taste and smell to really create a sense of presence in the virtual world. That’s what I call experience communication: not just sharing information, but sharing your experience.

 

20160727-Kissenger-IoR
The Kissenger device attaches to your smartphone and allows you to send virtual kisses. Credit: Imagineering Institute

 

What are some of the limitations to engaging our five senses using virtual reality?

Right now, we are concentrating on making the technology that allows for the virtual communication of touch, taste and smell by digitizing these senses. It’s a very difficult problem. Fundamentally, sound and light are frequencies: they are wave-based, and can be transmitted into digital bits onto the internet. The fundamental problem with touch, taste and smell is that it’s a different kind of sense. For smell, we sense molecules, which triggers some electrical pulse to your brain. It’s still very much in the early stages of research but we have successfully been able to produce virtual taste, using electrical signals only and without any chemicals. We are now working on smell, which is an even more difficult problem. Unlike taste, where we only have five different taste receptors, most scientists estimate us having a few thousand different kinds of individual smell receptors. How do we stimulate those individual receptors? It’s going to be a very big problem, but so far we have gotten some success by using small electrodes on the inside of the nose to stimulate olfactory sense.

How do you think virtual reality technology will evolve in the next 20 years?

Technology is increasing at such an exponential rate that we can’t imagine exactly what the world will be like, but we can imagine that it’ll be incredibly different. I think that whatever that doesn’t break the laws of physics can be invented by humans. We can’t change the fact that we’re born with five different types of taste receptors, or that there’s a specific range of frequencies we can see and hear, but I believe that we can somehow alter our perceptions with technology. For example, the spectrum of light is much wider than what we can see. We can’t see infrared with our naked eyes but we can now visualize it with infrared glasses. In some way, we already live in our own virtual reality: an analog biological virtual reality, because we are just seeing the world we are naturally designed to see. We think that this is the reality but it’s not—that’s why it’ll be so much different when we have virtual reality, because we already don’t see a kind of ‘objective reality.’

Source: http://www.asianscientist.com/2016/08/features/adrian-cheok-imagineering-institute-mixed-reality-lab/

The Multi-Sensory Internet Brings Smell, Taste, and Touch to the Web | Motherboard

posted in: Media

motherboard_logo

November 10, 2013 // 02:02 PM EST

By GIAN VOLPICELLI

Adrian Cheok, professor of pervasive computing at City University London and director of the Mixed Reality Lab at the National University of Singapore, is on a mission to transform cyberspace into a multi-sensory world. He wants to tear through the audiovisual paradigm of the internet by developing devices able to transmit smells, tastes, and tactile sensations over the web.

Lying on the desk in Cheok’s lab is one of his inventions: a device that connects to a smartphone and shoots out a given person’s scent when they send you a message or post on your Facebook wall. Then there’s a plexiglass cubic box you can stick your tongue in to taste internet-delivered flavours. Finally, a small plastic and silicone gadget with a pressure sensor and a moveable peg in the middle. It’s a long-distance-kissing machine: You make out with it, and your tongue and lip movements travel over the internet to your partner’s identical device—and vice versa.

“It’s still a prototype but we’ll be able to tweak it and make it transmit a person’s odour, and create the feeling of human body temperature coming from it,” Cheok says, grinning as he points at the twin make-out machines. Just about the only thing Cheok’s device can’t do is ooze digital saliva.

I caught up with Cheok to find out more about his work toward a “multi-sensory internet.”

The make-out device, plugged into an iPhone

 

Motherboard: Can you tell us a bit more about what you’re doing here, and what this multi-sensory internet is all about?

There is a problem with the current internet technology. The problem is that, online, everything is audiovisual and behind a screen. Even when you interact with your touchscreen, you’re still touching a piece of glass. It’s like being behind a window all the time. Also, on the internet you can’t use all your senses—touch, smell and taste—like you do in the physical world.

Here we are working on new technologies that will allow people to use all their senses while communicating through the Internet. You’ve already seen the kissing machine, and the device that sends smell-messages to your smartphone. We’ve also created devices to hug people via the web: You squeeze a doll and somebody wearing a particular bodysuit feels your hug on their body.

What about tastes and smells? How complex are the scents you can convey through your devices?

We’re still at an early stage, so right now each device can just spray one simple aroma contained in a cartridge. But our long-term goal is acting directly on the brain to produce more elaborated perceptions.

We want to transmit smells without using any chemical, so what we’re going to do is use magnetic coils to stimulate the olfactory bulb [part of the brain associated with smell]. At first, our plan was to insert them through the skull, but unfortunately the olfactory part of the brain is at the bottom, and doing deep-brain stimulation is very difficult.

And having that stuff going on in your brain is quite dangerous, I suppose. 

Not much—magnetic fields are very safe. Anyway, our present idea is to place the coils at the back of your mouth. There is a bone there called the palatine bone, which is very close to the region of your brain that makes you perceive smells and tastes. In that way we’ll be able to make you feel them just by means of magnetic actuation.

Cheok demonstrates the taste-transmitter

 

But why should we send smells and tastes to each other in first place?

For example, somebody may want to send you a sweet or a bitter message to tell you how they’re feeling. Smell and taste are strongly linked with emotions and memories, so a certain smell can affect your mood; that’s a totally new way of communicating. Another use is commercial. We are working with the fourth best restaurant in the worldin Spain, to make a device people can use to smell the menu through their phones.

Can you do the same thing also when it comes to tactile sensations? I mean, can you put something in my brain to make me feel hugged? 

It is possible, and there are scientists in Japan who are trying to do that. But the problem with that is that, for the brain, the boundary between touch and pain is very thin. So, if you perform such stimulation you may very easily trigger pain.

It looks like you’re particularly interested in cuddling distant people. When I used to live in Rome, I once had a relationship with a girl living in Turin and it sucked because, well, you can’t make out online. Did you start your research because of a similar episode?

Well, I have always been away from my loved ones. I was born in Australia, but I moved to Japan when I was very young, and I have relatives living in Greece and Malaysia. So maybe my motivation has been my desire to feel closer to my family, rather than to a girl. But of course I know that the internet has globalized our personal networks, so more and more people have long-distance relationships. And, even if we have internet communications, the issue of physical presence is very relevant for distant lovers. That’s why we need to change the internet itself.

The scent device in action

 

So far you have worked on a long-distance-hugging device and a long-distance-kissing machine. We also have gadgets that can transmit a person’s body odour. If I connect the dots, the next step will be a device for long-distance sex.

Actually, I am currently doing some research about that. You see, the internet has produced a lot of lonely people, who only interact with each other online. Therefore, we need to create technologies that bring people physically—and sexually—together again. Then, there’s another aspect of the issue…

What’s that?

As you noticed, if you put all my devices together, what you’re going to have soon are sorts of “multi-sensory robots”. And I think that, within our lifetime, humans will be able to fall in love with robots and, yeah, even have sex with them.

It seems to me all the work you’re doing here may be very attractive for the internet pornography business.

Of course, one of the big industries that could be interested in our prototypes is the internet sex industry. And, frankly speaking, that being a way of bringing happiness, I think there’s nothing wrong with that. Sex is part of people’s lives. In addition, very often the sex industry has helped to spur technology.

But so far I haven’t been contacted by anybody from that sector. Apparently, there’s quite a big gap between people working in porn and academia.

Source: http://motherboard.vice.com/blog/the-multi-sensory-internet-brings-smell-taste-and-touch-to-the-web?utm_source=mbfb

On a mission to send smells, tastes virtually | TODAY

posted in: Media

todaylogo

BY REGINA MARIE LEE
JULY 8, 2016

Visitors can unwrap an Egyptian mummy virtually and even explore amulets buried with the body. Photo: Interspectral

 

In the third of a series of reports on the highlights of the Singapore Science Festival this year, we look how scientists are making it possible to create virtual worlds real enough to smell and taste.

SINGAPORE — See animals in a zoo that are so real, that you can smell them, when you are merely standing in a room with a video feed. Or send a kiss virtually, one that can be felt.

These are the “realities” Professor Adrian Cheok wants to create, in his mission to make communicating digitally more realistic — by making it possible to send touch, smells and tastes over the Internet.

Prof Cheok, who is from the Imagineering Institute, is among the scientists working in the field of simulation technology, to change the way people experience not just communications, but also science.

For example, children in Spain with certain diseases, who typically spend many hours in the hospital, can now “visit” the zoo virtually through a live 3D video feed, as part of an ongoing study by the Malaysia-based Institute and the University of Valencia to see if virtual zoos have a positive effect on the children.

Unlike sound and light, which are frequency-based and can be sent digitally, taste and smell are chemical-based, so the challenge is in using electrical signals to stimulate these senses, said Prof Cheok, who is working on a device that sends kisses virtually — with the help of a silicon device attached to a mobile phone — and another which artificially produces taste sensations.

Taste and smell are the “most difficult senses” to digitise, said Prof Cheok, and he is working on a project that has electrodes implanted inside the nose to stimulate olfactory receptor neurons to produce a smell.

Another device connects to the tongue to change its temperature, producing a sweet taste through thermal energy.

“In the future it may not be silver electrodes, it could be cutlery which has electrical signals … (with) a tiny wireless electrode in your nose connected to your smartphone,” he said.

Currently, simulation technology in the market is focused on augmented reality, and more widely seen in advertising and entertainment, he said. But researchers focus on inventing new technology and then work with businesses to figure out how it can be made into useful commercial products.

Simulation technology can recreate novel experiences without causing harm, such as allowing people to “unwrap” a mummy — via a visualisation table at the Science Centre Singapore.

Created using X-rays, laser scans and photos of a real mummy, Neswaiu, a wealthy Egyptian priest who lived in the third century BC, visitors can zoom in on the touchscreen to examine Neswaiu’s sarcophagus, and then peel off layers of the body on a touchscreen to study the anatomy, internal organs and even the amulets buried together with him.

The technology behind the Mummy Explorer was originally created for visual medical images so doctors could perform virtual autopsies, but the team “quickly realised there were wider opportunities for the use of the technology”, said Prof Anders Ynnerman, the scientist behind the technology.

The explorer “is a very nice way of being able to engage people … (and let) the general public have a feeling for what scientists are doing and what the scientific exploration process is like”, since they can actively participate and freely explore the mummies themselves.

The first interactive touch table was completed in 2009, and has been used in other museums, like the British Museum and the Museum of Mediterranean and Near Eastern Antiquities in Stockholm.

Prof Ynnerman said the British Museum was initially worried that the dazzle of a digital artefact would leave the real mummy in the cold. But they found that people spent thrice the amount of time at the exhibit, looking at the mummy, his digital counterpart, and then at the mummy again to study it closely.

He hopes that with such interactive visualisations can shorten the distance between scientific research and the general public, so that through exploring, they can feel “part of the scientific discovery”.

Source: http://www.todayonline.com/singapore/mission-sendsmells-tastes-virtually

How much we need to touch, smell and kiss online? | Xataka

posted in: Media

xataka-logo

¿Cuánto nos falta para tocar, oler y besar por internet?

1366_2000

Conseguir transmitir los 5 sentidos a través de una conexión remota. Esa es la obsesión del australiano Adrian David Cheok, profesor de la Universidad de Singapur, quien gira por todo el mundo enseñando sus inventos para probar sabores a través de chips, oler e incluso darse besos. Su tesis es sencilla:convertir Internet en un espacio conectado… emocionalmente. Digitalizar los 5 sentidos —olfato, vista, oído, tacto y gusto— , replicando el complejo mundo real a través de hardware.

Para ello su laboratorio lleva trabajando en proyectos multisensoriales desde hace más de dos años. Ya que nuestra comunicación se fundamenta, en gran medida, en el uso de Internet y el smartphone mediante las redes sociales, Cheok cree que es ahí donde debemos orientar nuestros progresos.

Enviarás besos a través del móvil

02

Digitalizar los sentidos supondría un cambio radical en nuestra forma de comunicarnos. El primer progreso de Cheok en este campo gira en torno al tacto. Siguiendo los pasos de Kissenger, aquel robot de la compañía Lovotics a través del que podíamos mandar besos, el device de Cheok es un rectángulo de silicona, de color rosado, con una clavija para fijar a la base del smartphone. Mediante conexión WiFi puede comunicarse con un aparato análogo.

El dispositivo dispone de sensores de presión: un algoritmo calcula en tiempo real los índice de presión y dirección cuando se produce el beso. Pero también podemos usarlo, simple y llanamente, para transmitir sentimientos positivos, de buena suerte, para saludar, o para felicitar el cumpleaños a nuestra abuela. Si no queremos cerrar los ojos, a través de la pantalla del móvil vemos la frente o el rostro de la persona besada. El cacharro en cuestión es lavable y tiene la forma de unos labios estándar. No son los de tu pareja ideal, no tienen ningún cálido aroma, pero la textura tampoco se asemeja a una simple tubería de PVC.

Kissenger

Cheok establece un paralelismo interesante: antiguamente copiábamos nuestras canciones favoritas a dos o tres amigos, y las pasábamos en cintas. Ese fue el nacimiento de la digitalización. Pero ahora todo el mundo puede descargar la canción en un par de segundos, podemos chivar un descubrimiento emergente y que se convierta en algo de popularidad mundial en varios días. Cheok persigue esa comunicación, esa forma de transmitir información de forma más directa, a la antigua usanza.

Hay algo psicológico en el contacto. Las relaciones a larga distancia se rompen por esta brecha. Uno puede pasarse horas colgado del Skype y, aunque mejora sobremanera la malinterpretación y deformación de los chats, sigue siendo algo frío. Cuando cuelgas la llamada sientes que un silencio se apodera de ti. La emociones son difíciles de asir. Y llevamos un montón de años intentándolo.

05

De 100 años a 10: la carrera por digitalizar los sentidos

Como señala el propio Cheok, la digitalización del audio nos tomó un siglo, 50 años hacer lo mismo con el vídeo —y ya hemos llegado hacia la representación realista del mismo, mediante el HDR—. «Digitalizar el olor o el sabor no nos llevará más de 10 o 15 años, debido a la aceleración de la tecnología».

La digitalización de los olores o sabores siempre ha estado ahí de alguna u otra manera. Dejar una rosa disecada entre las páginas de algún pasaje concreto: de las cartas aromáticas a los cuentos de Gerónimo Stilton. En la industria cinematográfica la digitalización de olores fue bastante explícita y caótica. El suizo Hans Laube presentó en la Expo Mundial de Nueva York de 1939 ‘Scentovision’, un sistema que añadía banda aromática a la película. De esta manera, dosificando pequeñas dosis con tubitos escondidos tras las butacas de cada espectador, se amplificaría el poder narrativo de la obra. No llegó a mayores.

SmelloVision

El productor de Minneapolis Michael Todd hizo lo propio, siguiendo las pesquisas de su padre, y presentó ‘Smell-O-Vision’ bajo el eslogan «¡Primero se movieron (1895)! ¡Después comenzaron a hablar (1927)! ¡Ahora huelen!». Fue un fracaso que nunca llegó a despegar. Así las cosas, los años siguieron con tibias investigaciones. Terrence Malick apoyó una idea similar para su ‘The New World’ (2005), su relectura del mito de Pocahontas con aromas de tierra labrada y madera recién cortada. En 2013, varios cines de Nagoya (Japón) proyectaron ‘Iron Man 3’ recurriendo la tecnología 4DX o, lo que es lo mismo, una mezcla de vídeo 3D apoyado con efectos físicos —luz estroboscópica, niebla, aire caliente—, junto a olores dispersados por los conductos de ventilación.

Fuera del circuito comercial, la técnica para digitalizar olores (cromatografía) ha evolucionado por dipares derroteros. La empresa californiana Digiscent presentó hace algunos años iSmell, aplicación para incorporar olores a diferentes plataformas tecnológicas, como un simple correo electrónico. Un aparato receptor sintetiza el olor a través de un chip que asimila y analiza la huella que dejan los olores, y otro imprime y reproduce el olor asimilado. La compañía japonesa Chaku hizo algo similar en 2012 con las “narices electrónicas” y desarrolló Chat Perf, una forma de enviar olores a distancia mediante un atomizador conectado al puerto de iPhone, una suerte de smell-o-phone para pequeños consumidores.

Chip

Es fácil digitalizar el audio y la visión, son números

El audio y la imagen se miden en frecuencias: espectro visible, audible, y bits de resolución. Ya se sabe: los 8 bits de color hacen referencia a las 256 sombras de cada color primario: 256 de rojo, 256 de verde y 256 de azul. Con el audio sucede algo similar. Medimos la altura dentro del espacio audible —de los 20 Hertzios a los 20.000—, medimos el timbre sonoro, y medimos la intensidad, en decibelios. Con estos tres valores sabemos si el sonido de un fagot es “grave”, “metálico” y “suave”, basándonos en la potencia de la fuente sonora.

El gusto y el olfato se basan en moléculas sensoras. Son como una versión más hormoral y menos cerebral. Para interpretarlas no nos bastan los números, necesitamos una estimulación sensorial, que en el cerebro se traduce en electricidad. Es como enviar pequeños paquetes de datos codificados. El receptor revisa el matasellos y determina «este aroma a cítrico es agradable y fresco». Así que, como apunta Cheok en el vídeo, si queremos sentir esa conjunción aromática-olfativa, necesitamos condensar esa huella en bits de información y transmitirla eléctricamente a nuestro cerebro. Hablar su mismo idioma.

Smell

El problema de no poder digitalizar una molécula

El camino usual es acudir a la química. Como decíamos, las imágenes y los sonidos son señales de vibración y luz, mientras que los aromas y sabores son señales químicas. ¿Cómo podemos estimular estos receptores sin usar químicos? No podemos transmitir moléculas ni partículas a través de Internet. ¿O sí? Podemos detectar la diabetes mediante lectores de glucosa. El dulzor se convierte en una señal digital que el lector de insulina interpreta. Esa información la podemos exportar incluso a una tarjeta de memoria.

El problema de digitalizar una molécula deviene no tanto por la información misma de la molécula sino por cómo la interpretamos. Me explico: en el estudio de la cóclea se concluye que podemos detectar frecuencias limitadas, entre 20Hz y 20kHz, como decíamos. En el estudio del ojo sabemos cómo engañar y adaptarnos a sus características, con sus 7-8 millones de conos y 120 millones de bastones —receptores de color y luminancia, respectivamente—.

El olfato es otro cantar. Un humano sano puede detectar unos 1.200 millones de olores distintos. Y se memorizan mejor que cualquier otro recuerdo generado por una sensación. Las posibilidades combinatorias generan una posibilidad combinatoria prácticamente infinita. ¿Cómo sabemos qué moléculas descartar y cuáles seleccionar?

Ni siquiera podríamos construir un modelo estándar: no todas las personas tienen las papilas gustativas iguales. Aunque todo el mundo tiene receptores similares (dulce, salado, amargo, ácido y umami) la forma en la que interpretamos esas moléculas es única, como una huella dactilar.

Probar

Solución: estimular al cerebro con electricidad

De igual manera que hacen las conexiones sinápticas de nuestras neuronas, la única forma eficaz de interpretar correctamente estas moléculas a través de miles de kilómetros es mediante la estimulación eléctrica. Tal vez hayáis oído hablar del ‘brain hacking’, una forma de estimulación transcraneal de corriente directa. Consiste en aplicar, de manera controlada, pequeñas descargas con objeto de tratar patologías, estimular la memoria en personas con amnesia, u ordenar que cierta parte del cuerpo ejecute cierta función —como provocar el crecimiento de pelo—.

El profesor de ingeniería biomédica de la Universidad de NYC dejó bien claro, eneste artículo de Nature, que la versión casera y voluntaria de este hackeo es verdaderamente peligrosa. De hecho, aún no se conocen los efectos secundarios a largo plazo de estas prácticas. En este particular no estamos hablando de dopaje, estamos hablando de reinterpretación, de leer los olores como podríamos leer un libro. Y los dispositivos actuales aún no alcanzan esta meta con eficacia.

Teleport

Cambiarás la manera de relacionarte con las personas

La idea de Cheok es eminentemente social. En el mundo real sólo puedes dar una abrazo a una persona, si logramos compartir los sentidos de manera eficaz podremos dar abrazos a miles, de manera simultánea. Su meta es replicar el comportamiento del mundo real a escala global.

Un ejemplo: la cultura del lenguaje emoji. Lo que en principio era mera anécdota y refuerzo para transmitir un mensaje eminentemente textual se ha convertido en el núcleo del mensaje. Podemos lanzar un emoji tan cargado de significado como cualquier carta. La teoría de Cheok sólo necesita algo de tiempo extra. Como seres emocionales y sociales, la comunicación seguirá más allá de cualquier forma de digitalización, de los implantes cibernéticos o la humanización de Internet.

Abracitos

Tu amigo del futuro no existirá, será un robot

El tiempo es una centrifugadora. Hemos pasado del planteamiento de usar robots para comunicarnos, normalizando su uso en actos cotidianos, a implicarnos en las tareas del hogar y, cómo no, hacerlos partícipes de nuestra infidelidad poniéndolos a repartir abrazos por contrato. ¿Vamos camino, como en la inteligente cinta de Masamune Shirow ‘Ghost in the Shell’ (1995), hacia una dependencia emocional con los robots? ¿O sólo se trata de medir nuestras posibilidades como creadores? Fijaos en el mito de Pinocho: su padre-creador Geppetto no buscan insuflar vida, sino sustituar una serie de querencias afectivas.

‘Her’ (Spike Jonze, 2014) es popularmente conocida por ampliar cierto debate sobre la Inteligencia Artificial. ¿Puede una persona enamorarse de una simple voz, una Inteligencia Artificial intangible? Porque no se trata de ficción, sino de un debate lógico entre lo que podemos tocar y lo que podemos sentir. Y de ahí saltaríamos a otro debate: de volcarse ese cerebro artificial a un cuerpo humanoide, como en la popular ‘Ex Machina’ (Alex Gardland, 2015), ¿tomaría plena consciencia de sus capacidades, sentiría el deseo de satisfacer a su dueño o marchar libremente, con las implicaciones emocionales que implica?

Aburrido

«¿Por qué debería volar en un avión? ¿Por qué debería siquiera salir de casa para ir a trabajar?». El propio Cheok, en el vídeo de cabecera, asume algunos comportamientos: ya que el ser humano parece abocado poco a poco a cierto inmovilismo, a las rutas guiadas por museos virtuales en vez de visitar los reales, ¿preferiremos a un robot antes que una persona, por el simple hecho de resultarnos más cómodo?

Los progresos en Inteligencia Artificial arrojan un futuro híbrido: sólo hay que observar los vehículos autoconducidos. Por un lado tenemos algoritmos de aprendizaje automático, vinculados a la generación procedural, y que procesarán el lenguaje humano de una forma mucho más intuitiva que los asistentes actuales (Cortana, Siri, etcétera). Gracias a la conexión permanente, al “Internet de las cosas”, todo gadget que no incluye cierta tecnología de computación e IA se considerará estúpida, obsoleta.

El análisis visual de datos ha mejorado exponencialmente gracias a los ojos biónicos, e incluso con proyectos como Tango, de Google, los sensores ya analizan e interpretan el volumen de los cuerpos en el espacio.

La industria que más esperaría este cambio no es apta para menores de 18 años

Las salas multisensoriales —que acuñan su nombre holandés “snoezelen” de las palabras “impregnarse” y “soñar”— aparecieron en los años 70 para el desarrollo cognitivo y han ido adaptándose, poco a poco, al mercado de lo sexual. La tecnología Teledildonics es una versión hardcore del Kissenger, donde las sensaciones táctiles se comunican a través de un enlace de datos entre los participantes. Y sí, hace referencia al sexo robótico o, como mínimo, a la estimulación erótica a distancia.

Ex Machina

Un cambio en la comunicación sensorial a distancia revolucionaría de arriba abajo el mercado adulto. Las páginas dedicadas a ello encontrarían un filón para sus clientes o usuarios premium. El mejor ejemplo lo podemos encontrar en la Realidad Virtual: canales, aplicaciones y líneas de venta dedicadas exclusivamente a ello. Ellos fueron los pioneros, los primeros en adaptar los nuevos formatos, del DVD al Blu-Ray, del uso de cierta ropa o implantar comportamientos y lenguajes sociales.

Así pues, si los robots aún se comunican de forma muy aséptica, habrá quien aproveche esa inmersión, afianzando esa conexión entre personas de maneras más literales. Aunque algunas marcas quieran evitarlo por contrato.

http://www.xataka.com/n/cuanto-nos-falta-para-tocar-oler-y-besar-por-internet

Adrian David Cheok, Keynote Speaker of Brainy Tongue Conference

Brainy Tongue

Keynote Speaker, Adrian David Cheok, Brainy Tongue Conference, “Sensory Logic of the Gastronomic Brain”, a workshop to explore the interface between neuroscience and cooking, an exceptional hands-on meeting where the world’s leading scientists and chefs will research the mysteries of perception through interactive seminars and sensory experiments. The event will be co-organised by three Spanish entities: the Centre for Genomic Regulation (Barcelona), Mugaritz (San Sebastian) and the Basque Culinary Center (San Sebastián). The meeting will take place at the Basque Culinary Center in autumn of 2016. The workshop aims to bring the together prominent scientists and chefs for two and a half days, in order to generate hypotheses and new principles to be exploited at the dining table.

1 4 5 6 7 8 9 10 22