The next step in augmented reality: Electrify your taste buds
News Article in Sd Japan:
This week I had a chance to visit Dr. Adrian Cheok and his students at the Mixed Reality Lab at Keio University. The research they’re conducting is based around the notion that in the future technology will shift from today’s ‘Information Age’ to an ‘Experience Age’. Dr. Cheok predicts that we will experience the realities of other people, as opposed to just reading about them, listening to them, or watching a video on a glass screen.
Visiting the Mixed Reality Lab was a refreshing experience. I’ve come to associate terms like ‘Augmented Reality’ with things like Sekai Camera, or the fascinating human Pac-man game that his lab created a few years back [1]. But Dr. Cheok points out quite rightfully – and perhaps surprisingly – that one of the earliest examples of AR was Sony’s Walkman, the first device that allowed people to have their own personal sounds with them all the time.
Beyond Sound and Vision
Once we accept the idea that augmented/mixed-reality is not just limited to vision, then it opens up a whole world of possibilities. And these are the possibilities that Dr. Cheok and his students are researching. He explains:
I became interested to see if we could extend augmented reality to other senses. To touch. At first I made a system for human-to-pet communication. We made a jacket for a chicken that allowed a person to convey touch to a chicken remotely. Then we made Huggy Pajama, which could be used to hug a child remotely [2].
While projects like this might strike us as a little strange — or even wacky — it’s important to note that such projects can be far more practical than you might think at first glance. A version of Huggy Pajama called T Jacket has been subsequently developed for for therapeudic purposes. So for example, a child with autism could be comforted remotely with hugs can be sent over the internet by smartphone.
Readers may recall that we previously featured another remarkable haptic communication project from the Mixed Reality Lab called Ring-u. The idea here is that vibrating messages can be sent over the internet, back and forth between a pair of rings, and there is also now a smartphone interface for the ring as well. This project has perhaps far larger potential in the consumer electronics space, and they’re speaking with toy companies and high-end jewelers about possibile future developments.
Taste the Future
But perhaps the biggest challenge for Dr. Cheok and his team is figuring out how to digitize the other two remaining senses:
Smell and taste are the least explored areas because they usually require chemicals. [But] we think they are important because they can directly affect emotion, mood, and memory, even in a subconscious way. But currently its difficult because things are still analog. This is like it was for music before the CD came along.
Amazingly the team has developed a prototype electric taste machine, and I was lucky to be able to try it out during my visit. The device in its current form is a small box with two protruding metal strips, between which you insert your tongue to experience a variety of tastes. For me some were stronger than others, with lemon and spicy being the strongest. It works by using electric current and temperature to communicate taste, and I experienced what felt like a fraction of the intended tastes – but very impressive. I’m told that in the future, this system could even assume a lollipop-like form, which would certainly be very interesting.
The lab is also collaborating with Japanese startup ChatPerf, which you may recognize as the company that developed a smell-producing attachment for smartphones. They will also conduct a formal academic study to see to what level smell can affect communication between individuals. But even with ChatPerf, the creation of smells is still analog, using cartridges of liquid to emit odors. Later on Dr. Cheok hopes to similate smells in a non-chemical, digital way, noting that it can be done via magnetic stimulation of the olfactory bulb.
So while experiments like these tend to cause lots of laughs and raised eyebrows sometimes, the work is quite important in expanding how we see technology’s role in our lives.
These are just a few of the great projects that the Mixed Reality Lab is working on, and we hope to tell you about others in the future.
-
It’s pretty amazing that they made this way back in 2009. ↩
-
For more information on this fun huggable chicken project, check out Adrian Cheok: Making a Huggable Internet over on IEEE Spectrum. A demo of Huggy Pajama can be found here. ↩