This week I had a chance to visit Dr. Adrian Cheok and his students at the Mixed Reality Lab at Keio University. The research they’re conducting is based around the notion that in the future technology will shift from today’s ‘Information Age’ to an ‘Experience Age’. Dr. Cheok predicts that we will experience the realities of other people, as opposed to just reading about them, listening to them, or watching a video on a glass screen.
Visiting the Mixed Reality Lab was a refreshing experience. I’ve come to associate terms like ‘Augmented Reality’ with things like Sekai Camera, or the fascinating human Pac-man game that his lab created a few years back . But Dr. Cheok points out quite rightfully – and perhaps surprisingly – that one of the earliest examples of AR was Sony’s Walkman, the first device that allowed people to have their own personal sounds with them all the time.
Beyond Sound and Vision
Once we accept the idea that augmented/mixed-reality is not just limited to vision, then it opens up a whole world of possibilities. And these are the possibilities that Dr. Cheok and his students are researching. He explains:
I became interested to see if we could extend augmented reality to other senses. To touch. At first I made a system for human-to-pet communication. We made a jacket for a chicken that allowed a person to convey touch to a chicken remotely. Then we made Huggy Pajama, which could be used to hug a child remotely .
While projects like this might strike us as a little strange — or even wacky — it’s important to note that such projects can be far more practical than you might think at first glance. A version of Huggy Pajama called T Jacket has been subsequently developed for for therapeudic purposes. So for example, a child with autism could be comforted remotely with hugs can be sent over the internet by smartphone.
Readers may recall that we previously featured another remarkable haptic communication project from the Mixed Reality Lab called Ring-u. The idea here is that vibrating messages can be sent over the internet, back and forth between a pair of rings, and there is also now a smartphone interface for the ring as well. This project has perhaps far larger potential in the consumer electronics space, and they’re speaking with toy companies and high-end jewelers about possibile future developments.
Taste the Future
But perhaps the biggest challenge for Dr. Cheok and his team is figuring out how to digitize the other two remaining senses:
Smell and taste are the least explored areas because they usually require chemicals. [But] we think they are important because they can directly affect emotion, mood, and memory, even in a subconscious way. But currently its difficult because things are still analog. This is like it was for music before the CD came along.
Amazingly the team has developed a prototype electric taste machine, and I was lucky to be able to try it out during my visit. The device in its current form is a small box with two protruding metal strips, between which you insert your tongue to experience a variety of tastes. For me some were stronger than others, with lemon and spicy being the strongest. It works by using electric current and temperature to communicate taste, and I experienced what felt like a fraction of the intended tastes – but very impressive. I’m told that in the future, this system could even assume a lollipop-like form, which would certainly be very interesting.
Electric taste machine
The lab is also collaborating with Japanese startup ChatPerf, which you may recognize as the company that developed a smell-producing attachment for smartphones. They will also conduct a formal academic study to see to what level smell can affect communication between individuals. But even with ChatPerf, the creation of smells is still analog, using cartridges of liquid to emit odors. Later on Dr. Cheok hopes to similate smells in a non-chemical, digital way, noting that it can be done via magnetic stimulation of the olfactory bulb.
So while experiments like these tend to cause lots of laughs and raised eyebrows sometimes, the work is quite important in expanding how we see technology’s role in our lives.
These are just a few of the great projects that the Mixed Reality Lab is working on, and we hope to tell you about others in the future.
Professor Adrian Cheok of Keio University’s Mixed Reality Lab has been a pioneer in blending the internet and the physical world, producing creations like Petimo, which allows kids to send hugs to each other over the internet; and Huggy Pajama, a similar solution for kids whose parent might be away. Projects from Mixed Reality Lab emphasize the importance of physical touch in a world where communication is drifting away from that particular sense.
Professor Cheok now has a new project in the works that iterates on this philosophy of blending physical touch with the internet. The RingU is another device that transmits these internet hugs, but it does so in a far more compact device. Using a ring connected to your smartphone by Bluetooth, signals can be sent to a paired ring over the internet. You can see a quick overview of RingU in the video below.
When you want to communicate a sort of virtual hug to your partner — and it could be a family member, a lover, or just a good friend — you squeese the ring, and your partner will receive this ‘tele-hug’ in real time. So even when separated by huge distances, you know that a person far away is thinking of you at that very moment. There are even different types of hugs which you can send — mini, intense, and urgent — depending on the situation.
You can also control the color of your partner’s ring according to whatever emotion you’re feeling at the time. There’s also the accompanying mobile app which partners can use as a private social network to share messages, photos, videos, thus complementing your physical hugs with the other types of communication you’ve become used to in this mobile age.
In the near future there will be no place for average Professors and average universities. The internet will mean everyone from Boston to Botswana can directly get the worlds best education from Harvard, MIT, Yale, etc. On the other hand there will be new leaders who are internet age universities who embrace the internet (like Amazon of universities). The academics who are trying to fight this new reality are like the record companies which tried to fight music going on line.
Although we are now in the age of the Internet, our schools are still stuck in the industrial age. As a result, the gap between our schools and reality is widening and could end in total disruption.
There is a clear link between our schools and the factories of the industrial age. In the production line system developed in the 19th and 20th centuries, each individual had to work at the pace of the industrial process, completing repetitive tasks, and was often banned from speaking.
The current school system is eerily similar. Students move along a linear progression of years, semesters and subjects. Every student studies at the same pace, receives grades and takes exams at the same time. If you excel at maths, you are likely to get bored. If you are bad at maths, you are likely to receive bad grades. No matter, everyone must move straight along the production line and repeat the same task over and over again to pass the exam. In class, you are not allowed to talk but must sit passively and let the teacher transfer information at a set speed.
It is not surprising that schools are modelled on the production line. Society, government and businesses needed manpower for the factories and companies of the industrial age. They set up systems that moulded workers into such manpower.
This model is archaic and unsuited for the Internet age, the age of knowledge. Firstly, we do not need factory workers – we need entrepreneurs, inventors, creative business people and designers. It is difficult to compete in global manufacturing. We can compete only in high value-added sectors such as new products, new services and creative industries.
Secondly, the Internet age allows us to discard the linear model. We have the tools and the ability to learn at our own pace. In fact, we can revive some educational practices of the pre-industrial age, such as the apprentice system. Each person keeps working on something until he or she masters it. A maths exam need not be set for the whole class on a specific day. Instead, students can be given continuous online mini tests. When they have mastered one topic, they move on to the next at their own pace.
The main obstacles to implementing such a new model are the inertia and conservatism of the education sector. However, just like every other industry, education is being disrupted and revolutionized by the Internet. Classes and lectures will go online. Students can view them at their own pace and be evaluated interactively.
Students will be much happier because they can study independently and test their limits (this is how video games work, and games are a good model for learning). Homework, on the other hand, will be done in classrooms and lecture halls. Being physically together will be all about solving problems, doing projects, learning through practical tasks, and working in teams with other students and teachers.
Learning and knowledge production will be done simultaneously. This is much more suited to the great technological and social changes of the 21st century. We need to learn more about tacit knowledge rather than explicit knowledge. Explicit knowledge becomes rapidly out of date when technology is changing so quickly. Tacit knowledge helps us to deal with such change. So does learning by doing and working in teams.
KOLLABORATE.IO 93% of all human communication is visual but most online collaboration solutions are text-based. Until now. Kollaborate introduces real-time visual collaboration without the hassle.
PRESENTATION.IO Present realtime to anyone on any device. No downloads, no installations, you simply move through your slides, which will change on all devices connected at the same time.
REAKTIFY A realtime feedback analytics tool. Google Analytics tells you what happened on your site, Kissmetrics tells you who did it, Reaktify tells you why.
Assemblage was founded based on one simple quest; to make it easy to for people and companies to collaborate online with multiple people at the same time. Since that first spark of an idea in 2011, Assemblage products have gone on to help companies and people in over 140 countries around the world to work together real-time on the web.
Adrian Cheok upon appointment as Advisor said: “My interest is in the future of internet where we will have multisensory communication with all the five senses. Assemblage is helping to increase experience communication.”
Ars Futura: The Art and Design of Our Digital Futures
Moderator: Scott Fisher Panelists: Adrian David Cheok, Dooeun Choi, Alex McDowell
Please join the University of Southern California at the 2013 USC Global Conference in Seoul, South Korea, taking place May 23-25, 2013. The conference will take place at the Grand Hyatt Seoul and will reflect on the interrelated themes of science, technology and health; global business, international stability and the rule of law; and education, the arts and cultural institutions.
Online chat with only voice or video is outdated! Now you’re able to hug, kiss, smell, or even taste your buddies remotely! Following Adrian, we experienced a fascinating multisensory tour.
The tour started with his early study, the Real World Pacman. With a pair of glasses, real humans become Pacman picking up cookies on the real world streets. But this is just a warm-up. Buckle up, more surprises are coming!
Following the first sense, sight, one of Adrian’s students using sound transformed a regular umbrella into a katana! Different sound effects are displayed by swinging the umbrella in different ways. Certain movement combinations can trigger special sound effects, such as explosion. Added only sound, an umbrella can have much more fun!
The presentation gradually came to its upsurge when the sense of touch was brought up. With a jacket, or just a ring, you can give your loved one a remote hug. As reported by IEEE Spectrum, Adrian is making a huggable Internet! In fact, it’s not just huggable, but also kissable! Kissenger really blew away everybody’s mind last night.
The tour didn’t stop at the sense of touch. It continued blowing everyone further away by digital smell and electrical and thermal taste.
Now with these possibilities, how can we incorporate all five senses into online teaching and learning? Maybe an online “better kisser” course is a good start? The stimulated audience had very inspiring and funny chats too. This session is definitely this year’s “must attend” one! The public can access a recording of the session here: http://squirrel.adobeconnect.com/p86lgb32brf/