Magnetic Table

posted in: Media | 0

By Nur Ellyza Binti Abd Rahman*, Azhri Azhar*, Murtadha Bazli , Kevin Bielawski , Kasun Karunanayaka, Adrian David Cheok

 

 

 

In our daily life, we use the basic five senses to see, touch, hear, taste and smell. By utilizing some of

these senses concurrently, multisensory interfaces create immersive and playful experiences, and as a

result, it is becoming a popular topic in the academic research. Virtual Food Court (Kjartan Nordbo,

et. al.,2015), Meta Cookie (Takuji Narumi, et. al.,2010) and Co-dining (Jun Wei, et. al., 2012)

represent few interesting prior works in the field. Michel et al. (2015) revealed that dynamic changes

of the weight of the cutleries, influence the user perception and enjoyment of the food. The heavier

the weight of the utensils, would enhance the flavour. In line with that, we present a new multisensory

dining interface, called ‘Magnetic Dining Table and Magnetic Foods’.

 

‘Magnetic Dining Table and Magnetic Foods’ introduces new human-food interaction experiences by

controlling utensils and food on the table such as modify weight, levitate, move, rotate and

dynamically change the shapes (only for food). The proposed system is divided into two parts;

controlling part and controlled part. The controlling part consist of three components that are 1)

Dining Table, 2) Array of electromagnet and 3) Controller circuit and controlled part consist of two

components; 1) Magnetic Utensils and 2) Magnetic Foods. An array of electromagnet will be placed

underneath the table and the controller circuit will control the field that produce by each of the

electromagnet and indirectly will control the utensils and food on the table. For making an edible

magnetic food, ferromagnetic materials like iron, and iron oxides (Alexis Little, 2016) will be used.

We expect that this interface will modify taste and smell sensations, food consumption behaviours,

and human-food interaction experiences positively.

 

Magnetic Dining Table and Magnetic Foods

Using mobiles to smell: How technology is giving us our senses | The New Economy Videos

posted in: Media | 0

Untitled

By The New Economy – February 11th, 2014

http://www.theneweconomy.com/videos/using-mobiles-to-smell-how-technology-is-giving-us-our-senses-video

The New Economy interviews Professor Adrian Cheok of City University London to find out about a new technology that will allow people to taste and smell through their mobile phones

Scientists are coming increasingly closer to developing technology that will allow us to use all our senses on the internet. Professor Adrian Cheok, City University London, explains how mobile phones will soon allow people to taste and smell, what the commercial benefits of this technology might be, and just how much it’s going to cost

The online world: it’s all about the visual and sound experience, but with three other senses, it can leave us short. Studies have demonstrated that more than half of human communication is non-verbal, so scientists are working on ways to communicate taste, touch, and smell over the internet. I’ve come to the City University London to meet Professor Adrian Cheok, who’s at the forefront of augmented reality, with new technology that will allow you to taste and smell through a mobile phone.

The New Economy: Adrian, this sounds completely unbelievable. Have you really found a way to transmit taste and smells via a mobile device?

Adrian Cheok: Yes, in our laboratory research we’ve been making devices which can connect electrical and digital signals to your tongue, as well as your nose. So for example, for taste we’ve created a device which you put on your tongue, and it has electrodes. What those do is artificially excite your taste receptors. So certain electrical signals will excite the receptors, and that will produce artificial taste sensations in your brain. So you will be able to experience, for example, salty, sweet, sour, bitter – the basic tastes on your tongue – without any chemicals.

And with smell we’re going in a couple of tracks. One is using chemicals, it’s a device that you can attach to your mobile phone, and these devices will emit chemicals. So that means that with apps and software on your phone, you can send someone a smell message. For example, you might get a message on Facebook, and it can send the smell of a flower. Or if your friend’s not in a very good mood it might be a bitter smell.

So the next stage of that is, we’re making devices which will have electrical and magnetic signals being transmitted to your olfactory bulb, which is behind your nose. It’ll be a device which you put in the back of your mouth, it will have magnetic coils, and similar to the electrical taste actuation, it will excite the olfactory bulb using electrical currents. And then this will produce an artificial smell sensation in your brain.

Already scientists have been able to connect optical fibre to neurons of mice, and that means that we can connect electrical signals to neurons.

With the rate of change, for example with Moore’s Law, you get exponential increase in technology. I think within our lifetimes we’re going to see direct brain interface. So in fact what you will get is essentially, you can connect all these signals directly to your brain, and then you will be able to experience a virtual reality without any of these external devices. But essentially connecting to the neural sensors of your brain. And of course that also connects to the internet. So essentially what we will have is direct internet connection to our brain. And I think that will be something we will see in our lifetime.

The New Economy: So direct brain interface – that sounds kind of dangerous. I mean, could there be any side-effects?

Adrian Cheok: Well we’re still at the very early stages now. So scientists could connect, for example, one optical fibre to the neuron of a mouse. And so what it has shown is that we can actually connect the biological world of brains to the digital world, which is computers.

Of course, this is still at an extremely early stage now. You know, the bio-engineers can connect one single neuron, so, we’re not anywhere near that level where we can actually connect to humans. You would have to deal with a lot of ethical and also privacy, social issues, risk issues.

Now if you have a virus on your computer, the worst it can do is cause your computer to crash. But you know, you could imagine a worst case: someone could reprogram your brain. So we’d have to think very carefully.

The New Economy: Well why is it important to offer smell over the internet?

Adrian Cheok: Fundamentally, smell and taste are the only two senses which are directly connected to the limbic system of the brain. And the limbic system of the brain is the part of the brain responsible for emotion and memory. So it is true that smell and taste can directly and subconsciously trigger your emotion, trigger your memory.

Now that we’re in the internet age, where more and more of our communication is through the internet, through the digital world, that we must bring those different senses – touch, taste and smell – to the internet; so that we can have a much more emotional sense of presence.

The New Economy: What will this be used for?

Adrian Cheok: Like all media, people want to recreate the real world. When cinema came out, people were filming, you know, scenes of city streets. To be able to capture that on film was quite amazing. But as the media developed, then it became a new kind of expression. And I believe it will be the same for the taste and smell media. Now that it’s introduced, at first people will just want to recreate smell at a distance. So for example, you want to send someone the smell of flowers, so Valentine’s Day for example, maybe you can’t meet your lover or your friend, but you can send the virtual roses, and the virtual smell of the roses to his or her mobile phone.

At the next stage it will lead to, I think, new kinds of creation. For example, music before; if you wanted to play music, you needed to play with an instrument, like a violin or a guitar. But now the young people can compose music completely digitally. Even there’s applications on your mobile phone, you can compose music with your finger, and it’s really professional. Similarly, that will be for smell and taste. We’ll go beyond just recreating the real world to making new kinds of creation.

The New Economy: So will it also have a commercial use?

Adrian Cheok: For advertising, because smell is a way to trigger emotions and memory subconsciously. Now, you can shut your eyes, and you can block your ears, but it’s very rare that you ever block your nose, because you can’t breathe properly! So people don’t block their nose, and that means advertising can always be channelled to your nose. And also we can directly trigger memory or an emotion. That’s very powerful.

We received interest from one of the major food manufacturers, and we’re having a meeting again soon. They make frozen food, and the difficulty to sell frozen food is, you can’t smell it. You just see these boxes in the freezer, but because it’s frozen, there’s no smell. But they want to have our devices so that when you pick up the frozen food maybe it’s like a lasagne, well you can have a really nice smell of what it would be.

The New Economy: How expensive will this be?

Adrian Cheok: We’re aiming to make devices which are going to be cheap. Because I think only by being very cheap can you make mass-market devices. So our current device, actually to manufacturer it, it’s only a few dollars.

The New Economy: Adrian, thank you.

Adrian Cheok: Thank you very much.

 

Vegetables Can Taste Like Chocolate by Adrian Cheok, Director of Imagineering Institute

posted in: Media | 0

 

Adrian David Cheok is Director of the Imagineering Institute, Malaysia, and Chair Professor of Pervasive Computing at City University London.

He is the Founder and Director of the Mixed Reality Lab, Singapore. His research focuses on multi-sensory internet communication, mixed reality, pervasive and ubiquitous computing, human-computer interfaces, and wearable computing. Today he talks about how the internet connects us and what we can do by blending reality, our senses, and the internet.

Adrian Cheok is a 2016 Distinguished Alumni Award recipient in recognition of his achievements and contribution in the field of computing, engineering and multisensory communication.

posted in: Media | 0

Adrian is a pioneer in mixed reality and multisensory communication; his innovation and leadership has been recognised internationally through multiple awards.
Some of his pioneering works in mixed reality include innovative and interactive games such as ‘3dlive’, ‘Human Pacman’ and ‘Huggy Pajama’. He is also the inventor of the world’s first electric and thermal taste machine, which produces virtual tastes with electric current and thermal energy.

Adrian Cheok is a 2016 Distinguished Alumni Award recipient in recognition of his achievements and contribution in the field of computing, engineering and multisensory communication.

Olfactometer

posted in: Media | 0

By Adrian David Cheok, Kasun Karunanayaka, Halimahtuss Saadiah, Hamizah Sharoom

 

 

Many of the Olfactometer implementations we find today, comes with high price and they are

complex to use. This project aiming to develop a simple, low cost, and easily movable laboratory

Olfactometer, that can be used as a support tool for wider range of experiments related to smell, taste,

psychology, neuroscience, and fMRI. Generally, Olfactometers use two types of olfactents; solid or

liquid odor. Our laboratory Olfactometer (as shown in Figure 1) will support for liquid based odors

and later we may also extend to handle solid odors.  Also we are thinking of improving this system as

a combined Olfactometer and Gustometer.

 

In this Olfactometer design, we utilize continuous flow (Lorig design) for good temporal control.

Lorig design have simpler design and low cost due to minimal usage of parts as compared to other

designs (Lundstrom et al., 2010). Our Olfactometer contains an 8 output channels that will produces

aromas in a precise and controlled manner. Besides that, it also produces a constant humidified flow

of pure air. The laboratory Olfactometer components consists of oil less piston air compressor, filter

regulator & mist separator, 2- color display digital flow switch, check valve, solenoid, manifold,

TRIVOT glass tube, connector, gas hose clip and also PU tubing. The controlling system of

Olfactometer will consists of Adruino Pro mini, UART converter, USB cable and solenoid circuit.

 

Air supply from oil-less piston air compressor plays an important part to deliver the odor with a

constant air pressure to the subjected nose. After that, the filter regulator combined with mist

separator are used to ensure the air is clean and did not have any other contaminant. After the filter,

the air flows will be metered through to 2-color display digital flow switch. Check valves are

connected after the flowmeter to ensure that the air will flow only in one direction. After the check

valve, the tube is then connected to 9 fitting male connectors which directly fit into the 8 output of

manifold. Then, 8 pcs of normally closed solenoid valves will be connected to the top of the manifold.

The Olfactometer can manually be controlled by computer to send an odour to the nose. A 2-color

display digital flow switch will be connecting after solenoid to make sure the air will flow around 3

LPM to 5 LPM (to avoid any discomfort to the subjected nose). If the solenoid valve is on by

computer, the air will pass through to the glass bottles and proved it by seeing the bubbling air inside

of glass bottles. Finally, air flow will blow the liquid/solid and go through the check valve before

straight to human nose.

 

The Laboratory Olfactometer

Magnetic Table

posted in: Media | 0

By Nur Ellyza Binti Abd Rahman*, Azhri Azhar*, Murtadha Bazli , Kevin Bielawski , Kasun Karunanayaka, Adrian David Cheok

 

 

In our daily life, we use the basic five senses to see, touch, hear, taste and smell. By utilizing some of

these senses concurrently, multisensory interfaces create immersive and playful experiences, and as a

result, it is becoming a popular topic in the academic research. Virtual Food Court (Kjartan Nordbo,

et. al.,2015), Meta Cookie (Takuji Narumi, et. al.,2010) and Co-dining (Jun Wei, et. al., 2012)

represent few interesting prior works in the field. Michel et al. (2015) revealed that dynamic changes

of the weight of the cutleries, influence the user perception and enjoyment of the food. The heavier

the weight of the utensils, would enhance the flavour. In line with that, we present a new multisensory

dining interface, called ‘Magnetic Dining Table and Magnetic Foods’.

 

‘Magnetic Dining Table and Magnetic Foods’ introduces new human-food interaction experiences by

controlling utensils and food on the table such as modify weight, levitate, move, rotate and

dynamically change the shapes (only for food). The proposed system is divided into two parts;

controlling part and controlled part. The controlling part consist of three components that are 1)

Dining Table, 2) Array of electromagnet and 3) Controller circuit and controlled part consist of two

components; 1) Magnetic Utensils and 2) Magnetic Foods. An array of electromagnet will be placed

underneath the table and the controller circuit will control the field that produce by each of the

electromagnet and indirectly will control the utensils and food on the table. For making an edible

magnetic food, ferromagnetic materials like iron, and iron oxides (Alexis Little, 2016) will be used.

We expect that this interface will modify taste and smell sensations, food consumption behaviours,

and human-food interaction experiences positively.

 

Magnetic Dining Table and Magnetic Foods

Bench of Multi-sensory Memories

posted in: Media | 0

By Stefania Sini, Nur Ain Mustafa, Hamizah AnuarAdrian David Cheok

 

 

What if cities have dedicated urban interfaces in public spaces that invite people to share stories and

memories of public interest, and facilitate the creation of a public narration? What if people share and

access these stories and memories while chatting with a public bench? Will the interaction with the

bench provide a meaningful, memorable and playful experience of a place?

 

The Bench of Multi-sensory Memories is an urban interface whose objective is to investigate the role

of urban media in placemaking. It mediates the creation of a public narration, and affords citizens a

playful and engaging interface to access and generate stories and memories that form this narration.

 

The bench has been designed and fabricated in collaboration with the Malaysian artist Alvin Tan,

which has experience with bamboo installations in public spaces. Its structure is robust and it allows

to easily and safely allocate all the hardware components. The hardware and software system

consists of: a) input devices, the USB Microphone and the Force Sensitive Resistor (FSR) sensors; b)

Analog-Digital or Digital-Analog (AD/DA) Converter Module Board; c) Microcontroller, a Raspberry Pi

3; d) output device, a Speaker; e) the software, the Google Speech API. The components operate as

following: the FSR sensors detect the presence of a person in the bench through physical pressure,

weight and pressing; the AD/DA Converter Module Board read the analogue values of the FSR

sensors and convert them into digital values, readable by the Microcontroller; the Microcontroller,

which has advanced features, such as the Wi-fi, Ethernet, Bluetooth, USB, HDMI and Audio Jack,

easily connects the inputs and output devices. Currently, the system software implements speech

applications, such as text-to- speech and speech-to- text: Google Speech API generates the voice

based on the text, records the voice and translates the speech into text, through the output and input

devices. Therefore, at the moment, the system performs a scripted sequence that includes text-to-

speech and speech to text translations. In the short term, we will be able to employ a custom chatbot,

that is able to conduct interactive and meaningful conversations.

 

Bench of Multi-sensory Memories

Multi-sensory Story Book For Visually Impaired Children

posted in: Media | 0

By Edirisinghe Chamari, Kasun Karunanayaka, Norhidayati Podari,  Adrian David Cheok

Experience of reading for children is enriched by visual displays. Researchers suggest

through picture book experiences, children expose themselves to develop socially,

intellectually, and culturally. However, the beauty of reading is an experience sighted

children naturally indulge, and which visually-impaired children struggle with. Our multi-

sensory book is an attempt to create a novel reading experience specifically for visually-

impaired children. While a sighted person’s mental imagining is constructed through visual

experiences, a visually-impaired person’s mental images are a product of haptic, and

sounds. Our book is introducing multi-sensory interactions, through touch, smell, and sound.

The concept is also aiming to address a certain lack of appropriately designed technologies

for visually-impaired children.

 

Our book titled “Alice and her Friend” is folding out to reveal a story about a cat, whose

activities are presented with multi-sensory interactions. There are six pages in this book, with

different sensors and actuators integrated in each page. The pages were designed with

textures, braille, large font text, sounds, and smell. With this book, we believe we have

contributed a new reading experience to the efforts of visually- impaired children to

understand the beauty of the world.

 

A Picture Book for Visually Impaired Children

French woman wants to marry a robot as expert predicts sex robots to become preferable to humans

posted in: Media | 0

Untitled

By Nzherald – Decemeber 24, 2016.

http://www.nzherald.co.nz/lifestyle/news/article.cfm?c_id=6&objectid=11772407

Human-robot marriages may become commonplace by 2050 if not before. Photo / 123RF
Human-robot marriages may become commonplace by 2050 if not before. Photo / 123RF

 

On the surface, Lilly seems like a blushing young woman ready to marry the man of her dreams who makes her “totally happy.”

Only her partner is 3D printed robot named Inmmovator who she designed herself, after realising she was attracted to “humanoid robots generally” rather than other people.

“I’m really and totally happy,” she told news.com.au over email in her tentative English. “Our relationship will get better and better as technology evolves.”

The “proud robosexual” said she always loved the voices of robots as a child but realised at 19 she was sexually attracted to them as well. Physical relationships with other men confirmed the matter.

“I’m really and only attracted by the robots,” she said. “My only two relationships with men have confirmed my love orientation, because I dislike really physical contact with human flesh.”

She has since built her own dream man with open-source technology from a French company, and has lived with him for one year. They are ‘engaged’ and plan to marry when robot-human marriage is legalised in France.

The unconventional relationship has been accepted by family and friends but she said “some understand better than others.”

She won’t reveal whether they have a sexual relationship and is currently in training to become a roboticist in order to take her passion into her everyday life.

While Lilly’s views will strike many as odd, it’s just a sign of things to come according to David Levy.

The chess whiz and authority on Love and Sex with Robots said he expects human-robot marriages to become commonplace by 2050 if not before.

Speaking at the second conference on the issue held in London this week, Mr Levy told a room filled with academics and interested people that advances in artificial intelligence mean robots could become “enormously appealing” partners within the next few decades.

“The future has a habit of laughing at you. If you think love and sex with robots is not going to happen in your lifetime, I think you’re wrong.”

“The first human robot marriages will take place around the year 2050 or sooner but not longer,” he said.

The conference explored a host of issues on the subject including everything from what robots should look like to whether they should be able to “learn” about sexual preferences and feed back information to companies behind them.

University of London Computing Professor Adrian David Cheok said he believes robots will not only become common, but preferable for many people.

“It’s going to be so much easier, so much more convenient to have sex with a robot. You can have exactly what kind of sex you want. That’s going to be the future. That we will have more sex with robots and the next stage is love … we’re already seeing it.”

“Actual sex with humans may be like going to a concert. When you’re at home you can listen to Beethoven’s ninth symphony, it’s good enough and once or twice a year you’ll want to go the Royal Albert Hall and hear it in a concert hall.

“That may be the way sex with humans is going to be. It’s going to be much more easier, much more convenient to have sex with a robot, and maybe much better because that’s how you want it.”

When bitter tastes sweet, seeming is believing

posted in: Media | 0

Untitled

By David Mitchell, Octorber 16,2016 – Theguardian

https://www.theguardian.com/commentisfree/2016/oct/16/seeming-is-believing-taste-buddy-foreign-students-theresa-may

1872
A man experiments with the Taste Buddy, which emits thermal and electric signals to stimulate the taste buds. Photograph: Professor Adrian Cheok/PA

 

The time has come to loosen our grip on reality. All the signs are there. Millennia of booze and drug abuse, hundreds of conflicting religions and cults and superstitions and alternative medicines and conspiracy theories, the premise of the Matrix franchise, the internet, sunglasses, video games and the powerfully convincing anti-intellectualism of Michael Gove. They’re all saying the same thing: ignore what’s really happening and you’ll feel a lot better. It’s been staring us in the face: we need to close our eyes to what’s staring us in the face.

And there’s been a huge breakthrough in this direction. They’re calling it the “Taste Buddy”, but that’s because they’re awful and cheesy and the less we have to perceive their existence, the happier we’ll be. And the Taste Buddy will help separate our perceptions from that sour reality. Particularly our perception of cheesiness, which we should soon be able to precisely regulate using a computer.

The Taste Buddy, which was unveiled last week, is a new invention, still in its prototype stage, that changes our sense of what things taste like by emitting thermal and electric signals that stimulate, or rather delude, the taste buds. Currently it can only make things seem saltier or sweeter than they are, but the team behind it, led by Adrian Cheok of London University, believes that, with development, it could go much further. If built into pieces of cutlery, it “could allow children to eat vegetables that taste like chocolate”; it could make tofu taste like steak; basically, it could make healthy things taste like delicious things.

“But healthy things are delicious!” you may be saying. And therein lies the problem. Not that healthy things actually are delicious – that’s patently not true. Sometimes it might seem like they are – nuts, for example, often give this impression – and then you discover the deliciousness is all because of some salt or sugar or duck fat that’s been added in cardiovascularly hazardous quantities. Healthy things are delicious if either a) they’re deep fried, or b) there’s nothing else to eat. Couscous salad is much better than no food at all but, on the modern culinary battlefield, it’s a mere flint-headed arrow to the state-of-the-art cruise missile that is a fried egg sandwich.

No, the challenge for the Taste Buddy is not that lentils actually are tastier than chips, but that some people say they are and, in some cases, come to believe it. Their own mental powers of self-delusion rival Taste Buddy’s thermal and electric trickery. And that’s because many people define their identities by their eating choices.

Whether consciously or not, some healthy eaters’ healthy eating is primarily an expression of control, cleanliness and virtue. It doesn’t just make them feel better, it makes them feel better than other people. If eating steamed broccoli is suddenly no hardship, because it can be made to taste like baked Alaska, they’re going to be deeply offended. It would be like offering a devout order of self-flagellating monks an inexhaustible supply of local anaesthetic.

Frankly, Taste Buddy will be seen as cheating. These penitents won’t like it that those of us with coarse, lifespan-reducing palates will get the benefit of nutrients we haven’t earned, now that gruel is no longer gruelling. A market will immediately open up for some scientists to discover that it’s actually tasting the lettuce rather than swallowing it that matters most.

The most rabid salad eaters and the haute cuisine sector will combine to incentivise anyone who’ll claim “there are still no shortcuts” when it comes to eating well, that the brain needs the taste of roughage, or just that Taste Buddy might give you tongue cancer. Which, I suppose, it might. As might a sexist joke on a lolly stick.

Illustration by David Foldvari

And maybe they’d have a point. A spoonful of sugar may help the medicine go down, but it probably screws up the placebo effect. Who knows how crucial those feelings of sacrifice, self-denial and moral superiority (lost for ever if Taste Buddy turned everything delicious) actually are to the health-enhancing powers of a balanced diet. In a carefully conducted study, it could probably be measured. But that sounds rather elitist, doesn’t it? Measuring things with cold objectivity, as if that can ever matter as much as a sincere conviction of the heart.

If you think that’s all a bit touchy-feely, or tasty-thinky, you may be surprised to learn it’s an approach Theresa May is very keen on. Last week the Times reported that the Home Office was concealing a report it had commissioned into the number of foreign students who break the terms of their visas and remain in Britain illicitly after their courses have finished. The number the report had come up with was about 1,500 annually, rather than the tens of thousands that had previously been estimated and generally bandied about. That was not what the Home Office, or the prime minister, wanted to hear.

Why not? It’s good news, isn’t it? Well not if you’ve just cracked down on the admission of foreigners to British universities, with potentially disastrous consequences for the latter’s funding. The notion that this drastic policy might have almost no effect on reducing net immigration was extremely unwelcome and, the government clearly felt, best kept quiet.

Particularly as, among likely Tory voters, there’s a broad perception that foreign students stay here and scrounge. Many people feel that feckless young foreigners are dragging us down and the government has come up with a harsh little policy to address that. Why let the fact that it’s not true get in the way?

Surely, Theresa May must think, it’s not the business of government to start telling the public it’s wrong. In an increasingly virtual world, feelings are as valid as facts. Let’s focus on what people perceive to be the case and concentrate on adding to that a perception that something is being done about it. That’s efficient democratic accountability for post-truth Britain.

No need to contradict people about what they reckon is going on, denying problems they believe exist and citing others they were previously untroubled by. Policy doesn’t need to reflect reality any more than the currency needs to be backed by gold. Just listen to their fears, confirm them and then use them to make the government seem vital. People will swallow anything if you control how it tastes.

1 2 3 4 5 6 7 72