You Might Be Able To Marry A Robot By 2050

posted in: Media | 0

logo

By Shannon Ullman – February 21, 2017 – Your Tango.

http://www.yourtango.com/2017300158/you-might-be-able-marry-sex-robot-by-2050

sex robot

 

Robots are the future… in more than one way.

Single people: everything is going to be OK. If you’ve given up on ever finding someone and have vowed to be alone with your cats until the end of time, you may want to reconsider.

With technology taking over pretty much everything these days, it really shouldn’t be a surprise that it may take over our love lives, too. I mean, sex robots are kind of a thing and virtual porn is heading towards a market takeover. It won’t be long before the robots who fill our sexual desires start to satisfy our emotional desires, too.

At a recent “Love And Sex With Robots” conference (it’s a real thing… seriously) at the University of London, David Levy, expert and author of a book on love between humans and robots, predicated that sex robot marriages would be legal by 2050. The director of the Mixed Reality Lab in Singapore, Adrian Cheok, said that the prediction seems pretty valid.

During his speech at the conference, Cheok was quoted as saying, “That might seem outrageous because it’s only 35 years away. But 35 years ago people thought homosexual marriage was outrageous… Until the 1970s, some states didn’t allow white and black people to marry each other. Society does progress and change very rapidly.”

I’ve got to say, the man makes a pretty good point. In fact, Cheok is full of logical wisdom and just listening to his point of view makes the whole robot-human marriage sound rather lovely.

He goes on to argue that marrying robots could make huge improvements to society as many human-to-human marriages are unhappy or end up in divorce. He also argues that while society assumes that everyone has the ability to meet someone, fall in love, get married and live happily ever after, this is simply not the case for a huge part of the population.

 


Giphy

 

When comparing a bad marriage to a human or a marriage to a robot, Cheok believes that the robot relationship is a much better option.

While there are still massive yet doable improvements to be made to sexrobots, the biggest challenge for making them lovable is the skill of conversation. By making robots look like they love you and, more importantly, make you FEEL like they love you, you are going to feel that same kind of connection as you would get with human love.

Humans are capable of loving pets and fictional characters, which is why Cheok believes that falling in love with robots is not a big stretch. Cheok offers up some pretty convincing arguments, but there are two sides to every story and others are not so convinced that human and robot marriages will be successful.

Professor at The University of Applied Sciences and Arts in Switzerland, Oliver Bendal, argues that love and sex robots will have no moral standing. He believes that the whole contract of marriage involves duties that robots can’t actually perform, such as taking care of the children. However, he somehow still believes that this marriage option can become legal by 2050, simply because of public pressures.

With the possibility of the government taking action against this kind of marriage, the professor thinks that anything could happen and that we should prepare ourselves for any outcome.

Falling in love and marrying a robot — the pro and con lists for this topic are both long. What do you think? Would marrying a robot ever seem appealing to you?

 

Kissenger

posted in: Media | 0

By Emma Yann Zhang , Adrian David Cheok

 

 

This research looks into sharing intimacy and emotion in digital communication

through mediated physical interactions, kissing in particular. It aims to extend our

sense of touch by creating an interactive kissing machine that produces dynamic haptic

force feedback and other sensory stimuli to transmit realistic kissing sensations

over a telecommunication network. The research takes a novel approach to affective

communication by constructing a mathematical model of kissing, including the forces,

dynamics and bilateral control of the forces to enable remote kissing in a communication

system.

 

A multimodal interactive system for remote kissing interaction is developed. The

kissing device is designed as an attachment device for mobile phones, allowing users

to kiss each other remotely while having a video chat using their mobile phones. It

measures and transmits real-time lip pressure over the Internet. The device consists

of force sensors that measure the the contact force between the lip surface of the

device and the user’s lips, as well as a linear actuator array to produce accurate force

feedback and lip movements during user interaction. A bilateral force controller is

used such that both users feel the reflections of their own lip forces as well as the

forces from each other. The system also engages the sense of smell by emitting body

scents or pheromones from the device.

 

The system provides a new communication channel for people to share intimacy

and affection through remote physical interaction. It engages a wide spectrum of our

sensory modalities, including touch and smell, thereby increasing the sense of telepresence

and allowing for deep emotional exchanges. While face-to-face interaction is

not always available in this increasingly globalised society, this system also offers an

effective and intuitive way for parents and grandparents to communicate with young

children who have limited language abilities, as well as with people with physical disabilities

or communication disorders. The outcome of this research could also make a

great impact in the area of robotics and artificial intelligence. The digitisation of kissing

provides exciting opportunities for human-robot relationships or even robot-robot

relationships. Robots will be able to possess emotional intelligence and learn to understand

the emotional meaning and pleasure of kissing, hence establishing intimate

and humanistic relationships.

 

Kissenger

Moody Hoody

posted in: Media | 0

By Nurafiqah Johari, Halimahtuss Saadiah, Kasun Karunanayaka, Adrian David Cheok

 

 

Moody Hoody is a hoody that can emit fragrance from the device inside it. It is a combination between fashion and technology. We mounted a Scentee Module inside the vest section of the hoody to emit a scented vapor on demand by pressing a button that located in the pocket of the Hoody. When you press the button, you can see vapor emitting from the vest of the Hoody through the Scentee Module. Scentee is the world’s first phone attachment that produce scent on demand by using an app (smart phone).

The main objective of this project is to lift the mood/spirit of the person who wears hoody and the people surround as the scent vapor is coming out from the hoody. Previous studies shows that certain fragrances can influence people’s mood and emotions. From the applications side this hoodie can be used for aroma therapy. Further, when you want to attract someone, you can wear the hoodie with a nice scent that can make people feels good when they are near to you. The colors of the hoody and its inner lining was specifically chosen to create some form of Mood for the person who wears the Hoody.

In the future, we plan to make moody hoody a whole lot slimmer, attaching an array of much flatter Scentees. All the wiring will be replaced with conductive threads. We also want to make the hoody connect through Bluetooth to the Smart Phone, so the person who wears the Hoody can always activate the release of the scented vapor through the smart phone. We hope that it will be able bring the clothing industry to a totally different level and become a very profitable marketable product in the “not so distant future”.

 

Moody Hoody

Teacherbot

posted in: Media | 0

 

 

Schools are meant to prepare learners for the future outside of school. Current developments

in AI, machine learning and robotics suggests a future of fully shared social spaces, including

learning environments (LEs), with robotic personalities. Today’s learners (as well as teachers)

should be prepared for such a future. AI in Education (AIED) has focused on implementation

of online and screen-based Pedagogical Agents (PAs); however, research findings support

richer learning experiences with embodied PAs, hence, recent studies in AIED have focused

on robot as peers, teaching assistants or as instructional materials. Their classroom uses

employ gamification approaches and are mostly based on a one-robot- one-student interaction

style whereas current educational demands support collaborative approaches to learning.

Robots as instructors are novel, considered a major challenge due to the requirements for

good teaching, including the demands for agency, affective capabilities and classroom control

which machines are believed to be incapable of. Current technological capabilities suggest a

future with full-fledged robot teachers teaching actual classroom subjects, hence, we

implement a robot teacher with capabilities for agency, social interaction and classroom

control within a collaborative learning scenario involving multiple human learners and the

teaching of basic Chemistry in line with current focus on STEM areas. We consider the PI

pedagogical approach an adequate technique for implementing robotic teaching based on its

design with inherent support for instructional scaffolding, learner control, conceptual

understanding and learning by teaching. We are exploring these features in addition to the

agentic capabilities of the robot and the effects on learner agency as well as improved

learning in terms of engagement, learner control and social learning. In the future, we will

focus on other key concepts in learning (e.g. assessment), other types of learners (e.g.

learners with cognitive/physical disabilities), interaction styles and LEs. We will also explore

and cross-community approaches that leverage on integration of sibling communities.

 

Teacherbot

Magnetic Table

posted in: Media | 0

By Nur Ellyza Binti Abd Rahman*, Azhri Azhar*, Murtadha Bazli , Kevin Bielawski , Kasun Karunanayaka, Adrian David Cheok

 

 

 

In our daily life, we use the basic five senses to see, touch, hear, taste and smell. By utilizing some of

these senses concurrently, multisensory interfaces create immersive and playful experiences, and as a

result, it is becoming a popular topic in the academic research. Virtual Food Court (Kjartan Nordbo,

et. al.,2015), Meta Cookie (Takuji Narumi, et. al.,2010) and Co-dining (Jun Wei, et. al., 2012)

represent few interesting prior works in the field. Michel et al. (2015) revealed that dynamic changes

of the weight of the cutleries, influence the user perception and enjoyment of the food. The heavier

the weight of the utensils, would enhance the flavour. In line with that, we present a new multisensory

dining interface, called ‘Magnetic Dining Table and Magnetic Foods’.

 

‘Magnetic Dining Table and Magnetic Foods’ introduces new human-food interaction experiences by

controlling utensils and food on the table such as modify weight, levitate, move, rotate and

dynamically change the shapes (only for food). The proposed system is divided into two parts;

controlling part and controlled part. The controlling part consist of three components that are 1)

Dining Table, 2) Array of electromagnet and 3) Controller circuit and controlled part consist of two

components; 1) Magnetic Utensils and 2) Magnetic Foods. An array of electromagnet will be placed

underneath the table and the controller circuit will control the field that produce by each of the

electromagnet and indirectly will control the utensils and food on the table. For making an edible

magnetic food, ferromagnetic materials like iron, and iron oxides (Alexis Little, 2016) will be used.

We expect that this interface will modify taste and smell sensations, food consumption behaviours,

and human-food interaction experiences positively.

 

Magnetic Dining Table and Magnetic Foods

Using mobiles to smell: How technology is giving us our senses | The New Economy Videos

posted in: Media | 0

Untitled

By The New Economy – February 11th, 2014

http://www.theneweconomy.com/videos/using-mobiles-to-smell-how-technology-is-giving-us-our-senses-video

The New Economy interviews Professor Adrian Cheok of City University London to find out about a new technology that will allow people to taste and smell through their mobile phones

Scientists are coming increasingly closer to developing technology that will allow us to use all our senses on the internet. Professor Adrian Cheok, City University London, explains how mobile phones will soon allow people to taste and smell, what the commercial benefits of this technology might be, and just how much it’s going to cost

The online world: it’s all about the visual and sound experience, but with three other senses, it can leave us short. Studies have demonstrated that more than half of human communication is non-verbal, so scientists are working on ways to communicate taste, touch, and smell over the internet. I’ve come to the City University London to meet Professor Adrian Cheok, who’s at the forefront of augmented reality, with new technology that will allow you to taste and smell through a mobile phone.

The New Economy: Adrian, this sounds completely unbelievable. Have you really found a way to transmit taste and smells via a mobile device?

Adrian Cheok: Yes, in our laboratory research we’ve been making devices which can connect electrical and digital signals to your tongue, as well as your nose. So for example, for taste we’ve created a device which you put on your tongue, and it has electrodes. What those do is artificially excite your taste receptors. So certain electrical signals will excite the receptors, and that will produce artificial taste sensations in your brain. So you will be able to experience, for example, salty, sweet, sour, bitter – the basic tastes on your tongue – without any chemicals.

And with smell we’re going in a couple of tracks. One is using chemicals, it’s a device that you can attach to your mobile phone, and these devices will emit chemicals. So that means that with apps and software on your phone, you can send someone a smell message. For example, you might get a message on Facebook, and it can send the smell of a flower. Or if your friend’s not in a very good mood it might be a bitter smell.

So the next stage of that is, we’re making devices which will have electrical and magnetic signals being transmitted to your olfactory bulb, which is behind your nose. It’ll be a device which you put in the back of your mouth, it will have magnetic coils, and similar to the electrical taste actuation, it will excite the olfactory bulb using electrical currents. And then this will produce an artificial smell sensation in your brain.

Already scientists have been able to connect optical fibre to neurons of mice, and that means that we can connect electrical signals to neurons.

With the rate of change, for example with Moore’s Law, you get exponential increase in technology. I think within our lifetimes we’re going to see direct brain interface. So in fact what you will get is essentially, you can connect all these signals directly to your brain, and then you will be able to experience a virtual reality without any of these external devices. But essentially connecting to the neural sensors of your brain. And of course that also connects to the internet. So essentially what we will have is direct internet connection to our brain. And I think that will be something we will see in our lifetime.

The New Economy: So direct brain interface – that sounds kind of dangerous. I mean, could there be any side-effects?

Adrian Cheok: Well we’re still at the very early stages now. So scientists could connect, for example, one optical fibre to the neuron of a mouse. And so what it has shown is that we can actually connect the biological world of brains to the digital world, which is computers.

Of course, this is still at an extremely early stage now. You know, the bio-engineers can connect one single neuron, so, we’re not anywhere near that level where we can actually connect to humans. You would have to deal with a lot of ethical and also privacy, social issues, risk issues.

Now if you have a virus on your computer, the worst it can do is cause your computer to crash. But you know, you could imagine a worst case: someone could reprogram your brain. So we’d have to think very carefully.

The New Economy: Well why is it important to offer smell over the internet?

Adrian Cheok: Fundamentally, smell and taste are the only two senses which are directly connected to the limbic system of the brain. And the limbic system of the brain is the part of the brain responsible for emotion and memory. So it is true that smell and taste can directly and subconsciously trigger your emotion, trigger your memory.

Now that we’re in the internet age, where more and more of our communication is through the internet, through the digital world, that we must bring those different senses – touch, taste and smell – to the internet; so that we can have a much more emotional sense of presence.

The New Economy: What will this be used for?

Adrian Cheok: Like all media, people want to recreate the real world. When cinema came out, people were filming, you know, scenes of city streets. To be able to capture that on film was quite amazing. But as the media developed, then it became a new kind of expression. And I believe it will be the same for the taste and smell media. Now that it’s introduced, at first people will just want to recreate smell at a distance. So for example, you want to send someone the smell of flowers, so Valentine’s Day for example, maybe you can’t meet your lover or your friend, but you can send the virtual roses, and the virtual smell of the roses to his or her mobile phone.

At the next stage it will lead to, I think, new kinds of creation. For example, music before; if you wanted to play music, you needed to play with an instrument, like a violin or a guitar. But now the young people can compose music completely digitally. Even there’s applications on your mobile phone, you can compose music with your finger, and it’s really professional. Similarly, that will be for smell and taste. We’ll go beyond just recreating the real world to making new kinds of creation.

The New Economy: So will it also have a commercial use?

Adrian Cheok: For advertising, because smell is a way to trigger emotions and memory subconsciously. Now, you can shut your eyes, and you can block your ears, but it’s very rare that you ever block your nose, because you can’t breathe properly! So people don’t block their nose, and that means advertising can always be channelled to your nose. And also we can directly trigger memory or an emotion. That’s very powerful.

We received interest from one of the major food manufacturers, and we’re having a meeting again soon. They make frozen food, and the difficulty to sell frozen food is, you can’t smell it. You just see these boxes in the freezer, but because it’s frozen, there’s no smell. But they want to have our devices so that when you pick up the frozen food maybe it’s like a lasagne, well you can have a really nice smell of what it would be.

The New Economy: How expensive will this be?

Adrian Cheok: We’re aiming to make devices which are going to be cheap. Because I think only by being very cheap can you make mass-market devices. So our current device, actually to manufacturer it, it’s only a few dollars.

The New Economy: Adrian, thank you.

Adrian Cheok: Thank you very much.

 

Vegetables Can Taste Like Chocolate by Adrian Cheok, Director of Imagineering Institute

posted in: Media | 0

 

Adrian David Cheok is Director of the Imagineering Institute, Malaysia, and Chair Professor of Pervasive Computing at City University London.

He is the Founder and Director of the Mixed Reality Lab, Singapore. His research focuses on multi-sensory internet communication, mixed reality, pervasive and ubiquitous computing, human-computer interfaces, and wearable computing. Today he talks about how the internet connects us and what we can do by blending reality, our senses, and the internet.

Adrian Cheok is a 2016 Distinguished Alumni Award recipient in recognition of his achievements and contribution in the field of computing, engineering and multisensory communication.

posted in: Media | 0

Adrian is a pioneer in mixed reality and multisensory communication; his innovation and leadership has been recognised internationally through multiple awards.
Some of his pioneering works in mixed reality include innovative and interactive games such as ‘3dlive’, ‘Human Pacman’ and ‘Huggy Pajama’. He is also the inventor of the world’s first electric and thermal taste machine, which produces virtual tastes with electric current and thermal energy.

Adrian Cheok is a 2016 Distinguished Alumni Award recipient in recognition of his achievements and contribution in the field of computing, engineering and multisensory communication.

Olfactometer

posted in: Media | 0

By Adrian David Cheok, Kasun Karunanayaka, Halimahtuss Saadiah, Hamizah Sharoom

 

 

Many of the Olfactometer implementations we find today, comes with high price and they are

complex to use. This project aiming to develop a simple, low cost, and easily movable laboratory

Olfactometer, that can be used as a support tool for wider range of experiments related to smell, taste,

psychology, neuroscience, and fMRI. Generally, Olfactometers use two types of olfactents; solid or

liquid odor. Our laboratory Olfactometer (as shown in Figure 1) will support for liquid based odors

and later we may also extend to handle solid odors.  Also we are thinking of improving this system as

a combined Olfactometer and Gustometer.

 

In this Olfactometer design, we utilize continuous flow (Lorig design) for good temporal control.

Lorig design have simpler design and low cost due to minimal usage of parts as compared to other

designs (Lundstrom et al., 2010). Our Olfactometer contains an 8 output channels that will produces

aromas in a precise and controlled manner. Besides that, it also produces a constant humidified flow

of pure air. The laboratory Olfactometer components consists of oil less piston air compressor, filter

regulator & mist separator, 2- color display digital flow switch, check valve, solenoid, manifold,

TRIVOT glass tube, connector, gas hose clip and also PU tubing. The controlling system of

Olfactometer will consists of Adruino Pro mini, UART converter, USB cable and solenoid circuit.

 

Air supply from oil-less piston air compressor plays an important part to deliver the odor with a

constant air pressure to the subjected nose. After that, the filter regulator combined with mist

separator are used to ensure the air is clean and did not have any other contaminant. After the filter,

the air flows will be metered through to 2-color display digital flow switch. Check valves are

connected after the flowmeter to ensure that the air will flow only in one direction. After the check

valve, the tube is then connected to 9 fitting male connectors which directly fit into the 8 output of

manifold. Then, 8 pcs of normally closed solenoid valves will be connected to the top of the manifold.

The Olfactometer can manually be controlled by computer to send an odour to the nose. A 2-color

display digital flow switch will be connecting after solenoid to make sure the air will flow around 3

LPM to 5 LPM (to avoid any discomfort to the subjected nose). If the solenoid valve is on by

computer, the air will pass through to the glass bottles and proved it by seeing the bubbling air inside

of glass bottles. Finally, air flow will blow the liquid/solid and go through the check valve before

straight to human nose.

 

The Laboratory Olfactometer

Magnetic Table

posted in: Media | 0

By Nur Ellyza Binti Abd Rahman*, Azhri Azhar*, Murtadha Bazli , Kevin Bielawski , Kasun Karunanayaka, Adrian David Cheok

 

 

In our daily life, we use the basic five senses to see, touch, hear, taste and smell. By utilizing some of

these senses concurrently, multisensory interfaces create immersive and playful experiences, and as a

result, it is becoming a popular topic in the academic research. Virtual Food Court (Kjartan Nordbo,

et. al.,2015), Meta Cookie (Takuji Narumi, et. al.,2010) and Co-dining (Jun Wei, et. al., 2012)

represent few interesting prior works in the field. Michel et al. (2015) revealed that dynamic changes

of the weight of the cutleries, influence the user perception and enjoyment of the food. The heavier

the weight of the utensils, would enhance the flavour. In line with that, we present a new multisensory

dining interface, called ‘Magnetic Dining Table and Magnetic Foods’.

 

‘Magnetic Dining Table and Magnetic Foods’ introduces new human-food interaction experiences by

controlling utensils and food on the table such as modify weight, levitate, move, rotate and

dynamically change the shapes (only for food). The proposed system is divided into two parts;

controlling part and controlled part. The controlling part consist of three components that are 1)

Dining Table, 2) Array of electromagnet and 3) Controller circuit and controlled part consist of two

components; 1) Magnetic Utensils and 2) Magnetic Foods. An array of electromagnet will be placed

underneath the table and the controller circuit will control the field that produce by each of the

electromagnet and indirectly will control the utensils and food on the table. For making an edible

magnetic food, ferromagnetic materials like iron, and iron oxides (Alexis Little, 2016) will be used.

We expect that this interface will modify taste and smell sensations, food consumption behaviours,

and human-food interaction experiences positively.

 

Magnetic Dining Table and Magnetic Foods

1 2 3 4 5 6 31