Nadine, developed by scientist in Singapore, is a real virtual human, a human-like robot.
Scientists at the Nanyang Technological University, Singapore have unleashed their latest creation on the university’s campus: the world’s most human-like robot.
Nadine, as the robot is called, has soft skin and medium brunette hair just like her creator, Professor Nadia Thalmann. But more than her looks, the most impressive thing about Nadine is what she does: make eye contact, smile, meet and greet guests, shake hands, and even recognise past visitors and engage in conversation with them based on previous exchanges.
Unlike previous generations of robots, Nadine has an individual personality, with moods that change depending on the topic.
Conversational technology is more than a voice, face and search.
A recent blog post by a tech writer compared the Natural Language Processing (NLP) technology that is behind Intelligent Virtual Assistants (IVAs) like Ann at Aetna or Jenn at Alaska to the hologram technology used to resurrect Tupac for Coachella. The comparison sheds light on the reality that people – tech writers included – still do not understand that conversational NLP technologies in the form of IVAs are more than a voice, face and search. Could anyone at Coachella have a conversation with Holo-pac? No. Or would the same comparison be made of Siri? Siri doesn’t have a face, so probably not.
New, future oriented field of artimetrics, i.e. artificial biometrics, focuses on recognition, verification, and authentication of virtual agents, robots and other nonbiological entities. As a sub-field of cybersecurity, artimetrics is being developed in Cyber-Security Lab, which prides itself on being the world’s first to conduct this kind of research.
Telesar V Robot Avatar: a real-life remote experience by transmitting sight, sound & touch data to its operator
Experience a foreign or distant world without actually being there: Japanese researchers leaded by professor Susumu Tachi at Tachi Laboratory have developed Telesar V Robot Avatar which delivers a remote experience straight to its operator by transmitting sight, sound and touch data. You can actually feel the shape and surface unevenness, and also the temperature of objects.
Robots and virtual worlds assist autistic children with skill development, by multitouch or robot KASPAR
Children with autism can develop skills they normally find difficult by interacting with virtual worlds. By using multitouch, these children activate a virtual character on a screen and experiment with different social scenarios. This way, researchers can compare their responses to those displayed in real-life situations.
Interaction between Human and Chatbot through tactile touch screens
Researchers from Disney Research and Carnegie Mellon University have been investigating haptic interfaces that allow users to feel virtual elements through touch. Such tactile touch screens provide users with a wide variety of tactile sensations such as textures, friction and vibration.
Meet Holly and Graham: 2D Holograms transformed into virtual boarding agents
Various airports (such as Paris, London, Birmingham and Manchester) are experimenting with so-called virtual boarding agents, these are virtual agents which for example kindly greet passengers when boarding or explain security measures before going through customs.
Blue Mars Lite: Meet and greet avatars in real-world locations on Google Street View
Blue Mars is a 3D virtual world platform enabling users to meet, chat and share their common digital experiences. Blue Mars Lite application goes one step further: Meet other avatars in real-world locations on Google Street View.