AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

Where negotiations and conversations meet
 
 

It’s quite hard to imagine this is all AI:

http://www.youtube.com/v/NqZM8gDD8mY

 

 
  [ # 1 ]

I watched the video, and some other videos from the same project, and here are a few thoughts that popped up in my mind:

1. Although it seems to be impressively sophisticated at it’s level of speech-based interaction, it still strikes me as a chatbot that uses canned responses. I get this from the other videos where long lists with possible responses are shown on a comp screen.

2. The avatars itself (the animation) are not autonomous; they move based on a repertoire of predefined poses and movements (this is explained in some of the videos). They do not have free motor control over their limbs, so they can not make a ‘new’ move or gesture based on learning. This is one of the things I’m looking at when I mention ‘virtual robotics’. I think using a game-engine that includes a physics-model (like the one in Blender for example, there are more of course) is the way to go here.

All in all it is pretty impressive and I think we will see more applications of NLP emerge in the near future. But I also have the strong feeling that these are all short-term solutions to the idea of building AI. It is still lacking the ‘big one’; real autonomous consciousness, where the AI ‘understands’ the concepts that are discussed and decides, based on it’s own experiences, how to react to the input.

 

 
  [ # 2 ]
Hans Peter Willems - Feb 22, 2011:

The avatars itself (the animation) are not autonomous; they move based on a repertoire of predefined poses and movements (this is explained in some of the videos). They do not have free motor control over their limbs, so they can not make a ‘new’ move or gesture based on learning.

That’s a new decade of learning. A robot which just a sensors (motion sensors, location sensor, vision sensor) to observe where its limbs are, so he can learn to move. Starting from scratch, with a reason to live: ‘feel grass’, he would be rewarded internally when he manages to reach out his limbs to grass in the neighborhood. In that case all you would need is an objective, sensors and a ‘body control system’, and a little bit AI grin.

 

 
  [ # 3 ]

I agree, but I also see it the other way around: giving AI a (virtual) body that it can interact with itself, might be an important step towards self-awareness, and ultimately ‘consciousness’.

I envision a system where I can teach AI what a ‘knee’ is, by touching the ‘knee-sensor’ that is linked to the virtual bot. Next, when I ask ‘touch your knee with your hand’, the bot will simply do so as feedback to me, but also receives feedback from it’s own ‘knee-sensor’, that way making it a reality for the bot.

We already know from neuroscience that ‘feedback loops’ are an intrinsic part of how humans learn and operate.

 

 
  [ # 4 ]

touching, interesting.

This one will require some power: if your hair (in general, not ours wink ), is waving is the wind and touching our schoulders, hands, and even of someone else. Virtual robots should actually be aware of all small inputs you can ever imagine.

And then: shock waves, like now in Christ Church, virtual robots should be able to feel that as well (physical robots btw too).

But eventually, we’ll get there.

 

 
  [ # 5 ]

I totally agree with you there…. and most of that is already technologically possible. We have sensor-grids or sensor-nets for monitoring large areas for high density measurements. And as for the shock waves, the physics model in the game-engine of Blender already handles that, as is does gravity, acceleration/deceleration, aero- and hydrodynamics, collisions…..

 

 
  [ # 6 ]

@HP: I once heard, in the beginning of 2010 as far as I remember, about a (academic) program (not a software proram) with the objective to bring back all robot-knowledge together. Unfortunately, heard it through word-of-mouth, but the source was reliable. Unfortunately, I have never been able to track down this project. Have you ever heard of it?

 

 
  [ # 7 ]

Doesn’t sound like something I’ve come across. But then again, I’ve not looked in to robotics for the last few years, other then what comes along in internet news from the big technological community sites like slashdot and the likes.

I’m trying to stay away from true robotics, as that is yet another technological domain that you can get totally submerged in. For now I’ll stay at virtual emulations of robotics without having to deal with real hardware sensors, electronics (although I do actually have an electronics diploma as well), and stuff like pattern recognition and such.

 

 
  [ # 8 ]

Well, I’m not Hans Peter, and this may not be exactly what you’re looking for, but I found this to be interesting:

http://robots.net/article/3100.html

smile

 

 
  [ # 9 ]

Have you seen this
http://www.erwinvanlun.com/ww/C151/

I haven’t updated it for quite a while, but still interesting.

 

 
  login or register to react