AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

Uncanny valley: When robots look real
 
 

One of the most realistic looking robots so far: Geminoid.

 

 
  [ # 1 ]

Yes, it really looks real, doesn’t it. Only the movement gives it away, it’s a bit creepy, I think.

 

 
  [ # 2 ]

So does http://www.youtube.com/watch?v=4sjV_lxSVQo&feature=related

All she needs is Turing Test passable NLP, visual recognition and mobility & we’re there.

In order of complexity though, I believe it is

(from easiest to most difficult)

Facial Expressions
NLP
Visual rec / mobility

 

 
  [ # 3 ]

You’ll love the examples mentioned here:

http://www.erwinvanlun.com/ww/C151/

I haven’t worked on that website for quite a while (just doing Chatbots.org stuff at the moment), but my plans are to continuing my trend research Q3.

from easiest to most difficult

What is visual rec?

Expression understanding goes further than ‘facial expressions’. Most expressions are not being recognized by humans at all. There are just a few people in the world who have specialised in body language.

But still, even taking that into consideration, language is the higher abstraction, it’s more complex by defintion, also the reason why other species don’t have language.

So we agree on that.

Once robots understand the world, the need to have a purpose, a reason to survive, somehting to optimize on, in order to survice, to repair/heal themselves, and… to reproduce themselves. So we shouln’t look after them any longer.

I’m not convinced that this is necessarily more comlex than NLP.

 

 
  [ # 4 ]

visual rec , sorry, being lazy , visual recognition.

Your comment on READING facial expressions , agreed.

I want to put my CLUES engine into Nexi —- at that URL posted above http://www.youtube.com/watch?v=XrmrU7P-ysA&feature=player_embedded#at=36 -— -second last robot on page.

Hope I can buy a Nexi some day to plan the code into it !!!!

 

 
  [ # 5 ]

In that case I would say:

In order of complexity though, I believe it is

(from easiest to most difficult)

Facial Expressions (recognize expressions in humans)
Visual recognition (recognize everything in everything)
NLP (being able to talk about what has been recognized earlier in a sensible way)

 

 
  [ # 6 ]

add 4th to your list Erwin…

learning via NLP

 

 
  [ # 7 ]

NLP is about understanding. When you understands what has been said, you’ve learned, didn’t you?

 

 
  [ # 8 ]

NLP is about understanding. When you understands what has been said, you’ve learned, didn’t you?


I guess for humans (well, most of us at least)  that would/should be true, but I wouldn’t say that of AI. Take Watson, it’s doing NLP, but I’m not certain it is learning from the questions.

 

 
  [ # 9 ]

NLP is about ‘processing’ (hence the ‘P’). Understanding or comprehension is another part of the problem all together. Watson is indeed a great example; it uses NLP as the ‘user interface’ but besides that it’s more or less a search-engine on steroids. As I’ve mentioned somewhere else before, even some Watson team members don’t see it as AI themselves.

 

 
  [ # 10 ]
Erwin Van Lun - Mar 10, 2011:

NLP is about understanding. When you understands what has been said, you’ve learned, didn’t you?

I believe this is correct.

Right now, Grace is very very young and only has a small mastery of language.

But she can ‘understand’ (ok, whatever word you want to use), things like : “Sam took some money out of the bank because he was going to the Casino”

and respond when asked where Sam went.  She will actually know that she assumed he actually made it to the casino (that is intention was to go to the casino). 

Later she will handle if you said “no, he didn’t make it to the casino” and adjust her belief.

This weekend she will learn the difference between connecting main clause to a subordinate clause, not with “because” but instead with “if”.

“Jane will be angry if Jack went to the casino”

note that the subordinate clause, if taken in isolation, (“Jack went to the casino”) does NOT mean he actually DID go to the casino (because of “if” subordinate conjunction).  Not like “Jane is angry because Jack went to the casino”.

So right now she knows that if subordinate conjunction is “because” (and yes, any other term like “since”,  “cuz”,  even “bc” (short form)) , and later even misspelled “becuase” ).

Later, abductive reasoning module will be added to allow her to handle things like, given the fact above, and user stating “Jane is angry.”, she may hypothesize , “Why, did Jack go to the casino ?”  That should prove most interesting !

I don’t think Watson is doing NLP like Grace.  I could be wrong, but many videos, the developers of Watson out right stated “Watson itself can’t read a page of text and understand it.”

Hans, agreed , Watson = very powerful key-word , statistics based search engine.

 

 
  [ # 11 ]

Merlin, I hijacked, didn’t I?  sorry—Dave place above in my CLUES thread , thanks!  (+remove this post)

 

 
  [ # 12 ]

Actually, Victor, I see no need to make any changes. Comparing the functionality of your project with someone else’s, doesn’t necessarily constitute “hijacking”. so unless Erwin feels differently, I’ll leave it as it sits. smile

 

 
  [ # 13 ]

No worries Victor. Many of the topics meander around.
Using your own development as examples/contrast is helpful in understanding.

Some day I will have a bot that will automagically scan the topic heading “Uncanny valley: When robots look real” and be able to identify posts on NLP and copy them into the NLP thread. wink

 

 
  [ # 14 ]

Thanks guys!

I also don’t mind your bot progress reports, etc in CLUES thread!

 

 
  login or register to react