AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

In Search of Intelligence
 
 
  [ # 16 ]

“That opens up a whole new topic: volition. Would a machine have any inherent goals regarding survival if we didn’t program such goals into it in some way, even indirectly such as via artificial pain and artificial pleasure? As open-minded as I am, I don’t believe so.”

My hypothesis is: the machine wants to answer questions because that’s the way I programmed it at the start. In trying to answer user questions, eventually it has to ask questions itself. (For example my logicagent, when asked who is X’s grandfather, asks itself “who is X’s father?” and then uses that response, Y, to ask itself “who is Y’s father?”) Then perhaps I set it to ask questions of itself when I’m not interacting with it, and it goes out on the web to find answers, reads wikipedia, etc. As a result of its own explorations, knowledge becomes its goal, without any need for feeling pain. (Knowledge is also the ultimate goal in Jainism, which has been around at least since the beginning of recorded history…)

So let’s build machines, and see if our beliefs are realized.

 

 < 1 2
2 of 2
 
  login or register to react
‹‹ Neats vs Scruffies      Intelligent behavior ››