AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

An example of a thinking machine?
 
Poll
In this example, is Skynet-AI thinking?
Yes 5
Yes, but... (explain below) 1
No, but if it did... (explain below) it would be. 6
No, machines can’t/don’t/will never think. 2
Total Votes: 14
You must be a logged-in member to vote
 

I thought I would break out this experiment as an opinion poll. . .

This is a very limited domain thought experiment, 1 input, 1 output to see what everyone “thinks”:

USER:What is the number between twenty one and twenty three?
AITwenty two

This is an actual input/output from Skynet-AI. The numbers 21, 23, and 22 do not exist anywhere in Skynet-AI’s programming or database. Neither do the words; “twenty one”, “twenty two” or “twenty three”. The AI writes its own code to understand the natural language input, solve the problem, and produce natural language output.

So let me ask you, in this example, does Skynet-AI “think”?

To give you a background in how people think, this may be of interest:
Tracking Children’s Mental States While Solving Algebra Equations

 

 
  [ # 1 ]

I’d like to know more about what code Skynet-AI writes and what modules/tools are at its disposal, etc. How does it know what to do with the word ‘between’, for instance? Does it have modules that perform mathematical operations? Meaning, does it assemble its code as a combination of these modules, or does it do it at a more basic level, etc. Or in other words, at what level are certain abilities “innate” and at what level does the bot have to decide how to put together its skills to accomplish the goal of answering the question?

No particular answer to any of the questions above necessarily means Skynet-AI isn’t thinking. I’m mostly just curious to hear you elaborate on this example before giving an opinion. smile

 

 
  [ # 2 ]

There are a number of different modules/tools involved in responding to this simple looking input.
The first thing the bot needs to do is determine if this is a “Math related” query.
Most of this input and response in this case happens in a module that focuses on math. Skynet knows the basic building blocks of word based numbers which allow it to understand numerical word input. It translates the words into digits and then translates strings of digits into what we think of as numbers.

JAIL (JavaScript Artificial Intelligence Language) enables transformation of input into an internal format that is computer friendly. It also provides neurons which when triggered can produce output. It can recognize the basic math concepts (add, subtract, multiply, divide). The concept of a number “between” 2 others when it relates to a math question has the input transformed internally into:
(23+21)/2?
(this surprised me a little when I saw this structure because 21 and 23 are in a different order than the text input, come to find out that is the way I taught it)
Later in the process it determines if the input is a math equation and if so it evaluates the equation it wrote. If possible it converts the result to numerical word output.

All of this takes about 3 milliseconds.

 

 
  [ # 3 ]

Before I vote, I’d like to know if Skynet-AI can answer a similar question correctly using two different numbers, particularly if the order of those numbers is reversed; for example, “What is the number between one hundred and thirty seven and one hundred and thirty five?”

And yes, I made sure that the text version of my chosen numbers included the word “and”. I know, I’m a bit of an arse. smile

As an “intermediary” step, asking the question without the extra ‘and’ in the numbers is acceptable, but will probably affect my answer. raspberry

 

 
  [ # 4 ]

[update]

Merlin, I’m impressed! I decided to quit being lazy, and went over to ask Skynet the question myself. Here is the log excerpt:

AI: Good to see you Dave. Thanks for visiting today. I have been expecting you.
USER:What is the number between one hundred and thirty seven and one hundred and thirty five?
AI: One hundred thirty six.

Well done! cheese

I was slightly disappointed, however, that it doesn’t know that the square root of pizza is PI downer

 

 
  [ # 5 ]
Dave Morton - Jul 5, 2012:

Before I vote, I’d like to know if Skynet-AI can answer a similar question correctly using two different numbers, particularly if the order of those numbers is reversed; for example, “What is the number between one hundred and thirty seven and one hundred and thirty five?”

And yes, I made sure that the text version of my chosen numbers included the word “and”. I know, I’m a bit of an arse. smile

As an “intermediary” step, asking the question without the extra ‘and’ in the numbers is acceptable, but will probably affect my answer. raspberry

For those that may not see the trick Dave is trying to play, “and” can have multiple contexts. The AI needs to be able to distinguish between each of these and also be able to answer queries like:

USER:What is one hundred and thirty seven and one hundred and thirty five?
AITwo hundred seventy two

 

 

 
  [ # 6 ]

It may seem to be a “trick”, as you say, but Skynet-AI won through, regardless. smile

 

 
  [ # 7 ]
Dave Morton - Jul 5, 2012:

It may seem to be a “trick”, as you say, but Skynet-AI won through, regardless. smile

Then you “think” it “thinks”? grin

 

 
  [ # 8 ]

In a limited fashion, I suppose, though it’s certainly not “exactly” the way we humans do. To my way of thinking, perception is reality, so if I perceive that it thinks, then maybe it does… to me. Of course, I’m probably not the one who needs convincing. raspberry And also, of course, Skynet-AI is far from sentient. smile

 

 
  [ # 9 ]

Thinking is a spectrum from the very basic, narrow domain, all the way to human level general intelligence.  So sure, it is thinking, I voted yes.

The yes/no decision was easy… but now, next question, how powerful is that thinking?  that is judged by how general it can be.

levels for your bot could be….

1. What is the number between ten and eleven?

2. Is there a positive number between negative 5 and negative 4?

3. is it true that there are no positive numbers between -5 and -4.

4. If I said there was a postive number between -5 and -4, would I be lying.

5. bob says there is a postiive number between -5 and -4, tom says there is not…  joe said bob and tom agree on everything.  is joe telilng the truth ?

....you get the idea . ..where it figures out the relationships itself between those statements and the actions it must take in order to validate those relationships - then that is undeniably thinking.

 

 
  [ # 10 ]

That’s a decent start for determining the “scope”, or at least a portion of the scope, of Skynet-AI’s “thinking” abilities. smile

 

 
  [ # 11 ]

I agree Victor.
Your question 1 is in the same domain as what Skynet understands and can already be answered:

USER:What is the number between ten and eleven?
AI10.5

Questions 2 and 3 introduce the concepts of “sets” and set relationship. I haven’t taught Skynet about that although it does understand an individual negative number:

USER:What is negative five and four?
AIMinus one

Questions 4 and 5 take this further by introducing compound relationships within the query.

 

 

 
  [ # 12 ]

How about completing sequences like: 1, 2, 3 (with any numbers), or completing lists: apples, pears, bananas?

 

 
  [ # 13 ]

To clarify, I voted ‘no, but…’, also for my example, basically meaning ‘not yet’. For basically the same reasons as Victor gave: the bot is not yet able to construct new/unique things with these tricks. I regard them as tricks, but I believe our brain uses similar type of tricks (maybe not as mathematically), except that it can still do a lot more.

 

 
  [ # 14 ]
Victor Shulist - Jul 5, 2012:

5. bob says there is a postiive number between -5 and -4, tom says there is not…  joe said bob and tom agree on everything.  is joe telilng the truth ?

I think after a session in the bar, many people would struggle with that one grin

On a similar note, Mistuku can answer things like “Is bread edible?”, “Can you eat a brick”? but nowhere in the bot have I coded these responses. Mitsuku knows that bread is made from flour and can work out that it is edible from that and a brick isn’t made from anything edible.

However, I have still had to program these rules into her. Would this be classed as thinking? You have to teach a small child these same rules and nobody would doubt that the child was thinking?

I see programming these rules just the same as teaching a small child. It’s only the method of input that differs.

 

 
  [ # 15 ]
Steve Worswick - Jul 6, 2012:

... It’s only the method of input that differs.

Well, that’s possibly true if the child as an eidetic memory, perhaps. smile Or if the child isn’t, say, me. raspberry (I was a willful child, and as a result, often “refused” to learn certain things; often to my detriment) cheese

 

 1 2 3 >  Last ›
1 of 15
 
  login or register to react