AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

Using ChatScript on my mobile robot
 
 

Hello Everyone.

I’ve been building and programming mobile robots most of my life, and I have some questions about integrating ChatScript on board my current robot to give her some conversational abilities.

My current robot is Anna. She was developed on an electric wheelchair platform, and I’ve just recently moved her to a custom built chassis. She’s somewhat autonomous, using speech recognition and TTS, kinect camera for object recognition and navigation. I’ve also taught her to harass my wife’s cat. lol. She’s also remote-controllable through internet connected laptops and smartphones. She’s about three feet tall, weighs around 150 pounds at the moment, and has a runtime of about 8-10 hours (or 35 miles) between recharges.

The next phase of my work is going to be giving her conversational abilities. I have two young neices that really want to be able to converse with her. I’m hoping that by integrating some chatbot technology that this will be possible.

The three most important goals for me (as far as the chatbot software) are : 1. Eventually, I’d like the interactions to be via voice, but I can handle that in c++.  2. I’d like to be able to read/write external system commands (ie: what are your ping sensors showing?) through the chat interface instead of WSRM, and 3. I’d like Anna to be able to “learn” facts or do lookups from local documents and external websites.

I’ve been doing serious research on Bruce’s ChatScript, and I really like it.  Can anyone offer feedback or experiences on if ChatScript will do what I’m hoping for?

If you’ve done it, I’d also love to see some sample syntax on how to access the internet (like a google lookup). My motor control and sensor systems work through a USB COM: port on a Windows-based laptop. I’d also like some ideas how CS might be able to read/write to it. Do I need to work from a file (export/import), or can I read the port as if it were a file?

And one last question…I see from the CS documentation that it can read/process text files, ie: mobydick.txt. I issue the command to read the document, and it seems to do so. But what does it do with what it reads? I can’t get anything back out of it after reading the document. Do I need to set up a topic file for the document it reads? The simple question is “What exactly does CS do with the text file it reads? Does it retain anything, and if so, how do I access it?”

Any feedback would be most appreciated. I’m still very much in the research phase before deciding which direction to go with this project, so please accept my apologies in advance if these questions are simple or mundane. I promise my questions will get more difficult as I learn more. cool smile

John

PS. I’m currently working on building her Bio, as Bruce suggests. Any advice on that would be welcome, as well. I have a very firm idea of her personality (based on the computer character in Society of the Mind. Not the Minsky writing, but the book by Eric L. Harry.

 

 
  [ # 1 ]

You should read the ChatScript External Communcations document which tells you how to make calls to System, TCPOpen, Popen, and how to embed CS in a local app, although it does not currently support https protocols.

Document reading mode acts like a bot that reads all the document as a single volley. That is meaningful only if you write a bot to do something specific with the information as it passes by. The NLTK bot that you can build in ChatScript has a couple of commands it supports. You can read its documentation for how to use it. Normally it just prints out stuff, but one could modify it to use ^log to write stuff to a file other than the default user log file data.

 

 
  [ # 2 ]

Bruce,

First and foremost I’d like to thank you for making this program open source. I love it. I truly appreciate your sharing your work this way. I also see how active you are in various places in offering support. Kudos to you! Thank you for that.

I have printed ALL the docs and I’m currently muddling my way through them as I have time in my insane schedule. Your explanation of document mode certainly helps point me in the right direction as to how I need to write the rules regarding that and a few other things.

I’ve been playing around with the System calls, etc., and have that portion of it able to call a directory listing or a batch file.  I’m still trying to get my head around the syntax…I just need more practice time, I guess.

I was also going down the bunny-trail of embedding ChatScript within my system, and just issuing a call to it when I need it. The fact that you mentioned that as well, tells me I’m going down the right trails.

I appreciate your time and your response, Bruce. It’s an honor to correspond directly with you. Thank you.

John

 

 
  [ # 3 ]

OK. So if you embed chatscript in your app, then you can use whatever voice-to-text library to create text for CS. And if you use the out-of-band syntax for passing commands to your bot from CS, then you could voice in “what are your ping sensors showing”, have CS put something like [ping-status] Checking…  as output, your system then tells the user Checking… and does its ping-status thing, presumably displaying that to the user as well. Of course you can use OOB syntax to send back any command, so NOTHING needs to be done from CS in talking to the net or OS if your calling app can do those things.

 

 
  [ # 4 ]

Beautiful!!! Thanks for the follow-up post. I’m loving the potential more and more cheese

This is exactly the kind of feedback/direction I was hoping for when I asked my questions.

 

 
  [ # 5 ]

I can sympathize with the fact that there is a lot of documentation to read.  When you find something you want to try and have questions about it, feel free to ask away.

 

 
  [ # 6 ]

Once I get up to speed on ChatScript, I’d be happy to help you with it wherever I can. Documentation, testing, etc. That will be my way of saying thank you for releasing it as open source.

Geographically, I’m very close to OSU. What I read of their implementation for a Virtual Patient was very interesting. They are very well respected here (in many aspects). If they’ve changed to using your system, that says quite a bit to me.

 

 
  [ # 7 ]

Their problem with AIML was as their system got bigger, AIML became more unmanageable. It slowed their server down tremendously and it was getting nearly impossible to add patients and questions without breaking something that was already working. AIML is easier to get started with, so it makes a great intro system. But ChatScript is targeted to things beyond ChatBots, where speed and maintainability are more critical. Hence the :verify system, for example, for automatic testing.

 

 
  [ # 8 ]

Re: targeted to things beyond chatbots…I couldn’t agree more. Although my mobile robot project is certainly not mainstream usage, I had an interesting thing happen last week.

When I was talking to a local hospital administrator (not OSU. lol) about what I’m trying to use ChatScript for, they asked about using it as a training system for administrative employees to navigate their various info and telephone systems. They want to talk more about that once I have some more depth and can show some simple examples.

 

 
  [ # 9 ]

There is a British robotics company that is interested in us building them a “receptionist” bot as an application for their bots. Also, you could point your hospital administrator at: http://www.figaroavatars.com/  who does ChatScript for the health care industry.

 

 
  [ # 10 ]

Receptionist bot is a great idea. Patient self-registration was a topic mentioned in my discussions last week. Thank you very much for the link. I will pass it along.

 

 
  [ # 11 ]

Also, AISoyRobotics in Spain has been using ChatScript embedded with their bots, so you can voice chat with the bot and ask it status things and command the bot. The fuller list of things ChatScript has been involved in is on the projects page of brilligunderstanding.com

 

 
  [ # 12 ]

I’ve been to your website many times. It’s one of the things that made me consider ChatScript.

I’ve never come across AISoyRobotics…thank you! I’ll look them up.

 

 
  [ # 13 ]

The Healthcare Common Procedure Coding System has a code K0011 for Standard weight frame motorized, power wheelchair with programmable control parameters for speed adjustment, tremor dampening, acceleration control and braking. John, has it occurred to you that your electric wheelchair robot may be considered a motorized wheelchair, like iBOT, which may then be covered under the standard benefit?

 

 
  [ # 14 ]

Pla Net,

I’m in the USA, but oddly enough, that’s how this version of Anna first got started.

Someone contracted me to develop a way to stop an electric wheelchair user from inadvertantly backing down stairways and other drop offs.

Anna became my testbed for reverse-engineering the chair electronics. A few ping sensors, an Arduino, and many man-hours later, I was able to give them a reliable system of cliff sensors (and object avoidance) that would detect (and sound an increasing alarm) and alert the user before going over a drop-off. If they got critically close, it could stop the motors, put the brakes on, and put the chair into fault mode until the user pressed an override button.

Once I got their cliff sensors installed and working for them, just for fun I decided to hack a kinect to my testbed to add some vision and using it’s microphone array for voice recognition,  and Anna was born.

I’m currently in the slow process of getting this system prototype to be plug-in compatible to various wheelchair manufacturer’s. They all use their own proprietary communication protocol. Once it’s ready for market however, my cost will only be about $100 or a little less per end-user-unit.

 

 
  [ # 15 ]

One thing to consider with either AIML, RiveScript or ChatScript is that they usually have giant knowledgebases of thousands of records. Perhaps a smaller knowledgebase with a few hundred records like Elizabot would give the patient more control?

AIML:

Patient: Wheelchair
Wheelchair: Yes, Patient.

Patient: Come here.
Wheelchair: Coming to you now.

Patient: Rotate to face me.
Wheelchair: Rotating to face you now.

Patient: Apply safety brakes
Wheelchair: Applying safety brakes now.

Patient: What room are we in?
Wheelchair: Your bedroom, Patient.

Patient: Go downstairs.
Wheelchair: No. I am not programmed to do that.

Patient: Let’s go to the kitchen then.
Wheelchair: Of course, Patient.

 

RiveScript:

Patient: Wheelchair
Wheelchair: Hello, Patient.

Patient: Come here.
Wheelchair: Coming to you now.

Patient: Rotate to face me.
Wheelchair: Rotating to you now.

Patient: Apply safety brakes
Wheelchair: Safety brakes are applied.

Patient: What room are we in?
Wheelchair: The bedroom, Patient.

Patient: Go downstairs.
Wheelchair: Sorry, I can’t do that.

Patient: Let’s go to the kitchen then.
Wheelchair: I can do that, Patient.

 

ChatScript:

Wheelchair: May I help you, Patient?
Patient: Yes, Wheelchair.

Wheelchair: OK, may I come to you now and rotate to face you?
Patient: Yes, Wheelchair.

Wheelchair: I’ve applied the safety brakes, Do you know what room we are in?
Patient: Bedroom

Wheelchair: Yes, Patient. Which room would you like to go to now?
Patient: Downstairs.

Wheelchair: No, I am not programmed to go downstairs. Please choose somewhere else.
Patient: Let’s go to the kitchen then.

Wheelchair: Yes, Patient.

 

★ Elizabot:

Patient: Wheelchair
Wheelchair: (beep) Ready.

Patient: Come here.
Wheelchair: (beep) Coming.

Patient: Rotate to face me.
Wheelchair: (beep) Rotating.

Patient: Apply safety brakes
Wheelchair: (beep) Braking.

Patient: What room are we in?
Wheelchair: (beep) bedroom.

Patient: Go downstairs.
Wheelchair: (alarm) wink OK

 

 

 

 1 2 > 
1 of 2
 
  login or register to react