I’ve always wanted to build a bigger robot. Something that could run around the house, be able to interact in a human sized environment, and, who knows, maybe even do something useful someday….although that is not a requirement Mostly, I wanted to get away from robots like the firefighters where every move was preprogrammed. I wanted a robot which decided for itself what to do and maybe even had some personality and learning capability.
About a year ago, Bruce Weimer, a member of our robot club (RSSC.org) gave a presentation and demonstration of some artificial life software which he had written inspired by the computer game creatures. He had a simulated critter called “Leaf” which could speak and do voice recognition and do some simple visual recognition using a webcam. As he explained what plans he had for this program, I realized that this was the kind of functionality that I wanted to build into my new robot.
I proposed that I would build a hardware mobile robot to host his software and he was enthusiastic about the idea. At about the same time, Robin Hewitt expressed an interest in working on the project and specializing in vision software. Well, there is more than enough work for three people (or maybe even 30), so we became a team. Over the last year, we have build one of these robots for each of us. Rocky was the prototype and is mine. Robin has an almost identical model called Mabel, and Bruce is building a larger model with a more Star Wars look which will have the original name of Leaf.
We have found that our software interests fit well together with simple interfaces such that we can develop software independently yet, amazingly, it almost always works right away when we integrate our programs. Bruce continues to build the artificial intelligence part including the speech software; Robin is specializing in all the vision aspects which will include object recognition, navigation guidance based on visual cues and any other aspects associated with the vision. I am providing the microcontroller code for motor control and sensor I/O as well as basic navigation and control software in the laptop.
We have a separate website for the Leaf Project which you may link to here. Our goal is to provide open source software and hardware design so that anyone can just copy what we have to get started.
Come check it out. Build one! So far, the Leaf robots are proving to be a capable hardware platform and the software functionality is coming along well and is great fun.
Here is Rocky’s face. It is just a simple prototype but is capable of changing expression as directed by emotion data from Bruce’s AI software. The face is all made up from bitmaps, so someday I just have to create some better looking bitmaps! Below the face is a scrolling text display for status output and troubleshooting.
This shelf holds components associated with the laptop. The laptop has two USB ports, one is directly connected to the webcam, and the second goes to the hub on the left of the picture. The hub is shown with a wireless adapter connected. The black box is a Devantech CMPS03 compass.
Just below is the microcontroller shelf using the same Motorola HC9S12DP256 processor I used on my last firefighting robots. The processor card has been placed on a mother board which brings out all the I/O to telephone module connectors which are proving to be very reliable and easy to use. The mother board also contains a dual axis accelerometer, a rate gyro and a parallel to USB adapter.