Wednesday, September 29, 2004

Very interesting custom sensor package in Odessey robot

One robot builder I spent a lot of time talking to was Ted Larson who submitted a robot for Robo-Magellan named Odessey. I was very interested in Ted's work on his sensor systems, especially his gyroscope & accelerometer based inertial navigation sensor. Ted also has a couple of good pictures of the contest.

Monday, September 27, 2004

A very cool bump sensor

I'm really intrigued by the bump sensor created for the Merriwether robot for the Robo-Magellan contest. These guys have quite the innovative solution to getting a good 180° of collision coverage.

A bit of sample code

One of the reasons I have been pursuing Windows XP Embedded and the .NET Framework 2.0 as the brain of Cylon is to be able to build on top of a high level programming language that gives me much more power to write detailed algorithms. Here are two example methods from the CylonControl class that should give you an idea what it looks like to write robot control code in C#:

private void _Track()

if (!_Tracking) return;

Double delta = 3 * _HeadingDelta(_TrackHeading, _Heading);

// Govern down the turns so they aren't too severe
if (delta < -120) delta = -120;
if (delta > 120) delta = 120;

Steering = (SByte)delta;

if (_StopWhenTurnComplete)
if (Math.Abs(delta) < 15)
_Tracking = false;
_Status = "Turn complete.";

private void _UpdateTelemetry()
while (!_Disposed)
lock (this)
lock (brainstem)
_Analog0 = (Int16)brainstem.Analog_ReadInt(0);
_Analog1 = (Int16)brainstem.Analog_ReadInt(1);
_Analog2 = (Int16)brainstem.Analog_ReadInt(2);
_Heading = brainstem.CMPS03_GetHeading(Degrees);



Now what may not be evident here is just how much simpler you can make your robot logic once you have a powerful CPU & operating system that can use techniques such as multiple threads, locking, etc. One of the things I hope to find out over the next few years is just how much easier it is to write complex behavior mechanisms now that I don't have to rely upon state machines and complex time-sharing logic due to the limitations of a small CPU.

Operating system provided services such as networking, threads, asynchronous I/O, should bring to robotics the ability to develop much more sophisticated programs. For instance, my UI, telemetry, and tracking methods all run on different threads simplying what each method has to do and allowing me to get much more efficient use of resources such as the slower RS-232 link between the Brainstem and the Mini-ITX PC motherboard. A simple locking mechanism keeps the threads from each trying to access the same resource simultaneously.

In a future article I'll go into some detail about the threading model of Cylon and how the various components interact between each other and the hardware.

Screenshots of Cylon's User Interface

Cylon has two major UI components.

GpsViewer for finding, averaging, and plotting course waypoints (click to enlarge)

CylonUI for viewing the current state and telemetry of the robot while it is in operation:

Uh, obviously I'm an API developer, not a UI guy. My design skills are a bit, ah, primitive.

One cool thing I did with GpsViewer is use the Tablet PC SDK to ink-enable the app. You can draw on the course image with the waypoints to either point out obstacles or help debate course strategies with your team members, and I use an Ink enabled textbox to allow you to lable the Gps waypoints using your handwriting as you walk about the course so you don't have to try and type description into a keyboard while walking.

Robo-Magellan Waypoints at the Seattle Center

On the SRS mailing list, the topic came up of what the GPS lat/long co-ordinates of the contest were. I saved my co-ordinates so I could come back and re-run the race later, so here they are:

The three midpoints are each in an area that has a clear, straight-line shot from the previous waypoint. Each midpoint is also out in the open, it was my intention to use dead-reckoning between each waypoint and then pause and recalibrate my position estimate using GPS at the waypoint before continuing on to the next.

The path from Mid 3 to End is very cluttered due to the layout of the course. Mid 3 was about as close as I could get to the end point without having to deal with tight path constraints due to obstacles. Navigating from Mid 3 to End could theoretically be done via dead reckoning, but serious obstacle avoidance would be needed in all but the most lucky of runs.

Start 47.620593, -118.315947
Mid 1 47.620706, -118.315652
Mid 2 47.620701, -118.315385
Mid 3 47.620664, -118.315090
End 47.620537, -118.314920

Sunday, September 26, 2004

"So you said this was a software project but all you've posted on to date is hardware?"

OK, let's look at the software components of Cylon:


A Managed C++ recompilation of the Brainstem GP board which is used to do all the sensor & motor I/O.


A library that talks to a Gps unit over RS-232 and converts the Gps's output into managed objects (AKA GpsPoint, GpsFix events). It understands the NMEA-0183 protocol which is commonly used by consumer GPSes. It has been tested with the Delorme Earthmate GPS.


A Managed C++ library build on top of DirectShow that exposes a standard web-cam as high level objects. This allows you to get RGB24 bitmaps out of the web camera so you can process the images in any way you want from C# or other .NET languages. It has been tested with the Logitech QuickCam 3000 Pro & 4000 Pro.


A C# library that takes the bitmaps from VideoCapture and performs a color pattern recognition search. The output of this library is "there is a cone at 10 degrees to the left about five feet out". This library was written by my wife, Tracy Beavers, who is into the AI and recognition aspects of robotics.


The "central nervous system" of the robot. This runs on the robot and pulls together all the libraries above. This library runs the robot control loop (AKA "follow this heading" and "turn towards this waypoint") and exposes TelemetryData which is used to display the state of the robot including what actions are being taken, what all the sensor and motor settings are, etc.


A lightweight application that runs on the robot. It creates the CylonControl object, shares it out to the network via TcpRemoting, and runs a simple UI on the robot to display a subset of the TelemetryData.


The central control UI for the robot that runs on a laptop. This connects over the network to CylonControl via TcpRemoting & CylonHost and then is used to view TelemetryData and issue commands like "Start", "Stop", etc. This doesn't actually tell the robot when to turn, move, etc. it is just the UI for a person to use to monitor the progress of the robot, etc.


A waypoint calculating application which runs on the laptop. Before the robot is started, I attach the GPS to my laptop and walk the course. This application reads the Gps, averages the readings using a Kalman filter to get more precise waypoints, and then stores a list of ordered waypoints. These waypoints are downloaded to CylonControl and used by CylonControl to navigate the course. Eventually I will integrate CylonUI and GpsViewer into one application for simplicity.

As you can see, there was quite a bit of programming involved in the development of Cylon. I'm planning on making binaries of some of these libraries available (GpsReader, VideoCapture, Brainstem.Net) in the near future.

A few pictures of Cylon

Cylon's Electronics (Click to enlarge):

Cylon's Chassis (Click to enlarge):

Saturday, September 25, 2004

Today Cylon competed in the Seattle Robotics Society's Robo-Magellan contest

After many months of ramp-up today was the day. We charged up the batteries, packaged up the critical triage equipment in case of failure, and hiked the whole family out to the Seattle Center for the Seattle Robotics Society's Robothon conference. Once there, Cylon joined the lineup for the Robo-Magellan contest.

The goal was to cross about 100 meters of varied city park-like terrain (grass, rolling hills, sidewalks, benches, trees, trash-cans) and find and touch an orange traffic cone. The coolness of this contest is in the number of challenges you have to simultaneously solve in order to win. You have to:

Navigate 100+ m of distance via multiple waypoints and arrive within 3 m of your destination. This, for all intents and purposes, requires GPS because it is highly unlikely that you could get 3 m accuracy after 100 m of crossing this varied terrain with just dead-reckoning.

Because GPS is good for finding your approximate location over long distances but horrible at telling you your exact location and heading (it's good to +- 5 m or so, +- 5 degrees or so), you really should have some form of inertial navigation (compass / gyroscope plus dead reckoning) to allow you to move with reasonable accuracy 3-5 m between waypoints. You need enough accuracy to be able to dodge trees, benches, etc. that you just can't get from a GPS unit.

The traffic cone is in a particularly cluttered area of the park -- trees, benches, concrete pillars, trash cans, etc. surround the cone. In some places, there is less than a meter clearance. To avoid these obstacles, you need collision avoidance sensors like whiskers or IR/Sonar range finders.

To find and touch the traffic cone in the middle of the clutter, you need to be able to use some form of vision recognition system. Most people go with a simple system that relies on color detection -- if you see a large number of orange pixels to the left, you assume that that is the direction of the traffic cone.

To win this challenge a robot not only needs to be able to handle varied terrain and outdoor conditions, it needs to integrate all these sensors together. Getting it working properly on one robot is a heck of a challenge.

So, how did Cylon do? Well, poorly. I was, ah, a bit rushed assembling the final configuration. Bugs, a hardware failure here and there, you know the drill. I ended up not getting any time to test the robot at the actual contest site before the race. This proved to be a fatal problem.

I chose to use 802.11b wireless as my fail safe switch. Every robot in the contest must have a fail safe mechanism so it can be quickly turned off in case it goes wild and starts chasing a six year old around the park. This mechanism should stop working if there is a "lack" of something, similar to the lever you have to hold down on a lawn mower to keep the engine running. The theory is that for a potentially dangerous machine if the person or thing controlling it stops doing something, it automatically shuts off or "fails safely". In my case, if my laptop computer stopped sending a "heartbeat" signal every 300 ms the robot would shut itself off.

Something strange was in the air this morning. For some reason my robot and my laptop, even though they were only 2 m from each other, kept losing the 802.11b signal between them a couple of times per minute. At the starting line Cylon started up, drove a meter forward, lost the 802.11b signal, and shut off -- kind of anti-climatic for a passionate project composed of months of late nights. But heck, in the last robotic competition I was in I was solidly beat by a 10 year old boy with a wicked Mindstorms creation. :-)

Let me introduce myself & my robot

I'm Jay Beavers. I'm owned by my robot, Cylon. Cylon has been working on me for the past year or so.

Before my life was enslaved by Cylon, I played around with smaller, less costly, less possessive robots. I started down this dark path with Lego Mindstorms -- I purchased one of the first Mindstorms kits from a FAO Schwarz store in Seattle the first day they were out. From there, I fell into harder stuff. I moved to a CMU Palm Pilot Robotic Kit from Acroname. Then up to the Brainstem (again from Acroname). At that point it became a compulsive habit. A little dabble with the Handyboard, Cricket, and OOPic here and there. But I was still in control. I could stop any time I wanted. I just didn't want to. Then, hard times. Yes, I happened on the Via Mini-ITX x86 motherboards and Windows XP Embedded. And now my robot Cylon owns me.

I'm Jay Beavers and I'm a roboholic.