Pages

Friday, May 04, 2007

DARPA Urban Challenge: Unveiling our algorithm, Part Deux, Sensor and Actuator basics.


In a previous entry, I mentioned that we would be unveiling our algorithm. Before we do this, I will talk about the different sensors we have and use.
  • GPS: Since the race is mostly a mapping between the vehicles knowledge of its surrounding and a set of maps given by DARPA with GPS coordinates ( see Route Network Definition File (RNDF) and Mission Data File (MDF) ), we need to have a GPS. We do not have a DGPS, just a normal GPS chip with which we are interfacing with a an RS-232 interface at a rate of 1 Hz. To be specific this is a GARMIN GPS 18 PC, OEM SYSTEM, WAAS.

  • IMU: We also have a Microstrain MEMS IMU that provides acceleration, turning rate and heading (3DM-GX1). It provides information at 100 Hz.
  • Vision system: We use a series of webcams and a set of Unibrain firewire cameras. Right now our frame rate os about 15 Hz.

All these sensors interface with the Python program through an RS-232 channel. In terms of sophistication, it just does not get any better for us. One of the underlying reason is the realization that gathering data is one thing, using them efficiently is a totally different one. In particular, there are instances where reading and processing information from the IMU is not interesting.

With regards to the actuators, we currently have two large stepper motors connecting to the python program through the serial port. The first stepper motor rotates the steering wheel and the second one activates either the brake or the acceleration pedal.

One of the ways to do supervised learning, is to run the python program from a laptop that connects to both the sensors and the stepper motors. One can then run the car through the keyboard of the laptop. It works well as long as Skype is not running on the laptop at the time ( yes, we tried :-), it's a little bit like talking on your cell while driving....

In my next entry, I will discuss the modification to the webcams and firewire cameras so that they provide meaningful information. In particular, I will talk about the algorithm for the stereo system as well as the hardware and software implementation of the compressed sensing element of our algorithm (a random lens imager using a webcam). Both are in competition and we believe that the stereo system will not need to be used eventually.

No comments:

Post a Comment