SCRC:Indoor Robot-Magellan: Navigation

From SCRC
Jump to: navigation, search

The navigation group is responsible for keeping track of where our robot is and where it needs to go next. This presents several challenges which the Robot Operating System (ROS) helps us address.

Course

The course will be indoors and will roughly approximate a portion of a living space, with a family room area, hallway, and dining area. There will be obstacles within the course consistent with an indoor living space. For at least the first iteration of this course, all of the objects, sizes, and locations will be clearly defined well in advance of the competition. A proposed layout of the course and objects is shown above and in the appendices, as well as a dimensioned drawing of the proposed layout.

The “walls” of the course will be a minimum of 11” tall. The “floor” of the course will be whatever floor is in the contest area. It is anticipated that this course would be placed on the parquet floor in the Seattle Center Armory building.

Robots will be placed at the designated starting point prior to each run (the green square in the above illustration). The ending point is shown in the above illustration as a red square. It is unlikely that these locations will be marked in the actual arena, but will comprise the 2’x2’ spaces in the location shown above.

For this first iteration of the contest, there will be an athletic shoe placed as shown in the above illustration and the appendices, as an obstacle to avoid while running the course. Moving the shoe as part of running the course is permitted. Similarly, there will be a dog dish in the dining area, and moving the dish with the robot is also permitted.

There will be a single can of pop placed in a precise and secure pre-defined location on the bottom-most horizontal surface within the mini-fridge, as close to the door as possible while still allowing the door to close completely. The exact model of mini-fridge to be used and the specific location of the can of pop will be published a minimum of 4 months prior to the competition.

File:Http://robothon.files.wordpress.com/2016/08/popcanarena2016.jpg

2017 Strategy

The direction set this year is to implement as much as possible in the Robot Operating System, as this project contains a large amount of general purpose, working robotics code.

Robot Operating System (ROS)

Moving to and implementing ROS takes care of not only many of the communication problems faced by robotics, but also allows us to pull pre-built and tested software into our project easily and efficiently. The ROS Navigation stack incorporates robot localization, mapping and route planning; all ready to use, provided the robot is sending and receiving sensor data on the appropriate ROS topics.

Sensors

Two primary sensors will be incorporated into the Sno_Bot. Depending on functionality and time, more may be incorporated.

1. 4-bit encoder on each motor
2. Microsoft's Kinect 3D-sensor
3. (6 or 9 DOF IMU)

Sensor data fusion and localization

Sensor fusion and robot localization are handled by the robot_pose_ekf package which uses an extended Kalman filter with a 6D model (3D position and 3D orientation) to combine measurements from wheel odometry, IMU sensor and visual odometry. Localization will be further aided by the depthimage_to_laserscan package, which uses the depth information provided by the Kinect and produces a simulated laser-scan line used by the mapping and localization systems.

2016 Strategy

First, we can utilize dead reckoning to keep track of where we are based on a known location (the start point) and how far and in which direction we've traveled. Second, we can use sensors to help compensate for any errors in our dead reckoning and also to help facilitate obstacle avoidance. We know where we start, where we need to finish and what we need to accomplish in between. Using the dimensions given to us, we should be able to get from point A to point B just by calculating how far we've traveled. In order to determine how far we've traveled, we'll need to keep track of our motor direction and speed. How we go about this will depend on the propulsion system used. If we're using stepper motors, this will be a simple matter of keeping count of how many "steps" each of our motors has taken and calculating distance from this count. If we decide to use another type of motor, DC geared motors for example, we can still keep track of distance by attaching a sensor such as a rotary encoder to the drive shaft of the wheels and keeping a count of the number of "pulses" these encoders provide.

Sensors

Dead reckoning is great for getting us started and in an ideal world, would probably be all we needed for this particular course. Unfortunately, this is the real world. We may have wheel slippage or something else that throws our distance calculations off of their ideal value. We may also run into obstacles along the way that need to be avoided. To compensate, we'll need some sort of sensor (or array of sensors) to both compensate for any error in our dead reckoning calculations and sense the need to avoid any obstacles in our path.

In recent club meetings, we've discussed several possible types of sensors that could be used to aid in navigation:

  • Sonic "ping" sensors - testing of these inexpensive sensors has shown that they can be relatively accurate under ideal conditions (flat objects perpendicular to the sensor), but run into issues when the "ping" strikes objects at an angle.
  • Optical laser type sensors - we've briefly discussed several options which will need further investigation before we can determine if they are viable
    • "Time of Flight" sensors - these operate similar to the sonic sensors in that they send out a pulse and determine distance based on how long it takes the sensor to pick up the return pulse.
    • One suggestion was a combination "vertical line" laser and a camera. The laser would be mounted at an angle to the camera and turned on while a still picture is taken. The laser is then turned off and another picture is taken. The difference of the pictures is then computed which should result in a vertical line showing where the laser intersected with objects in our path. The range of the object(s) in our path would then be calculated based on the angle of the laser to the camera and how far (left or right) the laser image appears in the difference picture. See [1]
    • Similar to the above idea (and probably cheaper / easier to implement) is using a laser pointer mounted on a servo which could track back and forth until it's reflection is picked up by a phototransistor. The angle of the servo at that point could be used to calculate the distance of the object the laser hit. Not sure how fast this method would be as we will be limited by the speed of the servo, but it should be relatively simple to setup and test.
  • "Whisker" sensors - while these won't help us determine the range to an object, they will be helpful in determining if we missed sensing an object with our range sensor(s) or if we've drifted off course and are too close to a wall.

Discussion

I propose we use this format transmitted serial at 9600 8N1 RS-232. A USB to serial port adapter can be used for input to the main computer of the robot:

           // NAV FORMAT
           //AA   Header Start
           //  0000    Compass integer 1800 = 180.0 degrees
           //      0000   bow 000,0 distance 0x400 0-1024 , 0 whisker/bump 
           //          0000    aft 000,0 distance 0x400 0-1024 , 0 whisker/bump
           //              0000    port 000,0 distance 0x400 0-1024 , 0 whisker/bump
           //                  0000    starboard 000,0 distance 0x400 0-1024 , 0 whisker/bump
           //                      0000    reserved
           //                          0000    reserved
           //AA0000000000000000000000000000

All data shown are eight bit. The header AA is transmitted as ASCII decimal 65.

Feedback welcomed: Chas

Feedback

At present, all of our discussions are in the theory and planning stage. Please, please, please... If you have any feedback or suggestions, feel free to make a post here on the wiki, post a comment in the robotics forum on our club forum page (http://www.snocomakers.club) or bring your ideas and recommendations to the next club meeting. We'd love new ideas (no matter how far fetched you think they might be). Now is the best time to present ideas as we can act on them as early as possible and implement them in our build!