Monday, May 31, 2010

Vision for Robots 1 of n: Setup

This is going to be a mini-series of posts that helps you build a practical computer vision application for your robot. We will be using the ROS robotics framework and OpenCV computer vision library.

To complete this you will need:
  • A computer
  • Some programming experience
  • Optional USB Webcam

The first step is going to be to install ROS. On Ubuntu, if you don't like reading instructions, you can try copy and pasting this and it might work. It may also cause your computer to catch on fire, but these are the risks you take when you refuse to read the instructions.

sudo apt-get install build-essential python-yaml cmake subversion wget
wget --no-check-certificate -O ~/rosinstall
chmod 755 ~/rosinstall
~/rosinstall ~/ros
echo "source ~/ros/" >> ~/.bashrc
. ~/.bashrc
rosdep install ros
rosmake install ros
rosdep install image_view
rosmake install image_view

Once you manage to get ROS installed you may want to spend some time going through the tutorials. They are well worth your time.

If you have a USB webcam that supports the uvc video class you can install one of the drivers.

cd ~/ros/stacks
svn co
cd ~/ros/stacks/bosch-ros-pkg/trunk/bosch_drivers/usb_cam
rosdep install usb_cam
rosmake usb_cam

To test the usb_cam driver create a launch file directory.

mkdir ~/ros/launch

Next, create a launch file called ~/ros/launch/usb_cam.launch that contains the following.
  <node name="usb_cam" pkg="usb_cam" type="usb_cam_node" output="screen" >
    <param name="video_device" value="/dev/video0" />
    <param name="image_width" value="320" />
    <param name="image_height" value="240" />
    <param name="pixel_format" value="mjpeg" />
    <param name="camera_frame_id" value="usb_cam" />
    <param name="io_method" value="mmap"/>
  <node name="image_view" pkg="image_view" type="image_view" respawn="false" output="screen">
    <remap from="image" to="/usb_cam/image_raw"/>

Launching the file should display images from the camera.

roslaunch ~/ros/launch/usb_cam.launch

After that you can install the I Heart Robotics ROS Packages

cd ~/ros/stacks
git clone git://
cd ~/ros/stacks/iheart-ros-pkg/ihr_demos/ihr_opencv
rosdep install ihr_opencv
rosmake ihr_opencv

The prerecorded video is stored in .bag files and can be viewed by running the launch file.

roslaunch ~/ros/stacks/iheart-ros-pkg/ihr_demos/ihr_demo_bags/launch/demo.launch

Next we will be showing how to convert the video from blue, green, red (BGR) to the hue, saturation, value (HSV) color-space.

Please post any problems with these instructions in the comments.

Sunday, May 30, 2010

MicroMouse Kansai Competition 2010

More great work from Kato-san

Friday, May 28, 2010

iRobot 710 Warrior with Anti-Personnel Obstacle Breaching System

Road clear, proceed forward.


This is a map of the University of Freiburg Campus generated with Gmapping. Conveniently enough, a wrapper has been made for GMapping to work with ROS and someone has written a tutorial on it.

This image shows the results of the TreeMap Algorithm. This has not been ported to ROS, yet.

Code for these and many more simultaneous localization and mapping (SLAM) algorithms are available at

Thursday, May 27, 2010

New Robo-One and ZMP Videos

There are some new Robo-One videos over at Ando Blog's youtube channel.

This site may also be interesting for fans of Robo-One,

Next, ZMP has a new video of their IMU based haptic interface.

Tuesday, May 25, 2010

CV Dazzle

It's good to see that research has begun. Hopefully it will be enough to fool our future robot overlords. Additionally, I believe that monocles may also be effective at throwing off face detection.

[From: jwz]

Monday, May 24, 2010

Testing: ROS USB Camera drivers

So evidently as a sign of desperation, the developers of the robotics framework that shall not be named have decided to reduce its price to $0. They might have better luck finding developers if they tried using an open source license, but that kinda defeats the purpose of locking developers into using your code, tool chain, OS and etc. As they say, I wouldn't use it if you paid me, because it would just cost me too much in the long run. Also, they have not provided me with an xBox or a Project Natal controller, so really it's their own fault that I have nothing good to say.

Meanwhile ROS appears to have the opposite problem, with multiple developers writing drivers that perform the same basic tasks. I suppose it's better to have too many options than too few but figuring out which driver to use can take some time. So to save you time and energy we will be testing and rating various ROS drivers to save you time and energy.

This week we are testing USB camera drivers for performance and functionality.
Here are the options we are testing.

The probe driver, from the Brown Robotics Group, uses GStreamer to broadcast data to an image topic. The upside of this is that it is very flexible and it is easy to connect to just about any camera. It can also be used to connect to more than just cameras, for example with the right incantations it can be used to connect to things like mjpeg streams sent over http.

On the downside, figuring out the right incantations to get anything to work can take a while. One option is to launch the driver from a terminal, which may make it easier to get GStreamer setup.
$ export PROBE_CONFIG="v4l2src device=/dev/video1 ! video/x-raw-rgb, width=320, height=240, framerate=30/1 ! ffmpegcolorspace ! identity name=ros ! fakesink"
$ rosrun probe probe

probe launch file
  <env name="PROBE_CONFIG" value="v4l2src device=/dev/video1 ! video/x-raw-rgb, width=320, height=240, framerate=30/1 ! ffmpegcolorspace ! identity name=ros ! fakesink" />
  <node name="probe" pkg="probe" type="probe" respawn="false" output="screen"/>

Developed by the Bosch Research and Technology Center for their Segway RMP based robot, the usb_cam driver offers good performance and reasonable configuration options. It is also the only driver considered that currently support the camera_info topic. Unfortunately it doesn't support saving the calibration to a file via set_camera_info. So tricking the camera calibration process into working in the boxturtle release requires editing the source.


class CalibrationNode:

    def __init__(self, chess_size, dim):
        # assume any non-default service names have been set.  Wait for the service to become ready
#        for svcname in ["camera", "left_camera", "right_camera"]:
        for svcname in ["left_camera", "right_camera"]:
            remapped = rospy.remap_name(svcname)
            if remapped != svcname:
                fullservicename = "%s/set_camera_info" % remapped


usb_cam launch file
    <node name="usb_cam" pkg="usb_cam" type="usb_cam_node" output="screen" >
        <param name="video_device" value="/dev/video1" />
        <param name="image_width" value="640" />
        <param name="image_height" value="480" />
        <param name="pixel_format" value="yuyv" />
        <param name="camera_frame_id" value="usb_cam" />
        <param name="io_method" value="mmap"/>
        <rosparam param="D">[0.032095, -0.238155, 0.000104, -0.002138, 0.0000]
        <rosparam param="K">[723.715912, 0.000000, 318.180121, 0.000000, 719.919423, 280.697379, 0.000000, 0.000000, 1.000000]
        <rosparam param="R">[1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
        <rosparam param="P">[723.715912, 0.000000, 318.180121, 0.000000, 0.000000, 719.919423, 280.697379, 0.000000, 0.000000, 0.000000, 1.000000, 0.000000]

Developed at the Stanford AI Lab, the uvc_cam driver may not have the highest performance but it seems to have more of the camera controls implemented. It also has some interesting support tools included, including a program that can be used to enumerate the available camera devices.

uvc_cam launch file
    <node name="uvc_cam" pkg="uvc_cam" type="senderIT" output="screen" >
        <param name="device" value="/dev/video1" />
        <param name="topic" value="/camera/image" />
        <param name="width" value="640" />
        <param name="height" value="480" />
        <param name="fps" value="30" />

Over at Colorado University they have developed the gencam_cu driver. This driver uses OpenCV to access the camera. The driver has pretty good performance however there appears to be no way to configure the resolution. It does however support compressed_image_transport which is convenient.

gencam_cu launch file
    <node name="gencam_cu" pkg="gencam_cu" type="gencam_cu" output="screen" >
        <param name="camera_index" value="1" />

Actually, this isn't another driver, it's just some launch files that work with usb_cam.

USB Ports

While it may seem like all USB ports are the same, depending on the bus topology and USB version, each USB port may provide vastly different performance. In some configurations testing showed a significant difference, with one USB port capable of providing 15 frames per second while another port provided 30 frames per second. You would think that two ports adjacent to each other would be on the same bus, but apparently not.

High resolution images at high frame rates tend to pin the CPU at 100% or more, which doesn't particularly make sense but that is software for you. There are probably some low hanging fruit in terms of optimizations, however they may be deep within the V4L2 stack. Profiling and optimization is left for those whose projects depend on it.

1234 user      20   0  117m  34m 8992 R  110  1.7   0:23.92 probe

Camera Calibration
Out of all of these drivers, only the usb_cam driver currently supports the camera_info topic. Without this information, image rectification using image_proc will not work unless you write your own camera_info publisher. Fortunately the camera_info_manager support class appears to be shipping with the next stable release of ROS. This should help camera driver developers implement the camera_info topic consistently and improve the camera calibration process.

Cameras Tested
The following cameras were tested at 640x480 and 320x240 with 24 bit color. Most modern USB cameras that support the USB Video Class (UVC) interface should work.
  • Logitech C600
  • Logitech Quickcam 9000 Pro
  • Built-in Laptop OmniVision OV2640 Webcam

For all of these tests, image_view is run at the same time as the camera driver to produce a simulated load and to ensure that the video is actually being transmitted.

Combo launch file
  <node name="usb_cam" pkg="usb_cam" type="usb_cam_node" output="screen" >
    <param name="video_device" value="/dev/video1" />
    <param name="image_width" value="320" />
    <param name="image_height" value="240" />
    <param name="pixel_format" value="mjpeg" />
    <param name="camera_frame_id" value="usb_cam" />
    <param name="io_method" value="mmap"/>
  <node name="image_view" pkg="image_view" type="image_view" respawn="false" output="screen">
    <remap from="image" to="/usb_cam/image_raw"/>

The rostopic command is used to measure the update frequency of the camera image messages.
$ rostopic hz --window=1000 /image_raw

320x240  C600   9000 Pro OV2640
Probe    29.942 29.945   15.006
usb_cam  30.035 30.036   7.503
uvc_cam  25.082 24.077   15.006

640x480    C600    9000 Pro  OV2640
Probe      11.914  12.373    12.807
usb_cam    15.068  15.026    10.005
uvc_cam    12.651  12.851    8.267
gencam_cu  14.711  14.974    7.503

Overall each driver contains features and ideas not found in other drivers. Moving forward it would probably be in everyone's interest to focus on improving the performance of one of the drivers and to incorporate the best ideas from the other drivers. The usb_cam driver looks to be the best supported and may be an ideal starting point, however the bsd license may be a minor issue for some.

If you are trying to get a USB camera working with ROS, it is suggested that you start with the usb_cam driver and if that doesn't work for you, try one of the other drivers.

If you don't have ROS installed, instructions are here and if you have time I strongly suggest trying the tutorials. Now would be a great time to get ROS setup because later this week we will be showing you how to connect OpenCV to ROS to perform actual practical robotics tasks.

Friday, May 21, 2010

Call for Papers: Semantic Mapping and Autonomous Knowledge Acquisition

Just a quick heads up to let you know that Willow Garage has a call for papers for an IROS 2010 Workshop on Semantic Mapping and Autonomous Knowledge Acquisition. They seem particularly interested in experimental results with live or video demonstrations.

Thursday, May 20, 2010

Reader Survey

If you have a moment please fill out the survey at the top of the sidebar. It should help us at I Heart Robotics determine what sort of coverage to prioritize.


Wednesday, May 19, 2010

Cute Arduino Based Robot with Ultrasonic Sensor

I am looking forward to seeing the next step for these Arduino based robots. I think it is only a matter of time before techniques like SLAM and particle filters start being implemented at the hobbyist level.

Tuesday, May 18, 2010

More on the ZMP RoboCar

Previously we mentioned the RoboCar being sold by ZMP. This new video shows many more of the features that make this an interesting platform for autonomous vehicle research,

Wednesday, May 12, 2010

Object Modeling

One may like to believe that you could ignore thermodynamics, specifically the heat equation, if you are working with robotics. Despite the fact that thermodynamics seems inescapable, the results look pretty amazing.

Tuesday, May 11, 2010

Project Natal Update

I am looking forward to this.


Ryan O'Horo has written up great instructions for the Make:NYC casting and molding lab.

Also as a note, posting will be intermittent for the next week.

Thursday, May 6, 2010

Thing-a-Week #9: Make:NYC Badge

This weeks thing is a positive master of a wearable badge designed for the Make:NYC Casting and Molding Lab.

The badge is designed designed for maximum success for beginners learning plastic resin casting. Designed for a one part mold, the included 5 degree draft angle on the surfaces insures that the part can be removed easily from the mold.

A pin back can be glued to the back so it can be worn to your local Maker Faire.

Designs and a positive master can be provided for other Make:City Groups looking at running a casting lab.

The STL file is downloadable here and previous things can be found here.

Here are some of the test parts for tonight's lab.

Tuesday, May 4, 2010

Playback: Is it Friday yet?

While the weekend is still days away, Playback brings you a fine selection of robotics videos to help distract you from whatever it is that you are supposed to be doing.

One day I am going to have the luxury of worrying about setting up a ground control station.

A recycling machine developed by the CMU Robotics Club for an ASME student design competition.

Cellbots continues driving a good idea around.

A nice relaxing day at the lake.

I want a PR2, now. I don't care if I don't have any spare time. If I had a PR2 I would have it tuck me into bed whenever it is that I actually get a chance to sleep.

Just keep watching this video the UW RSE-lab, there will be a quiz later.

Monday, May 3, 2010

Testing: Minoru 3D Webcam

The Minoru 3D webcam is supposedly the first consumer grade 3D webcam and while it has some rough edges it does look like an interesting sensor for robotics.

The concept behind the camera seems mostly like a gimmick based on the current popularity of 3D movies. The camera itself retails for about $90, which seems a little expensive when compared to what you can make yourself, but it might be worth it depending on what tools you have available. In terms of practical performance, on Linux I can get 30 frames per second from each camera at 320x240 resolution, and a little under 15fps at 640x480. The quality of the cameras is about what you would expect for two $45 cameras. Also worth noting, is that on my slower netbook, high frame rates tend to crash the system. I have not yet figured out why.

Bob Mottram has been working on developing software for supporting the camera on Linux and with ROS and he has provided extensive documentation here.

Here are some videos of his results which are fairly similar the the results I have obtained.

If you budget is a little more flexible, and your robot supports firewire you may be interested in the Videre Stereo Cameras as a higher performance alternative.

Do not buy this camera if you are expecting an out of the box 3D range finder. This camera is mostly interesting for people who want to write computer vision algorithms but don't want to deal with the mechanical issues of mounting the cameras together. The camera also provides acceptable results if your interest is producing anaglyphs. I am still waiting for the RGB-D cameras like the ones that should be shipping with Project Natal. Seriously, if you have not seen the RGB-D videos from Dr. Dieter Fox's lab, go here now. Hizook!!

If you are still interested you can buy the Minoru webcam here.