Monday, November 28, 2011

The Mirror Test

Here the Qbo robot recognizes itself in a mirror. This is quite a nice looking robot, and I like the way that the head moves in a semi-anthropomorphic kind of way.

This is probably just using simple 2D feature based recognition (SIFT, SURF, etc), but the problem of recognizing yourself is a deeper issue which is related to self awareness and theory of mind. If any creature wants to survive it needs to be able to do things such as detect damage to itself, and this involves having a self model which is learned through early interactions with the world, such as play behaviors, together with lifelong habituation. In humans, and presumably also other animals, this can lead to curious phenomena such as phantom limb syndrome. Currently in systems such as ROS the model of the robot is something which is hand coded by traditional engineering, but in a hypothetical AGI system the model would be learned via self observation and interaction in the environment.

Qbo also uses stereo vision rather than an RGBD sensor.  Stereo vision has been a classic problem in computer science, due to the high ambiguity of feature matching, but current dense stereo algorithms have seen great improvements in recent years to the point where the results are similar to those which the Kinect can produce, especially at close range.

Example depth maps from stereo vision are shown in these videos - one of me, and another of a cup on a table.  I don't know whether Qbo uses this type of algorithm, but it's one of the open source implementations which are available for use in robotics projects.

More Experimental Party Add-Ons

Get ready to assume the party submission position, because here is a TurtleBot that is equipped with a 6 Unit Beverage Delivery Interface and is ready to initiate party mode.

While more work and testing needs to be done, the upward facing center mounted sonar sensor should provide a means of reliably signaling to the robot that a beverage is needed by waving a hand over the robot. This should help prevent unnecessary collisions between robots and humans who have reached their beverage capacity.

The Beverage Delivery Interface will likely undergo productification after further development and debugging during holiday testing season. Impatient TurtleBot owners interesting in beta testing should send an email.

In Stock: Smoke Daemon Fume Extractor

We will have a limited run of Smoke Daemon Fume Extractors available before the holidays.

There are some minor changes from the above photo and two or three parts that are coming in later this week but orders will ship out at the beginning of next week.

If there is demand we will look at another run next year.

Cyber Madness

Over at I Heart Engineering the Cybernetic Sales Systems are coming on-line and will be offering a selection of deals through out the day. You can find the currently featured deal by clicking the Cyber Madness animated GIF. The ad banner may be garish but the store is guaranteed to be 100% music free, so at least one of your sensor subsystems should remain intact.

Sunday, November 27, 2011

In Stock: More Tools

Some customers though that the TG-01 diameter was a little small (75mm/2.95"), so we now have the larger (100mm/3.9") and wider (25mm) TG-02 grinder.

If you deal with installing or removing e-rings, the PZ-01 ring pliers are for you.

 If you use more than two different types of tweezers on a regular basis you be interested in the PTZ-51 Ceramic Tweezers.

Monday, November 21, 2011

Navigating with buttons

As an adaptation to the Turtlebot I've added a button and audio user interface similar to the one on my larger robot. This uses an Arduino as a means of sensing button presses and communicating those to the netbook.

If you have a small number of locations that you wish the robot to visit then this provides an easy way to tell the robot where to go, which once it has been trained doesn't require that you run Rviz or need any separate computer. If two places are defined on the map then merely pressing the start button will cycle between them to implement a sort of fetch and carry.

Transporting things around within a building at slow speed is pretty much a business model in some scenarios, such as warehouses, hospitals and supermarkets, and could form the basis for more sophisticated domestic applications such as watering plants or dusting/cleaning surfaces like shelves or tabletops.

The button user interface is completely generic and doesn't necessarily require a Turtlebot. It could be used with any PC based robot which runs the ROS navigation system.

Saturday, November 19, 2011

Nao for the future of robot friends

Jerome Laplace from Generation Robots sent us some information about their excellent work getting a Nao humanoid robot to play 'Connect 4' versus a human opponent.

There are several interesting things about this video and the technical approach taken for getting everything working. The video production quality is excellent and helps explain to a non-technical audience how humans need to perform roughly the same functions as the robot without getting distracted by mathematics or jargon.

The 'Connect 4' game has some interesting properties in terms of robotics, the game pieces are geometrically uniform cylinders with easy to distinguish colors and the playing field is a grid of circles with a known and fixed spacing. If you were to design a game for a robot to play using computer vision you would be hard pressed to make something better, given that the playing field is basically a camera calibration pattern.

One other nice thing is how the robot is designed to uses it's weaknesses as a strength. If there is a problem with the robot's game play it can detect the failure and ask the human player to correct the placement of the game piece. This robustness also gives it the ability to detect cheating on the part of the human player and the robot can react as needed.

While for now this is a single game, it is pretty clear that this approach could be extended to playing checkers, Othello, or perhaps even a game of Go. It is easy to imagine an accretion of such hacks to the point where a generalized solution becomes possible, at the very least it seems reasonable to build a state machine that can make a reasonable guess as to which game the human wants to play and load the correct computer vision hacks and AI game play code.

I was initially uncertain about the business model behind the development of an emotive game engine for robots to play with children. After contemplation I now believe that this idea holds some promise as a de-virtualization of the Tamagotchi.

Tuesday, November 15, 2011

Level UP!!

Thanks to your continued support, I Heart Engineering has leveled up! Now with a bigger and better unified shipping and manufacturing facility in Brooklyn, NYC.

Check out the views from the manufacturing area.

Monday, November 14, 2011


A recent example of navigation with the Turtlebot. Tuning the navigation parameters is quite tricky and the default values didn't work very well. The parameters I used can be found here.

The next step is to have a way of defining map locations and interactively teaching them in a way similar to my larger robot. Once points of interest are defined within the map then an executive planner can be devised to organize the robot's behavior in terms of high level semantics.

Monday, November 7, 2011


We will see you there!

Continue Testing

Sunday, November 6, 2011


An example of straight legged walking on the HRP-4C robot. Most humanoids have constantly bent knees, and this gives them some leeway to handle vertical forces throughout the stride. Real humans do this by having a curved spine which is flexible, and this robot presumably has something comparable at the waist section.

If you pause the video as the leg is about to be lifted you can see that - unusually for most humanoids - there is a separate toe section which facilitates the push off.

Humans are the only primates who routinely walk upright, and it turns out that being able to do this efficiently requires pretty accurate sensing and motion control, which is why we're now only just beginning to see really human-like walking in robots.

I'm not expecting there to be practical uses of large humanoids like this in the near future, simply because of their large complexity and cost, although in the longer term assuming that current technology trends continue a humanoid form seems to be considered desirable and would fit best with existing infrastructure.

Tuesday, November 1, 2011

ADIS16488 "Tactical Grade" Inertial Sensor

Analog Devices newest inertial sensor is now available with a new 'Tactical Grade' rating on the data sheet. The specs seem ideal for flying robotics with ±450°/sec gyros, ±18 g accelerometers, ±2.5 gauss magnetometer, and an integrated 300-1100 mbar pressure sensor. There is also an embedded temperature sensor included but for some reason they chose to limit themselves to 'Ten Degrees of Freedom'.

The inertial  sensor has a low profile package measuring 44mm x 47mm x 14mm, and does away with the ribbon cable used by Analog's other integrated inertial sensors.

The communication interface is via SPI and operates at 3.0-3.3VDC at temperatures from −40°C to +85°C. Interestingly there are four integrated FIR filter banks which could be very useful for removing internal noise sources such as motor vibrations. Alarms and digital I/O round out the feature set.

If you are buying a 'Tactical Grade' gyro, you would expect the best in terms of accuracy, precision and sample rates. The sensor has an internal factor calibration and has optional support to compensate for the effect of linear acceleration on gyroscope bias. Gyros and accelerometers sample at 2.46 kHz after averaging and decimation. The barometric pressure sensor runs at 51.25 Hz. The IMU uses a 32bit twos-complement register to store most measurements. The gyro for example uses the upper 16bits in twos-complement format with a LSB of 0.02°/sec. For the lower 16bits, "[t]he MSB has a weight of 0.01°/sec, and each subsequent bit has 1⁄2 the weight of the previous one". The datasheet could use some more statistical details on the A/D conversion process but it is only Rev. 0.

Integrate the ADIS16488 iSensor® MEMS IMU into your sUAS for only $1,619.00 in quantities of 100 to 499.

I look forward to seeing the 'Ultra Mega Tactical Grade' IMU from Analog Devices in the future. This sensor should include an on-board GPS with external antenna connector, additional pressure sensors for airspeed measurement, humidity sensor and integrated real-time quantum clock. The on-board 64bit ARM core provides for maximum bandwidth to the sensors.