Robots Mean Business: A Conversation With Rodney Brooks
Comment of the Day

February 18 2015

Commentary by David Fuller

Robots Mean Business: A Conversation With Rodney Brooks

Here is a brief sample from a conversation with the CTO of Rethink Robotics, also forwarded by David Brown, prior to his presentation at Markets Now next Monday:

Sensors and common sense

Over the last 50 years, robots have been identified with industrial robots that have been in factories, mostly automobile factories, where the robots are going through a series of motions again and again, repeatedly, very accurately. But they’re really operating the same way they did when they first were introduced in 1961, when computation was incredibly expensive and sensors were very expensive.

In the last 10 years, we’ve had 50 years of Moore’s law, getting the amount of computation we can have in an embedded, lost-cost system so that we can do real-time sensing and real-time 3-D sensing. We also have enough understanding on human–computer interaction and then robot–computer interaction to start building robots that can really interact with people so that an ordinary person—who doesn’t have to know how to program—can interact with a robot and get it to do something useful.

The first industrial robots had to be shown a trajectory and follow a trajectory, and that’s how the vast majority of industrial robots are still programmed: follow a trajectory accurately. That’s your hammer. So now you have to restructure all your nails so that hammer can hit those nails.

In the new robots, in the new style of robots, there’s a lot of built-in software, a lot of commonsense knowledge that’s just built in. The robot knows—if it doesn’t have anything in its hand, it can’t put it down. So now you don’t have to have error-recovery code put in by the user. We let an ordinary factory worker get the robot to do complex tasks. They don’t have to know anything about programming, they don’t have to know about quaternions or six-dimensional vectors; doesn’t come into it. They’re showing it the objects and what to do with the objects, and the robot is making the inferences.

When a factory worker wants the robot to do something new, they come up to the robot, they grab its arm. They move its arm over an object. There’s a camera there. They show it the object. The robot learns what the object looks like by moving around. They may bring it down, they may press the button to close the fingers. The robot infers, “Oh, I’m supposed to pick something up here.” And if, by the way, the object was moving when it went down, the robot would infer, “Oh, this is a conveyer belt. I know about conveyer belts. I need to match my arm speed to that.”

So they just show the robot the things that it’s got to deal with, where to pick them up, where to put them, when to hold them in front of a scanner, say. And the robot patches everything together and then is able to do the task by itself. 

David Fuller's view

Here is the article: Robots Mean Business.

Note the opening of the second paragraph above:

In the last 10 years, we’ve had 50 years of Moore’s law, getting the amount of computation we can have in an embedded, lost-cost system so that we can do real-time sensing and real-time 3-D sensing.

This is the accelerated rate of technological innovation that I have been talking about for a number of years.  We have seen a previously unimaginable acceleration in technology over the last ten years.  The rate of innovation should be much faster over the next decade.

Up until now, I have felt that one of the best ways to benefit as an investor was to buy shares in big companies which benefitted most from the technologies available.  Global Autonomies provide many examples and this service has discussed them at length, not least in Eoin’s reviews. 

As a long-term investor, I now think it is time to diversify into the robotics field and will say more about this in the next few days.      

Back to top

You need to be logged in to comment.

New members registration