No, it’s not the opening line of a joke. Or, if it is, the joke is in poor taste, and the person telling it will find himself asking, “too soon?” About a month ago, at the Consumer Electronics Show in Las Vegas, a Tesla Model-S self-driving car ran over and “killed” a robot that was walking to its job at the show.
Think about that…
Are you more disturbed by the incident, the fact that they used the word “killed,” or the fact that we are now reading stories about one autonomous thing colliding with some other autonomous thing?
I, and perhaps some of you formed thoughts about robots and autonomous things beginning in the 60s, when they started to be featured in TV shows as handy things people would someday have. Of course, I’m talking about Rosie, the robot maid on the Jetsons and Robot on Lost In Space. If you’re thinking, “what a slacker, he couldn’t even google the name” – that was the name of the Class M-3 Model B-9 General Utility Non-Theorizing Environmental Control Robot famous for saying “Danger, Will Robinson!” Aren’t you glad they didn’t use an acronym.
I’m a little disturbed by the fact that we have to read stories like this. Not because the robots didn’t end up like Rosie, not because the robot that was destroyed is/was cute, but because of what it says about control. I’m not sure there was much control in place. The story says that the police came to investigate but that no charges had been filed at the time of the story. Who would you charge? With what crime? The robot, a Promobot, rents for $2,000 a day. According to the news story:
“Of course we are vexed,” said Oleg Kivokurtsev, Promobot’s development director. “We brought this robot here from Philadelphia to participate at CES. Now it neither cannot participate in the event or be recovered.”
Apparently, their grammar-bot is every bit as effective as the Promobot. I’m sorry, too soon?
If you’re thinking “that’s interesting, Dan but I don’t have any plans to buy a robot or a self-driving car,” you might still have reasons to be worried. Consider the following headline from a story on the travel news website Boarding Area:
“ANA Boeing 787 dual engine shutdown upon landing
All Nippon Airways is currently investigating an incident that occurred on the 17th of January that saw both engines of one of their Boeing 787-8 Dreamliners shutdown simultaneously.”
The good news, the engines shut down after landing. The bad news, they shut down before the thrust reversers had been deployed to slow the plane. “The crew let the aircraft roll until it came to a standstill 2450 meters (8030 feet) down the runway.” Which is great, but if they had landed at Laguardia, the last 3,030 feet of that process would have been in Flushing Bay.
The article goes on to say:
“It’s worth noting that a bulletin was released by Boeing not so long ago to pilots and maintenance crews about the Thrust Control Malfunction Accommodation (TCMA) system, which prevents risk in an uncommanded high-thrust situation, stating that errors in the landing sequence could cause the system to activate.”
Again, does anyone else find it disturbing that our travel lexicon includes a reference to a “Thrust Control Malfunction Accommodation (TCMA) system?” I was hoping that ‘thrust control’ would be the responsibility of a well-trained pilot. I would hope that a ‘thrust control malfunction’ might be reported to a maintenance person and repaired before the plane ever flew again, instead of being accommodated by a system.
PS, you don’t need to own the robot or the car to be involve in an accident. Last March, a self-driving car being tested by Uber Technologies Inc. struck and killed a pedestrian in Arizona. In this case:
“The NTSB said the Uber vehicle’s sensors detected the pedestrian walking across the road with a bicycle six seconds before impact. At first, the self-driving system’s software classified the pedestrian as an unknown object, then as a vehicle and finally as a bicycle with varying expectations of where the bike was headed.”
However, the vehicle wasn’t programmed to stop for obstructions in its path and the human ‘safety operator’ didn’t apply the brakes. Yes, yes, also according to the NTSB, “the pedestrian had drugs in her system and didn’t look for traffic” and the investigation continues.
Whether or not any of this causes you to worry, you should know that “Robots will be ready when the next recession hits.” A recent article in the Washington Post talks about how technology advances that are often slow to be adopted, get a boost during a recession. I just hope Wal*Mart doesn’t start arming the robot security guards.