DWD Driving While Distracted: Google skips auto-pilot, goes for fully self-driving vehicles

At any given daylight moment across America, approximately 660,000 drivers are using cell phones or manipulating electronic devices while driving, a number that has held steady since 2010. — Distraction.gov Key Facts and Statistics

“I think it’s wonderful that Tesla has gone out there with this technology, but they might have hyped Autopilot a little bit too much. It doesn’t work in all circumstances. Drivers don’t necessarily know when the car goes from tracking fine to a gray area when the car is confused, and then to a situation when the car doesn’t know where it’s going. These things aren’t well-defined.” —  Alain Kornhauser, director of the transportation program at Princeton University

I’ve been puzzling over the question of how Tesla-type auto-pilot systems are really going to work in the real world. That’s the world where drivers are likely to turn over way too much responsibility to the auto-pilot. Drivers are already frequently distracted while they are theoretically in control. Googles test data makes it very clear that drivers are going to over-trust the auto-pilot brain, and therefore fail to take control fast enough, with enough context awareness to become the pilot. I’ve found that the monthly reports from the Google Self-Driving Car Project are a super research source: “what is working, what isn’t working, what probably won’t ever work.” Read their latest October 2015 report:  “Why we’re aiming for fully self-driving vehicles”. 

As we see more cars with semi-autonomous features on the roads, we’re often asked why we’re aiming for fully autonomous vehicles. To be honest, we didn’t always have this as our plan.

In the fall of 2012, our software had gotten good enough that we wanted to have people who weren’t on our team test it, so we could learn how they felt about it and if it’d be useful for them. We found volunteers, all Google employees, to use our Lexus vehicles on the freeway portion of their commute. They’d have to drive the Lexus to the freeway and merge on their own, and then they could settle into a single lane and turn on the self-driving feature. We told them this was early stage technology and that they should pay attention 100% of the time — they needed to be ready to take over driving at any moment. They signed forms promising to do this, and they knew they’d be on camera.

We were surprised by what happened over the ensuing weeks. On the upside, everyone told us that our technology made their commute less stressful and tiring. One woman told us she suddenly had the energy to exercise and cook dinner for her family, because she wasn’t exhausted from fighting traffic. One guy originally scoffed at us because he loved driving his sports car — but he also enjoyed handing the commute tedium to the car.

But we saw some worrying things too. People didn’t pay attention like they should have. We saw some silly behavior, including someone who turned around and searched the back seat for his laptop to charge his phone — while travelling 65mph down the freeway! We saw human nature at work: people trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax.

We did spend some time thinking about ways we could build features to address what is often referred to as “The Handoff Problem”– keeping drivers engaged enough that they can take control of driving as needed. The industry knows this is a big challenge, and they’re spending lots of time and effort trying to solve this. One study by the Virginia Tech Transportation Institute found that drivers required somewhere between five and eight seconds to safely regain control of a semi-autonomous system. In a NHTSA study published in August 2015, some participants took up to 17 seconds to respond to alerts and take control of the the vehicle — in that time they’d have covered more than a quarter of a mile at highway speeds. There’s also the challenge of context — once you take back control, do you have enough understanding of what’s going on around the vehicle to make the right decision?

In the end, our tests led us to our decision to develop vehicles that could drive themselves from point A to B, with no human intervention. (We were also persuaded by the opportunity to help everyoneget around, not just people who can drive.) Everyone thinks getting a car to drive itself is hard. It is. But we suspect it’s probably just as hard to get people to pay attention when they’re bored or tired and the technology is saying “don’t worry, I’ve got this…for now.”

Regulators: please let us have our robocars fast. They won’t be flawless, but they will be much safer than the meatware currently driving steel lethal weapons around our cities. 

3 Weak Arguments Against Self-Driving Cars Totally Miss the Point.  Let’s not let ourselves get distracted from the main goal here. In the hour that you debate the “moral dilemma” at a dinner party almost 300 people were killed somewhere by driver error [WHO Road Traffic Deaths].