Aug 27, 2018

Link to the report from Uber's fatal accident:

To summarize what happened - Uber's autopilot makes decisions about when emergency braking is necessary. But the software was braking so often on public roads that overall it's driving was erratic.

So Uber knowingly disabled emergency braking in all cases.

This essentially turned the car into a missile on our public roads.

at the time of the accident, the autopilot detected the pedestrian. Calculated an imminent collision, and initiated emergency braking. Instead of actually breaking, the message to break simply went to the logs.

Combine that with a safety driver that was looking down at the console and not up at the road - and the fatality was the result.


It is completely beyond me why there isn't a battery of government saftey tests that vehicles with autopilot must pass before they get approved for use on our public roads.

How many times have you seen footage of a crash test dummy safely landing in an air bag? I want to see footage of crash test dummies getting pushed out in front of autopilot cars - and hopefully, never getting run down.

May 24, 2018

Direct link to NTSB preliminary report:

According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). 2 According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.