Category: Self Driving Roadkill

I, For One, Welcome Our New Self Driving Overlords

Car & Driver;

The National Highway Traffic Safety Administration (NHTSA) has opened a preliminary evaluation into Tesla Autopilot systems and the ways this driver assistance technology works to “monitor, assist, and enforce the driver’s engagement with driving while Autopilot is in use.”

The reason for the preliminary evaluation is so that the agency can better understand the causes of 11 Tesla crashes that have happened since the start of 2018, a NHTSA spokesperson told Car and Driver in a statement. Seventeen injuries and one death are part of these 11 incidents. More Tesla crashes in which Autopilot was said to factor have happened, of course, but these are the 11 that NHTSA will look into to determine what the agency’s next steps should be regarding Tesla’s technology.

This subset of Tesla Autopilot crashes is important to NHTSA because they all involved cases where first responders were active, the agency said, “including some that crashed directly into the vehicles of first responders.” NHTSA told C/D that it confirmed that in all of these cases, the Tesla vehicles in question either had Autopilot or Traffic Aware Cruise Control engaged just prior to the crashes. Most of the incidents also happened after dark, the agency said, and “the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones.”

Related.

h/t Raymond

We Don’t Need No Flaming Sparky Cars

The Tesla had been traveling from a cul-de-sac on Hammock Dunes Place when it failed to negotiate a curve, and ended up crashing into a tree and bursting into flames.
 
According to the station, the Tesla’s batteries continued to ignites despite the efforts of firefighters to extinguish the blaze. It took them four hours—and used 23,000 gallons of water—to finally douse the flames completely.

Both of the dead were in passenger seats.

I, For One, Welcome Our New Self-Driving Overlords

It’s a Tesla two-fer;

Police are investigating the fatal conflagration of a Tesla Model X that killed its owner and injured two others in an underground parking garage in Seoul.
 
The Model X crashed into the wall of the parking garage of an apartment complex in Yongsan District, central Seoul, then caught fire, on Wednesday. This led to the burning death of the car’s owner, who was in the passenger seat.
 
The driver, who escaped with injuries, claimed “the car suddenly got out of control,” raising the possibility of a sudden unintended acceleration as the cause of the accident, according to police.
 
It is unusual for a car slowly winding its way through a parking garage to suddenly smash into a wall and result in the death of a passenger. The police will ask the National Forensic Service to try to determine the cause of the accident.

h/t Raymond

I, For One, Welcome Our New Self-Driving Overlords

Humans are simply not good at passing off 80-plus percent of a task and then staying alert to monitor what’s going on, which is what Autopilot demands. Since Level 2 systems offer no failover capability and need a human to be ready to take over at any moment, if you’re not paying constant attention, the wreck we see here is precisely the kind of worst-case nightmare that can happen.
 
It’s not just me saying this; experts have long known about how shitty people are at “vigilance tasks” like these for decades. Imagine having a chauffeur that had been driving you for hours, and then sees something on the road he doesn’t feel like dealing with, so, he jumps into the back seat and tells you it’s your job to handle it now.
 
You’d fire that chauffeur. And yet that’s exactly what Autopilot is doing here.

I, For One, Welcome Our New Self Driving Overlords

McKenna Approved

The shuttles can seat up to six people at a time and can travel at speeds of up to 25 km/h. Typically, the shuttles travel along a pre-programmed route and use sensors and software to detect their surroundings and avoid obstances.
 
Those sensors proved to be very sensitive during Monday’s test. Light snow, blowing leaves, and even geese that can frequently be found around Tunney’s Pasture brought the shuttle to a halt.

I, For One, Welcome Our New Self-Driving Overlords

Engineer…

The National Transportation Safety Board has discovered that the man who was killed in a crash involving a Tesla SUV was playing a game on his phone at the time of the crash.
 
The man was identified as Apple engineer Walter Huang.
 
The NTSB said Huang was on his phone playing a game while the vehicle’s autopilot system was engaged. The SUV swerved and hit a concrete barrier on the freeway in Mountain View, California, in March 2018.

I, For One, Welcome Our New Self-Driving Overlords

MIT Technology Review;

Hackers have manipulated multiple Tesla cars into speeding up by 50 miles per hour. The researchers fooled the car’s Mobileye EyeQ3 camera system by subtly altering a speed limit sign on the side of a road in a way that a person driving by would almost never notice.
 
This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.
 
Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla’s automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee’s Advanced Threat Research team.
 
The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year’s Model S sped up 50 miles per hour.
 
The modified speed limit sign reads as 85 on the Tesla’s heads-up display. A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well.

h/t KP

I, For One, Welcome Our New Self-Driving Overlords

The cat and the laser-pointer;

A team of researchers in Israel have successfully tricked Tesla’s and Mobileye’s driver assistance features into stopping for pedestrians that weren’t there, and driving at speed limits posted on signs that don’t exist, exposing a possible flaw in modern automotive tech that could prove dangerous if exploited.
 
Researcher Ben Nassi from Ben-Gurion University just recently fooled a Tesla Model X running Version 2.5 hardware using a simple video projector. Nassi had previously fooled a Mobileye 630 Pro driver assist system with the same device. Nassi calls the system “Phantom of the ADAS.”

I, For One, Welcome Our New Self-Driving Overlords

Daily Mail;

New footage has emerged showing the moment a Tesla electric vehicle driving on Autopilot crashed into the back of a truck in California earlier this month.
 
The driver posted the video online, describing in a summary of the footage the moment when the unknown model Tesla experiences the Autopilot failure on January 13 while driving in Milpitas, in the San Francisco Bay area. […]

 
Drivers have reported several similar problems with auto pilot. A federal probe into a 12th Tesla crash in December, that was believed to be tied to the auto pilot system, which already had been blamed for four fatal accidents

Navigation