Category: Self Driving Roadkill

I, For One, Welcome Our New Self-Driving Overlords

It’s a Tesla two-fer;

Police are investigating the fatal conflagration of a Tesla Model X that killed its owner and injured two others in an underground parking garage in Seoul.
The Model X crashed into the wall of the parking garage of an apartment complex in Yongsan District, central Seoul, then caught fire, on Wednesday. This led to the burning death of the car’s owner, who was in the passenger seat.
The driver, who escaped with injuries, claimed “the car suddenly got out of control,” raising the possibility of a sudden unintended acceleration as the cause of the accident, according to police.
It is unusual for a car slowly winding its way through a parking garage to suddenly smash into a wall and result in the death of a passenger. The police will ask the National Forensic Service to try to determine the cause of the accident.

h/t Raymond

I, For One, Welcome Our New Self-Driving Overlords

Humans are simply not good at passing off 80-plus percent of a task and then staying alert to monitor what’s going on, which is what Autopilot demands. Since Level 2 systems offer no failover capability and need a human to be ready to take over at any moment, if you’re not paying constant attention, the wreck we see here is precisely the kind of worst-case nightmare that can happen.
It’s not just me saying this; experts have long known about how shitty people are at “vigilance tasks” like these for decades. Imagine having a chauffeur that had been driving you for hours, and then sees something on the road he doesn’t feel like dealing with, so, he jumps into the back seat and tells you it’s your job to handle it now.
You’d fire that chauffeur. And yet that’s exactly what Autopilot is doing here.

I, For One, Welcome Our New Self Driving Overlords

McKenna Approved

The shuttles can seat up to six people at a time and can travel at speeds of up to 25 km/h. Typically, the shuttles travel along a pre-programmed route and use sensors and software to detect their surroundings and avoid obstances.
Those sensors proved to be very sensitive during Monday’s test. Light snow, blowing leaves, and even geese that can frequently be found around Tunney’s Pasture brought the shuttle to a halt.

I, For One, Welcome Our New Self-Driving Overlords


The National Transportation Safety Board has discovered that the man who was killed in a crash involving a Tesla SUV was playing a game on his phone at the time of the crash.
The man was identified as Apple engineer Walter Huang.
The NTSB said Huang was on his phone playing a game while the vehicle’s autopilot system was engaged. The SUV swerved and hit a concrete barrier on the freeway in Mountain View, California, in March 2018.

I, For One, Welcome Our New Self-Driving Overlords

MIT Technology Review;

Hackers have manipulated multiple Tesla cars into speeding up by 50 miles per hour. The researchers fooled the car’s Mobileye EyeQ3 camera system by subtly altering a speed limit sign on the side of a road in a way that a person driving by would almost never notice.
This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.
Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla’s automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee’s Advanced Threat Research team.
The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year’s Model S sped up 50 miles per hour.
The modified speed limit sign reads as 85 on the Tesla’s heads-up display. A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well.

h/t KP

I, For One, Welcome Our New Self-Driving Overlords

The cat and the laser-pointer;

A team of researchers in Israel have successfully tricked Tesla’s and Mobileye’s driver assistance features into stopping for pedestrians that weren’t there, and driving at speed limits posted on signs that don’t exist, exposing a possible flaw in modern automotive tech that could prove dangerous if exploited.
Researcher Ben Nassi from Ben-Gurion University just recently fooled a Tesla Model X running Version 2.5 hardware using a simple video projector. Nassi had previously fooled a Mobileye 630 Pro driver assist system with the same device. Nassi calls the system “Phantom of the ADAS.”

I, For One, Welcome Our New Self-Driving Overlords

Daily Mail;

New footage has emerged showing the moment a Tesla electric vehicle driving on Autopilot crashed into the back of a truck in California earlier this month.
The driver posted the video online, describing in a summary of the footage the moment when the unknown model Tesla experiences the Autopilot failure on January 13 while driving in Milpitas, in the San Francisco Bay area. […]

Drivers have reported several similar problems with auto pilot. A federal probe into a 12th Tesla crash in December, that was believed to be tied to the auto pilot system, which already had been blamed for four fatal accidents

I, For One, Welcome Our New Self Driving Overlords

Fox News

A Tesla on autopilot rear-ended a Connecticut trooper’s vehicle early Saturday as the driver was checking on his dog in the back seat, state police said.
Police said they had responded to a disabled vehicle that was stopped in the middle of Interstate 95. While waiting for a tow, the self-driving Tesla came down the road.
After striking the trooper’s vehicle, the driver in the Tesla then rear-ended the disabled vehicle before stopping.

h/t Harambe