Humans are simply not good at passing off 80-plus percent of a task and then staying alert to monitor what’s going on, which is what Autopilot demands. Since Level 2 systems offer no failover capability and need a human to be ready to take over at any moment, if you’re not paying constant attention, the wreck we see here is precisely the kind of worst-case nightmare that can happen.
It’s not just me saying this; experts have long known about how shitty people are at “vigilance tasks” like these for decades. Imagine having a chauffeur that had been driving you for hours, and then sees something on the road he doesn’t feel like dealing with, so, he jumps into the back seat and tells you it’s your job to handle it now.
You’d fire that chauffeur. And yet that’s exactly what Autopilot is doing here.
I, For One, Welcome Our New Self-Driving Overlords
A Tesla test drive in drunk mode.
I, For One, Welcome Our New Self Driving Overlords
The shuttles can seat up to six people at a time and can travel at speeds of up to 25 km/h. Typically, the shuttles travel along a pre-programmed route and use sensors and software to detect their surroundings and avoid obstances.
Those sensors proved to be very sensitive during Monday’s test. Light snow, blowing leaves, and even geese that can frequently be found around Tunney’s Pasture brought the shuttle to a halt.
I, For One, Welcome Our New Self Driving Overlords
"Full Self Driving" #Tesla chooses the wrong lane, despite cheating with pre-loaded maps and in broad daylight, then cuts across the bike lane. Safe, @NHTSAgov?
(Cred: Tesla Raj from Youtube) $TSLA $TSLAQ pic.twitter.com/WQvLNodm1T— Greta Musk (@GretaMusk) October 25, 2020
I, For One, Welcome Our New Self-Driving Overlords
I, For One, Welcome Our New Self-Driving Overlords
British delivery robot plunges into canal…
I, For One, Welcome Our New Self Driving Overlords
The danger of AI is weirder than you think
I, For One, Welcome Our New Self-Driving Overlords
Some advances in AI have been exaggerated
I, For One, Welcome Our New Self Driving Overlords
I, For One, Welcome Our New Self-Driving Overlords
The National Transportation Safety Board has discovered that the man who was killed in a crash involving a Tesla SUV was playing a game on his phone at the time of the crash.
The man was identified as Apple engineer Walter Huang.
The NTSB said Huang was on his phone playing a game while the vehicle’s autopilot system was engaged. The SUV swerved and hit a concrete barrier on the freeway in Mountain View, California, in March 2018.
I, For One, Welcome Our New Self-Driving Overlords
Hackers have manipulated multiple Tesla cars into speeding up by 50 miles per hour. The researchers fooled the car’s Mobileye EyeQ3 camera system by subtly altering a speed limit sign on the side of a road in a way that a person driving by would almost never notice.
This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.
Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla’s automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee’s Advanced Threat Research team.
The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year’s Model S sped up 50 miles per hour.
The modified speed limit sign reads as 85 on the Tesla’s heads-up display. A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well.
h/t KP
I, For One, Welcome Our New Self-Driving Overlords
The cat and the laser-pointer;
A team of researchers in Israel have successfully tricked Tesla’s and Mobileye’s driver assistance features into stopping for pedestrians that weren’t there, and driving at speed limits posted on signs that don’t exist, exposing a possible flaw in modern automotive tech that could prove dangerous if exploited.
Researcher Ben Nassi from Ben-Gurion University just recently fooled a Tesla Model X running Version 2.5 hardware using a simple video projector. Nassi had previously fooled a Mobileye 630 Pro driver assist system with the same device. Nassi calls the system “Phantom of the ADAS.”
I, For One, Welcome Our New Self-Driving Overlords
New footage has emerged showing the moment a Tesla electric vehicle driving on Autopilot crashed into the back of a truck in California earlier this month.
The driver posted the video online, describing in a summary of the footage the moment when the unknown model Tesla experiences the Autopilot failure on January 13 while driving in Milpitas, in the San Francisco Bay area. […]
Drivers have reported several similar problems with auto pilot. A federal probe into a 12th Tesla crash in December, that was believed to be tied to the auto pilot system, which already had been blamed for four fatal accidents
We Don’t Need No Flaming Sparky Cars
I, For One, Welcome Our New Self Driving Overlords
“This is not the best representation of auto pilot i know, but this is the video i have”.
I, For One, Welcome Our New Self-Driving Overlords
I, For One, Welcome Our New Self-Driving Overlords
The number of $TSLA unintended acceleration events is shocking. Whether it’s a deficiency of the control system or a deficiency of the driver interface, it needs to be addressed. https://t.co/akwvgEx9gW
— CoverDrive (@CoverDrive12) December 25, 2019
I, For One, Welcome Our New Self-Driving Overlords
I, For One, Welcome Our New Self Driving Overlords
A Tesla on autopilot rear-ended a Connecticut trooper’s vehicle early Saturday as the driver was checking on his dog in the back seat, state police said.
Police said they had responded to a disabled vehicle that was stopped in the middle of Interstate 95. While waiting for a tow, the self-driving Tesla came down the road.
After striking the trooper’s vehicle, the driver in the Tesla then rear-ended the disabled vehicle before stopping.
h/t Harambe




