British delivery robot plunges into canal…
The danger of AI is weirder than you think
Some advances in AI have been exaggerated
The National Transportation Safety Board has discovered that the man who was killed in a crash involving a Tesla SUV was playing a game on his phone at the time of the crash.
The man was identified as Apple engineer Walter Huang.
The NTSB said Huang was on his phone playing a game while the vehicle’s autopilot system was engaged. The SUV swerved and hit a concrete barrier on the freeway in Mountain View, California, in March 2018.
Hackers have manipulated multiple Tesla cars into speeding up by 50 miles per hour. The researchers fooled the car’s Mobileye EyeQ3 camera system by subtly altering a speed limit sign on the side of a road in a way that a person driving by would almost never notice.
This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.
Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla’s automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee’s Advanced Threat Research team.
The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year’s Model S sped up 50 miles per hour.
The modified speed limit sign reads as 85 on the Tesla’s heads-up display. A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well.
A team of researchers in Israel have successfully tricked Tesla’s and Mobileye’s driver assistance features into stopping for pedestrians that weren’t there, and driving at speed limits posted on signs that don’t exist, exposing a possible flaw in modern automotive tech that could prove dangerous if exploited.
Researcher Ben Nassi from Ben-Gurion University just recently fooled a Tesla Model X running Version 2.5 hardware using a simple video projector. Nassi had previously fooled a Mobileye 630 Pro driver assist system with the same device. Nassi calls the system “Phantom of the ADAS.”
New footage has emerged showing the moment a Tesla electric vehicle driving on Autopilot crashed into the back of a truck in California earlier this month.
The driver posted the video online, describing in a summary of the footage the moment when the unknown model Tesla experiences the Autopilot failure on January 13 while driving in Milpitas, in the San Francisco Bay area. […]
Drivers have reported several similar problems with auto pilot. A federal probe into a 12th Tesla crash in December, that was believed to be tied to the auto pilot system, which already had been blamed for four fatal accidents
“This is not the best representation of auto pilot i know, but this is the video i have”.
A Tesla on autopilot rear-ended a Connecticut trooper’s vehicle early Saturday as the driver was checking on his dog in the back seat, state police said.
Police said they had responded to a disabled vehicle that was stopped in the middle of Interstate 95. While waiting for a tow, the self-driving Tesla came down the road.
After striking the trooper’s vehicle, the driver in the Tesla then rear-ended the disabled vehicle before stopping.
I am wondering whether Tesla ever helps pay for damage in faulty vehicle.
My husband was driving his Tesla Model 3 yesterday in the rain. He was in navigation mode and it downgraded to lane follow because of the weather. He was in the. HOV lane of the interstate, straight (but wet road) road, going the speed limit, no other cars near his. He went under a bridge, the car suddenly swerved left (no warning/beeping triggered). My husband was alert and had his hands on the wheel and tried to regain control, but it had already gone into a spin, did a 360 and hit a concrete wall. In their system, there was no evidence of collision. We only knew the exact time because the frunk became disengaged when it crumpled. The airbags were not deployed, which is surprising given it was a 65 mph crash.
I had previous issues with the navigation mode. The first instance was the car’s desire to move to the leftmost lane even when I was getting off in less than two miles. I could cancel the lane change, but a split second later, it would reinitiate the merge. I could cancel it 10 times in a row, and it did not matter. This was a change from before. I have a video of this. Tesla phone intake person said that was not normal.
A week after, the car in navigate mode nearly led me into an accident. I tried to merge me into a car on the interstate that was going ~30 mph slower than me. Luckily I was able to take the wheel and swerve around it and the road was dry so I was ok. I stopped using the self driving after that. My husband, a mechanical engineer, really believes in the Tesla project and kind of dismissed it as unlucky. But then his accident happened yesterday.
Unsafe at any speed – including “parked”.
The National Highway Traffic Safety Administration has launched an investigation into the possibility that battery defects in Tesla vehicles may have caused the cars to burst into flames.
The investigation will involve certain battery management system software updates in Model S and Model X vehicles made between 2012 and 2019 in response to an “alarming number of car fires that have occurred worldwide,” according to a letter the agency sent to Al Prescott, Tesla’s deputy general counsel, on Oct. 24.
The alleged defects in question are “high-voltage battery fires that are not related to collision or impact damage to the battery pack,” according to the letter.