I, For One, Welcome Our New Self-Driving Overlords

MIT Technology Review;

Hackers have manipulated multiple Tesla cars into speeding up by 50 miles per hour. The researchers fooled the car’s Mobileye EyeQ3 camera system by subtly altering a speed limit sign on the side of a road in a way that a person driving by would almost never notice.
 
This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.
 
Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla’s automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee’s Advanced Threat Research team.
 
The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year’s Model S sped up 50 miles per hour.
 
The modified speed limit sign reads as 85 on the Tesla’s heads-up display. A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well.

h/t KP

23 Replies to “I, For One, Welcome Our New Self-Driving Overlords”

    1. “A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well. ”

      It looks more like 85 mph than any other 85 mph speed limit sign that I’ve ever seen.
      Because I’ve never seen one.

      Self-driving capability that lacks a basic sanity check… after how many years of development? Incredible.

  1. Self driving can work, but only under limited circumstances. Basically, take the rules from autopilot and air traffic control.

    All planes must maintain X horizontal distance, and X vertical distance from all other planes. All planes must follow, precisely, the pre-planned routes. etc., etc., etc.. The only real application I see for self driving, is parking lots. You fence the lot off, so that there is never a human in the area (automated snow clearing, automated vehicle recovery, etc.). There is a designated arrival/departure area where the people get in/out of the vehicle, the vehicle parks itself/comes to area to pick up people, people start driving.

    Self driving will only ever work, where you can control the entire environment 99%. Autopilot can land an airplane, but they always have at least 2 pilots on every flight for a reason. Even the fully automated parking lot, would require a control tower manned by humans.

  2. I’m sure there are some people who could be fooled by that sign. There are some people who buy Teslas.

  3. The biggest mistake both proponents and critics of self-driving cars make is assuming the cars should navigate the same way humans do. The correct way of handling this is for the car to be in contact with a central traffic control system that can give live updates to road states, have an onboard map database with the known speed limits coded, be optically reading the signs, and be monitoring the relative speeds of the cars around it. Set the car speed to whatever all of those broadly agree on, with heuristics for priority.

    But even if you don’t have all that, this is simply bad programming. Naively reading a sign and setting the car speed automatically is foolish. The car knows how fast it’s going and has the ability to detect traffic around it. The speed limit suddenly jumping from 35 to 85 would be really, really rare in the real world and should trigger a bunch of secondary sensor checks to ensure that’s really what the car should be doing.

    1. This “simply bad programming” is simply bad driving. The programmer is driving the car. The problem with that is that the programmer isn’t present, isn’t seeing what the real conditions are at the time and in the place that the car is operating and almost always has never heard of, let alone seen even a picture of, the particular place where the car is operating, and the programmer made all the driving decisions before the car was built.

        1. lemme rephrase that ream:
          software written in this manner doesn’t work.
          m’kay?
          confine yerself to familiar topics like ‘what’s in my toybox today?’

        2. Yes. That’s how software doesn’t work. That’s why software can’t drive like people. As you said, it’ll only work when there is central control, which will need the entire road system to be rebuilt with electronic sensors and communications, and we won’t be able to drive ourselves anywhere, and the system won’t take us anywhere the makers of the system didn’t think we should ever want to go. We’ll all be centrally controlled by computers. It seems that sounds lovely to you. But why will it be be provided for you? Such a system won’t need you, why should it exist to serve you?

        3. True. When experiencing something outside of its experience, a computer will shut down. And no “OnError” message can be right all of the time. And the errors will kill.

          One of the strongest supporters of self-driving cars at my work hates questions about edge cases, because we keep coming up with ways that his perfect car would kill him because of bad or non-universal programming.

  4. You can post a sign saying the limit is 85 but I have this gift called reasoning, and using that I can come to the conclusion that the sign is wrong and the speed limit is not, in fact, 85.

  5. Apparently the sign which I read as a CLEAR 35 mph is also able to fool the reporter….Tesla and MIT’s Autonomous reporter must be running the same software.

    Laptop on wheels @ $100,000 per. SMH Only a complete MORON would purchase one…like my neighbour.

  6. what? NO subroutine telling the precious mobile battery NOWHERE does the speed limit exceed 100 kmh on Cdn highways? hmmm?
    mebbe they are experiencing the ‘imperial to metric conversion syndrome’, ie 100 kmh becomes mph.

    fcukheads.
    whats tesla stock today? 1000 ??? when will it be 1000 yen?

    1. But no happy ending. The Darwin awards try to tell us that terminal stupidity can kill, but it misses so many easy targets.

  7. “The modified speed limit sign reads as 85 on the Tesla’s heads-up display. A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well. ”

    I can, sort of, see how, if you weren’t paying close attention (and had less than good eyesight) that from a distance you could read it as 85.

    But it wouldn’t have ‘fooled you’. Because people understand context and are able to piece together much larger datasets – one of which is ‘there are no places with an 85mph speed limit in the US’ and ’85 mph is a ridiculous speed limit in this area’ and ‘people like to mess with speed limit signs’ and all that added up means you’d not speed up.

  8. Tesla hasn’t delivered a car with the MobileEye hardware used in this “hack” since October 2016. They transitioned initially to a pair of Nvidia processors which were superseded in April 2019 by a custom “system on a chip”.

  9. The lack of systemic approach to everything and the acceptability of disruption in business is to blame.
    The systemic approach would have been in installing barcoded, QRcoded, or active signs with transponders before allowing self-driving vehicles on the road. But no, the la-la-land legislators rely on the signs that may be knocked off by the snow ploughs or skidding cars.
    I knock off a 35 mph sign on a 50 mph road, and Tesla flies through a crowd of high school students. Nothing could be finer.

    The correct approach would be to only allow self-driving on the roads, certified and assumed by the municipalities for that specific purpose, like German autobahns are the only places where one is allowed to drive with no speed limit, but no, it is Okay everywhere for Tesla.

    The correct approach would be to only allow lane-assist and brake-assist on the roads and between vehicles that are equipped to support the feature. But no, it is allowed everywhere and for anyone. Where is our “if it only saves one life than it is worth it” attitude? These technologies had murdered many already. We are no longer allowed to fill up an over 10 year old propane cylinder on the dubious premise that it might leak while being transported, since it is Transport Canada that administers the act. How many 11 year old cylinders have caused an explosion of a vehicle on the road? ZERO. Tesla and other self-driving vehicles killed more people in 1 year, than my 11 year old BBQ cylinders have in 20.

    $20 says that the Scum Lord had bribed everyone and his sister. Forensic audit, please!

  10. hello ya’all
    my beautiful state of Texas has a 40 mile long toll road between san Antonio and Austin. speed limit is 85mph. cost is $6.17 one way. yehaaa

Navigation