I, For One, Welcome Our New Self-Driving Overlords

Bloomberg Law;

A Tesla car crashed into a Virginia couple’s home on two occasions when its parking assistance features failed, a new lawsuit alleges.
 
Bikan and Daljit Octain paid $111,450 for a 2016 Tesla Model S 90D Automobile, including a $3,000 charge for the “Full Self-Driving Capability,” they allege in a suit in Virginia Circuit Court against Tesla Inc. and Tesla Motors Inc.
 
This capability includes the “Tesla Autopark” feature, which allows the car to be parked remotely, the complaint said. It also includes the “Tesla Summon” feature that lets owners move the car in and out of a parking space from outside the vehicle using a mobile app or the key, the complaint alleges.
 
But the car crashed itself once into their home and months later drove itself into the wall of their garage when they attempted to use the parking assist features, the Octains allege.

Maybe it was just stoned.

22 Replies to “I, For One, Welcome Our New Self-Driving Overlords”

    1. Except you will get that self driving car far sooner than then when one owned by someone else runs over or into you. And Your Betters don’t care, because they know it ain’t gonna happen to them!

  1. I can accept a “self driving car” in the leftmost highway lane (“number one lane”). All the technology is already available. It can keep a constant speed, stay in the lane, and slow down when there is a slower obstacle straight ahead, and resume speed when it is no longer there. Such a feature can be useful on a long drive, and can relieve the driver for hundreds of miles. However, that can only be minimally considered a “self driving” car.
    On the other hand, driving on city streets, parking the car into a variety of available spaces, avoiding numerous different unknown objects, are just too much to pre-program. If you really try to include all the possible variables, you need a much more expensive onboard computer. If you don’t, the car is not truly reliably “self driving”. In the real world, a one percent failure rate is eminently unacceptable. A .01% rate is not acceptable. Even the industrial “standard” of six sigma, with less than .00034% failure, is not acceptable. That is a failure in just less than 300,000 decisions. Even in industry, that is more of a slogan or goal than reality. But even that may mean an annual failure for a car that may be called on to make thousands of decisions during a trip.

    1. OldBruin you make some interesting points. My first reaction was to totally agree. But then I asked myself, why do we expect self driving cars to be perfect? Shouldn’t they just have to be better than the current situation?

      This site: https://www.driverknowledge.com/car-accident-statistics
      says there are 6 million vehicle accidents per year in the USA. This yields 3 million injuries and 30 thousand deaths per year (90 deaths per day).

      Given 276 million vehicles in the USA, we get an accident rate of 2% per year.

      So do self driving cars need to be perfect? Possibly not.

      Do I want to ride in one? I must admit no I don’t. I like the perception that I am in control of my destiny. On the other hand I have been involved in 3 accidents where another driver hit my car, and one where I backed into a post.

      Having thought about all the above and having worked a number of years in IT, I am still not ready to trust a self driving car.

      1. OB … that would be ME driving FAST in the left lane … as it should be. As the LAW says … “Slower Traffic Keep Right”. So long as the self-driving Tesla is driving FAST in the left lane, I guess I will begrudgingly accept them into MY driving domain. However, if their fancy, cutting-edge, high-tech, computerized, software is also SLOWING-down in the left lane … for battery life preservation, or cold-weather-slow-driving mode, or radar re-calibration-mode … or any other kewl high-tech mode … then I will run the fkcuing battery behemoths off the damn road, and give their computer sensors something to REALLY work out. And give their sleeping or texting occupants something to think about.

        No Tesla will be allowed to enter MY roadway domain unless it knows how to behave there. It’s taken quite some time … but I have managed to SCARE (yes, SCARE) all the slow Prius drivers out of the LEFT lane … I hope the fancy “onboard Tesla learning-computer” takes the hint somewhat more quickly than the slow witted Prius drivers.

        “Slower Traffic Keep Right” … it’s the LAW … it’s only LOGICAL … and safe.

        1. Man oh man…I’m with ya on that bit Kenj. Some of them doing under the posted in the Left lane are the arrogant ones that would flip ya the finger – while others, either too stoned or too old and brain dead simply have no clue. I’ve discovered however that, when my Lows (HID), HIghs (LED) and the 20″ LED bar mounted on my Aftermarket front bumper all come on, the 60,000 lumens of lovely HID/LED illumimation in their rear view mirror does not make for, shall we say “comfortable” driving…for them.

          …works like a charm.

          1. It doesn’t matter if one is doing the speed limit in the PASSING LANE, or 10 mph over the limit. When one has faster traffic overtaking you it’s time to get out of the PASSING LANE and yield to the faster traffic. This was taught to me by a Colorado state trooper in driving class 50 years ago and still is the rule of the road. A passive-aggressive ass in the PASSING LANE is an ass no matter what speed they are doing.

          2. Steakdude … I admire your aftermarket innovations! Tremendously!

            And BTW … I always yield (move over) when someone overtakes me in the #1 lane … it’s only logical. Please motorists! Drive logically. “Slower Traffic Keep Right”

            Note the Law says slow-ER Traffic. It doesn’t matter if you are driving the speed limit in the #1 lane … if you are driving slow-ER than some dude with a 6,000 lumen light bar … move over.

    2. No. As soon as you add construction or a broken down truck the self-drive feature gets confused. This is the problem with self-drive. It works perfectly until it goes wrong. So many of the Tesla self-drive accidents are in construction zones – running into barriers.

    3. “… If you really try to include all the possible variables, you need a much more expensive onboard computer. …”

      The programmer will always be a much greater constraint than the capacity of the computer. The “self-driving” car replaces a human driver present in the moment with a program created by someone who has never been there trying to create appropriate responses to every situation they can imagine and some that they can’t imagine. If anyone thinks they can’t be a better driver than that programmed computer, we already have taxis and buses for them.

      Humans aren’t perfect, neither will their inventions be.

  2. Anyone else catch the irony of someone with the last name ‘Octain’ having problems with a Tesla?

    Anyways, the ‘white powder’ story reminded of an old automotive joke…

    How can you tell if it’s a DeLorean going down the highway?

    There’s no white line behind it.

  3. So … a Tesla owner notices … for the very FIRST TIME in his years of garage ownership … white powder under a specific location of his parked Tesla … and the Tesla owner comments suggest it is concrete efflorescence suddenly showing up after years of home ownership … suggests only one thing to me: Tesla owners are TRUE BELIEVERS who will NEVER acknowledge or accept ANYTHING negative about their so-called eco-autos. They are self-deluded fools. They are incapable of buyers remorse. No amount of $$$ spent will change their opinion. No amount of failures will change their opinion, that their Tesla is the GREATEST product ever invented by mankind (peoplekind). I personally consider people like this to be the most dangerous of all personalities … the self-deluded. Close-minded, true believers. Worse than the most Fundie of all fundamentalist Christian. They are not “open” to any facts which might upset their eco-orthodoxy. Truly dangerous people … especially if they are given political power (see: Jerry Brown, and his ilk).

    And what is the white powder? I don’t know. But I do KNOW that efflorescence never JUST shows up in one small limited portion of a concrete slab or wall. It covers the surface.

    1. Calcium perhaps…but then, does it snow in Kali..? Do roads get salted…?
      Hmm – Salt, I wonder how that would affect said Self Driving Computers.?

      As for the dyed in the Wool Ideolouges…your leftist, socialist eco-fascist globalist type…? None of them are capable of truly “seeing” a damned thing, they all show the same Blinding Fanatical, Religous like “belief”….sorta like what ya see from Rabid Islamics in Pakistan.?
      Identical mental disease.

  4. If self-drive systems are so great why do we still have aircraft landing accidents? Big long runway, nice and wide, no other aircraft and … well … bang, off the runway.

    It would be much easier to outfit an aircraft with self-drive than an automobile that is meters from a concrete barrier and surrounded by idiot drivers.

  5. “Bikan and Daljit Octain”

    Are they mooselimb? Perhaps the car crashed because they have downloaded “Crash into crowd of infidels ” app, I hear it is often bundled with the “keep your wife on a leash” app.

  6. a ‘bit’ (pun intended) of techno lingo regarding teslas, aircraft, desktops etc.
    3 decades ago I found out about interrupts in the link between hardware and software.
    the hardware device gets kinda a ‘direct link’ to the goings on inside the CPU.
    interrupts are prioritized according to the device.
    when an interrupt occurs, the CPU saves important stuff like which instruction in the active program it was on, the contents of registers etc and then immediately if not sooner takes care of the higher priority interrupt which for instance, may only be a click on the keyboard. this all happens in millionths of a second, dep on the CPU clock speed.
    in the old days, slower CPUs could get bogged down with interrupts to the point where some of them get ignored.

    this might be what’s happening with mr bull moose musk’s product. or something similar. in which case ‘debugging’ is next to impossible due to the impossibility of duplicating the conditions sufficiently to then confidently code the patch.

    bottom line is, who the FCUK would pay >> 100 friggin GRAND for an unreliable and demonstrable dangerous new technology?

Navigation