1. Attachments are working again! Check out this thread for more details and to report any other bugs.

Featured Tesla Autopilot recall probed by safety regulator following new crashes

Discussion in 'Prius, Hybrid, EV and Alt-Fuel News' started by Gokhan, Apr 26, 2024.

  1. AndersOne

    AndersOne Active Member

    Joined:
    Jun 10, 2023
    274
    163
    0
    Location:
    Europe
    Vehicle:
    Other Hybrid
    Model:
    N/A
    Most modern cars have front side radar just for these kind of situations - the Prius Prime too that's why it can have an automated lane change function.

    Unlike Tesla (no radar) that cheaps out on most sensors to save money... which works great until it doesn't.
     
  2. bisco

    bisco cookie crumbler

    Joined:
    May 11, 2005
    108,634
    49,357
    0
    Location:
    boston
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    Tesla uses misleading advertising to sell cars. What else is new?
     
  3. John321

    John321 Senior Member

    Joined:
    Nov 16, 2018
    1,197
    1,195
    0
    Location:
    Kentucky
    Vehicle:
    2008 Prius
    Model:
    Two
    Some would say this is what's new and different:
    The number of accidents and fatalities due to the advertising/misuse/failures of the system.
    Improper safeguards to prevent the misuse.
    Investigation and its finding by NHTSA.
    Investigation of Ford for a very similar situation.

    The crux of the matter is not a particular car brand but a disregard for public safety, proper testing and proper implementation.
    That was the finding of NHTSA TESLA investigation and will probably be the same finding with the Ford Investigation.

    'Critical safety gap' between Tesla drivers, systems cited as NHTSA launches recall probe (msn.com)
    "The agency also closed a nearly three-year investigation analyzing 956 crashes involving Tesla vehicles up to Aug. 30, 2023. Nearly half of the accidents (467) could have been avoidable, ODI said, but happened because "Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.""

    Why not just temporarily pause Self Driving Implementation and develop a solution together:
    Why not form a Cooperative Commitee of all Auto Manufactures with the NHTSA and design a Universal Auto Driving System with adequate safeguards/protections and do proper testing then roll it out. Implement a Basic Standard for every Auto Driving System Design that must be meant - required rigorous testing by the NHSTA for real life operation situations then and only then can a System be Licensed to be installed in a vehicle.
     
    #43 John321, May 1, 2024
    Last edited: May 1, 2024
    dbstoo and AndersOne like this.
  4. Zeromus

    Zeromus Member

    Joined:
    Oct 21, 2023
    344
    164
    4
    Location:
    Ottawa Canada
    Vehicle:
    2024 Prius Prime
    Model:
    SE
    Of course its not limited to Tesla. But Tesla is the biggest one pushing these technologies and using the most ambiguous language with a blowhard CEO making wild claims on the daily about the capabilities. automating your tesla into a robo-taxi to make money on the side being just one such claim - operating on the assumption that FSD will enable this

    ETA:

    Like it or not, there's a human psychology to all of this. People, to someone else's point, do get lazy and do become accustomed to these automated support systems, and they do begin to offload some of that mental effort while driving to the car. This is not, in and of itself, a wholly bad thing. While you still need to pay attention, the level of attention and care to tasks like keeping in your lane can be reduced. The inherent subconscious stress of "what if I veer from my lane by accident" lessens when you have feedback to support you in recognizing when that happens. This makes driving not only safer as an extra check/balance on human error exists, but it also because people *can* be more attentive to other things while driving. The flipside is that people can also rely on it to the point where they are less attentive. And that is bad.

    The more people are told and come to expect that the car's will "Drive themselves" the worse the latter situation gets. Its why messaging is important and also why technologies to act as a barrier for people who take too much advantage need to be robust. Tweaking these systems so that they don't chime or beep at you when your hands are off the steering wheel for longer than it takes to, idk, sneeze unexpectedly, is bad. With the ability to push software based updates to these feedback systems the regulators need to be even more on top of it too. As noted previously in this thread by me, Tesla pushed an update sometime last year that made the car beep at drivers *less* often and allowed them to have their hands off the wheel for longer periods of time while using the driver assist features. There needs to be a legal minimum for these systems that act to try and keep people engaged with the car while using automated support systems. The fact tesla has been playing with previous settings is an issue though.
     
    #44 Zeromus, May 1, 2024
    Last edited: May 1, 2024
    AndersOne and Trollbait like this.
  5. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    27,361
    15,507
    0
    Location:
    Huntsville AL
    Vehicle:
    2018 Tesla Model 3
    Model:
    Prime Plus
    Current events, this week:
    • 4/29 (Mon) - Tesla downloaded 2024.3.25 with the latest Full Self Driving (FSD).
    • 4/30 (Tue) - Drove ~40 miles using FSD and found the 'Wilson weave' all but gone.
    • 5/1 (Wed) - Wash and clean car for trip and dogs to park
    • 5/4 (Thu) - Load car for 1,400 mi trip the next day. Leave after midnight.
    • 5/3 (Fri) - Will drive 700 mi to Munro & Associates for a $25 Open House.
      • Overnight charge at motel.
    • 5/4 (Sat) - Attend Open House and see about upgrading to $250 ticket.
    • 5/5 (Sun) - Drive 700 mi home
    Bob Wilson
     
  6. hill

    hill High Fiber Member

    Joined:
    Jun 23, 2005
    19,841
    8,148
    54
    Location:
    Montana & Nashville, TN
    Vehicle:
    2018 Chevy Volt
    Model:
    Premium
    4runners are hardly AI controlled (sigh) .... but in the New World Order where everyone is victim, won't be surprising if the murderer driver gets off scott free
    .
     
    bisco likes this.
  7. Zeromus

    Zeromus Member

    Joined:
    Oct 21, 2023
    344
    164
    4
    Location:
    Ottawa Canada
    Vehicle:
    2024 Prius Prime
    Model:
    SE
    If you read the article looks like it was updated with info that he was not let off, unless this is related to an appeal.

    And the article also says that the guy gave a statement to police right after saying his foot was stuck on the pedal... so....
     
    bisco likes this.
  8. hill

    hill High Fiber Member

    Joined:
    Jun 23, 2005
    19,841
    8,148
    54
    Location:
    Montana & Nashville, TN
    Vehicle:
    2018 Chevy Volt
    Model:
    Premium
    ok so i guess we're not quite that far gone yet
     
  9. Zythryn

    Zythryn Senior Member

    Joined:
    Apr 28, 2008
    6,231
    4,227
    1
    Location:
    Minnesota
    Vehicle:
    Other Electric Vehicle
    Model:
    N/A
    When I change setting to allow for FSD to be used, there is information provided that I must acknowledge. This information includes the fact that this system requires driver awareness and attention. That it is NOT a system that takes over driving duties, but assists.
    When I check different options I get more warnings.

    When driving, if my eyes aren't on the road, I get warnings and if I repeat this often, FSD is no longer accessible.

    All people that purchase the car now get training on what exactly FSD is, how it works, and its limitations.


    There have been bad actors for decades, well before any assistance technologies. I have a friend who quit driving his motorcycle due to all the people out there applying make-up, reading books, and not paying any attention.

    What I would like to see, is a data driven comparison on the number of accidents Tesla's have, compared to other cars.
    Yes, cars with FSD can get in accidents. But is it more or less than others and by how much?
     
  10. hill

    hill High Fiber Member

    Joined:
    Jun 23, 2005
    19,841
    8,148
    54
    Location:
    Montana & Nashville, TN
    Vehicle:
    2018 Chevy Volt
    Model:
    Premium
    Perhaps the detractors won't be happy unless & until the system gives a mild shock to those who try to screw with it.
    Then they'll be moaning because the static zapper doesn't include a defibrillator with it - when/if it sends hearts out of rhythm. Only thing not considered it seems would be the amount of accidents per million miles ... & who knows why that's not the litmus test
    .
     
    Zythryn likes this.
  11. John321

    John321 Senior Member

    Joined:
    Nov 16, 2018
    1,197
    1,195
    0
    Location:
    Kentucky
    Vehicle:
    2008 Prius
    Model:
    Two
    I don't see how the Auto Makers touting this technology will stay solvent continuing to tout it:

    Vultures are already circling them

    Tesla Autopilot Lawsuits (klinespecter.com)

    "Tesla vehicles driven in Autopilot or “full self-driving mode” have been involved in far more accidents – including those resulting in at least 17 deaths and many serious injuries — than previously reported.

    That, according to a recent analysis by The Washington Post of data from the National Highway Traffic Safety Administration (NHTSA) that discovered 736 crashes since 2019 in the United States involving Teslas driven on Autopilot.

    If you or a loved one suffered a catastrophic injury or death believed to be as a result of a Tesla Autopilot vehicle, you may have grounds for a lawsuit against the automaker. Contact us for a free consultation about your case.

    Kline & Specter, PC, with 60 attorneyss, has the experience and expertise to investigate and litigate auto accident claims that may have been the result of defective vehicles. We have won billions of dollars in product liability verdicts and settlements, including verdicts of $153 million and $52 million from one car company and a $30 million settlement from another, just to name a few.

    The Post article noted that the vast majority of serious accidents involving Autopilot have occurred in Tesla vehicles, or 736 of the 807 recorded by NHTSA in the past four years.

    By comparison, Subaru vehicles, with the second most, were involved in only 23 reported Autopilot-related crashes since 2019.

    Part of this may be due to the fact that Tesla CEO Elon Musk has heavily promoted the technology, saying that the self-driving cars will actually lower the number of overall accidents. But The Post noted “flaws in the technology” that is being tested on America’s roads.

    Kline & Specter handles cases on a contingency fee bases, meaning we don’t make any money until you win your case. We have offices in Pennsylvania, New Jersey, Delaware and New York. For cases outside those states we work with local attorneys in each state as applicable."
     
    #51 John321, May 1, 2024
    Last edited: May 1, 2024
    AndersOne likes this.
  12. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    27,361
    15,507
    0
    Location:
    Huntsville AL
    Vehicle:
    2018 Tesla Model 3
    Model:
    Prime Plus
    Good thing Toyota managed to survive the "runaway acceleration" crisis. Ford survived the exploding Pintos. GM did not crash and burn from "Unsafe at Any Speed" Corvette.

    Click bait was once called "Yellow Journalism" but can look like a supermarket tabloid:
    upload_2024-5-1_11-1-22.png


    My "lying eyes" already bought and use Full Self Driving.

    Bob Wilson
     
    bisco likes this.
  13. hill

    hill High Fiber Member

    Joined:
    Jun 23, 2005
    19,841
    8,148
    54
    Location:
    Montana & Nashville, TN
    Vehicle:
    2018 Chevy Volt
    Model:
    Premium
    Maybe that article should be called. "when ambulance Chasers run amok"

    Screenshot_2024-05-01-10-02-53-44_e4424258c8b8649f6e67d283a50a2cbc.jpg

    Notice how they claim they've "won" awards but failed to name any, against whom. That's proof positive they were nuisance suits brought against defendants. We always include non-disclosure agreements whereas winning a lawsuit you can run it up the flagpole. Shakedown artists get a good payout this way - whether it's Ford, Stelantis, GM, Tesla, Toyota, Honda or the VW group.

    Your next car purchase, when you pay up the wazoo, consider how part of the sunk cost necessarily involves a bunch of Shakedown payouts, not to say that all claims are BS.
    .
     
    #53 hill, May 1, 2024
    Last edited: May 1, 2024
    Zythryn likes this.
  14. radsaq

    radsaq Junior Member

    Joined:
    Dec 10, 2007
    28
    15
    0
    Location:
    Virginia, USA
    Vehicle:
    2024 Prius
    Model:
    XLE
    Uppity Americans, always wanting to sue for damages just because of serious injury/death. Imagine.

    (As a software developer, I'm all for progress and innovation. But tech bros applying "it's better to ask forgiveness than for permission" to everything is just reckless and dangerous when it comes to issues of public safety. If manufacturers/regulators can't figure out how to properly test and deploy this stuff without beta testing it on fleshy, disposable humans then perhaps we shouldn't be doing it. No one says self-driving cars need to exist. Invest in public transport infrastructure instead, sheesh.)
     
    Trollbait, AndersOne and Zeromus like this.
  15. Zeromus

    Zeromus Member

    Joined:
    Oct 21, 2023
    344
    164
    4
    Location:
    Ottawa Canada
    Vehicle:
    2024 Prius Prime
    Model:
    SE
    I think Tesla folks are being extra sensitive. I don't think people are saying the company is going to be destroyed by FSD or Autopilot, and no other manufacturer working on this tech is going to be destroyed entirely either.

    But the regulators clearly see an issue, and are trying to understand it.

    At no point did I imply people need electric shocks, and at no point did I say Tesla doesn't tell people to pay attention to the road at all.

    What I'm saying is that there's an issue, they've focused in on Tesla since it appears to be one of the major culprits, and it also makes continuous changes and adjustments in a space that doesn't have strong regulations as is, and is an acutely problematic example of the problem. My example about Tesla making changes is pretty clear. They pushed an update that made their systems more permissive as it relates to the checks that drivers are holding the wheel. They also use language that implies a significant level of autonomy that regulators are concerned isn't yet safe enough for the existing checks and balances.

    Maybe Tesla's reminders are sufficient, and if the regulators determine they are, they may be set as a legal minimum. At the very least, if they set a legal minimum then manufacturers can't push updates to be more permissive without the regulators giving the okay first. That's it. What they probably didn't like was the tweak that let people be more hands off. Or maybe, they'll come down with some sort of hammer on *every* manufacturer implementing more advanced functions.

    My personal point on Tesla's naming convention is that it implies, to the average person, a higher level of "hands free" operation, and self-driving capability than is actually safe in as many conditions as people seem to think.

    When I tell coworkers the prius prime follows lanes with cruise control and that I can hold my turn signal at highway speeds and it will change lanes on its own they react with complete shock. They say "I thought only tesla's could do that" and some will ask about self driving. Some will even glow about how cool it is that their friend got the trial for self driving and that the car was parking itself, and how its so advanced that Tesla is able to have these features. Ignoring that many luxury brands have had self-parking for years prior as well, and outside of the self-driving moniker.

    The average person who isn't aware of all the latest tech and features in cars ascribes these things that many manufacturers increasingly have *to Tesla* and *to FSD* because the company is just that good at its marketing. So it makes sense that people who want an EV but aren't hyper into the tech will similarly have somewhat naive understanding, may be more willing to engage with the features given their perceptions, and because it is so common in the public consciousness in this way, may be why Tesla in particular is a magnet for the NHTSA to go after in this study. Keep in mind also that because Tesla is so much more popular, and equips so many vehicles with these features, they may well have the most data available to support regulatory improvements that would go beyond Tesla as well.

    If I was a regulator I know I'd rather the years and volume of data that Tesla has - thanks in part to things like sentry mode and the cameras built into the car as well - available to me for assessing these new technologies than I would hoping Toyota TSS3.0 data is robust, or hoping that enough people bought a self-driving Mercedes to let me get good data from which to make a decision on legal minimums for driver awareness related hardware.

    The driver is, ultimately, the most culpable part of the equation. Same as they were when seatbelts were introduced and people chose not to wear them. But at some point legal minimums and requirements need to be put in for a base level of expected safety and function for the benefit of everyone, and to lower the number of bad drivers and/or moments of poor decision and inattention. The best corollary to trying to have a system to help with a momentary lapse of judgement is probably seat belt chimes. Sometimes, a person is in a hurry and they forget to put on their seatbelt, but a chime reminding them to put it on is helpful. A chime that reminds a parent to check the back seat because a pressure sensor says there's a carseat there will help to prevent a sleep deprived and flustered parent leaving a kid in the back seat. Its ultimately their responsibility but that doesn't mean an extra layer isn't helpful. And its best for there to a baseline expectation of what that extra layer looks like across the board and there should be rules about making changes that roll back some of that extra layer if it's starting point is above the legal minimum too.

    Its more than fair for the NHTSA to work on this issue using tesla given their data, popularity, and accessibility.

    Good for you, too bad the average person may not fully understand what "requires driver attention" means. When the damn Apple Vision Pros came out there were videos of people actively using them, on their faces, in VR, in Teslas, in the driver seat of the car. These aren't exactly tech illiterate folks taking these wild risks and putting significant levels of faith in the Tesla's FSD capabilities either.

    I've seen countless examples of people in Tesla's making internet First person POV videos of them letting Tesla figure stuff out on the road and letting the car figure out complex situations, actively saying "i don't know if I need to intervene lets see how this goes" and letting the car go MUCH further than I would feel comfortable before taking over, and getting into close calls because of it, and then praising the car for doing so well.

    To some extent I find some people who fully understand that FSD has its limits are even more willing to push it when less tech literate people are much less trusting and probably would be safer. Bad actors are the issue but like I said, we should be protecting them from themselves and others from them if that's easily doable by laying down ground rules.
     
    #55 Zeromus, May 1, 2024
    Last edited: May 1, 2024
  16. Zythryn

    Zythryn Senior Member

    Joined:
    Apr 28, 2008
    6,231
    4,227
    1
    Location:
    Minnesota
    Vehicle:
    Other Electric Vehicle
    Model:
    N/A
    Good thing these people have multiple messages, warnings, and a required in-car demonstration then.

    I’m not saying some people won’t abuse this.
    Heck, people have been abusing cars for decades through inattention, drunkenness and just being negligent.
    And yes, the name (Autopilot & FSD) were bad choices.
     
    Isaac Zachary likes this.
  17. bisco

    bisco cookie crumbler

    Joined:
    May 11, 2005
    108,634
    49,357
    0
    Location:
    boston
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    Tl;dr most of this thread
     
    Isaac Zachary likes this.
  18. sylvaing

    sylvaing Active Member

    Joined:
    Jul 15, 2023
    1,027
    424
    0
    Location:
    Canada
    Vehicle:
    2017 Prius Prime
    Model:
    Plug-in Base
    When overtaking a car, I check their blind spot LED and they only light up when I'm close to the rear end of their car. I've never seen one lighting up earlier than that.

    The radar of those two Ford Bluecruise worked great, until it didn't. You're still in control of the car.

    Last Sunday, I did a 98 km drive in FSD, from my cottage driveway to my in-laws driveway with zero intervention. All that driving was on regional and city roads, no highways where it's much easier for a ADAS. The drive started from my private dirt road that is not even mapped and it handled it flawlessly as can be seen below



    A few km of a 90 km/h regional road had absolutely zero line marking, just black asphalt, with no car in front to show the way, the car handled perfectly.

    Stops, unprotected intersections, traffic lights, moving over for parked vehicles obstructing the lane a bit, all were handled with zero issue and that includes a flashing green light that gave us priority to turn left. That one I wasn't sure how it would handle as it was the first time I've taken a flashing green light with it. We got there when it was already flashing and cars were already waiting in the opposite direction. It turned without any hesitation.

    Personally, I'm impressed with its capabilities. I will not get complacent and stop monitoring the drive though but for long drives, it definitely helps with the mental task of driving.

    Check this guy's drive through Manhattan trafic (and it's not the latest iteration of FSD either) and tell me what other vehicle right now could be doing that there, or basically anywhere in NA.

     
    bisco and Zythryn like this.
  19. Isaac Zachary

    Isaac Zachary Senior Member

    Joined:
    Jan 20, 2018
    1,823
    867
    1
    Location:
    USA
    Vehicle:
    Other Hybrid
    Model:
    N/A
    I remember being told once or twice in school that a short, to the point answer is not only easier to understand, it also shows that the person answering understands the question. Unless you're doing an essay or something similar of course.
     
  20. bisco

    bisco cookie crumbler

    Joined:
    May 11, 2005
    108,634
    49,357
    0
    Location:
    boston
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    Yeah, too many of these posts look like manifesto’s
     
    bwilson4web likes this.