Investigating Autopilot

Discussion in 'Tesla' started by bwilson4web, Jun 18, 2021.

  1. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    25,821
    14,615
    0
    Location:
    Huntsville AL
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Plus
    Source: UPDATE 2-U.S. safety agency probes 10 Tesla crash deaths since 2016

    WASHINGTON, June 17 (Reuters) - U.S. auto safety regulators said on Thursday they have opened 30 investigations into Tesla crashes involving 10 deaths since 2016 where advanced driver assistance systems were suspected of use.

    The National Highway Traffic Safety Administration (NHTSA) released a list offering details about crashes under review by its Special Crash Investigations programs.

    The agency had already confirmed some specific Tesla crash investigations but had not previously released to Reuters a full accounting of all Tesla crashes investigated where Tesla's Autopilot system was suspected of being involved.

    Of the 30 Tesla crashes, NHTSA has ruled out Tesla's Autopilot in three and published reports on two of the crashes.
    . . .

    Now if they could just do the same for ICE vehicles.

    Bob Wilson
     
  2. Trollbait

    Trollbait It's a D&D thing

    Joined:
    Feb 7, 2006
    19,198
    9,621
    0
    Location:
    eastern Pennsylvania
    Vehicle:
    Other Non-Hybrid
    Whenever an accident with Tesla and Autopilot hits the news, I wonder how many other accidents had Comma.ai or other car brands' driver aids involved?
     
  3. hill

    hill High Fiber Member

    Joined:
    Jun 23, 2005
    18,282
    7,224
    54
    Location:
    Montana & Nashville, TN
    Vehicle:
    2004 Prius
    Model:
    IV
  4. 3PriusMike

    3PriusMike Prius owner since 2000, Tesla M3 2018

    Joined:
    Jun 21, 2009
    2,781
    2,132
    0
    Location:
    Silicon Valley
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    Interesting Tesla Auto Pilot issue.

    I was driving in the #2 lane on a freeway in AP. The lane to the left was a double-white-lined toll lane. All was fine.
    Then the left lane widened to two lanes to provide merging into and out of the toll lane and the lane marker became a dashed line for a while. At the same time I caught up to the car in front of me and AP began to do a lane change to the left to pass.
    Not what I wanted to do...nor should AP do this.
    Hey Tesla, you should fix this.
    Oh, I forgot to mention that I had just passed a Tesla service center and the factory in Fremont CA one exit back
    These toll lanes were put into service 9-10 months ago. I guess all the AP engineers are working from home.

    Mike
     
  5. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    992
    522
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Advanced
    Thanks for posting that link.

    It was interesting to see how many of those accidents were either run into by other vehicles or were simulated accidents, I.E. accidents that were avoided when a human intervened. Many of the serious ones were situations where the waymo vehicle was rear ended. The single instance where the waymo rear-ended another car was one where a car swerved into the lane and slammed on the brakes. The waymo van (or was it SUV???) could not stop in time.

    Significantly, there were no deaths or major injuries. I like that in a fully autonomous car. :)
     
  6. fuzzy1

    fuzzy1 Senior Member

    Joined:
    Feb 26, 2009
    16,309
    9,465
    90
    Location:
    Western Washington
    Vehicle:
    2012 Prius
    Model:
    Three
    "Here is a breakdown of the car data from January 1, 2019 to September 30, 2020, which covers 65,000 miles in driverless mode. The disengagement data is from January 1 to December 31, 2019, which is when Waymo’s cars drove the aforementioned 6.1 million miles." [emphasis added]

    The total miles aren't yet up to sufficiently useful numbers. Human drivers were running just barely over 1 death per 100 Million miles driven, just before the Pandemic put them on bad behavior.
     
  7. Zythryn

    Zythryn Senior Member

    Joined:
    Apr 28, 2008
    5,665
    3,697
    1
    Location:
    Minnesota
    Vehicle:
    Other Electric Vehicle
    Model:
    N/A
    I haven’t switched on the pass slower cars setting yet.
    We used AP quite a bit on an out of state trip last Saturday. I really enjoyed it, but am not yet comfortable letting the car decide, and then change lanes without my input.
     
  8. 3PriusMike

    3PriusMike Prius owner since 2000, Tesla M3 2018

    Joined:
    Jun 21, 2009
    2,781
    2,132
    0
    Location:
    Silicon Valley
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    You can cancel any lane changes.
    When it detects to change lanes it turns on the turn signal for a few seconds, then moves.
    During this time you can flick the turn signal the other way and it cancels it

    Mike
     
  9. fuzzy1

    fuzzy1 Senior Member

    Joined:
    Feb 26, 2009
    16,309
    9,465
    90
    Location:
    Western Washington
    Vehicle:
    2012 Prius
    Model:
    Three
    Teslas with Autopilot a step closer to recall after wrecks

    DETROIT (AP) — Teslas with partially automated driving systems are a step closer to being recalled after the U.S. elevated its investigation into a series of collisions with parked emergency vehicles or trucks with warning signs.

    The National Highway Traffic Safety Administration said Thursday that it is upgrading the Tesla probe to an engineering analysis, another sign of increased scrutiny of the electric vehicle maker and automated systems that perform at least some driving tasks.

    Documents posted Thursday by the agency raise some serious issues about Tesla’s Autopilot system. The agency found that it’s being used in areas where its capabilities are limited, and that many drivers aren’t taking action to avoid crashes despite warnings from the vehicle. ...

    Investigators will evaluate additional data, vehicle performance and “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks, undermining the effectiveness of the driver’s supervision,” the agency said. ...

    The agency found that in many cases, drivers had their hands on the steering wheel as Tesla requires, yet failed to take action to avoid a crash. This suggests that drivers are complying with Tesla’s monitoring system, but it doesn’t make sure they’re paying attention.

    In crashes were video is available, drivers should have seen first responder vehicles an average of eight seconds before impact, the agency wrote. ...

    The agency document all but says Tesla’s method of making sure drivers pay attention isn’t good enough, that it’s defective and should be recalled, said Bryant Walker Smith, a University of South Carolina law professor who studies automated vehicles.

    “It is really easy to have a hand on the wheel and be completely disengaged from driving,” he said. Monitoring a driver’s hand position is not effective because it only measures a physical position. “It is not concerned with their mental capacity, their engagement or their ability to respond.”

    Similar systems from other companies such as General Motors’ Super Cruise use infrared cameras to watch a driver’s eyes or face to ensure they’re looking forward. But even these systems may still allow a driver to zone out, Walker Smith said.

    “This is confirmed in study after study,” he said. “This is established fact that people can look engaged and not be engaged. You can have your hand on the wheel and you can be looking forward and not have the situational awareness that’s required.”

    In total, the agency looked at 191 crashes but removed 85 of them because other drivers were involved or there wasn’t enough information to do a definite assessment. Of the remaining 106, the main cause of about one-quarter of the crashes appeared to be running Autopilot in areas where it has limitations, or in conditions that can interfere with its operation.

    “For example, operation on roadways other than limited access highways, or operation in low traction or visibility environments such as rain, snow or ice,” the agency wrote. ...
     
  10. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    25,821
    14,615
    0
    Location:
    Huntsville AL
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Plus
    Good deal!

    The facts and data are welcome from an honest and fair 3d party.

    Bob Wilson
     
  11. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    992
    522
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Advanced
    It's not surprising that people are failing to take over when the Tesla Autopilot is unable to safely drive the car.

    I have run into a similar situation using the smart cruise control on the Prius Prime. It's natural to get used to it when you are driving for 4 or 5 hours straight. Then when you pull off the road to grab a bite to eat it's natural to expect the auto braking to bring you to a complete stop. By the time you realize that the auto braking is not going to be activated you may be in a dangerous situation.

    There should be a standardized way to notify drivers that they are NOT in a full auto mode.
     
    John321 likes this.
  12. bisco

    bisco cookie crumbler

    Joined:
    May 11, 2005
    101,136
    45,857
    0
    Location:
    boston
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    tesla needs to add an awareness brain probe embedded chip. let it unlock the doors and call it a feature.

    if elon got along with bill gates, maybe bill would allow him to add a tesla chip to his ms vaccine injection
     
  13. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    992
    522
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Advanced
    All joking aside, there have been serious injuries and death caused by driver confusion precipitated by failure of the auto pilot to behave as expected.

    One of the lessons learned when doing system design is that the system (hardware and software) has to behave predictably. One design failure that comes to mind from my youth was the car response when the auto choke was engaged and the engine was cold. In those days, it was a crap shoot to try to cross a busy road because you never knew when the car would stumble on acceleration. You never attempted to cross a 6 lane intersection unless it was pretty clear.

    Applying that lesson to modern cars... Some cars are much too configurable and are lacking the cross checks that would ensure that the ride is predictable. For instance, when you push updates over the air, the driver is not always aware of the changes that are made. It might change the braking algorithm to work better, but with less feedback. It might go into 'one pedal driving' unexpectedly. Or it might change the autopilot such that it goes into a failsafe mode that pulls the car to the curb when it sees flashing lights from emergency vehicles. That's not a great idea, but there have been worse decisions made. :)

    ALL JOKING ASIDE.... It would probably be a morally better path to stop trying to make light of poor car designs that cause (or allow) people to die.
     
    Mendel Leisk likes this.
  14. PaulDM

    PaulDM Active Member

    Joined:
    Sep 24, 2016
    582
    290
    2
    Location:
    UK
    Vehicle:
    2022 Prius
    Model:
    Excel
    There are too many non Autopilot cars on the road to consider this a safe option. Once all cars are autopilot. Happy days.
     
  15. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    25,821
    14,615
    0
    Location:
    Huntsville AL
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Plus
    Perhaps ‘hands on Autopilot time’ might be a first step?

    In 2019, I paid $2,000 for the Autopilot extra. Two months later, it paid for itself when it protected my family when a transient medical event (micro sleeps) hit me while driving.

    Tesla reports that using Autopilot reduces the accident rate by about 1/8th compared to NHTSA accident statistics. But unlike the required up-scale systems, it is standard and improving.

    When I was engineering, we said never let perfect become the enemy of good enough. So let us see if NHTSA will confirm the 8x safety of Autopilot. Better still, reporting rules to compare performance of all Autopilot-like systems.

    Bob Wilson
     
  16. bisco

    bisco cookie crumbler

    Joined:
    May 11, 2005
    101,136
    45,857
    0
    Location:
    boston
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    i wonder what nhtsa is doing about all the non auto pilot distracted driving accidents
     
  17. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    992
    522
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Advanced
    From what I read, it's not that autopilot drivers were distracted so much as the drivers were not prepared to detect the situations where the autopilot was inadequate to the task.

    If nothing else, the auto pilot should flash a big red banner across the windshield to coincide with the audio that screams "Oh SHIT!" anytime that it becomes confused about what to do.
     
  18. Trollbait

    Trollbait It's a D&D thing

    Joined:
    Feb 7, 2006
    19,198
    9,621
    0
    Location:
    eastern Pennsylvania
    Vehicle:
    Other Non-Hybrid
    Subaru's Eyesight does this, but less vulgar.
     
  19. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    25,821
    14,615
    0
    Location:
    Huntsville AL
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Plus
    It usually scares my passenger(s).

    Bob Wilson
     
  20. fuzzy1

    fuzzy1 Senior Member

    Joined:
    Feb 26, 2009
    16,309
    9,465
    90
    Location:
    Western Washington
    Vehicle:
    2012 Prius
    Model:
    Three
    That runs into the chicken-and-egg problem. You can't force everyone to get and use it all the time on all roads, until it is good enough to use all the time, on all roads, in all conditions. And it can't get there without the improvements and validations that come with billions of miles of user testing. So we can't use the none-or-all model, but are stuck with a mixture on the road for a significant while to come.
    Beware, that 8X 'improvement' very highly likely contains a lot of selection bias, with most miles on the easier and safer roads in the conditions that it can actually handle. It seems very likely that the roads and conditions that it can't handle, thus aren't included in those test miles because it is turned off, are also heavily weighted to those where humans have a higher crash rate.

    We talked about crash rate differences on different road types sometime last year. A more careful comparison is needed.
     
    #20 fuzzy1, Jun 13, 2022
    Last edited: Jun 13, 2022
    Trollbait likes this.
Loading...