12 comments

  • londons_explore 11 days ago
    > , it only records an accident if the airbags deploy,

    I've often wondered if Tesla is gaming the figures by only deploying the airbags in really bad crashes, whereas other manufacturers might deploy the airbags for pretty minor bumps.

    This in turn makes them look really good on 'number of accidents where airbags deploy'.

    • Veserv 11 days ago
      Tesla frequently loses telemetry in serious and fatal crashes [1]. For a specific example you can see here [2] listed as 13781-3074 where the fatal crash was detected by a complaint rather than telemetry. Of reported conclusively fatal accidents, only ~45% were detected by telemetry.

      From that alone, we can conclude that their telemetry is bad at fatal crash detection. However, when we contrast it against their regular crash reporting where ~90% are telemetry, we can further conclude that either crashes are massively under-reported or fatal accidents are uniquely under-reported (even ignoring the unknown base rate vs reported rate problem). In either case, the massive unexplained discrepancy in the most visible, fatal crashes demonstrates that their data collection procedures are grossly inadequate to conclude any positive safety statistic.

      The fact that Tesla repeatedly markets safety statistics based on grossly inadequate data collection procedures without any attempts to identify the true, higher rate, and without qualification is criminally repugnant.

      [1] https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...

      [2] https://www.washingtonpost.com/technology/interactive/2024/t...

      • londons_explore 11 days ago
        Recently disassembled a 2023 Nissan, and found that the telemetry/emergency crash reporting system has its own internal battery, sim card, and LTE antennas, allowing it to report a crash even when the cars electrical system is totally down due to large numbers of burnt/broken wires.

        I wonder if Tesla will consider that for future models...

    • FireBeyond 11 days ago
      > I've often wondered if Tesla is gaming the figures by only deploying the airbags in really bad crashes, whereas other manufacturers might deploy the airbags for pretty minor bumps.

      So even as a Tesla "hater", but as a paramedic, first generation airbags were very simplistic. Essentially, "if speed > x mph and collision = true then deploy".

      Nowadays, advanced airbag systems do calculations based on speed, G forces, angle of intrusion, lateral movement, etc., and decide "will the airbag system cause or risk more injury?", i.e. "can the passenger restraint system (i.e. seatbelts) do sufficient work to minimize injury".

      So you can get relatively low speed collision with heavy lateral movement causing a deployment, but some other deployments at 30mph not deploying airbags.

      All that being said, I do not believe Tesla (or any other manufacturer) so describe accidents only in terms of airbag deployment. Other factors like speed, G force felt, did seatbelt pretensioners activate, should all be sufficient to categorize as an accident.

    • bangaladore 11 days ago
      The IIHS has repeatedly shown that Tesla airbags are the best in class. Deploying airbags when not needed can do significant harm to occupants. Not to mention total vehicles for no good reason.

      There isn't a need to wonder when independent data gatherers have already answered the question.

      • FireBeyond 11 days ago
        > The IIHS has repeatedly shown that Tesla airbags are the best in class.

        No they haven't. For a start, IIHS doesn't rank "airbags".

        And then... Tesla fans love to latch on to this as a trope.

        The IIHS -groups- vehicles into safety levels by class. It doesn't rank in class, at all.

        So what, you might say, they're still in the top or top two safety levels, they deserve credit... Except:

        There are twenty one other vehicle at that same level or higher in the same segment ("mid-size luxury"), and sixteen if you only count the top segment.

        Of those, Tesla could be number 1, or number 16. You don't know. So this whole meme of "Tesla is the safest" needs to die, it's just bluster.

        • bangaladore 11 days ago
          I think you are misrepresenting the actual IIHS results.

          For example, the Model Y is one of five cars that achieved the highest rating in its size class, not one of 16. You are grouping their highest and second-highest tiers together.

          Additionally, the Model Y is by and far the cheapest vehicle in that category.

          So, yes. If you consider a car to be classified by size and price, I think it is a very reasonable way to classify a car, the Model Y is very clearly the best in its class.

          • FireBeyond 11 days ago
            > For example, the Model Y is one of five cars that achieved the highest rating in its size class, not one of 16. You are grouping their highest and second-highest tiers together.

            Who said I was talking about the Y? (I also specifically said that I was grouping them ("they're still in the top or top two safety levels"), because IIHS has "Top Safety Pick+", "Top Safety Pick" and "Others". Though I have been looking at 2023. But let's break it down:

            Tesla Model 3: Didn't make the cut.

            Model S: Didn't make the cut.

            > So, yes. If you consider a car to be classified by size and price, I think it is a very reasonable way to classify a car, the Model Y is very clearly the best in its class.

            "So yes" implies a logical conclusion. But your initial point was "IIHS says Tesla has the best airbags in class", which it does not.

            For one model of car it says that it is in the top five of that segment.

            But then somehow you blow that out to be "clearly, best in class, because it's cheapest".

            Like, no. The other cars in the category could have better safety than the Model Y. And if your argument is "its best because its cheapest" when it comes to safety? Wow. Huh. I suppose we all know the safest components are the cheapest?

            • bangaladore 11 days ago
              The original comment about Tesla allegedly gaming the system by not deploying airbags when they should is unfounded based on available safety metrics. It's an unsubstantiated claim that derails the conversation from the main point.

              Comparing the minor differences in airbag performance among a handful of cars is not particularly relevant to the overall argument, especially when considering the vast number of poorly performing vehicles on the market.

              And, yes, I do believe that cost matters here. Cars exist in size and price classes. If you cannot comprehend that, that's fine. But it doesn't change the argument.

              • FireBeyond 11 days ago
                > The original comment about Tesla allegedly gaming the system by not deploying airbags when they should is unfounded based on available safety metrics. It's an unsubstantiated claim that derails the conversation from the main point.

                If you read my other comments in this thread, I partially agree with you. But I also think Tesla's "we don't count it as an FSD/AP incident if airbags weren't deployed" is "convenient", given that "advanced airbags" (which are a spec) use a whole variety of means to determine deployment which don't correlate to the severity of the incident, i.e.:

                depending on other parameters, you can collide with someone at 20-30mph, but have no airbag deployment, because the algorithm decides that passenger restraint is sufficient. Great. Except if a car operating in FSD/AP mode causes a 20-30mph collision with something else, that's a notable incident. Well, most people would think so. Tesla explicitly says this is NOT an incident when reporting FSD/AP stats. Huh.

                > And, yes, I do believe that cost matters here. Cars exist in size and price classes. If you cannot comprehend that, that's fine. But it doesn't change the argument.

                I can comprehend that just fine. But it's not a factor for IIHS, which is what -you- brought into the argument when you said "IIHS says it's best in class for safety".

                1. It doesn't.

                2. You can't then say "oh, well, if you consider cost as well, then clearly it must be best in class", which is a conclusion that cannot inherently be drawn from the previous.

                It -may- be best in class "overall", not for safety alone, but that's got nothing to do with what you said. The flow of that argument was:

                You: It's best in class.

                Me: Not demonstrably, it's one of several cars that are in the top category for that class.

                You: Well, if you factor in price, too, it's "clearly" best in class.

                Everyone likes paying less money, sure. But you're already talking about the luxury segment, where just maaaaybe people are conscious of more factors than price when considering even this broader definition of best in class.

      • londons_explore 11 days ago
        It can be best in class, yet still rarely deploy...

        It can even be best in class when tested, rarely deploy, and save more lives than competitors. It just needs to correctly detect cases like this[1].

        [1]: https://youtu.be/MTX0MvqBqL0?t=10

        • bangaladore 11 days ago
          You claim that Tesla is gaming the system.

          I'm not aware of a definition of gaming in which the system is overall safer or equivalently safer than other top-tier competitors. That seems objectively better than deploying airbags when they will harm the occupants or total the vehicle when not-needed.

          • thenewnewguy 11 days ago
            Tesla is not gaming the system by having a good airbag system that only deploys when necessary (that's actually really good!).

            Tesla is gaming the system by excluding accidents where the airbags don't deploy from the "Autopilot/FSD accidents" dataset, thus artificially deflating the number of accidents.

            • bangaladore 11 days ago
              This is a point you certainly could make. But it is not what the original commenter wrote.
              • thenewnewguy 11 days ago
                Agreed, but I am not the original commenter, I am making my own point separately from them. I agree with you that Tesla isn't gaming anything by having a good airbag.
          • chucksta 11 days ago
            It only makes sense if everyone else is gaming the system, and then it just proves it's still the best lol.
    • YeahThisIsMe 11 days ago
      That's how airbags work in general.

      A slight bump won't set off your airbag in any car.

    • furyofantares 11 days ago
      I have no knowledge or opinion about what thresholds various auto makers have relative to each other, beyond the fact that they probably all meet regulations.

      But I do want to point out that ~nobody would buy a car that deploys airbags for "pretty minor bumps". Airbag deployment is a serious event, it can injur or in rare cases kill a person itself.

  • altairprime 11 days ago
    The investigation conclusions are just over one page long and are plainly written:

    https://static.nhtsa.gov/odi/inv/2022/INCLA-EA22002-14498.pd... [pdf]

  • loudmax 11 days ago
    If Tesla hadn't named their driver assistance system "Autopilot" they would have saved so much trouble (not to mention lives).
    • stetrain 11 days ago
      The underlying issue isn't the name, it's the mismatch between marketing claims, actual performance, and what it lets you "get away" with.

      They released a system that will engage on almost any road and provides a firm steering force that will steer without driver interaction.

      Unlike some competing systems, they did not include dedicated hardware for monitoring driver attention, or restrict operation to separated highways.

      Ever since, they have been trying to layer on increasingly annoying nags to jiggle the wheel, or use the low res interior camera to try to determine driver attention, to try to work backwards from the fact that they didn't provide for proper driver monitoring to begin with or build a system that discourages zoning out.

    • fyrn_ 11 days ago
      It was the name _and_ constant absured marketing claims, not just the name
    • lukan 11 days ago
      Autopilots in planes maintain altitude and speed and no landing or evasion.

      But full self driving was a broken promise.

      • bombcar 11 days ago
        Autoland exists - but it's potentially more like calling a trim wheel "autopilot" or an ILS system "autoland".
      • spamizbad 11 days ago
        Sure, but there's an expectation that an automobile will be driven by virtually any adult that can pass a basic driving test. An airplane requires extensive training and certification.
      • wil421 11 days ago
        [deleted]
      • ahahahahah 11 days ago
        That's a really dumb argument when the research exists explaining exactly how people understand autopilot in the context of cars.
      • kwhitefoot 11 days ago
        Autopilot is not FSD.
    • 1-6 11 days ago
      To be fair, I don’t think Autopilot is well defined in any governing literature. We’re in the wild west.
    • dzhiurgis 11 days ago
      Not sure. Anyone who used it for a mere hour would understand its limitations immediately.

      Otherwise - all systems can be abused.

    • tgma 11 days ago
      I can see you can potentially argue (unsuccessfully IMHO) that it is marketing fraud in inducing purchases of the vehicle, but I don't buy it that a single person who actually owns the vehicle and have driven on Autopilot for more than a week is confused by the abilities of the system because the name is Autopilot. People text and drive _all the time_ on their ICE carriages without Autopilot too. Thus the claim that renaming it would have saved a life, let alone lives, is simply sensationalism.
      • FireBeyond 11 days ago
        Yeah, except of those people who have bought (at a significant price) those features, and understand that they're nowhere near their "aspirational" goals (When Tesla has been called out by the SEC and DMVs for some of Musk's more absurd statements, the company has quietly whispered to them "these are aspirational goals", and in one case, to the California DMV, "Musk's visionary statements are not reflective of engineering reality")...

        a significant portion of them still allow their car to do stupid/dangerous/risky things, because they believe that the car learns and adapts from this, and sends this data back to Tesla for further learning.

        Which it doesn't, as this article states. Those drivers are "it's okay if I run the occasional stop sign because it'll help my Tesla stop at the next one", "it's okay if I do XYZ..." when it won't do anything except put others (and themselves) at risk.

        • tgma 11 days ago
          The aspirational claims are not about Autopilot, but FSD (and I agree that having people pay for FSD in 2015 is borderline fraudulent. It is a consumer protection matter, not a safety matter.)

          Again, the statements to California DMV or SEC or Congress is one thing. Whether the owner is confused by the abilities of their vehicle post purchase and driving for a week is a distinct matter.

          > Those drivers are "it's okay if I run the occasional stop sign because it'll help my Tesla stop at the next one

          I think this is simply a made-up and ridiculous characterization. People break traffic laws all the time because they are selfish and it is convenient for themselves. Tesla drivers are no different. Have you seen a single consumer, not an employee, who does this for training data for Tesla?

          • Veserv 11 days ago
            They are objectively, literally fatally confused as to the abilities of their vehicles.

            Tesla's systems, or any other ADAS for that matter, can not be safely used without a fully attentive driver and Tesla's system in particular further demands that you must keep your hands on the steering wheel at all times. Deliberately and intentionally operating such a system without your full attention because you think it will safely drive you without your attention is like shoving your hand into a spinning saw blade.

            That is not to say that everybody who is injured by a spinning saw blade or by inattentive use indicates a product is fundamentally unsafe. Many people just inherently display inappropriate caution and most become complacent with dangerous devices through repeated "safe use". For a spinning saw blade, they miscalculate the risk and overestimate the safety mechanism, but they are not confused about that fact that inserting their hand into a spinning saw blade is a very bad idea.

            If Tesla ADAS operators only overestimated their attention then it would be comparable to a spinning saw blade, useful but dangerous. However, Tesla marketing aggressively misrepresents the capabilities of their systems and the Tesla ADAS operators routinely believe the marketing that it can safely drive without their attention; they believe you should insert your hand into a spinning saw blade and many people have literally died as a result. That is textbook fatal confusion about the abilities of their vehicles.

            • tgma 10 days ago
              > They are objectively, literally fatally confused as to the abilities of their vehicles.

              Citation needed.

              > Deliberately and intentionally operating such a system without your full attention because you think it will safely drive you without your attention is like shoving your hand into a spinning saw blade.

              False. Objectively and manifestly so. Regardless of it being a good idea or not, one hurts you with a small probability, one almost certainly.

              > For a spinning saw blade, they miscalculate the risk and overestimate the safety mechanism, but they are not confused about that fact that inserting their hand into a spinning saw blade is a very bad idea.

              Where is the data that attributes the behavior to risk miscalculation, not the temptation of doing it despite the risk? People do stupid things all the time despite the risks. They skydive and bungie jump FFS.

              > If Tesla ADAS operators only overestimated their attention then it would be comparable to a spinning saw blade, useful but dangerous. However, Tesla marketing aggressively misrepresents the capabilities of their systems and the Tesla ADAS operators routinely believe the marketing that it can safely drive without their attention; they believe you should insert your hand into a spinning saw blade and many people have literally died as a result. That is textbook fatal confusion about the abilities of their vehicles.

              I cannot take this rant seriously; let's say we take it as true: you could substitute Tesla driving to driving any vehicle (which can be fatal) and keep the argument unchanged. Tesla has cited numbers on safety. I'd rather have a quantifiable argument about pros and cons rather than analogies comparing to saw blades.

              • Veserv 10 days ago
                The analogy was provided to help illustrate the difference between complacency and confusion.

                You can not substitute driving any vehicle into the analogy unchanged because in normal vehicles the overwhelming majority of vehicle operators do not believe the vehicle can be operated safely without attention. Some may do so anyways, but the overwhelming majority of those cases are because they believe they are paying an adequate amount of attention, not because they believe no attention is needed. That is an example of complacency. If they wrongly believe that no attention is needed, then that is confusion.

                It is okay if you misunderstood as people who have no experience with safety-critical engineering are usually unfamiliar with the distinction, but you should reflect on it until you understand as it is hard to provide meaningful input to a conversation until you understand the basics. Good luck with your learning.

                • tgma 10 days ago
                  Tha analogy was far off and exaggerated to the point that it was meaningless. At this point you're continuously making up things using phrases like "overwhelming majority". Unless you are able to substiate your claims with something other than your feelings and point to data I don't think continuing this discussion is productive. I'm outta here.
          • FireBeyond 11 days ago
            > People break traffic laws all the time because they are selfish and it is convenient for themselves. Tesla drivers are no different.

            Never said the first scenario is acceptable, and neither is the second. What's worse is this active "I'm going to let it do what it wants because it'll get better."

            > Have you seen a single consumer, not an employee, who does this for training data for Tesla?

            YouTube has plenty of examples. TikTok. "Training my Tesla" and similar searches will net you plenty of results.

            • tgma 11 days ago
              They are doing it for content not to train Tesla. Isn't that obvious? People have doing performative moves with their bikes motor vehicles since the dawn of its age. You can find tons of those for drifting BMWs. No ADAS there.
  • FireBeyond 11 days ago
    One of the big things to remember from all of this is that "if airbags do not deploy, Tesla does not consider it an accident" when it is touting its (already inaccurate and misleading for other reasons) "safety".

    As an automobile manufacturer, there is no good conscience reasoning that Tesla is unaware that active restraint and airbag systems of today are far more nuanced and weigh multiple criteria when deciding to deploy (as compared to initial/early implementations, which were essentially "if collision=true and speed >= X mph then deploy"). You can have a very significant incident (two that I've witnessed recently as a paramedic involved vehicles into stationary objects at ~30mph) without airbag deployment. But if that was a Tesla in FSD that hit something at 30mph and didn't deploy airbags, well, that's not an accident according to Tesla.

    That also doesn't account for "incident was so catastrophic that restraint systems could not deploy", also "not counted" by Tesla. Or just as egregious, "systems failed to deploy for any reason up to and including poor assembly line quality control", also not an accident and also "not counted".

  • londons_explore 11 days ago
    > a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it

    I think ~all recalls are opt-in. The owner of the car can simply not bother taking the car in to the garage, turn off auto updates, etc.

    It would be funny if Tesla decides to separate out software updates for 'recalls' vs software updates for new features, and lets users have the new features without installing the 'nannying' recall fixes. In most cases it would be easy to implement via feature flags.

  • bottlepalm 11 days ago
    I'd rather be driving on the road with Autopilot cars than the impatient drivers, road ragers, distracted people on their phone, and people who drive like they're in a video game.

    I drive with it on 95% on the time, and I really only need to interrupt when it's too slow and too cautious.

    • kurthr 11 days ago
      I generally agree it does great on freeways, but not in urban situations where there are tight corners and lots of parked cars, or rural areas with twisty roads and farm equipment. I'd turn it off in construction areas too.
      • dzhiurgis 11 days ago
        It straight up useless in urban areas. Even some country roads that are too windy/hilly it will become more of a babysit than yourself.

        I agree on highway. Easily 95% use, likely more.

        • bottlepalm 9 days ago
          What are you talking about? It works amazing off highway. I use it all day long. Have you used v12?
          • dzhiurgis 9 days ago
            We're talking Autopilot, not FSD
            • bottlepalm 8 days ago
              I'm the OP, and I'm talking about FSD. Sorry if that wasn't implied when I said I drive with it on 95% of the time. On all road types. It does great in construction areas as well.
    • grecy 11 days ago
      Don't forget all the drunk, high & exhausted drivers too.

      100 people are killed every single day on the roads in the US. Day in and day out.

    • flandish 11 days ago
      For sure but also consider those autopilot apis are written by impatient coders, distracted proj managers, and with crappy specs.
  • d_theorist 11 days ago
    The relevant metric should be accident rate per million miles driven. What is the rate with autopilot vs without autopilot.
    • HPsquared 11 days ago
      Not all "miles driven" are the same. Autopilot is going to be engaged on highway driving which already has a lower accident rate per mile driven. So that biases a simple comparison.

      Edit: and even within highway miles, autopilot will be enabled more in the "easy" stretches and turned off in roadworks etc.

      • FireBeyond 11 days ago
        Precisely. If these systems aren't or can't be engaged on a Pittsburgh street at night in a whiteout, then I don't much care.

        Tesla jukes the stats by "Oh, this won't go well, so we're not even going to try", which isn't an option for human drivers, short of "don't drive at all".

    • Someone 11 days ago
      On similar roads, in similar traffic, in similar weather.

      If you look at fatalities or define “accident” as cases where somebody had to visit a doctor or something like that, you also need to compare to similar cars. Most self-driving cars are relatively new and relatively large, and both decrease those (probably even accident rate because they statistically have newer brakes and tires and have more active safety features)

      It’s far from trivial to decide whether some program is a better driver than the average human driver if the program gets the newer bigger car and can decide to cop out in cases where we do not expect humans to do so.

      • lurkingmba 11 days ago
        You can add, "at a similar time of day". Hours after midnight are very dangerous.

        To my knowledge, Tesla has never put out data on how Autopilot is "safer" on accidents per mile that was close to being apples to apples.

      • d_theorist 11 days ago
        I agree you need to control for other factors. But I don't think that is particularly beyond the wit of man.
  • ChrisArchitect 10 days ago
  • jycr753 11 days ago
    [flagged]
  • outside1234 11 days ago
    This whole company is going to collapse into an Enron isn't it?
  • drcode 11 days ago
    Amusing that they went with a photo from 2014 to illustrate the article
    • LeafItAlone 11 days ago
      It says

      > A 2014 Tesla Model S

      the vehicle is a 2014.

      The photo is of a Jan 2018 crash

      https://www.ocregister.com/2019/09/04/ntsb-tesla-autopilot-l...

      • grecy 11 days ago
        > The photo is of a Jan 2018 crash

        So still more than 5 years out of date.

        Would you accept a photo of a 5 year old iPhone on an article about some iPhone limitation?

        Would you accept a photo of a 5 year old version of Uber on an article about some Uber limitation?

        Would you accept a photo of a 5 year old robot vaccum on an article about some robot vac limitation?

        These things all change drastically each year. Five years ago is ancient history, and it's kind of silly to think "evidence" that is more than 5 years old has any bearing on how things perform now.

        • Dylan16807 11 days ago
          It's a picture of a crashed car.

          Unless you're claiming the system doesn't crash now, a picture from 2018 is just as good as a picture from last week.

          It's not a main piece of evidence, it's an example. The evidence is in the text of the article where it talks about hundreds of crashes.

          As to your questions, as long as the issue in the article applies to the model in the picture, all of those are fine. I'd want a newer picture for phones because they cycle so fast but it wouldn't be a big deal. A crashed 4 year old 2014 model and a crashed 10 year old 2014 model look the same.

          • grecy 11 days ago
            It's like showing a video of an iPhone taking ages to load a web page or a very grainy photo that it took.. but it's from a 5 year old iPhone.

            FSD has come a LONG way in 5 years.

            https://www.youtube.com/watch?v=43Lrrhn0CMk

            • Dylan16807 11 days ago
              It's more like showing someone holding the iPhone taking a picture.

              Unless you're saying the way it crashes is qualitatively different now, in a way that you can see just looking at the car afterwards? They're not showing it in action.

              Also that's an Autopilot crash, not FSD.

            • FireBeyond 11 days ago
              > It's like showing a video of an iPhone taking ages to load a web page or a very grainy photo that it took

              LOL. The disclaimer "Sequences Shortened" on ads came about because of other manufacturer complaints to the FTC about very early iPhone ads very much editing sequences to make the iPhone look faster than it was.

              > FSD has come a LONG way in 5 years.

              And has an even longer way to go. Usable means things like driving in Pittsburgh in winter at night, not "hey, look, it no longer causes near misses attempting to figure out a roundabout in suburban California in the middle of the day".

              • grecy 11 days ago
                Why are you changing the topic and moving the goal posts?
                • reitzensteinm 11 days ago
                  Your first comment in this thread moved the goal posts from 2014 to 2018.

                  Now you say we aren't allowed to move goal posts? That's, like, meta goal post moving, man.

                  • grecy 10 days ago
                    Where did I say anything about 2014?

                    I was saying that comparing a product from 2018 to what is available today is dishonest.

    • landonxjames 11 days ago
      I believe 2014 is the model year of that Tesla, it seems like that particular crash was in 2018. https://incidentdatabase.ai/cite/320/
  • psunavy03 11 days ago
    At some point, I just don't get how a lot of this is Tesla's problem. I'll buy that they should have called it something other than "Autopilot" and "Full Self Driving." But ultimately if it's in the manual saying "this is not fully automated and you must pay attention to the road," it should be on the driver to be held liable for abusing the system.

    Sure, go after Tesla if the system endangers people via spurious inputs or dangerous maneuvers. But it's also not Tesla's responsibility to nanny people any more than alcohol manufacturers are held liable for drunk driving. If some idiot zones out on their phone or tries to "drive" from the back seat, their negligence is not Tesla's responsibility.

    Self-driving systems shouldn't need all these "are you paying attention" bells and whistles. Just hammer the people who abuse them with reckless driving charges.

    • vasco 11 days ago
      > Just hammer the people who abuse them with reckless driving charges.

      That surely helps the now dead people they crash into. The manufacturer can prevent deaths by adding safety features, so it should to a reasonable extent.

      • psunavy03 11 days ago
        So we are also going to require breathalyzer interlocks on everyone's car as well?
        • kbenson 11 days ago
          Why are we even going there when the article, and the NHTSA report note some very easy first steps?

          As Ars has noted time and again, Tesla's Autopilot system has a more permissive operational design domain than any comparable driver-assistance system that still requires the driver to keep their hands on the wheel and their eyes on the road, and NHTSA's report adds that "Autopilot invited greater driver confidence via its higher control authority and ease of engagement."

        • whoknowsidont 11 days ago
          Honestly not opposed to this.
    • osrec 11 days ago
      I remember back in 2013 when I first drove a Tesla, they were really pushing the self driving features at my local dealership. I can't help but feel like they were overselling their capabilities, and their staff could well have made a gullible person believe the car really does drive itself perfectly.
      • Uzza 10 days ago
        Autopilot did not even exist as a public feature in 2013. It was first mentioned as being in development then, but the cars didn't even have any hardware for it until September 2014.
    • jeffbee 11 days ago
      > it's also not Tesla's responsibility to nanny people

      Yes, it is. An L2 driver assist system must ensure that the driver is always actively supervising, and Tesla's systems failed to do that. That's why they recalled 2 million cars.

      • bangaladore 11 days ago
        It seems to me that there are 10s of millions of "L2 driver assist" as defined by synopsys that have zero driver supervising checks such as most cars out there with lane-keep and cruise control.
        • jeffbee 11 days ago
          And you believe that federal regulators have simply never heard of these other systems? A more likely explanation is that the driver attention monitoring in a Honda or Cadillac or whatever works better than Tesla's.
          • bangaladore 11 days ago
            You seem to have misunderstood me. I'm talking about vehicles without attention monitoring systems. Most cars sold in the past ten years technically have "Level-2" driving features but with no driver monitoring systems.

            Tesla autopilot / FSD is immensely more capable than 99% of LKAS systems, so maybe that's why regulators care more. But from a pure attentiveness standpoint, 1000 other car models have grossly worse safety in this regard.

            However, my argument is that your original claim that the "L2 driver assist system must ensure that the driver is always actively supervising" is demonstrably false, given that, by pure numbers, "Level-2" capable cars, by large, do not have driver monitoring features. However, the government continues to focus solely on Tesla.

            • jeffbee 11 days ago
              Can you name such a vehicle? My personal car with lane keeping and TACC insists that you keep your hands on the wheel.
              • bangaladore 11 days ago
                I might be mistaken about cars requiring no input, but isn't just steering sensing a problem? The car does not know if it is you or a weight on the steering wheel. Or if you are on your phone touching the wheel with your knee.

                According to the NHTSA, Tesla was forced to enable driver monitoring via a camera because sensing driver steering input was insufficient.

                So, without a doubt, 10s of millions of cars have "insufficient" driver monitoring while having "level 2" capability. Should those cars not be recalled and the feature be disabled (or updated to more aggressive monitoring if appropriate hardware exists)?

                To be clear, I fully support driver monitoring for ADAS to whatever extent regulations require. But picking and choosing who to enforce the rules suggests they aren't working for the general public's best interest.

                • jeffbee 11 days ago
                  The NHTSA insisted that Tesla switch to camera monitoring because Tesla repeatedly monkeyed with the nag interval via OTA software pushes. All the other manufacturers, except Tesla, complied with NHTSA's 2017 rule on the issue.
    • lakhim 11 days ago
      If you sold alcohol called "Safe Drive Juice", implied in advertisement that you can drive safe while drinking it, even made you a better driver, in tweets and press releases said that this is the future and everyone should use it, and if they did they'd be safer drivers, it doesn't matter if you put a note at the bottom that said "doesn't actually let you drive safe, please don't drink and drive"
    • dawnerd 11 days ago
      > But it's also not Tesla's responsibility to nanny people

      It literally is.

      > Self-driving systems shouldn't need all these "are you paying attention" bells and whistles.

      It's not self driving at all. That's the problem.

    • ado__dev 11 days ago
      A lot of Tesla marketing over the years has been super deceptive to the point that a lot of people assume the car is capable of fully self driving itself. I've had a model 3 since 2018 and the number of people that have asked "do you ever drive yourself?" is too damn high.

      So Tesla actively does everything they can to make people thing the system is way more capable than it really is. Then shows a small popup the first time you use it that tells you "j/k it doesn't really do any of this, goodluck". I feel like eventually they'll be held accountable for it, but the bureaucracy moves veryyyy slowly.

      • Ajedi32 11 days ago
        > A lot of Tesla marketing over the years has been super deceptive to the point that a lot of people assume the car is capable of fully self driving itself

        The thing is, it's not just marketing that's contributing to these misunderstandings. The car actually is capable of fully driving itself... most of the time. So it's not so much that their marketing is misleading as it is that the reality of the situation is itself misleading, if you don't pay close enough attention to the system's limitations and caveats.