Waymo illegal overtake into wrong-way driving

(old.reddit.com)

79 points | by stefan_ 10 days ago

13 comments

  • fabian2k 10 days ago
    I would be curious to know what the car thought the situation was, something must have gone really wrong here. It ignored the double lines and crossed them to overtake and then stayed in the wrong lane for a really long time.

    If the car understood the state of the world around this, this is kind of a huge mistake in behaviour. And if it didn't it is concerning as well as it would mean it can end up in the opposite lane without knowing it. I would suspect maybe a bit of both, that it misunderstood the world when it started to overtake, and then was unable to abort the process safely. In that case I probably would have expected it to stop driving forwards and put on hazard warning lights as long as it is in the wrong lane.

    • jfim 10 days ago
      One possible explanation would be that it detected the unicyclists as pedestrians, assumed that it would be able to overtake them, and was unable to do but kept trying to overtake them.

      Seems like a failure of their path planning in this case, the same way if someone were to try to overtake on a two lane road, one would have to abort the overtake if the vehicle one is trying to pass keeps on accelerating such that overtaking isn't safe anymore.

    • mchanson 10 days ago
      It’s kinda funny that the Waymo car is so impatient. I’m pretty sure in this situation I would have just waited until the group ride cleared out to ahead of me and then continued. I would not want to crowd or find my way around them. If I wanted to get past such a big group ride I would probably reroute onto another block since they would keep catching me at stop lights.
      • tmearnest 10 days ago
        Once I saw that Waymo rage pass all of those bicyclists, I knew we finally solved self driving cars.
        • ryandrake 10 days ago
          Yea, I felt the same way. Even though that was an unsafe overtake, I think the Waymo definitely passed the Turing Test with flying colors there.
      • steveBK123 10 days ago
        What's weird is that Waymo seems generally biased to very conservative driving so see it do a random aggressive move is strange.
    • gus_massa 10 days ago
      My guess is that the police car in the wrong lane confused the system. If a car is going in a direction, 99% of the time it's safe to follow the car and go in the same direction.
      • GJim 10 days ago
        This is the mentality that leads to multiple pile ups in fog, because every idiot is looking at the tail lights in front of them.

        FWIW: The driving test (at least in Blighty) teaches you to look as far down the road as you can, never to just do what the bloke in front is doing.

        • fallingknife 10 days ago
          The problem with fog is that you can't see much past the person in front of you?
    • axpy 5 days ago
      Everyone seems to be an expert on what to do in this situation. I would argue that it's not as trivial as it seems. Yes the waymo cars seems to be on the wrong side of the road for far too long but when surrounded with erratic uniwheeler, stopping is not necessarly the right thing to do either. As a human driver, i got into some strange situation where I had to play damage control because of an early bad decision. In a big city, not crossing the double yellow line mean you will probably end up stuck every block. I personally see this as a good example to learn from but not necessarly a reason to stop testing. I would also restrain from making any decision before seeing the actual data from the car cameras/lidar/radar system.
    • Timshel 10 days ago
      What I find revealing is it utter inability to reset to a safe position.

      It stays in the wrong lane; is it because it can't rejoin because the lane is occupied by the unicyclist (looking at the video it does not appear to make efforts to go back) ? or because it has already forgotten the context and is now driving happily in a lane ...

      It only stopped when its path is blocked ...

  • gnfargbl 10 days ago
    Waymo told the Chronicle in a statement that the robotaxi “detected that there may be a risk of a person within that crowd who had fallen down, and decided to carefully initiate a passing maneuver when the opposing lane was clear to move around what could be an obstacle and a safety concern.”

    “After starting that maneuver, out of an abundance of caution around these vulnerable road users, and to avoid getting too close or cutting them off, the Waymo remained in the oncoming lane for longer than necessary before returning to its original lane of travel,” the company said.

    -- https://archive.ph/24IDK

    • stefan_ 10 days ago
      It is a really really really bad sign that these companies feel the obsessive need to lie and distort around what are safety-critical incidents. Where is the safety culture? If this is how it's going to be, why on earth should we trust them to deploy them on public streets?
    • Timshel 10 days ago
      I love the :

      > before returning to its original lane of travel

      When it was only able to do so when someone started to block its path to force it to stop ...

    • timeon 10 days ago
      Waymo car here like some drivers never consider option to slow down. They just have to overtake at all costs.
    • fabian2k 10 days ago
      That statement is PR bullshit, I expected better from Waymo. Nobody fell, so it misidentified the situation. And there was nothing careful about the overtake maneuver here, there was no safe place to end the maneuver in sight.

      These kinds of mistakes will very likely happen with self-driving cars, the important part is to address them and improve the cars. I don't expect the cars to be flawless from the start (I do expect some fundamental safety mechanisms to work reliably from the start like the automatic braking before obstacles). Downplaying the issue here is a bad look for Waymo.

      • gnfargbl 9 days ago
        I'm not sure if it's entirely bullshit, but certainly the phrase "remained in the oncoming lane for longer than necessary" is doing some heavy lifting, feels like some key information has been omitted.
  • oliwary 10 days ago
    This has to be one of the most scifi videos I have ever seen... The lights, the people on uniwheels, the self-driving car that misbehaves followed by people on unicycles surrounding the car to coax it back to the correct lane...
  • cj 10 days ago
    This is kind of cute, in a dystopian way.

    "Waymo you're going the wrong way!" and the human uni-cyclist steers in front of Waymo, knowing that it will brake, in an attempt to force it back into the correct lane.

    • jojobas 10 days ago
      What if something was coded/learned to recognize the "people on the road surrounding the vehicle" situation, so that the car could try to get away or even ram them.
    • GJim 10 days ago
      > the human uni-cyclist steers in front of Waymo, knowing that it will brake

      That's a level of optimism I don't have....

      ...on the same level as a bloke I saw advertising a food blender on the shopping channel, sticking his fingers into the thing to prove the safety mechanism would stop the blades.

  • bell-cot 10 days ago
    One really basic & important feature of (decent) autonomous systems might be called Situation Sanity Awareness. Human drivers are (overall) extremely good at that. And a car driving for any distance on the wrong side of a double yellow line is a very obvious SSA fail.
  • datadeft 10 days ago
    When are we going to admit that the current level of understanding and technical capabilities are not sufficient for safe autonomous driving?
    • scoofy 10 days ago
      I've said it over and over: the American automobile transit system is inherently unsafe. That's why I've been a self driving skeptic. We have a system where there drivers must break the rules and/or drive in an unsafe manner regularly. This involves: excessive speed when others are driving too fast, entering the oncoming lane when needing to pass someone double parked, making blind turns because a big truck is blocking your view, not knowing when a pedestrian will enter a crosswalk and needing to slam on the brakes, having to yield when crossing a lane of traffic... the list goes on and on. There isn't an actual safe set of rules, and the rules we have are so commonly broken that a learning algorithm won't follow them.

      I'd put this inherent danger at somewhere around 0.1% of the time. I'd say I'm forced to drive dangerously about once every 10 miles of driving, but the fact that people must drive dangerously occasionally to use the system means that a machine can't ever learn to use it safely.

    • dotancohen 10 days ago
      Probably about the time that we realize 100% safety is an impossible goal, and better-than-human safety is achievable.

      I fully admit to having done things in the car far dumber and more dangerous than this when I was young. I'm not excusing the Waymo, rather I'm putting it in perspective.

    • rufus_foreman 10 days ago
      One alternative to autonomous driving in San Francisco:

      https://www.youtube.com/watch?v=9HoqFlDuuBI

    • fallingknife 10 days ago
      First we need to admit that the current level of human capabilities are not sufficient for safe driving.
    • arethuza 10 days ago
      To be fair, I've seen many human drivers doing equally stupid things.
      • Timshel 10 days ago
        Yes tired, drunk or raging people do stupid shit.

        But the goal is not exactly to have autonomous driving at their level ...

        • arethuza 10 days ago
          The point I was trying to make, clearly not very well, was that nobody would be surprised by a human making a mistake like this so why are we expecting perfection from a software controlled car? They will make mistakes but I'd also expect these mistakes to become less and less common. I'd only worry if the rate of change of performance isn't fairly positive...
          • Timshel 10 days ago
            Don't know even by low human standard it's bad.

            I would not expect a human to drive two blocks while failing to go back to the correct lane; I would expect him to realize after 50m that there are too many and slow down and go back behind the uni cycle. In the same way I do not expect a human driver to stop after hitting someone then start again and drag the body with it.

            The problem is that it's easy to pump the statistic while driving miles without issue then it utterly fails when the situation is not well-defined anymore.

          • imadj 10 days ago
            > They will make mistakes

            Machines don't make "mistakes". A mistake require intent. In case of autonomous driving, any problem is a defect in the system that the whole fleet share.

            > I'd also expect these mistakes to become less and less common

            So the people should just tolerate it in the mean time "for the greater good"? oh please, it's an enterprise.

        • danielbarla 10 days ago
          There are a few more categories of people who often do stupid shit, many of which are fairly mundane and far less easy to disregard (such as: momentarily distracted / semi-permanently fiddling with their phone / dangerously incompetent).

          I don't think it's unfair to point out that humans, quite often, also suck quite badly at driving.

          • Timshel 10 days ago
            In general, I don't think human sucks at driving, but there are many times when any driver can be extremely dangerous for many reasons.

            Many of those reasons are illegal (alcohol, fiddling with your phone) so it's not a question of fairness an automated program was supposed to be better than that.

            I don't see anything unfair in requiring that automated driving which is supposed to not be afflicted from our human failing to not perform illegal and dangerous driving ...

            Especially since autonomous driving might bring more car not less (similar to Uber and Lift impact https://www.nature.com/articles/s41893-020-00678-z#Sec8 )

            • danielbarla 10 days ago
              My take is that both have spectacular edge cases / failure modes that are essentially laughable from the other's point of view. It's hard to compare the two, when all we're doing is cherry picking scenarios.

              I could point to a double pedestrian crossing near my place where cars routinely speed. Various people have told me that the drivers cannot see the pedestrians, for a variety of reasons (buildings, cars, and especially a large tree near one corner), yet the human drivers are confident to go 10-20 kph over the speed limit through the area, despite this lack of information. I can give many similar examples - people really do suck at driving, they just tend to get away with it most of the time because rare events are, well, rare.

              AI sucks for a variety of other reasons. The only real question is whether on average, over all scenarios and ratios of humans to AIs populating the roads, one sucks more than the other. It's something I don't think we have the answer to.

      • blibble 10 days ago
        I'd expect to be prosecuted if I was caught driving like this, and probably lose my license as a result

        what is the threshold for permanently terminating Waymo's license?

            - an example like this of dangerous driving?
            - 1 death by dangerous driving?
            - 10 deaths by dangerous driving? (say it veered right in this video into the other road users)
            - 100 deaths by dangerous driving? (say it veers into a sidewalk full of pedestrians)
        • viraptor 10 days ago
          Why would you expect losing a license? The only actual issue I can see is crossing the double line which is just one penalty point if caught https://www.shouselaw.com/ca/defense/vehicle-code/21460/

          (Possibly also not using the indicator which is another point, but I don't think we can be 100% sure it wasn't used before the change started - California seems to require it only before not during lane change if I read it correctly)

          • ChoGGi 9 days ago
            You'd also get a misdemeanor (or felony if you hurt someone).

            https://www.shouselaw.com/ca/defense/vehicle-code/21651b/

          • blibble 10 days ago
            the UK has somewhat stricter driving standards

            doing it for a second or two would likely be classed as "driving without due care and attention", resulting in a fine if convicted

            however doing it for as long as in this video I suspect would be classed as "dangerous driving", with disqualification being the likely result

            • arethuza 10 days ago
              I'd be very surprised if someone lost their license over doing this in the UK - you might get a fixed penalty notice but I suspect if you managed to explain to the police what you thought you were doing you might get away with a good telling off. Of course, if anyone got hurt you'd be in serious trouble...
        • miki123211 10 days ago
          The problem with these self-driving companies is that they drive a lot more miles than a human ever could. Even if their error rates were 1/10th of the most cautious human drivers, if we held them to the same standards as we hold humans, they'd lose their licenses in a week. This is due to the sheer number of cars they have and miles they drive.

          This is a variation on the trolley problem. Would you rather have roads that are 10 times safer, with 10 times fewer accidents and 10 times fewer deaths, but where the accidents are completely unexplainable and where nobody can be held guilty, or would you rather have the world stay as it is?

          • peteradio 10 days ago
            If Waymo sold the software/hardware without onerous licensing costs I would rather live in world #2. If instead they become an incredibly wealthy lobbying engine based on extraction for a few rather than true wealth creation then I'd rather still live in world #1.
        • acimim_ha 10 days ago
          Negative publicity is reflected on their equity evaluation. They have great incentives. The system works.

          I think with 100 deaths the company would get liquidated. This incident alone probably caused more damage than a cost of one single driver's license suspension.

      • logifail 10 days ago
        > I've seen many human drivers doing equally stupid things

        Just yesterday I got yelled at by an Italian police officer who was directing traffic at a busy junction (I was in a rental car). [Full disclosure: I stopped, wound my window down, we had a polite conversation, everything turned out fine...]

        The key difference is that if I screw up badly enough when driving, I risk a monetary fine and/or losing my licence and/or getting sent to prison. For most human drivers these represent clear incentives not to screw up when driving.

        Questions:

        Have we agreed on suitable penalties when an autonomous vehicle screws up, given there isn't a single human driver to sanction?

        Could the fleet get grounded?

        Could the company get fined?

        Could relevant executives go to prison?

        • kalleboo 10 days ago
          > Could the fleet get grounded?

          Cruise's permit was suspended after their accident

          • logifail 10 days ago
            There wasn't just the one Cruise incident, though, was there?
    • Mashimo 10 days ago
      That's why they still are in research and not widely available.
      • blibble 10 days ago
        that's funny, because when I watched the video I've got the impression it was dangerously roaming around the roads of San Francisco with hundreds of non-consenting people in close proximity
        • Mashimo 9 days ago
          Correct. Like I said, they are not wide spread. Only a few cities while remote monitored.
        • dotancohen 10 days ago
          Don't conflate an advanced state of research with wide availability.
        • concordDance 10 days ago
          We allow the barely coherent and error prone monkeys to control multiton vehicles around such roads for far worse reasons such as "I just felt like going for a drive".
      • NewJazz 10 days ago
        Why are they conducting experimental research on public roads in densely populated cities?
        • Mashimo 9 days ago
          Probably because it's cheap and gives the most feedback.
        • dageshi 10 days ago
          Why do we allow human trials of new drugs?
          • interloxia 10 days ago
          • NewJazz 10 days ago
            Consent
            • dageshi 10 days ago
              The people surrounding the trial participant didn't give consent, if the drug causes a violent psychosis which in turn causes people to be injured by the participant, how does that differ?
              • NewJazz 10 days ago
                You are quickly moving the goalposts I see. Have fun with that exercise.
                • dageshi 9 days ago
                  I don't think I am, I think they are comparable, they are both regulated tests of new technology in real world situations where there is no practical alternative in order to determine if they are fit for purpose.
  • botanical 9 days ago
    Google's AI is just as bad as the rest. I remember when it couldn't get past a traffic cone after not being in an accident for "countless hours":

    https://youtu.be/zdKCQKBvH-A?t=742

    These companies should be fined hefty amounts and barred from beta testing out in public and putting people in danger. Governments should carry out comprehensive tests on a self-driving car's capabilities, same as cars without proven passenger safety (Euro NCAP) aren't allowed to be on roads carrying passengers.

    And now this great AI company is also providing AI services to pariah states like Apartheid Israel to be used on people. That's scary.

  • everyone 10 days ago
    Self driving is a very difficult (and expensive to develop) engineering challenge.. I think it might end up being cheaper (and thus more likely) that the "solution" will be a legal one rather than an engineering one: Self driving vehicles will have a light and/or klaxon on them, if you see or hear it, it will be your responsibility to get out of it's way.

    Thats how the issues with automobiles vs other road users was resolved in the 1920's. The impetus is there, corporations want more and more robots for everything.

  • infecto 10 days ago
    Its obviously in the wrong but I do wonder if it somehow misidentified the lane it was supposed to be in as a bike lane?
    • acimim_ha 10 days ago
      I sure did. And I sure did even bigger screwups while driving.
      • infecto 10 days ago
        Totally. There is no ignoring it is in the wrong but this does not seem that serious to me. It did not plow into other cars or the unicycle folks.
        • peteradio 10 days ago
          Next week when it does plow into a bunch of people or causes a pileup we'll have people saying "I did much worse when I was young, people do it all the time"
          • infecto 10 days ago
            Please don't project. I am not in that camp and I suspect many others are not.
  • randunel 10 days ago
    I now expect a new recaptcha challenge to be added by Google, identifying unicycles. We already have bicycles and mopeds, but it would appear that unicycles confuse their cars.
    • unhammer 10 days ago
      It's the humans fault for not being lawful and considerate to the AI murderbot:

      > a former Baidu executive, Drive.AI board member, and one of the industry’s most prominent boosters — argues the problem is less about building a perfect driving system than training bystanders to anticipate self-driving behavior. In other words, we can make roads safe for the cars instead of the other way around. As an example of an unpredictable case, I asked him whether he thought modern systems could handle a pedestrian on a pogo stick, even if they had never seen one before. “I think many AV teams could handle a pogo stick user in pedestrian crosswalk,” he told me. “Having said that, bouncing on a pogo stick in the middle of a highway would be really dangerous.”

      > “Rather than building AI to solve the pogo stick problem, we should partner with the government to ask people to be lawful and considerate,” he said. “Safety isn’t just about the quality of the AI technology.”

      https://www.theverge.com/2018/7/3/17530232/self-driving-ai-w...

      • Dyac 10 days ago
        Reminds me of the jaywalking law in the US, brought about by the auto industry.

        This law that sets the default of it being illegal to be in the road doesn't exist in much of the world. It's really up to the self driving cars to respect other road users at least as well as a human driver would, not expect to be able to mould the laws around the world to fit their limited capabilities.

        https://www.bbc.co.uk/news/magazine-26073797

      • aftoprokrustes 10 days ago
        I agree that the citation does sound cold and coercitive, but actually this is already kind of how streets and urban developments are designed. Human drivers are limited, so if you allow cars to go fast, you restrict or remove the right to be walking close to the road (highways). Conversely, if you want to give more room to pedestrians, you decrease the speed limit, reduce lane width, etc. such that they are able to react accordingly. Nothing wrong with that in principle.

        Now, this does not say anything about the Waymo incident. In that case, the car performed an illegal maneuver, which cannot be blamed on other street users.

      • simion314 10 days ago
        Making rods safe for cars sound like rails, put rails and isolate them for the other streets then have super fast carts that a computer can safely drive, you could have carts where you can load your car onto and it would move you closer to the destination.

        If we need to design roads for Artificial Stupidity then we should design this roads from scratch and gain some extreme speeds and extreme safety from the work.

      • fallingknife 10 days ago
        This is what we do for cars now
        • everyone 10 days ago
          Yeah there was a PR big and legal campaign in the 20's from the auto industry to promulgate the idea of jaywalking and take the streets from the people.
    • tinco 10 days ago
      You would expect a captcha that shows you two yellow lines, and asks you on which side of the yellow lines a car should be driving. Wether there are unicycles present is totally irrelevant.
      • dagw 10 days ago
        Just make sure no one in the UK, Ireland, Australia or Japan gets asked to solve those captchas.
        • alchemist1e9 10 days ago
          Yeah I’m wondering if they accidentally have training data from those places and plus that data might be more likely to have groups of cyclists.
          • defrost 10 days ago
            Or people randomly playing cricket in the middle of the street.

            For real. Not /s.

            Not in downtown Melbourne, but a definite possibility in suburban streets.

            • dagw 9 days ago
              In parts of North America you have to deal with street hockey, so I imagine dealing with street cricket looks basically the same.
      • arethuza 10 days ago
        Actually, by looking at the behaviour of the unicycles I could infer that this was a country where people drive on the right and therefore the car was probably doing something wrong?
      • csunbird 10 days ago
        As always there is a relevant xkcd: https://xkcd.com/1897/
        • bitwize 10 days ago
          There are companies whose business model is this but unironically.
        • defrost 10 days ago
          My Driverless Car Is Driving Me To Distraction
  • sparrowInHand 10 days ago
    Sabotaging such a obvious dangerously vehicle could be seen as self defense?=
  • underdeserver 10 days ago
    No idea what the Waymo car was thinking, but what are these kids riding?

    Maybe it thought they were on a bike lane? Where I'm from it's illegal to ride like that in a group outside of police-sanctioned events - you're supposed to ride one after the other and keep right.

    • rimunroe 10 days ago
      > Maybe it thought they were on a bike lane? Where I'm from it's illegal to ride like that in a group outside of police-sanctioned events

      I’m in Massachusetts where lane splitting is legal in some cases (single lane each direction, posted speed limit below some amount, probably something else). Some other states have similar rules, but in most it’s flatly illegal.

      I believe California is an unusual outlier in that lane splitting is allowed with no additional restrictions.

      > you're supposed to ride one after the other and keep right.

      In many states bicycles are entitled to use the full lane. I often feel safer riding in the center of the lane for a few reasons:

      1. The center of the lane is less likely to have small ride-ruining debris like glass, sticks, and nails

      2. Cars surprisingly give me more space when passing if I’m not riding off to the side

      3. It seems that potholes occur more often to the side of the rode than in the center, or at least get fixed more often when they’re toward the middle of the road

      Note: I’m talking about the middle of the lane vs the side, not the middle of the lane vs the shoulder. The shoulder isn’t something you should travel in anyway, even on a bike.

      • GrantMoyer 10 days ago
        FYI, lane splitting coventionally refers to driving a small vehicle between lanes of stopped or slow cars, rather than multiple small vehicles riding side by side within one lane [1]. The first is unconditionally illegal in Massachusetts, while the second is conditionally legal like you say.

        [1]: https://en.wikipedia.org/wiki/Lane_splitting

        • rimunroe 10 days ago
          Thank you! I was indeed using the wrong term. It looks like 39 states allow bikes to ride two-abreast (sometimes more).
    • liveoneggs 10 days ago
      Is it also illegal to ride into oncoming traffic, in the wrong lane?
    • Faaak 10 days ago
      Still no excuse for driving on the opposite lane ;-)
      • reisse 10 days ago
        The unicycle rider with camera was doing the same - riding the opposite lane behind Waymo
        • Faaak 10 days ago
          Sure, but this article is about Waymo. We can create a new link about "unicycle rider with a camera driving on the opposite lane", but that's another topic
          • acimim_ha 10 days ago
            The topic would not gather nearly as much attention. Why?
            • rsynnott 10 days ago
              No-one is suggesting that that person should be duplicated a hundred million times and drive two tonne objects through every city in the world, you see. "Magic driving robot does stupid thing" is a much more interesting story than "arbitrary human does stupid thing", because the end goal is to have _a lot_ of the magic driving robot.
    • eklitzke 10 days ago
      The devices they're riding are called electric unicycles, often abbreviated to EUC.
  • bryanlarsen 10 days ago
    I suspect this is human error rather than a computer error. Waymo is a supervised system -- from what I've heard while there's little remote driving by humans, there is occasional real-time human labelling and confirmation. So the car likely phoned home to ask "what the heck is this?", and a human clicked on the "overtake" button or something.
    • Jevon23 10 days ago
      >a human clicked on the “overtake” button

      Ok. So what are the consequences for this human operator and/or Waymo?

      Human drivers make mistakes all the time, it’s true. But there are consequences for those mistakes. If a human drives into the opposite lane that’s a big problem, especially if they end up causing property damage or bodily injury. Why are the same standards not being applied here?

      • bryanlarsen 10 days ago
        I've seen plenty of cars drive the wrong way down a one way street. I've done it myself by accident a couple of times. I've never seen any face any consequences other than being honked at. I've seen one pulled over, but they got away with just a warning since they were obviously just confused tourists.

        OTOH, I doubt the operator still has a job, and Waymo is going to face significant consequences. Likely formal consequences, but if not, the punishment by the court of public opinion is going to be massive.

        • mannykannot 10 days ago
          If this is actually what happened, then regardless of the consequences, it would challenge the notion that human supervisors provide adequate protection of the public while these cars are being tested.
      • flutas 10 days ago
        > Ok. So what are the consequences for this human operator and/or Waymo?

        If you're Waymo apparently you can tell the car to run a red light, cause a moped to crash and not even need to report it to the state.

        https://www.forbes.com/sites/bradtempleton/2024/03/26/waymo-...