Why Waymo Has Trouble Stopping School Buses

For years, Alphabet-owned Waymo has tried to differentiate itself from other self-driving startups by emphasizing a culture of caution and safety. Now, just before a planned nationwide rollout, it faces repeated failure in one of the most sensitive places imaginable: school zones.

In December, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into Waymo after the largest school district in Austin reported at least 19 incidents where the company’s robotic axis failed to come to a complete stop for school buses during loading and unloading — an illegal violation in all 50 states. Waymo quickly responded by issuing a voluntary software recall and releasing updates to fix the problem.

But the patch didn’t work. Since the update, the Austin Independent School District (ISD) says there have been at least four other violations, including one on Jan. 19, when a Waymo vehicle was filmed swerving across the opposite lane while children waited to cross the street and board a bus with the stop arm extended. In total, at least 24 Waymo vehicle and school bus safety violations have been reported in Austin since the start of the 2025 school year.

Waymo defended itself in part by noting that none of the Austin school bus incidents resulted in a collision or injury. But this is no longer strictly a national case. Last week, Waymo released a blog post admitting that one of its vehicles hit a child outside a Santa Monica elementary school on January 23rd. Although the school district said The Washington Post that the child suffered only minor injuries, the outcome could have been much worse: Waymo says the vehicle slowed from 17 mph to 6 mph in the moment before the crash.

This is according to experts specializing in the safety of autonomous vehicles and interaction with pedestrians The Verge that these incidents were concerning, especially given the company’s stated goal of having its vehicles drive “with confident assertiveness.” In an attempt to shake off the stereotype of driving as a cautious grandparent, the vehicles have been spotted flouting traffic regulations. But by making robotaxis seem more human, it also risks inadvertently inheriting some of our more dangerous driving habits.

“These technologies are still being developed and tested in real-world environments because there are a lot of things that happen in the real world that are hard for companies and engineers to predict,” says Cornell Tech professor and human-robot interaction expert Wendy Ju. The Verge. “If you don’t understand all the things that can happen, it’s hard to know what to design.”

Waymo did not respond to repeated requests for comment. On Wednesday, Waymo Chief Security Officer Mauricio Peña responded to security concerns raised during a Senate hearing. He said Waymo is evaluating each of the school bus incidents and developing fixes, some of which have already been incorporated into its software. Peña also said they are working with Austin ISD to “collect data on different lighting patterns and different conditions.” Notably, Waymo has not committed to stopping traffic around school buses while data collection and testing takes place.

School bus stops test the ‘logic’ of autonomous vehicles

Navigating around school buses is one of the more dangerous aspects of driving, both for humans and robots. NHTSA attributed 61 deaths to vehicles illegally passing school buses between 2000 and 2023, nearly half of which were pedestrians under the age of 18. This danger has less to do with the bus drivers themselves, who are usually licensed and careful, and more to do with the chaotic, improvisational nature of the situation. Buses are often double overcrowded and kids, being kids, can’t wait to cross the street when they have to.

“Waymos has a problem because every driver has problems with school buses,” Ju said

As a result, drivers navigating around buses must rely on experience and intuition in addition to following a fixed set of rules learned in the driver edition. The kind of common-sense logic that comes naturally to skilled human drivers, Wu says, is especially challenging for self-driving cars.

“There are all these moments in time where you actually have to decide between different things that you should do,” Ju said.

Waymo Launches ‘Public Beta Test’ With Pedestrians?

On a technical level, there may be more at play. According to George Mason University professor and director of the Mason Autonomy and Robotics Center Missy Cummings, the apparent increase in safety concerns with robotic axi school buses may be related to what she describes as Waymo’s growing shift away from traditional, modular machine learning toward a greater emphasis on end-to-end learning, a technology she calls “fashionable” and “still nascent.”

Waymo publicly says it uses a combination of the two, but some speculate that balance is shifting.

Earlier autonomous vehicle systems relied on more conservative, layered architectures with separate modules responsible for detecting objects, classifying them, and applying explicit safety rules determining how the vehicle should respond. In contrast, full learning packages much of this process into a single model that takes all the information gathered by the car’s sensors at once and makes driving decisions probabilistically, based on patterns gleaned from large swaths of human driving behavior. The result is something that can feel more “natural” and human, though Cummings says it can also pose an additional risk, especially in high-stakes scenarios like school bus stops.

The security incidents demonstrate “all the signs of trouble when you change the architecture,” Cummings said. “I suspect a lot of (robotaxis companies) are doing that.”

The schools asked Waymo to stop

Regardless of what is causing the errors, Austin ISD has made its position clear. Officials have reportedly asked Waymo to halt robotaxi operations around schools during loading and unloading until the issue is resolved. But Waymo refused and continued to operate, a decision the experts seem short-sighted and at odds with the company’s public image of promoting safety and caution.

“I think it’s troubling that Waymo wouldn’t agree to that,” Cummings said. “You’re only talking about an hour and a half, 45 minutes in the morning and 45 minutes in the afternoon, so it’s not a heavy lift.

Philip Koopman, a Carnegie Mellon professor and AV security expert, echoed this sentiment in a recent edition of his Substack newsletter. Autonomous system security.

“Waymo clearly chose to gamble with children’s lives,” Koopman writes. “They say they like the odds, but it’s not their gamble.

And while the vast majority of school bus problems occur in Austin, that’s likely in part because the city recently outfitted each of its bus stops with cameras. In other words, similar incidents may be happening elsewhere, but simply go unnoticed. Austin ISD did not respond to our requests for comment. (Axios reported that the problem also occurs in Atlanta.)

The school bus incidents, along with the crash of the child in California, have sparked three federal investigations in as many months. While unlikely in Texas’ relaxed AV environment, incidents like these could put Waymo at risk of having its operating license revoked. Even now, with investigators reportedly heading to Austin in person to investigate the company, robotaxis continues to operate in a school zone. At the very least, this cavalier approach might make other municipalities think twice before welcoming the company’s cars onto their streets.

“Waymo brought this on itself,” Cummings said. “If they had done the responsible thing and decided themselves to stay out of school zones until they fixed it, then they wouldn’t be conducting this major (public) investigation because major investigations are being made public.”

Follow topics and authors from this story to see more of these in your personalized homepage feed and receive email updates.


Leave a Comment