Home Tutorials Robotaxis, school buses and repeated incidents: in Texas, Waymo is still struggling...

Robotaxis, school buses and repeated incidents: in Texas, Waymo is still struggling to teach its autonomous cars to stop behind school buses

9
0

Despite updates, an official recall and data collection, Waymo’s autonomous vehicles continued to outpace stopped school buses in Texas. A series of incidents which highlights the persistent difficulties of artificial intelligence in understanding certain situations and rekindles the debate on the real reliability of these technologies on the road.

This is one of the promises of the autonomous car: to learn continuously. Each incident, each error, each unforeseen situation is supposed to feed a collective intelligence, shared across an entire fleet of vehicles. “The Waymo Driver draws lessons from the collective experience acquired by our entire fleet, including previous generations of equipment,” boasts Waymo, a subsidiary of Alphabet.

In Austin (Texas), however, this well-oiled mechanism seems to have seized up. According to Wired, the company’s vehicles struggle to learn to stop when school buses load children on and off. Cars have, on several occasions, “illegally and dangerously” passed school buses that were stopped or flashing red lights instead of coming to a complete stop. A behavior strictly prohibited by American law since it endangers children getting in or out of the vehicle.

Around twenty accidents recorded

According to the Austin Independent School District, there have been at least 19 such incidents. In some cases, situations border on accidents. A robotaxi would have overtaken a bus “only a few moments after a student crossed in front of it, while he was still on the road”.

Faced with these failures, Waymo then implements software updates. The company is also launching an official recall with the federal highway safety administration, NHTSA. The firm acknowledges a dozen incidents and claims to have corrected the problem.

But on the ground, nothing seems to change. Incidents are increasing. So the school district and the company are attempting an unusual collaboration. In December, a data collection operation was organized in a parking lot. Several school buses were brought together. The objective? Allow Waymo engineers to train their systems to better recognize their flashing lights and help the fleet understand when vehicles are not allowed to pass.

“Specifically, they wanted to focus their data collection on the amber and red lights on each of our school buses, as well as the ability of their cars to detect them at different distances,” the district adds.

An ongoing investigation

In total, at least seven buses, representing all models of the fleet of more than 550 vehicles, were gathered at the establishment’s sports complex. The district even provided technical specifications for bus lighting to Waymo. But here again, the results are not there.

As early as January, another series of accidents was reported, according to a preliminary report published in early March from the NTSB, an independent federal safety monitoring agency. The document reveals in particular that an incident on January 12 occurred after a Waymo remote assistant, a Michigan-based employee responsible for “helping” the software in case of difficulties on the road, wrongly indicated to the robotaxi that the school bus which preceded it did not have his indicators on. Six vehicles passed the school bus while it was stopped, the agency said. An investigation is underway.

“It is alarming that the alleged incidents occurred after Waymo assured the district that it had updated its software to resolve the problem,” observes the district attorney. “Austin ISD is considering all legal remedies available to it and intends to take all necessary measures to protect the safety of its students, if necessary,” he warned.

Just days after these incidents in January, a Waymo vehicle struck a child crossing in front of it near a school in Santa Monica, California. The child would not have been injured. The company said its models showed that a human driver would have hit the child at a speed higher than that of their vehicle.

The blind spots of artificial intelligence

“The data we have collected from the start of the school year to the end of the semester shows that approximately 98% of human drivers who receive a ticket do not receive another,” observes an official from the school police department to the local affiliate channel. at NBC.

“This tells us that humans are learning, but it doesn’t appear that Waymo’s automated system is learning through its software updates or reminders or anything else because we continue to see violations,” he adds.

For specialists, these failures are not surprising. Artificial intelligence systems require time, sometimes months, to integrate new data. Above all, data collected outside of a real context loses relevance. A bus stationary in a parking lot reproduces neither the complexity nor the unpredictability of urban traffic.

Above all, autonomous vehicles have long struggled to interpret certain visual signals, such as flashing lights, mobile devices or articulated arms. So many elements that go beyond standardized road maps. “If (the company) hasn’t corrected this problem, the more vehicles are driven, the more it becomes a problem,” notes Missy Cummings, who conducts research on autonomous vehicles at George Mason University and a former safety advisor to the NHTSA under the Biden administration. “That’s exactly what’s happening here.”

The last percent

And it’s even more complex when you have to contextualize. Because a “stop” sign does not always have the same meaning. It can be fixed at an intersection, held up by a construction worker, or integrated into a school bus. For a human, context makes the difference. For a machine, this nuance is a challenge. “Waymo is trying to teach its software something very subtle,” insists Philip Koopman, a researcher in software and safety of autonomous vehicles at Carnegie Mellon University.

The company is trying to focus on “world models” to simulate, using AI, scenarios as rare as they are complex in order to prepare vehicles for such eventualities in the real world. But there is still work to do.

According to Missy Cummings, “Waymo should not be allowed to drive near schools during student arrival and departure times until it has resolved this problem and demonstrated it through specific tests”.

The Austin episode illustrates a truth well known to engineers. Reaching 99% reliability is relatively accessible. But the last percent, that of rare, unexpected, atypical cases, concentrates most of the complexity. However, on the road, that “last percent” can be a question of life or death.