A driverless Waymo taxi drove directly into an active fire scene in Hollywood this week, bypassing road flares and fire department barricades on Melrose Avenue before remaining in the restricted area for approximately ten minutes.
The autonomous vehicle, which appeared to carry one passenger in the backseat, entered a zone that Los Angeles Fire Department personnel had cordoned off for emergency operations. After spending several minutes within the closed area, the robotaxi executed a U-turn and departed the scene.
This incident represents the latest in a series of troubling episodes involving Waymo’s autonomous vehicles operating in American cities. The pattern of erratic behavior raises substantive questions about the readiness of driverless technology for widespread deployment on public roads.
Several months ago in Atlanta, Georgia, video footage captured a Waymo vehicle illegally passing a school bus while its stop sign was extended and lights were flashing. Children were exiting the bus at the time. The footage, recorded on a cellphone by a witness, documented a clear violation of traffic safety laws designed specifically to protect schoolchildren.
During the summer months, residents of a San Francisco neighborhood reported regular disturbances at four o’clock in the morning. Waymo vehicles congregated in a parking lot, creating traffic congestion and activating their horns repeatedly, disrupting the sleep of nearby residents. Multiple residents confirmed this as a recurring problem rather than an isolated incident.
The accumulation of these incidents has generated considerable attention on social media platforms, where users have shared numerous videos documenting unusual and problematic behavior by the autonomous vehicles.
The company’s leadership has addressed the safety concerns directly, though perhaps not in the manner the public might expect. At a technology conference in San Francisco, Waymo Co-CEO Tekedra Mawakana spoke candidly about the inevitability of fatal accidents involving autonomous vehicles.
When asked whether society would accept deaths caused by robotaxis, Mawakana stated, “I think that society will.” She emphasized that while Waymo maintains high safety standards and transparency regarding crash records, achieving perfection in autonomous vehicle technology remains an unattainable goal.
Most notably, Mawakana revealed that the company actively plans for fatal crashes. “We really worry as a company about those days,” she explained. “We don’t say ‘whether.’ We say ‘when.’ And we plan for them.”
This acknowledgment stands in stark contrast to the industry’s typical messaging about autonomous vehicles dramatically reducing traffic fatalities. While the technology may indeed lower overall accident rates compared to human drivers, the admission that fatal crashes are anticipated rather than merely possible represents a significant concession.
The fundamental question facing regulators and the public concerns whether autonomous vehicle technology has matured sufficiently for deployment on public roads. When driverless cars cannot reliably recognize and respond to road flares, emergency barricades, and school bus stop signs, the technology’s limitations become evident.
These are not minor software glitches or inconveniences. These are failures that directly endanger emergency responders, schoolchildren, and the general public. The frequency of such incidents suggests systemic problems rather than isolated malfunctions.
As autonomous vehicle companies expand their operations into more American cities, the need for rigorous oversight and accountability becomes increasingly urgent. The public has a right to expect that experimental technology will not be deployed on public roads until it can consistently obey basic traffic laws and recognize emergency situations.
Related: DOJ Officials Face Criticism Over Epstein Document Release Timeline
