Are Waymo Self-Driving Cars Safe? Recent Incidents Raise Concerns

waymo accidents in Baltimore

Waymo’s self-driving robotaxis are coming to Baltimore. A fleet of professional drivers is already mapping out the city’s streets, allowing the company to get up and running quickly once approval is granted by Maryland lawmakers. Governor Wes Moore has expressed enthusiasm for the technology, and the legislation has bipartisan support in Annapolis.

However, before Baltimore residents welcome driverless taxis onto Charm City’s notoriously narrow streets and pothole-riddled roads, there’s a critical question that deserves honest examination: Are Waymo vehicles actually safe in realworld, unpredictable situations?

A growing string of documented incidents from other cities where Waymo already operates suggests that serious concerns remain, and those concerns have direct implications for anyone who could be injured in or around one of these vehicles.

Recent Waymo Incidents Raising Safety Concerns About Self-Driving Vehicles

The transition from human-controlled traffic to AI-driven fleets has been anything but smooth. Several recent reports highlight the unpredictable dangers these vehicles pose to pedestrians, children, and emergency responders. 

Blocking Emergency Services During a Crisis

In March 2026, a Waymo vehicle in Austin, Texas, made national headlines for a terrifying reason: It blocked emergency medical services (EMS) during the response to a mass shooting. In a situation where every second meant the difference between life and death, the AI’s inability to navigate a chaotic scene created a literal roadblock for first responders. 

High-Stakes Errors in School Zones

Perhaps most concerning is a recent investigation opened by the National Transportation Safety Board (NTSB) following an incident in Santa Monica. A Waymo vehicle struck a child who ran out from behind a parked SUV in a school zone. Furthermore, federal investigators are looking into recurring reports of Waymo vehicles illegally passing stopped school buses

These are not minor “software bugs.” They are life-threatening failures to follow the most basic rules of the road designed to protect our most vulnerable citizens.

Confusion in Complex Urban Environments

Baltimore’s streets are unique. From the narrow cobblestones of some neighborhoods to the aggressive flow of the Beltway, The Waymo Driver, the company’s fully autonomous driving system, must navigate conditions that even seasoned Maryland drivers find challenging. 

Recent footage from other cities shows Waymo vehicles becoming “confused” by construction zones, hand signals from police officers, and unpredictable pedestrian movements, the exact types of scenarios that happen every day in downtown Baltimore.

Why Self-Driving Vehicles Still Face Challenges

Self-driving cars rely on a combination of cameras, radar and lidar sensors, artificial intelligence software, mapping data, and remote human support teams. While this technology is incredibly advanced, real-world driving is messy and unpredictable.

Situations that remain challenging for autonomous vehicles, such as those mentioned above, won’t stump a human driver. In many cases, the latter can make quick judgment calls based on intuition and experience, whereas autonomous vehicles must interpret these scenarios through algorithms and sensor data, which can sometimes lead to hesitation or unexpected behavior.

Who Is Liable When the ‘Driver’ Is a Computer?

When a human driver causes an accident, the path to accountability is relatively straightforward: driver negligence, insurance coverage, and legal liability. But, when a driverless vehicle like a Waymo car is involved, the question of who is responsible becomes far more complex as it involves several potentially responsible parties, including: 

  • The company operating the autonomous vehicle, 
  • The manufacturer of the vehicle or software, 
  • Remote operators supervising the system,
  • Maintenance providers, and 
  • Other drivers involved in the crash.

In addition, the legal landscape of autonomous vehicle litigation is complex and evolving. Depending on the circumstances, a personal injury claim might involve:

  • Product liability: If the software or hardware sensors failed to detect a pedestrian or another vehicle.
  • Negligent maintenance: If the fleet operator failed to keep the vehicle’s complex systems in working order.
  • Systemic failure: When the AI’s “decision-making” process violates Maryland traffic laws.

What’s particularly important to understand is that Waymo is a subsidiary of Alphabet Inc., one of the most well-capitalized corporations in the world. Injured victims should not assume that navigating a claim against such an entity is something they can do alone.

At The Law Offices of Nicholas A. Parr, we believe that “innovation” should never come at the expense of your safety. Tech giants like Waymo have massive legal teams and deep pockets, but they are still required to operate safely on our public roads.

What This Means for Baltimore

Baltimore presents its own set of challenges that no other Waymo city has faced in quite the same way. Baltimoreans will soon get to see if the technology is up to some of Charm City’s challenges, like its plethora of potholes, old narrow streets, or snow and ice the cars haven’t seen in sunny Phoenix or Austin. 

There’s also a significant regulatory gap. Maryland currently does not have a regulatory framework for autonomous vehicles. The states where Waymo currently operates, Arizona, California, Texas, Georgia, and Florida, all do. That means if something goes wrong with a Waymo vehicle in Baltimore tomorrow, the legal landscape for victims seeking accountability would be murky and largely untested in Maryland courts.

Company officials confirmed that Waymo already has about a dozen cars in Baltimore driven by professional drivers to acquaint the technology with the city but is not yet offering rides to passengers. However, full autonomous operation could be closer than many residents realize, and company officials said their entrance into the Baltimore market would be gradual and phased, meaning residents won’t have the chance to take a ride until late 2026 at the earliest. 

Injured in a Crash Involving a Self-Driving Car?

If you or a loved one is injured in an accident involving a self-driving vehicle, it’s important to understand your legal rights. 

Autonomous vehicle cases can involve multiple liable parties and complex technology investigations. An experienced personal injury attorney can help determine who may be responsible and pursue compensation for medical expenses, lost income, and other damages.

If you’ve been hurt in a collision involving a self-driving vehicle, speaking with a qualified attorney can help you understand your options.

Protecting Baltimore’s Families

Waymo may eventually prove to be a transformative technology that makes our roads meaningfully safer. But, “eventually” is not the same as “right now.” The incidents in Austin, Santa Monica, and cities across the country demonstrate that this technology still has serious, unresolved limitations, particularly in the kinds of chaotic, unpredictable situations that Baltimore streets will inevitably present.

As autonomous vehicles begin operating in our city, Baltimore residents deserve to know their rights. If you or a loved one has been injured in an incident involving a self-driving vehicle, contact The Law Offices of Nicholas A. Parr for a free consultation. We are here to help you navigate the road ahead.

DON’T WAIT ANY LONGER

Call today for a free consultation. We don’t receive a fee unless we win.

Similar Posts