A decade ago, self-driving cars were a glimpse of the future – sleek prototypes promising safer roads and fewer human errors. Today, that future is here. Companies like Waymo, Google’s autonomous-vehicle division, now operate fully driverless vehicles on California streets, transforming not only how we travel but also how we think about responsibility behind the wheel. But when a self-driving car is involved in a crash, one question cuts through the technology and innovation: who’s liable?
Traditional car accidents are relatively straightforward, where the driver at fault pays for the damages. In the world of autonomous vehicles, however, the answer is rarely that simple. Liability can shift between human passengers, vehicle manufacturers, software engineers, and even remote operators. Each collision becomes a complex intersection of product liability, negligence law, and cutting-edge technology. In California, where autonomous-vehicle testing and deployment are accelerating, this question is no longer theoretical. Real accidents involving Waymo and other driverless cars have already tested the boundaries of existing statutes and forced regulators, insurers, and courts to rethink how fault is assigned when the driver is a machine.
In this article, we’ll examine what makes these cases different, who can be held legally responsible, and how victims of Waymo-related crashes can pursue fair compensation under California law. As technology races ahead, the law must evolve to ensure that accountability, not automation, remains at the center of every collision.

Understanding Waymo and California’s Autonomous Vehicle Landscape
What Is Waymo and How Does It Operate?
Waymo LLC, a subsidiary of Alphabet Inc. (Google’s parent company), is one of the leaders in autonomous vehicle technology, often described as “the world’s most expensive driver.” Its vehicles operate using an intricate network of sensors, radar, LiDAR, and artificial intelligence capable of detecting road conditions, pedestrians, and other vehicles in real time.
Unlike semi-autonomous cars, which still rely on human drivers to monitor and intervene, Waymo’s vehicles can operate without any human driver present. These vehicles are part of an expanding fleet now providing fully autonomous rides in parts of Los Angeles, San Francisco, and Phoenix. Riders summon a Waymo through an app, much like Uber or Lyft, but instead of a driver arriving, an empty vehicle pulls up, ready to navigate on its own.
While the technology is remarkable, it introduces unprecedented legal challenges. When no one is behind the wheel, who is the driver under California law? The answer to that question sets the stage for determining liability when something goes wrong.
Autonomous Vehicles Under California Law
California has become one of the most active states in regulating autonomous vehicle testing and deployment, with oversight handled primarily by the California Department of Motor Vehicles (DMV) and the California Public Utilities Commission (CPUC).
The state requires companies like Waymo to obtain special permits, maintain detailed safety reports, and comply with insurance and liability standards. According to the DMV, the “operator” of a driverless vehicle is the entity that causes the autonomous technology to engage, rather than a human driver in the car. In other words, when a Waymo vehicle is driving itself, Waymo may be legally considered the operator, even if a human passenger sits inside. That distinction reshapes how fault is viewed in collisions involving driverless technology. Instead of focusing solely on driver negligence, courts and insurance adjusters must examine corporate responsibility, software integrity, and system performance.
How “Driverless” Really Works: Human Oversight vs. Full Automation
Although Waymo’s vehicles are designed for full autonomy, there are still layers of human oversight. During testing or certain ride operations, remote operators may monitor the vehicle activity from a distance, ready to intervene or shut down the vehicle if a malfunction occurs. However, these remote staff do not steer or control the car in real time; they oversee its behavior through alerts and data feeds.
This hybrid of human supervision and machine decision-making creates a gray area in California tort law. If an accident occurs, was it the result of a technical malfunction, inadequate human oversight, or an unpredictable third-party action? Determining liability means unraveling the vehicle’s internal data, from braking patterns to sensor readings, and translating that evidence into a narrative of fault.
Call our personal
injury lawyers today
Who Can Be Liable in a Waymo Accident?
Determining liability in a Waymo self-driving car accident is rarely straightforward. Unlike traditional car crashes, there may be no human behind the wheel, only a system of sensors, code, and decision-making algorithms. Yet under California law, someone must still be accountable. Whether that’s the passenger, Waymo itself, a third-party driver, or even the vehicle’s manufacturer depends on how the accident occurred and what the evidence reveals.
The Human Passenger or Safety Operator
Even in a driverless car, a human occupant might still share some level of responsibility. If the passenger interferes with the vehicle’s sensors, disables a safety feature, or fails to follow on-screen instructions during an emergency stop, their actions could contribute to the collision. Similarly, when a safety operator is present, typically during testing phases, California law may treat that person as the “driver” if they were supposed to monitor the vehicle’s performance and intervene when necessary. However, when the vehicle is operating in fully autonomous mode with no human oversight, liability generally shifts away from the occupant toward Waymo or its underlying technology systems.
The Vehicle Manufacturer or Waymo LLC
Waymo itself may bear significant liability if the crash stems from a defect in design, manufacturing, or the algorithms governing its autonomous system. Under California’s product liability laws, companies that design and market vehicles are responsible for ensuring their products are safe for consumer use. If a software error causes the car to misinterpret a traffic signal or fail to detect a pedestrian, that malfunction could be viewed as a defective product. The same principle applies to mechanical failures, like braking systems, LiDAR sensors, or cameras that malfunction. In these scenarios, Waymo or one of its component manufacturers may face responsibility for the harm caused by its technology.
Third-Party Drivers, Pedestrians, or Cyclists
Of course, not every collision involving a Waymo is the car’s fault. If another driver runs a red light, makes an illegal turn, or rear-ends the autonomous vehicle, traditional negligence rules still apply. The fact that one of the vehicles is self-driving doesn’t automatically make it liable. Similarly, a pedestrian or cyclist who suddenly darts into the road may be considered at fault if their unpredictable action made avoidance impossible, even for a car equipped with cutting-edge sensors. Waymo’s onboard data logs are often critical in these cases, revealing whether the vehicle reacted appropriately under the circumstances.
Product Defects, Software Errors, and Maintenance Failures
Liability can also extend to third-party manufacturers, programmers, or maintenance contractors. Autonomous vehicles rely on complex supply chains, from chipmakers to software vendors, and a defect anywhere in that system can have catastrophic results. For example, if faulty code from a subcontractor causes the vehicle to miscalculate speed or distance, that vendor might share in responsibility. Additionally, routine maintenance is crucial for these vehicles to operate safely. A missed software update or improperly calibrated sensor could create a dangerous malfunction, making the party responsible for upkeep, whether Waymo itself or a maintenance contractor, legally liable for the resulting injuries.
How California Law Assigns Fault in Autonomous Vehicle Collisions
Negligence and Comparative Fault in the Age of Automation
California’s legal framework for auto accidents still rests on a familiar foundation: negligence. To establish liability, a plaintiff must show that someone owed a duty of care, breached that duty, and caused harm as a result. The challenge in Waymo-related crashes is deciding who owed the duty: the human, the company, or the technology itself.
California follows a pure comparative negligence rule, meaning that fault can be divided among multiple parties in any proportion. Even if a plaintiff is partly responsible, they can still recover damages, but reduced by their assigned percentage of fault. In an autonomous vehicle context, that division might include not only human actors but also corporate entities and technology providers. For example, if a pedestrian is found 10% at fault for stepping outside a crosswalk, but the autonomous system failed to brake in time, Waymo could still bear 90% of the liability.
This flexible standard allows courts to adapt traditional negligence law to modern technology, though not without controversy. As vehicles become more independent, determining what counts as “reasonable care” for a self-driving system will be one of the defining legal questions of the decade.
The Role of Vehicle Data and Onboard Recording Systems
In conventional collisions, police reports, skid marks, and witness statements form the core of the investigation. In autonomous-vehicle cases, data becomes the new eyewitness. Waymo vehicles record enormous amounts of data, including GPS positioning, LiDAR imaging, steering input, braking force, and object recognition patterns. These digital “black boxes” can reconstruct exactly what the vehicle saw, how it reacted, and whether its behavior was consistent with safe operating standards.
However, accessing this information is not always straightforward. The data belongs to the company, and release often requires legal action. Attorneys must act quickly to preserve evidence before it’s overwritten or obscured by proprietary protections. That’s where experienced legal counsel becomes indispensable; understanding what to request, how to interpret the data, and hw to use it to establish negligence or product defect.
How Insurance Coverage Works in Self-Driving Car Claims
California law mandates minimum insurance requirements for both traditional drivers and companies operating autonomous fleets. Waymo carries its own commercial insurance coverage, which is meant to protect passengers, third parties, and property owners in the event of an accident. Yet, the claims process often becomes a maze of overlapping policies and disputed responsibility. A passenger might file a claim with Waymo’s insurer, while a third-party driver’s insurer disputes coverage, arguing that the self-driving system, not human error, was at fault. In other cases, an injured party may need to pursue recovery directly from Waymo through a personal injury or product liability lawsuit.
The blurred line between “driver” and “product” complicates every insurance determination. Without clear federal standards, these disputes are currently resolved on a case-by-case basis, relying on the skill of attorneys who understand both the technical and legal dimensions of autonomous-vehicle claims.
personal injury lawyer
get a free consultation
Common Scenarios and Liability Breakdowns
When a Waymo Vehicle Hits Another Car
Imagine a Waymo vehicle traveling through a Los Angeles intersection when it misreads a flashing yellow signal and collides with a car making a lawful turn. In this case, the issue turns on whether the autonomous system’s sensors or software failed to interpret the traffic environment correctly. If so, Waymor, or one of its component manufacturers, may be liable under product liability. However, if the other driver failed to yield, the blame may shift, at least in part, to that human operator. The key evidence will come from the car’s onboard data logs and the external camera footage that records how both vehicles behaved in the seconds before impact.
In these scenarios, the law does not presume that technology is flawless. A self-driving system is still expected to operate as a reasonably careful human driver would under similar circumstances.
When Another Driver Collides with a Waymo
In many cases, a Waymo vehicle is the victim, not the cause, of a crash. Suppose a distracted driver rear-ends a stationary Waymo at a red light. The self-driving vehicle may have done everything right, but the human error of another motorist still leads to an impact. In that event, the negligent driver’s insurance company would likely bear primary responsibility.
However, Waymo’s insurer may still become involved if passengers inside the autonomous vehicle sustain injuries. These overlapping claims can quickly grow complicated, as each insurer attempts to assign liability elsewhere. Understanding where to file, what evidence to preserve, and how to coordinate multiple insurance policies requires experienced legal navigation.
Pedestrian or Cyclist Accidents Involving Waymo Vehicles
Pedestrian and cyclist collisions are among the most challenging and high-stakes autonomous-vehicle cases. If Waymo fails to detect a person crossing at night or misjudges distance due to glare, the company could face serious liability for system error or inadequate design. At the same time, if a pedestrian crossed against a signal or appeared suddenly from between parked cars, fault could be shared. California’s comparative negligence doctrine means even pedestrians can bear partial liability when their conduct contributes to an accident.
What distinguishes these cases from ordinary collisions is the evidence: radar data, LiDAR imaging, and system decision logs reveal not just what the car did, but why it did it. That insight allows investigators to determine whether the vehicle’s reactions were appropriate or unreasonably delayed.
Software Malfunctions and Sensor Failures
Perhaps the most legally complex scenario arises when a software or sensor failure plays a role in the crash. A coding error could cause the car to misinterpret lane markings or freeze in response to conflicting data inputs. In such cases, responsibility might fall on Waymo’s software engineers or even an external contractor who supplied defective hardware.
Because self-driving vehicles rely on a vast digital ecosystem, pinpointing fault can resemble investigating a plane crash: every system, sensor, and update must be examined. When software is to blame, attorneys may pursue a product liability theory rather than simple negligence, focusing on design defects, testing protocols, or the company’s failure to warn consumers of known risks.
What to Do If You’re Involved in a Waymo-Related Crash
Stay Grounded and Document Everything
The moments after a collision with an autonomous vehicle can feel disorienting, especially when there’s no driver to speak with. If you’re involved in a crash with Waymo, the most important thing you can do is stay calm and treat the scene like any other accident: check for injuries, move to safety if possible, and call 911.
Once the scene is secure, begin documenting everything you can. Take photographs of the vehicles, road conditions, and any nearby traffic controls. Note the time, location, and weather, as these details can become critical later. If witnesses are present, gather their contact information immediately because memory fades, but a credible third-party account can be powerful evidence. Even though Waymo’s vehicles record detailed sensor and camera data, that evidence belongs to the company and may not be readily shared. Your personal documentation helps preserve an independent record that supports your version of the events.
Reporting and Preserving Evidence
You should still exchange information, though in this case, it will likely come from Waymo’s support team or a law enforcement officer rather than a driver. California requires autonomous vehicle operators to report any collision to the DMV within 10 days, and most incidents also trigger internal investigations. If you’re injured, seek medical attention immediately, even if symptoms seem minor. Many injuries, particularly whiplash or concussion, develop over time. Detailed medical records create a direct link between the accident and your condition, which is essential for proving damages.
Attorneys handling autonomous vehicle cases often send evidence preservation letters (sometimes called spoliation notices) to prevent companies from deleting or overwriting key vehicle data. Acting quickly ensures that information, from braking timestamps to sensor readings, is preserved before it’s lost to routine data cycles.
When to Contact a Personal Injury Attorney
Collisions involving Waymo and other self-driving cars are not ordinary claims. They require knowledge of technology, corporate liability, and California’s evolving regulatory landscape. Insurers may try to deflect responsibility or delay resolution while fault is investigated across multiple layers: human, mechanical, and digital. An attorney familiar with these cases can cut through that complexity. By understanding how to access vehicle data, interpret logs, and coordinate with technology experts, your lawyer can transform an opaque crash into a clear story of accountability.
If you’ve ever been hurt in a Waymo-related accident, reaching out to a legal team early allows them to begin preserving evidence and protecting your rights before key details disappear. The earlier your counsel is involved, the stronger and more credible your case will become.
How Vaziri Law Group Handles Autonomous Vehicle Cases
Investigating Corporate Liability and Data Evidence
At Vaziri Law Group, we approach autonomous vehicle accidents differently from conventional car crash claims. These cases require not only an understanding of negligence law but also the technical acumen to decode how automated systems make decisions. When a Waymo vehicle is involved in a collision, our investigation begins with one priority: data access.
Every Waymo car is a moving data center, recording its speed, sensor readings, braking intervals, and visual field at the moment of impact. Our attorneys work with accident reconstructionists and software experts to interpret this data, transforming lines of code into a clear story of cause and fault. When necessary, we issue preservation demands and subpoenas to ensure no evidence is lost or withheld under corporate confidentiality.
This proactive approach allows us to pinpoint whether the system malfunctioned, the oversight failed, or another driver caused the crash. In cases involving shared liability, that precision becomes the foundation of successful negotiation or litigation.
Navigating Insurance and Manufacturer Negotiations
Because self-driving technology sits at the intersection of insurance, corporate accountability, and consumer safety, no two claims look alike. Insurers often attempt to delay or deny coverage by citing “system error” or by shifting blame to third-party technology providers. Our firm cuts through those tactics with data-backed arguments and a meticulous command of California’s liability laws. We engage directly with both insurers and manufacturers, holding them to their legal obligations and preventing them from hiding behind emerging technology as a defense. Our goal is simple: to ensure that injured individuals are not left to absorb the costs of corporate experimentation gone wrong.
Advocating for Victims of Technology-Driven Negligence
Vaziri Law Group’s commitment to justice extends to the frontiers of modern transportation. We believe that innovation should never outpace accountability, and that every advancement must serve public safety before profit. When a Waymo crash disrupts your life, you deserve a legal team that understands both the human story and the digital evidence behind it. Our attorneys combine compassion with command, translating complex data into plain truth and using it to fight for fair compensation.
Each autonomous vehicle case we handle reinforces a principle that defines our firm: technology may change, but responsibility does not. Whether negligence takes the form of a distracted driver or a faulty algorithm, we pursue accountability with the same precision, intensity, and purpose that have shaped Vaziri Law Group’s success across California.
Conclusion: Accountability in the Age of Automation
The future of transportation is already here: sleek, silent, and guided by code. Yet behind every autonomous vehicle gliding through California streets remains an unchanging truth: accountability cannot be automated.
When a self-driving car causes harm, responsibility does not vanish into the algorithm that made the decision. It transfers, from human hands to corporate systems, from mechanical error to data-driven negligence. The law must keep pace with that transfer, ensuring that innovation never becomes an excuse for impunity.
At Vaziri Law Group, we see technology not as an obstacle but as a new frontier for justice. Our work on autonomous-vehicle cases stands on a simple belief: progress means little if it leaves the injured behind. Each Waymo crash represents not just a failure of technology, but a moment where the law must step in to reaffirm the value of human life and safety.
So if you’ve been hurt in an accident involving a self-driving car, don’t think of your case as a legal question about machines. Think of it as a statement about accountability; about who answers when technology gets it wrong. Tell your story so we can preserve your rights. And remember that while the vehicles may drive themselves, justice still requires someone at the wheel.