LegalReader.com  ·  Legal News, Analysis, & Commentary

News & Politics

The Legal Challenges Facing Tesla and Waymo’s Robotaxis


— March 13, 2025

For Tesla, Waymo, and other AV companies, navigating these legal roadblocks will be just as crucial as improving the technology itself.


The race to bring fully autonomous ride-hailing services to the streets is heating up, with companies like Tesla and Waymo leading the charge. Waymo, Alphabet’s self-driving subsidiary, has already launched its robotaxi services in multiple U.S. cities, while Tesla is preparing to roll out its Cybercab ride-hailing service in Austin later in 2025.

But as exciting as these advancements are, the legal landscape surrounding autonomous vehicles (AVs) is anything but simple. From state-by-state regulations and liability concerns to safety standards and privacy issues, Tesla and Waymo face significant legal roadblocks that could determine whether robotaxis become mainstream—or remain stuck in a regulatory quagmire.

Navigating Different State and Federal Laws

One of the biggest challenges facing robotaxis is the lack of consistent national regulations. Instead, AV companies must navigate a confusing mix of federal, state, and local laws, which vary widely depending on where they operate.

California vs. Texas

  • California has some of the strictest AV regulations in the U.S., requiring companies to obtain permits, report disengagements (when a human must take control), and disclose accident data. This has made it harder for companies like Tesla to deploy its robotaxi services there.
  • Texas, on the other hand, has positioned itself as an AV-friendly state, allowing companies to operate without extensive local restrictions. This is why both Waymo and Tesla have chosen Austin as a testing ground. Texas law even prevents local governments from restricting AV operations, making it an attractive alternative to California’s red tape.

Federal Oversight

The National Highway Traffic Safety Administration (NHTSA) oversees vehicle safety at the federal level but has yet to establish comprehensive AV regulations. Instead, it has primarily acted reactively, investigating Tesla’s Autopilot crashes and monitoring AV deployments without imposing strict nationwide rules. This regulatory uncertainty leaves companies like Tesla and Waymo operating in a legal gray area.

Who’s at Fault in an Accident?

One of the most complicated legal questions surrounding robotaxis is liability—who is responsible when a driverless car crashes?

  • For traditional vehicles, liability typically falls on the driver. But in the case of AVs, is it the vehicle owner, the manufacturer, or the software developer?
  • Texas law states that the owner of an AV is responsible for its compliance with traffic laws. But this could become problematic if Tesla or Waymo claim they are not liable for software errors.

Waymo, which has accumulated millions of autonomous miles, has been involved in multiple minor accidents, many caused by human drivers. However, as the technology expands, legal challenges regarding responsibility and compensation for accident victims will become even more critical.

Tesla, meanwhile, has consistently denied liability for crashes involving its Autopilot and Full Self-Driving (FSD) systems, arguing that drivers are ultimately responsible—even as it markets its software as an advanced autonomous system. This contradiction could lead to legal battles as Tesla’s robotaxis begin operating in public spaces.

Safety Standards and Oversight

While AVs promise to reduce traffic fatalities caused by human error, skepticism remains about their real-world safety performance.

Waymo has conducted extensive safety testing, with studies showing an 88% reduction in property damage claims and a 92% reduction in bodily injury claims compared to human-driven vehicles. However, Waymo vehicle accidents still happen. Most recently, a Waymo was involved in a fatal crash in San Francisco, but investigations found that it was stationary and not at fault when a speeding Tesla collided with it.

Image of the National Highway Traffic Safety Administration Logo
National Highway Traffic Safety Administration Logo; image courtesy of the U.S. Government via WIkimedia Commons, https://commons.wikimedia.org

Tesla faces even greater safety scrutiny, as its FSD software has been linked to multiple crashes. NHTSA data shows that 299 of Tesla’s 2,621 reported ADAS-related crashes since 2021 have occurred in Texas, where it plans to launch Cybercab.

As more AVs hit the road, regulators will need to strengthen oversight, requiring companies to disclose crash data and safety performance metrics. A key debate is whether AVs should be required to pass a driving test—similar to human drivers—before being allowed on public roads.

Who Owns the Data?

Robotaxis don’t just transport passengers—they collect massive amounts of data about their surroundings, traffic patterns, and even rider behavior. But this raises serious privacy concerns:

  • Who owns the data collected by AVs? Is it the passenger, the manufacturer, or a third-party service?
  • How is the data being used? Are companies monetizing it, and if so, are passengers aware?
  • What happens in case of a security breach? Could cyberattacks on AV fleets compromise passenger safety?

Waymo and Tesla must navigate growing concerns over data security and passenger privacy, particularly as states consider tighter consumer protection laws.

Ethical and Social Implications

Beyond legal and regulatory concerns, the rise of robotaxis presents major ethical and social dilemmas:

Job Displacement

One of the most significant concerns is the impact on jobs. If autonomous ride-hailing services become widespread, they could displace millions of drivers, including Uber and Lyft drivers, taxi operators, and truck drivers (if AV technology expands to freight transport).

How Should AVs Make Ethical Decisions?

Another unresolved issue is how autonomous vehicles should behave in emergency situations. If a crash is unavoidable, should the car prioritize protecting its passengers or pedestrians? Who decides what ethical framework AVs should follow?

These questions remain largely unanswered, but as robotaxis continue to expand, they will become more urgent.

The Legal Roadblocks Ahead

The future of robotaxis is full of promise, but legal and regulatory challenges could slow their rollout. Tesla and Waymo face significant hurdles, including:

  • Regulatory inconsistencies across states and a lack of federal guidelines.
  • Unclear liability rules, making it difficult to assign fault in accidents.
  • Ongoing safety concerns, particularly for Tesla, as regulators investigate FSD-related crashes.
  • Privacy and data security risks, which could lead to stricter regulations.
  • Ethical dilemmas surrounding job loss and decision-making in emergencies.

For Tesla, Waymo, and other AV companies, navigating these legal roadblocks will be just as crucial as improving the technology itself. Whether they succeed will depend on how well they address these challenges—and how willing governments are to adapt the laws of the road to a driverless future.

Join the conversation!