Insights Business| SaaS| Technology Waymo vs Tesla Autonomy — Two Fundamentally Different Approaches to Risk
Business
|
SaaS
|
Technology
Apr 21, 2026

Waymo vs Tesla Autonomy — Two Fundamentally Different Approaches to Risk

AUTHOR

James A. Wondrasek James A. Wondrasek
Graphic representation of Waymo vs Tesla autonomous vehicle risk approaches

If you’re evaluating autonomous vehicle technology in 2026, you’re really asking one question: who carries the liability when something goes wrong? That’s the question this article is going to answer. We’ll cover SAE Level 2 versus Level 4, sensor fusion versus camera-only, California’s December 2025 deceptive marketing ruling against Tesla, the Austin safety monitor removal, and what the safety data actually says. For the broader picture on the robotaxi market as a whole, it’s worth getting that context first.

The short version: choosing between Waymo and Tesla is a liability alignment decision. Not a feature comparison.


Is comparing Waymo and Tesla really a question of risk posture rather than performance specs?

Waymo and Tesla operate at different SAE automation levels. That difference is what determines who is on the hook legally when things go wrong.

Waymo is a commercial SAE Level 4 operator. Within its defined operational domain, the vehicle handles everything — no human required. Liability points at Waymo.

Tesla’s FSD (Full Self-Driving, Supervised) is SAE Level 2. The human operator is legally responsible at all times. Tesla has built its litigation strategy entirely around this structure: when incidents occur, the driver was responsible.

This isn’t a gap that more training data closes. It’s a categorical regulatory classification with direct legal consequences. For context on where each company currently operates, the geographic deployment picture shows just how differently these two companies have approached getting to market.


What do SAE Level 2 and Level 4 actually mean — and who pays when something goes wrong?

SAE Level 2 (partial automation): the vehicle assists with steering, acceleration, and braking. The human driver must stay engaged and legally responsible at all times.

SAE Level 4 (high automation): within a defined Operational Design Domain (ODD), the vehicle handles all driving tasks autonomously. If something goes wrong inside that ODD, the manufacturer wears the liability — not the occupant.

At Level 2 — which is where Tesla FSD Supervised sits — your employee is the legally responsible driver. If a company employee using Tesla FSD is involved in a serious incident, your organisation’s legal and insurance exposure comes in through duty-of-care obligations.

At Level 4, with Waymo, liability routes to Waymo and Alphabet. The passenger is not the driver.

Waymo’s ODD is explicit, published, and certified: defined geographic areas, road types, speed ranges, weather conditions. Tesla’s FSD (Supervised) has no formal ODD at all. The driver remains responsible regardless of conditions, and there is no bounded domain within which Tesla accepts manufacturer liability.

For the wider regulatory picture across the autonomous vehicle landscape, the liability framework is still evolving — but Level 2 versus Level 4 is the clearest structural marker in what currently exists.


How does Tesla’s camera-only approach differ from Waymo’s sensor fusion stack?

Waymo uses sensor fusion: LIDAR, radar, and cameras all running simultaneously, cross-validating each other. LIDAR gives you precise three-dimensional spatial data in conditions where cameras fall down — low light, glare, fog. If one sensor throws an anomalous reading, the others provide independent redundancy.

Tesla uses a camera-only architecture. No LIDAR, no radar. The system interprets the entire driving environment through cameras trained on supervised fleet data.

Tesla’s argument for going camera-only: human drivers navigate with eyes; cameras are cheaper to scale; 7.4 billion FSD miles produce a training dataset no purpose-built rival can match. That’s the data flywheel thesis.

The counter-argument: camera-based systems have failed to detect objects as large as trucks under degraded conditions — a failure mode LIDAR eliminates by design. A single-modality system has no independent check on a perception error.

There’s also a problem with the flywheel. Those 7.4 billion supervised miles were driven with a human who could intervene before incidents resolved. Driverless miles are a more informative safety signal because the system has to resolve every situation alone — no backstop.

Camera-only means the system must be right every time. Sensor fusion means a camera failure doesn’t have to become a safety event.

For a deeper technical look at why Waymo’s simulation investment matters, that’s covered separately.


What does “FSD Unsupervised” mean — and what did the California DMV’s deceptive marketing ruling find?

Tesla markets Full Self-Driving (Supervised). The “(Supervised)” qualifier is legally significant: the driver must stay attentive and ready to intervene, placing it firmly at SAE Level 2. “FSD Unsupervised” — a version that operates without a human driver — is Tesla’s stated goal, not a commercially certified product. In January 2026, Elon Musk said Tesla still needs roughly 10 billion miles before achieving safe unsupervised self-driving.

“Full Self-Driving” was a consumer product name for years before Tesla added “(Supervised),” and the confusion that created was widespread.

The California DMV ruled on this directly. In November 2023, the DMV accused Tesla of deceptive marketing: Tesla had advertised “Autopilot” and “Full Self-Driving Capability” as capable of conducting trips with “no action required by the person in the driver’s seat” — a claim the vehicles could not deliver. After an administrative hearing and ALJ ruling in 2025, the DMV issued its final decision on December 16, 2025, threatening a 30-day California licence suspension.

The ruling found deceptive marketing — not that the technology is unsafe. That distinction matters: it’s a formally adjudicated finding that Tesla’s marketing claims exceeded what the technology could deliver. A direct vendor credibility input for any procurement process.


Why did Tesla discontinue Autopilot — and what does the California DMV ruling signal?

After the December 2025 ruling, Tesla discontinued the Autopilot brand name in early 2026. The compliance timeline is worth paying attention to.

It took more than two years from accusation to compliance: marketing overclaims → partial correction (adding “(Supervised)” to FSD) → continued resistance on Autopilot → administrative hearing → judicial ruling → threatened sanctions → compliance. How a vendor responds to regulatory pressure tells you something. In this case it tells you quite a bit.

California sets national precedent on vehicle regulation. If this marketing standard extends to other major states, a Tesla FSD-based enterprise transport programme faces regulatory exposure that Waymo — built on formal certification — does not.

And discontinuing Autopilot doesn’t change what FSD (Supervised) actually is. The SAE Level 2 classification and liability posture are unchanged. One misleading brand name is gone.


How is Tesla removing safety monitors from its Austin robotaxi rides — and what does that approach reveal?

Tesla launched its Austin robotaxi service on June 22, 2025 with a human safety monitor in every vehicle. In January 2026, Elon Musk announced rides with no one in the car had begun, and Tesla’s VP for autonomy noted the unsupervised ratio would “increase over time.”

The backdrop to this decision is not reassuring. Tesla’s Austin fleet recorded 9 crashes between July and November 2025 under NHTSA’s Standing General Order — the federal requirement that compels AV operators to publicly report crashes. That’s roughly one crash every 55,000 miles. Human drivers average one police-reported crash every 500,000 miles. Tesla’s monitored fleet was crashing nine times more frequently than the human baseline — with a monitor in the car the whole time.

Waymo removed monitors incrementally, only after demonstrating performance below the human baseline in each market. Tesla began removing the human backstop while its incident rate remained well above that baseline.

There’s also a structural liability gap worth noting. Removing a safety monitor from a non-Level-4-certified system leaves no human backstop — but without Level 4 certification, manufacturer liability doesn’t clearly attach either.

Transparency matters here too. Every Tesla crash narrative in the NHTSA public database is fully redacted. Waymo and other operators publish full incident narratives.


What does the safety record comparison actually show — and what does it not show?

Waymo’s published safety data through December 2025 covers 170.7 million rider-only (driverless) miles. The Waymo Driver recorded 92% fewer serious-injury crashes compared to human drivers in those same cities — independently validated by Swiss Re through insurance claims analysis.

Tesla’s Austin pilot data shows 9 NHTSA-reported crashes over approximately 500,000 miles through November 2025 — the nine-times-worse-than-human figure — with monitors present throughout.

These figures look comparable. They are not. Waymo’s benchmark comes from driverless miles with no human backstop. Tesla’s miles are supervised — a human was present who could intervene, meaning the system’s unassisted capability in edge cases was never fully tested.

Waymo has a published, independently validated driverless safety record above the human baseline. Tesla does not. Any procurement process that treats these as equivalent evidentiary standards is making a category error.


What does the Waymo–Tesla divergence mean for procurement and vendor selection?

Choosing between Waymo and Tesla is not choosing between two versions of the same product at different maturity levels. It is choosing between two different liability postures, two different regulatory histories, and two different risk management philosophies.

At SAE Level 4 (Waymo): the vendor owns liability for incidents within its ODD. Your organisation’s exposure routes through contract terms, not employee driving responsibility. Waymo operates commercially in six markets and raised $16 billion in February 2026 — Alphabet’s long-term commitment is not ambiguous.

At SAE Level 2 (Tesla FSD Supervised): your employees are legally the responsible drivers. Your organisation may carry exposure through duty-of-care obligations. The vendor’s regulatory history includes a formally adjudicated deceptive marketing finding, and no equivalent driverless safety record is published.

A structured due diligence process should cover four areas. First, verify SAE level and certifying body — Level 4 with a published ODD differs materially from Level 2 regardless of naming. Second, review regulatory history — the California DMV timeline is public record. Third, examine safety data methodology — supervised versus driverless miles, conditions, independent validation. Fourth, confirm ODD alignment — verify your use case falls within the certified domain before assuming Level 4 liability protections apply.

If your organisation’s risk frameworks require a clear, documented liability assignment, Level 4 operation with manufacturer liability is the structurally cleaner choice. Level 2 retains driver — and potentially employer — liability regardless of how capable the assistance system becomes. For what these risk postures look like when incidents actually occur, the real cases that make the abstract liability question tangible are examined separately. And for the broader robotaxi deployment context, the full market picture situates where both companies are headed.


Comparison: Waymo vs Tesla 2026

SAE Level Waymo: Level 4 (within ODD). Tesla FSD (Supervised): Level 2.

Sensor stack Waymo: LIDAR + radar + cameras. Tesla FSD (Supervised): Camera-only.

Driver required Waymo: No. Tesla FSD (Supervised): Yes — legally responsible at all times.

Liability (within ODD) Waymo: Manufacturer (Waymo/Alphabet). Tesla FSD (Supervised): Driver / operator.

Operational Design Domain Waymo: Explicit, published, certified. Tesla FSD (Supervised): None.

California DMV ruling Waymo: Not applicable. Tesla FSD (Supervised): December 2025 — deceptive marketing finding.

Published driverless safety data Waymo: Yes — 170.7M rider-only miles; 92% fewer serious injuries; Swiss Re validated. Tesla FSD (Supervised): No equivalent published.

Safety monitor approach Waymo: Incrementally removed after safety milestones. Tesla FSD (Supervised): Began removal with crash rate ~9x human baseline.

NHTSA incident transparency Waymo: Full narratives published. Tesla FSD (Supervised): All narratives redacted.


FAQ

Is Tesla FSD actually Level 4 autonomy?

No. FSD (Supervised) is SAE Level 2 — the driver must remain attentive and legally responsible at all times. Tesla’s goal is Level 4 (“FSD Unsupervised”), but as of 2026 it has not received commercial regulatory certification. In January 2026, Elon Musk stated Tesla still needs roughly 10 billion total supervised miles before achieving safe unsupervised operation.

Why did California say Tesla’s marketing was deceptive?

The California DMV ruled in December 2025 that Tesla violated California state law by using “Autopilot” and “Full Self-Driving Capability” to imply autonomous capability the vehicles could not deliver. Tesla had advertised the system could conduct trips with “no action required by the person in the driver’s seat” — a claim the vehicles could not then, and cannot now, fulfil. The ruling addressed marketing claims, not the technology’s safety record.

Does Waymo use LIDAR and Tesla doesn’t — does that matter?

Yes, meaningfully. LIDAR provides precise three-dimensional spatial data in conditions where cameras degrade — low light, fog, glare. Without LIDAR, a perception error in Tesla’s vision-only system has no independent sensor to cross-validate against. In Waymo’s sensor fusion design, a camera failure does not propagate to a safety event because LIDAR and radar provide independent readings.

What is the difference between “supervised” and “driverless” miles?

Supervised miles (Tesla’s 7.4B FSD miles) are accumulated with a licensed human driver present and able to intervene. Driverless miles (Waymo’s 170.7M rider-only miles) are accumulated with no human present — the system resolved every situation independently. They are not equivalent safety metrics: supervised miles include human interventions that prevented incidents from resolving, so the system’s unassisted capability in those situations was never fully tested.

Who is legally responsible if a Waymo vehicle is involved in a crash?

Within its Operational Design Domain, Waymo (Alphabet) bears manufacturer liability — the passenger is not the responsible driver. Outside the ODD, liability becomes more complex and depends on circumstances and applicable state law. This is why ODD alignment matters for enterprise deployment: confirm your use case falls within the certified domain before assuming manufacturer liability applies.

How should a procurement team evaluate AV vendor risk?

Four steps: (1) Verify SAE level and certifying body — Level 4 with a published ODD differs materially from Level 2 regardless of product naming. (2) Review regulatory history — the California DMV timeline (2023–2026) is public record and documents the full pattern. (3) Examine safety data methodology — supervised versus driverless miles, conditions, and independent validation. (4) Confirm ODD alignment — verify the vendor’s certified domain covers your actual deployment use case before assuming Level 4 liability protections apply.

AUTHOR

James A. Wondrasek James A. Wondrasek

SHARE ARTICLE

Share
Copy Link

Related Articles

Need a reliable team to help achieve your software goals?

Drop us a line! We'd love to discuss your project.

Offices Dots
Offices

BUSINESS HOURS

Monday - Friday
9 AM - 9 PM (Sydney Time)
9 AM - 5 PM (Yogyakarta Time)

Monday - Friday
9 AM - 9 PM (Sydney Time)
9 AM - 5 PM (Yogyakarta Time)

Sydney

SYDNEY

55 Pyrmont Bridge Road
Pyrmont, NSW, 2009
Australia

55 Pyrmont Bridge Road, Pyrmont, NSW, 2009, Australia

+61 2-8123-0997

Yogyakarta

YOGYAKARTA

Unit A & B
Jl. Prof. Herman Yohanes No.1125, Terban, Gondokusuman, Yogyakarta,
Daerah Istimewa Yogyakarta 55223
Indonesia

Unit A & B Jl. Prof. Herman Yohanes No.1125, Yogyakarta, Daerah Istimewa Yogyakarta 55223, Indonesia

+62 274-4539660
Bandung

BANDUNG

JL. Banda No. 30
Bandung 40115
Indonesia

JL. Banda No. 30, Bandung 40115, Indonesia

+62 858-6514-9577

Subscribe to our newsletter