tesla fsd beta v12 goes camera only

see also: Latency Budget · Platform Risk

Tesla pushed FSD Beta V12 to a wider set of drivers, ditching radar entirely and promising that an end-to-end neural network can learn the whole driving policy (Electrek). The release reignites debate about whether cameras alone can deliver Level 4 behavior.

scene cut

V12 uses a neural net to map camera input directly to steering and throttle, reducing hand-coded heuristics. Drivers reported smoother turns but also odd lane decisions during rain.

signal braid

risk surface

  • Software regressions are harder to trace when the policy is a monolith.
  • Any high-profile crash could trigger broad NHTSA action.
  • Tesla’s release cadence depends on real-world data; a pause in usage slows iteration.

my take

Tesla is forcing autonomy to be a software product, but it raises the question of how you prove a neural policy is safe in the first place.

linkage

linkage tree
  • tags
    • #autonomy
    • #automotive
    • #2023
  • related
    • [[gpt-4 release recalibrates hallucination debate]]
    • [[starship flight 3 balances rapid iteration]]

ending questions

What telemetry proof would convince regulators that V12’s camera-only policy is acceptably safe?

tesla fsd beta v12 goes camera only