Robotaxis should be licensed like drivers
Autonomous vehicles are coming to Portland. They should follow the rules of the road before being allowed to drive on them.
Waymo is coming to Portland. The mayor supports it, the company started mapping streets on April 28, and Oregon House Bill 4085 died in committee earlier this year. The deployment is happening regardless of what anyone writes about it. The question worth asking is what we demand from them when they get here.
Anyone driving a Waymo on a Portland street should hold an Oregon-issued autonomous vehicle license, work as a W-2 employee of the company, be physically located in the United States, and have every decision they make logged for audit.
When a Waymo encounters a situation it can't resolve — a construction zone, an obstruction, a stopped school bus — the car phones home. A human "fleet response agent" reviews the camera feeds and provides input on lane selection, path planning, whether to proceed. That human currently doesn't need a driver's license, doesn't need to be in the United States, and isn't disclosed by the company. During a February 2026 Congressional hearing, Waymo confirmed some of its remote operators are based in the Philippines. Senator Ed Markey warned that overseas remote operations could give hostile actors driver-like control over American vehicles.
"Fleet response," "rider assistance," "contextual guidance" — the marketing language describes the act of driving, performed by a human, made remote. The legal category hasn't caught up, and the companies have spent considerable effort making sure it doesn't.
The Remote Vehicle Operator Licensing Act we drafted at State Capacity AI closes the gap with five provisions:
A new license endorsement. A Class R endorsement added to any existing state-issued driver's license. Written exam, practical assessment using simulated driving environments, background check equivalent to commercial drivers, four-year renewal. Every state already has a DMV that issues endorsements, so implementation is fast.
A US location requirement. Anyone providing remote operational guidance on Oregon roads must hold a US state-issued license. Companies can't route remote operations overseas. Someone driving a car on an American road needs to be reachable by American law enforcement and accountable under American liability frameworks.
A W-2 employment requirement. Remote operators must be employees, not contractors. The contractor model has been used across the platform economy to push liability onto individuals and away from the firms whose business models depend on the work being done. Vicarious liability requires employment.
A decision logbook. Every remote intervention logged with operator ID, timestamp, input, vehicle, decision. Cryptographically signed and uploaded to a state repository. The model is borrowed from Electronic Logging Devices used by long-haul truckers.
Real penalties. Graduated penalties for unlicensed operations, license points for operators who authorize illegal actions, DMV authority to revoke a company's authorization to operate.
Other states are passing pieces of this. Texas HB 4402 and SB 2807 require DMV authorization for commercial autonomous operations and let the state revoke authorization if a company endangers the public. California AB 1777 treats driverless cars as if a person is behind the wheel for traffic citations and requires a two-way voice communication device that reaches a remote operator within 30 seconds.
The reason we keep ending up with extractive platforms running our cities is that we let companies brute-force the law. Uber operated illegally in dozens of cities for years, paid the fines as a cost of doing business, and rewrote the regulations once they had enough riders to threaten the politicians enforcing existing rules. The pattern works because billions of dollars absorb the cost of breaking the law until the law gets rewritten. A normal business or a normal person trying the same approach gets shut down within a week.
The asymmetry is the entire game.