I guess the only option they have is "pull over" which in this case just caused the car to continue circling looking for a safe place to pull over. If they had an actual kill switch, we'd probably be watching another video of some guy on a call to waymo support while stuck in the middle of a highway.
To be clear, I'm talking specifically about the first line of support at Waymo here. I am not precluding that they have higher levels of control behind layers of authorisation.
Yes, in much of the world there are mandatory passenger-facing emergency break levers in every carriage of passenger trains. The US is the outlier here.
And yes, passengers should absolutely be able to bring their vehicle to an immediate stop. It's an "emergency break"! Of course you need an emergency break in an autonomous car! What exact alternative are you proposing for when you're in an AI-operated car hurtling under the chassis of a white truck that it failed to detect in snow conditions?
It seems like an incredibly obvious and basic legislative requirement for self-driving cars to have some kind of immediate manual break for emergencies. I'm kind of shocked that that apparently isn't the case now?
Sounds likely, in which case there needs to be a much more "break glass in case of emergency" control which gradually lowers the maximum speed cap of the vehicle.
So even if the vision/pathfinding believes there is nowhere to park and nowhere else to turn, it will still coast to a stop in a way that is not inherently less-safe than a more-normal car running out of gas and stalling on the road.