There are tens of thousands of people either walking or driving the streets of Mountain View and Sunnyvale where Google is testing its self-driving cars. So, the members of the public are at risk.
But, I understand that you mean behind the wheel. Which is fun because the Google cars do not have a wheel. As I'm sure you may know, instead of a traditional steering wheel they have horizontally mounted, valve wheel with a handle. It looks a little bizarre when you first see it, but I would imagine it could be a safer, smoother alternative to the current standards steering wheel.
Unfortunately, I couldn't quickly provide an image; I've seen it multiple times on my walk to and from work.
> There are tens of thousands of people either walking or driving the streets of Mountain View and Sunnyvale where Google is testing its self-driving cars. So, the members of the public are at risk.
There is a huge difference between Google operating a handful of cars in certain areas and Tesla selling 10-20 thousand vehicles per month that can be operated by untrained consumers anywhere in the US.
Tesla is using its customers as data-generating guinea pigs. In return, Tesla may become a guinea pig itself by showing other companies how not to progress towards autonomous vehicles.
Licensed drivers. Drivers who have a license that indicates that they will take responsibility for the actions of any vehicle they control and understand how to operate any features which may put themselves or others at risk.
> Drivers who have a license that indicates that they will take responsibility
The fact that customers are licensed does not mean the seller is free from any and all regulation. Gun owners are licensed, and guns are required to have safety switches. Cars have a long history of being regulated [1]
Basically, anything that can contribute to deaths is going to monitored closely by consumer protection bureaus, and will probably be heavily regulated.
I didn't mean to imply otherwise. Tesla probably could do a much better job informing it's customers what they are getting into, because they certainly aren't in a position to figure it out themselves. And they definitely can make their software better.
What I'm saying is, if your car injures someone while you sit in the driver seat, and you could have easily taken an easily foreseeable action to prevent it, then you are at least partially responsible.
> What I'm saying is, if your car injures someone while you sit in the driver seat, and you could have easily taken an easily foreseeable action to prevent it, then you are at least partially responsible.
Oh absolutely. No question there for me at this time.
There may yet be some class action or something that reveals some unjust action by the driver-assist companies. I agree that right now, all other things being perfect, if a driver of one of these cars is in an accident, then some person is responsible.
Also, Google doesn't drive it's test car on highway speeds with humans inside. Google self-driving car's speed is limited to 25 mph, so even if it hit something, potential that it would be lethal is very slim.
Google has two basic models of self-driving car. One is a modified standard car, and can drive at freeway speeds. If you drive around Mountain View, CA, you'll probably see some of them. The other is a little electric thing with no steering wheel limited to 25MPH. The Computer History Museum in Mountain View has one on display.
https://www.google.com/selfdrivingcar/reports/