Reasoning about what's legitimate can be a really fine line. Oncoming car, raccoon, fawn, human, doe, coyote, elk: some of these obstacles might actually be net-safer to not swerve around [perhaps depending on speed].
I'd be really curious if any of the autopilots have a value system that tries to minimize damage/loss of life. And how does it value the exterior obstacle against the passengers in the car?
I'm pretty sure that hitting a full-grown elk is a poor choice, and often a fatal one. A large, full-grown elk can weigh 600 pounds and stand 5 feet tall at the shoulder, which means it will probably go over the crumple zone and directly strike the windshield.
This happens commonly with moose in New England, which run up to 800 pounds and 6.5 feet tall at the shoulder. A car will knock their legs out from under them but the body mass strikes much higher and does an alarming amount of damage to the passenger compartment. You can Google "moose car accident" and find some rather bloody photos.
The standard local advice is to try really hard to avoid hitting moose, even at the cost of swerving. I would prefer that an autopilot apply a similar rule to large elk. Even if you wind up striking another obstacle, you at least have a better chance of being able to use your crumple zone to reduce the impact.
In this situation, small-to-normal passenger cars are actually much safer than typical American trucks and SUVs. When the legs are taken, the mass of the body strikes the roof of the lower vehicle and bounces over the top. The larger vehicles direct the body through the windshield into the passenger compartment.
I'm not sure about moose, but with more nimble animals, swerving is no guarantee of avoiding collision. A deer can change direction quicker than an automobile, and if it had any sense, it wouldn't be running across the road in the first place. It's definitely safer to slow down and honk your horn than it is to try to guess which direction the damn thing will jump.
[EDIT:] When I was learning to drive, my father often recounted the story of a hapless acquaintance who, in swerving to avoid a snake, had precipitated a loss of control that killed her infant child. He's not a big fan of swerving for animals.
Moose or Elks don't 'bounce over the top' on a small vehicle. They crush it, rip the roof off, or go through it. It may depend on antlers, angles, speed, and so on.
I've tried to pick the least graphic images possible:
I don't think there is much to be gained in terms of safety. Let's not forget that while you may have large moose, there's a range where they can still be much smaller while growing and still damage smaller cars fairly easily.
There is lots of ambiguity here. Depends on how you hit it, etc.
Roof landings don't always end well -- my Civic was trashed when a deer basically bounced up and collapsed my roof on the passenger side. My head would have been trashed as well had I been a passenger!
In general, if you cannot stop, hitting something head on is the best bet. Modern cars are designed handle that sort of collision with maximum passenger safety.
The cost of swerving is very, very minimal (especially in a modern car that won't go sideways no matter what you do).
Imagine a two lane road with a car traveling toward an obstruction in its lane. The distance required to get the car out of the obstructed lane via the steering wheel is much less than stopping it via the brakes.
There's a reason performance driving has the concept of brakes being an aid to your steering wheel as opposed to being operated as a standalone system.
Yes, I included it precisely because there's no reason to ever hit an elk. "some of these obstacles ..." -- it's one of the extrema that you might imagine as test cases for this kind of a feature.
Technically the swerving itself is not necessarily illegal, but it makes you liable for anything that could have been avoided by not swerving. If you swerve on a free and open road and don't cause any damage or injury to anything or anyone (including yourself and your car), it's not like you will still be dragged to court just because you didn't want to unnecessarily run over someone's pet.
> obstacles might actually be net-safer to not swerve around
In the UK its illegal to swerve for these things. In the sense that if you subsequently cause an accident for swerving to avoid a cat, you are at fault.
Wait, it's illegal to swerve for a moose (TIL: moose are called elk in Europe, American elk are totally different) in the UK? Hitting a moose is a fast route to the grave.
Such a value system would be a nightmarish thing so I hope that it doesn't exist yet.
Just imagine your car calculating that it should let you die in order to save a group of school children... except that the school children were in fact a bunch of sheep that the system mis-identified.
The danger of AI isn't in paperclip optimizers (at least for now), it's in humans over-relying on machine learning systems.
I think such a value system is fine in as much as it preserves the safety of the driver.
I'd like my car to make value judgements about what's safer - braking and colliding with an object or swerving to avoid it. I'd rather it brakes and collides with a child than swerve into oncoming traffic, but I'd rather it swerved into oncoming traffic to avoid a truck that would otherwise T-bone me.
As long as the system only acts in the best interest of it's passengers, not the general public, it's fine by me.
I'd rather it brakes and collides with a child than swerve into oncoming traffic
Wait, what? You'd rather hit and kill a child than hit another vehicle - a crash which at urban/suburban speeds would most likely be survivable, without serious injury, for all involved?
You cannot codify some of the "dark" choices that are typically made today in split second, fight or flight situations. If we go down this road, it opens to door to all sorts of other end justifies the means behaviors.
There's a threshold number of dead hypothetical children that's going to trigger a massive revolt against tone deaf Silicon Valley industry.
The moral choice is the opposite direction: the car should prefer to kill the occupant than a bystander. The occupant is the reason why the car is in use; if it fails, the penalty for that failure should be on the occupant, not bystanders.
Yeah this thread has really squicked me out. Of course dehumanization is an age-old tactic for enabling asshole behaviors of all sorts. I'd like to hope, however, that outside the current commuter-hell driving context, few people would even admit to themselves, let alone broadcast to the world, their considered preference for killing children. That is, when they no longer drive, perhaps people won't be quite so insane about auto travel. The twisted ethos displayed in this thread will be considered an example of Arendt's "banality of evil".
I don't even drive at all, and yet this thread irks me from the other direction. Dehumanization is an age-old tactic for enabling asshole behaviors, and bringing children into a discussion is an age old tactic for justifying irrational decisions ("Think of the children").
You claimed that any car may be driven slowly enough to avoid killing people on the road with only mere convenient to others. That's patently false: firetruck. And then there are several other posters claiming that it's the driver fault if any accident happens at all, due to the mere fact that the driver chooses to drive a car. If we remove every single vehicle in the US out of the street tomorrow, a whole lot of people would die, directly or indirectly. I'd like to see anyone stating outright that "usage of cars in modern society have no benefit insofar as saving human lives is concerned".
In the end, I think both sides are fighting a straw-man, or rather, imagine different scenario in the discussion. I read the original poster as imagine a case where swerving means on a freeway potentially being deadly for at least 2 vehicles, along with a multi-vehicle pile up. You're imaging suburban CA where swerving means some inconvenience with the insurance company. I have little doubt that everyone would swerve to avoid the collision in the latter.
Also since we're on HN and semantic nit-picking is a ritual around here, "avoid killing children in the road above all other priorities, including the lives of passengers" is NOT a good idea. As far as the car knows, I might have 3 kids in the backseats.
I believe you, that you don't drive at all. You call both situations strawmen, yet each is a common occurrence. Safe speed in an automobile is proportional to sightline distance. If drivers or robocars see a child (or any hazard) near the road, they should slow down. By the time they reach the child, they should be crawling. That's in an urban alley or on a rural interstate. If the child appeared before the driver or robocar could react, the car was traveling too fast for that particular situation.
In that "traveling too fast" failure mode, previous considerations of convenience or property damage no longer apply. The driver or robocar already fucked up, and no longer has standing to weigh any value over the safety of pedestrians. Yes it's sad that the three kids in the backseat will be woken from their naps, but their complaint is with the driver or robocar for the unsafe speeds, not with the pedestrian for her sudden appearance.
This is the way driving has always worked, and it's no wonder. We're talking about kids who must be protected from the car, but we could be talking about hazards from which the car itself must be protected. If you're buzzing along at 75 mph and a dumptruck pulls out in front of you, you might have the right-of-way, but you're dead anyway. You shouldn't have been traveling that fast, in a situation in which a dumptruck could suddenly occupy your lane. In that failure mode, you're just dead; you don't get to choose to sacrifice some children to your continued existence.
Fortunately, the people who actually design robocars know all this, so the fucked-up hypothetical preferences of solipsists who shouldn't ever control an automobile don't apply.
Thanks for your response. That's certainly an interesting way to think about driving (in a good way). Your previous posts would have done better by elaborate that way.
Just a bit of clarifying, I didn't call the situations rare or a strawman. I said the arguments were against a strawman, as you were thinking of different scenario and arguing on that.
Out of curiosity, had the situation with 3 kids in the car being potentially deadly to swerve. What would the response be?
"You claimed that any car may be driven slowly enough to avoid killing people on the road with only mere convenient to others. That's patently false: firetruck."
As a firefighter I'd like to offer a few thoughts on this:
- fire engines, let alone ladder trucks are big but slow. Older ones have the acceleration of a slug and struggle to hit high speeds
- with very, very few exceptions, the difference of 10 seconds to 2 minutes that your arrival makes is likely to have no measurable difference (do the math, most departments are likely to have policies allowing only 10mph above posted limits at best, and if you're going to a call 3mi away...)
- again, as mentioned, most departments have a policy on how far the speed limit can be exceeded. It might feel faster when it goes by when you're pulled over, but realistically there won't be much difference
- "due caution and regard" - almost all states have laws saying that there's implied liability operating in "emergency mode" - that is, you can disobey road laws as part of emergency operations, but any incident that happens as a result thereof will be implied to be the fault of the emergency vehicle/operator until and unless proven otherwise
If I'm driving an engine, blazing around without the ability to respond to conditions as much as I would in a regular situation/vehicle, then emergency mode or not, I am in the wrong.
This is a ridiculous claim to make. Emergency vehicles will be the very last kind of vehicle to get autonomous driving (if ever), because they routinely break regular traffic rules, and routinely end up in places off-road in some way.
Hell, modern jumbo jets can fly themselves, from takeoff to landing, but humans are still there specifically to handle emergencies.
> As far as the car knows, I might have 3 kids in the backseats.
Strange that you envision having a car capable of specifically determining 'children' in the street, but nothing about the occupants. Especially given that we already have cars that complain if they detect an occupant not wearing a seatbelt, but no autonomous driving system that can specifically detect an underage human.
> You're imaging suburban CA where swerving means some inconvenience with the insurance company.
"I'd rather other people die than me" is not about imagining mere inconvenience with insurance companies.
For someone complaining about strawmen, you're introducing a lot of them.
You seems to selectively cut out and either skim or didn't read my full post. Except the last paragraph, my full comment has little to do with autonomous driving and just driving/ morality with regard to driving in general.
No, I do NOT trust the car to detect children either on the road or in the car, that's why I phrased my comment that way.
> "I'd rather other people die than me" is not about imagining mere inconvenience with insurance companies.
Yes, I specifically point out the scenario the poster of that quote might be thinking about to contrast with the inconvenience scenario.
You should take your own advice: read the post again before commenting.
Legally speaking children are found to be at fault all the time, and under certain circumstances even treated as adults. So no, they're not incapable of being at fault.
Let me guess, you haven't done much work in public relations? Much worse for a company than doing some evil reprehensible shit, is doing such shit in a fashion that's an easy, emotional "narrative hook" for the media. A grieving mother clutching a tear-stained auto repair bill itemized with "child's-head-sized dent, one" is the easiest hook imaginable.
Maybe so. But Google for "parents of dead child sent bill for airlift" - if you want to be pedantic, you can even argue that in some of those cases, there may not have been any choice, only an implied consent by law that says that "a reasonable person would have wanted to be flown to hospital".
You'll find examples. And for every waived bill you'll also hear "well, regardless of the tragic outcome, a service was rendered, and we have the right to be paid for our work".
Please note that I am carefully avoiding passing any judgment on the morality of any of the above.
Unless I really misunderstand, the invoices to which you refer are for services intended to preserve the life of a child, not services intended to relieve a third party of certain trifling inconveniences associated with a child's death?
Society will not tolerate a robocar that cheerfully kills our children to avoid some inconvenience to its occupants.
One might think that we're not talking about inconvenience, but we are, because after all any car may be driven slowly enough to avoid killing children in the road. A particular robocar that is not driven that slowly (which, frankly, would be all useful robocars), must avoid killing children in the road above all other priorities, including the lives of passengers. A VW-like situation in which this were discovered not to be the case, would involve far more dire consequences for the company (and, realistically, the young children and grandchildren of its executives) than VW faces for its pollution shenanigans.
At the moment our autonomous cars can't manage to avoid a stationary guard rail on the side of the road, so it's a bit premature to be worrying about our cars making moral decisions.
The first priority of an autonomous vehicle should be to cause no loss of life whatsoever. The second priority should be to preserve the life of the occupants.
Nobody buys an autonomous car that will make the decision to kill them.
Reasoning about what's legitimate can be a really fine line. Oncoming car, raccoon, fawn, human, doe, coyote, elk: some of these obstacles might actually be net-safer to not swerve around [perhaps depending on speed].
I'd be really curious if any of the autopilots have a value system that tries to minimize damage/loss of life. And how does it value the exterior obstacle against the passengers in the car?