Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Level 2 has 0 semblance of reliable autonomy. Those are completely different problems.


It is a defined level of vehicle autonomy by the Society of Automotive Engineers (SAE).

https://www.sae.org/news/2019/01/sae-updates-j3016-automated...


I personally believe that level 3 is completely useless. It requires the driver to be completely aware and ready to take over the driving with a second notice whenever the car gets in over it's head.

It seems like level 3 is more dangerous than a 100% manual car.

Even high-end level 2 seems dangerous.

Until we get self driving cars to level 4, were they don't require the driver to play attention, I don't think self driving cars should be on the road.


I think they mean the green bits there are the part where the car can be relied on to drive by itself (with the blue bits requiring constant attention).

Arguing about what to call it is boring. Is Tesla close to having a system that can drive without attention, decide it isn't sure what to do and safely request driver attention? Or does it just have the bit that can successfully navigate a lot of situations?


Having driven across the US almost 30k miles on Navigate On Autopilot in our Teslas, I’d argue the former (“Is Tesla close to having a system that can drive without attention, decide it isn't sure what to do and safely request driver attention?”). Attention is given, but intervention is rare, even in construction zones (only done when zones are inactive, not with workers present for obvious reasons). Interstate to interstate transitions, passing of slow vehicles without our intervention (“lane changes without confirmation” feature) is mostly flawless.

Edit: if I didn’t make it clear, Tesla vehicles safely hand over responsibility to the driver when path planning confidence has been lost. If you don’t take over, the vehicle comes to a stop with the hazards on. If you haven’t, I recommend taking a free test drive yourself to better understand the system constraints and UX.


If the system/software to safely hand over control isn't present in the vehicle, what does driving 30k miles demonstrate about how close they are to deploying it?

Much of the distinction between level 2 and level 3 is that the vehicle reliably hands over control. Successfully navigating situations doesn't provide much information about how close that capability might be.


Level 2 means the driver always has responsibility.

It's good that the Tesla system stops making inputs when it doesn't know what to do. That's different than the Tesla system having full control and safely notifying the occupant that they need to start driving.


> Tesla vehicles safely hand over responsibility to the driver when path planning confidence has been lost

You've been lucky then, not to experience phantom breaking, cutting off other cars when changing lanes or fire trucks stopped partially in your lane.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: