Crashes have surged in the past four years, reflecting the hazards associated with increasingly widespread use of Tesla’s futuristic driver-assistance tech.
More or less Tesla’s autopilot is not as safe as Tesla would have you believe.
I don’t think this is a practical take. If I’m driving a car, I’m in control and know my intentions. If I’m responsible for an accident, it’s because I wasn’t fully alert or did something stupid.
If autopilot is driving the car, I don’t know the car’s intentions. It might cause a dangerous situation before my brain can process that it has bad intentions and take over. If it sees something in the road that isn’t there, it might swerve or brake and I won’t recognize until it already happened. That’s considering an alert driver with full concentration behind the wheel. The whole point of autopilot is to reduce the driver’s workload. It does that by requiring less concentration. I think it’s inherently dangerous to require human intervention in autopilot systems.
When using adaptive cruise control you can set the speed limit to let’s say 60. If you drive behind someone and they have slowed down to 30 to take a steep turn they might disappear from your cars sensors. In that case the car might see no obstacle and rapidly accelerate trying to get back to 80. That is scary, because suddenly the car is accelerating towards a sharp turn. This is not theoretical, my friends Volvo has done this multiple times.
If your argument is safety it is moot. Autopilot has less accidents than humans.
Autopilot is just a more advanced version of this. It is brilliant as long as you know it’s quirks. For highway driving with few cars around you can probably relax as as much or more as you would just cruising. For city driving you should be alert to take over at any time, but you might not have to navigate that complex intersection and can pay more attention to your surroundings.
Unless they get to a point where you can fold in the steering wheel and just be a passenger the burden falls on the driver.
I don’t think this is a practical take. If I’m driving a car, I’m in control and know my intentions. If I’m responsible for an accident, it’s because I wasn’t fully alert or did something stupid.
If autopilot is driving the car, I don’t know the car’s intentions. It might cause a dangerous situation before my brain can process that it has bad intentions and take over. If it sees something in the road that isn’t there, it might swerve or brake and I won’t recognize until it already happened. That’s considering an alert driver with full concentration behind the wheel. The whole point of autopilot is to reduce the driver’s workload. It does that by requiring less concentration. I think it’s inherently dangerous to require human intervention in autopilot systems.
When using adaptive cruise control you can set the speed limit to let’s say 60. If you drive behind someone and they have slowed down to 30 to take a steep turn they might disappear from your cars sensors. In that case the car might see no obstacle and rapidly accelerate trying to get back to 80. That is scary, because suddenly the car is accelerating towards a sharp turn. This is not theoretical, my friends Volvo has done this multiple times.
If your argument is safety it is moot. Autopilot has less accidents than humans.
Autopilot is just a more advanced version of this. It is brilliant as long as you know it’s quirks. For highway driving with few cars around you can probably relax as as much or more as you would just cruising. For city driving you should be alert to take over at any time, but you might not have to navigate that complex intersection and can pay more attention to your surroundings.
Unless they get to a point where you can fold in the steering wheel and just be a passenger the burden falls on the driver.