More or less Tesla’s autopilot is not as safe as Tesla would have you believe.

  • IAm8BitWolf@beehaw.org
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    They’re running a beta test to the general public - only the thing they’re testing is a 2 ton ball of metal and explosive material regularly traveling at 45 mph (70 kmh). They even have the gall to charge for the ability to beta test it. I really hope this gets regulated at some point, otherwise this is just the beginning

    • Knoll0114@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I’m shocked it isn’t already regulated. I get it’s a developing technology but cars can be murderous.

      • ShadowAether@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Where I am, SAE Level 3 is banned as in you need authorization to test it out on public roads but SAE Level 2 is allowed. There are also SAE Level 5 vehicles in operation today, they’re just on private roads/property and nearly all of them are regulated, it’s just under workplace safety laws instead of driving laws.

    • darkmugglet@lemm.eeOP
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      IMO, this is the problem. Any normal person doing this would be in prison. Something like automated driving should be strictly regulated. I own a Mach-e, and while its self driving features are limited, it errs so much on the side of caution that you cannot not pay attention to the road. As it should be.

    • Lumi@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Maybe a controversial opinion, but I’m glad they are charging for it. I wish there were a better way to vet who gets to be beta testers, but at least by charging money, they are ensuring only people who care about the technology get to use it.

      Maybe I’m jaded, but it seems like drivers, in general, have gotten worse post-pandemic, and I wouldn’t trust 90% of them with autonomous driving features in the state it’s in.

  • jjagaimo@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    While this is undeniably tragic in every instance, I can’t help but point out that the title had my sleep deprived brain thinking, “how the hell did the car crash that many times and keep on driving”

  • Wiitigo@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    Still almost exactly half the crash rate of human-only drivers. Therefore, we should ban human-only driving.

    • RandomBit@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      I don’t think this is a fair comparison since an Autopilot crash is a 2 stage failure: the Autopilot and then the driver both failed to avoid the crash. The statistics do not include the incidents where Autopilot would have crashed but the human took control and prevented it. If all instances of human intervention were included, I doubt Autopilot would be ahead.

      • Kepler@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        If all instances of human intervention were included, I doubt Autopilot would be ahead.

        Why would you interpret non-crashes due to human intervention as crashes? If you’re doing that for autopilot non-crashes you’ve gotta be consistent and also do that for non-autopilot non-crashes, which is basically…all of them.

        • RandomBit@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          If a human crashes and their action/vehicle is responsible for the crash, the crash should be attributed to the human (excepting mechanical failure, etc). I believe that if an advanced safety systems, such as automatic braking, that prevent a crash that otherwise would have occurred, the prevented crash should also be included in the human tally. Likewise, if Autopilot would have crashed if not for the intervention of the driver, the prevented crash should be attributable to Autopilot.

          As has been often studied, the major problem for autonomous systems is that until they are better than humans WITHOUT human intervention, the result can be worse than both. People are much less likely to pay full attention and have the same reaction times if the autonomous system is in full control the majority of the time.

    • darkmugglet@lemm.eeOP
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      You’re missing the point – with a human driver there is accountability. If I, as a human, cause an accident, I have either criminal or civil liability. The question of “who is at fault” get murky. And then you have the fact that Tesla is not obligated to report the crashes. And then the failures of automated driving is very different than human errors.

      I don’t think anyone is suggesting that we ban autonomous driving. But it needs better oversight and accountability.

      • Locrin@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        In these cases the human is still accountable. Do you think that if a Tesla plowed into a kindergarten while using Autopilot the driver would avoid punishment? The driver is using a feature of the car. It tells you to stay alert and be prepared to take over on short notice. Those crashing are the idiots that sit in the backseat, go to sleep or play on their phones while the Autpilot is on. The only self driving right now where I would be in favour of punishing the company if something went wrong is those taxis that you purely are the passenger in.

        Sit behind the wheel, you are responsible for what happens.

        • JillyB@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I don’t think this is a practical take. If I’m driving a car, I’m in control and know my intentions. If I’m responsible for an accident, it’s because I wasn’t fully alert or did something stupid.

          If autopilot is driving the car, I don’t know the car’s intentions. It might cause a dangerous situation before my brain can process that it has bad intentions and take over. If it sees something in the road that isn’t there, it might swerve or brake and I won’t recognize until it already happened. That’s considering an alert driver with full concentration behind the wheel. The whole point of autopilot is to reduce the driver’s workload. It does that by requiring less concentration. I think it’s inherently dangerous to require human intervention in autopilot systems.

          • Locrin@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            When using adaptive cruise control you can set the speed limit to let’s say 60. If you drive behind someone and they have slowed down to 30 to take a steep turn they might disappear from your cars sensors. In that case the car might see no obstacle and rapidly accelerate trying to get back to 80. That is scary, because suddenly the car is accelerating towards a sharp turn. This is not theoretical, my friends Volvo has done this multiple times.

            If your argument is safety it is moot. Autopilot has less accidents than humans.

            Autopilot is just a more advanced version of this. It is brilliant as long as you know it’s quirks. For highway driving with few cars around you can probably relax as as much or more as you would just cruising. For city driving you should be alert to take over at any time, but you might not have to navigate that complex intersection and can pay more attention to your surroundings.

            Unless they get to a point where you can fold in the steering wheel and just be a passenger the burden falls on the driver.

      • Fubarberry@aiparadise.moe
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I’m all for more accountability, but it’s still better than human driving. Cutting human car deaths in half in exchange for murky accountability is clearly a worthwhile trade.

      • Wiitigo@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I was commenting on the original post, which was an assertion that “Autopilot was not as safe as Tesla would have you believe.”.

        I think you hopped topics all-together. And I actually agree with you.

      • Faceman🇦🇺@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        My main issue with Teslas autopilot is it’s branding and the way they advertise it.

        Almost every non-tech person I talk to about things like that think it is 100% a hands off robot driver and that is a very, VERY dangerous idea.

        It’s a very good system, and it is improving with every update, but it is far from the idea that many people have in their heads.

        The videos you see of people sleeping on autopilot are worrying, do Teslas not have driver alert monitoring? if I look away from the road for 5 seconds in my Mazda it lets me know very loudly that it wants me to pay attention, if I were to fall asleep it would do it’s best to wake me up. when I use it’s very simple and limited self driving function I cant take my hands off the wheel for more than about 10 seconds before it alerts me.

    • Saik0@lemmy.saik0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Irrelevant; that’s what it is. Considering that a human is still ultimately responsible when they’re behind the wheel whether or not “autopilot” is running, it’s the human that should be attributed the lower crash rate.

      Otherwise you risk incidents like this one where the human intervenes in a near-miss and actively stops the car from actually causing a severe accident being counted as “pro-autopilot” when it was the human that actually stopped the event from occurring.

      • Communist@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        No, the crash rate is definitely relevant, but I do get what you’re saying about safety being human caused.

  • Oliver@feddit.de
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    In an ideal world, automation could free up the human workpower to relevant topics (instead of taxi/Uber/etc.). Science, development or even social working.

    But as we are not in an ideal world, this would never work, as it would need equal and ver could education where you see every student what it is: essential brain capital which must not be left ignored.

    So the reason why we get automated driving is: because we hate monotonously driving every day the same route through the traffic jam.

    • Communist@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I think we really need to know what the actual crashrate is before we decide on that.

      Even more, we need the crashrate based on different environments, I figure AI does worse than humans in snowy conditions, but in most highway conditions, I bet it can do better, perhaps we could regulate it based on weather conditions if we had such data

      I don’t know, but I think we need more data before we say anything.

  • amanneedsamaid@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I hate banning technology and stifling innovation. Lets ban automobile self-driving technology, no one needs it and the inherent risks and ethical dilemmas are not worth it at all.

    • Knoll0114@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      To be fair ‘no one needs it’ isn’t entirely true. There are many reasons someone who needs to get around might not be able to drive. For example, some people with epilepsy, senior citizens, teenagers going to work etc. I don’t need it but I’d love the convenience and stress relief of never having to drive again. Public transport could help some of this but some areas just aren’t populated enough for truly good public transport.

      • amanneedsamaid@sopuli.xyz
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        I think solutions like better public transportation, or government services so people could get free rides as some companies offer rides are better options. A computer driving a car has too many real world consequences that outweigh the convenience.

        • Stormyfemme@beehaw.org
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          The solution is always better public transit but I’d be shocked if any of us saw it approach even passable levels in our lifetime here in the States. Timelines for small projects stretch on for a decade. Massive ones can’t even get off the ground. I wish it weren’t true but I’ve basically given up on it. Maybe I’ll move to Europe some day to have access to transit options.

          • Knoll0114@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            Even in Europe though rural areas are a thing. I’ve lived in Australia and the UK, travelled extensively in Europe. Many European cities have excellent public transport, but if you need to get to a small town for whatever reason you can’t. In Australia it’s definitely better in the major cities than it is in US major cities but that are so few people and it’s such a large country that outside of those really big cities there’s very little.

      • Manticore@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        The issue you’ve described though is not about self-driven technology. It’s that ‘driving’ is the only form of transport, and thus the only way that anybody can ever be independent.

        It’s that too many areas design their infrastructure around the personal car, and make it impossible to get around without them. With them, it means sitting in traffic for hours at a time (because everybody else is in a car, too). Stretches of noisy rumbling multi-lane roads that don’t have walkways or crossings. Bike lanes are non-existent, or pressed up against fast-moving car traffic. And because walking/cycling isn’t an option, we have more people driving than ever - children being driven to and from school or sports, driving down to a store 100m a way to pick up eggs, etc.

        Cars spend ~95% of the time parked somewhere, and 4% of the time moving a single person. They’re incredibly inefficient, and yet they’ve been painted as a symbol of ‘freedom’ and ‘independence’ that seems massive amounts of land converted into parking spaces to accommodate something magnitudes larger than a person, one per person.

        Cities that design around subway trains and bus lanes from the get-go have far smoother commutes. Smaller villages designed around trams and cycling are quiet, pleasant, and walkable. Both of them offer independence to a population that cannot drive - either practically or financially.

        If self-driving car-sharing was available already now, then I’d be more likely to agree. Car-sharing (not ride-sharing, but hiring cars per minute via app) is the best way for car-based infrastructure to migrate towards lower traffic. Ripping up roads for trains is expensive, but knowing you can use a town car to visit your friend, then a van to help them move, and park neither of them in your driveway, will really help.

        But right now self-driving cars are a passion project. They’re not actually practical, they’re just exciting and expensive. If accessibility for our blind, elderly, and impoverished population is the concern here, then billionaires funding the self-driving cars they can’t ever afford is not the answer.

    • darkmugglet@lemm.eeOP
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      For me, the problem is one of justice. If I, as a meat sack, kill someone I am liable and most likely criminally liable for it. When AI commits man slaughter, then what? A company has the financial incentive and very little of the legal exposure because it’s out sourced to the owner. Effectively the human operator trusting Evil Corp gets the raw end of the deal.

      IMO, each version of the software should get a legal license.

    • shoe@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      One use case could be senior citizens who aren’t ready to give up driving entirely. I’m sure it’s not easy to admit that your vision and reaction time are deteriorating to the point that you’re a danger on the road. As long as we live in a car-centric society, I hope the tech has solidified by the time I reach that point.