• @TypicalHog@lemm.ee
    link
    fedilink
    English
    112 months ago

    It only matters if the autopilot does more kills than an average human driver on the same distance traveled.

    • @NIB@lemmy.world
      link
      fedilink
      English
      482 months ago

      If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.

      Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.

      Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.

      https://www.youtube.com/watch?v=Gm2x6CVIXiE

    • @Geobloke@lemm.ee
      link
      fedilink
      English
      232 months ago

      No it doesn’t. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them

    • @mojofrododojo@lemmy.world
      link
      fedilink
      English
      212 months ago

      this is bullshit.

      A human can be held accountable for their failure, bet you a fucking emerald mine Musk won’t be held accountable for these and all the other fool self drive fuckups.

      • @sabin@lemmy.world
        link
        fedilink
        English
        -22 months ago

        So you’d rather live in a world where people die more often, just so you can punish the people who do the killing?

        • @mojofrododojo@lemmy.world
          link
          fedilink
          English
          32 months ago

          That’s a terrifically misguided interpretation of what I said, wow.

          LISTEN UP BRIGHT LIGHTS, ACCOUNTABILITY ISN’T A LUXURY. It’s not some ‘nice to have add-on’.

          Musk’s gonna find out. Gonna break all his fanboys’ hearts too.

          • @sabin@lemmy.world
            link
            fedilink
            English
            -12 months ago

            Nothing was misguided and if anything your tone deaf attempt to double down only proves the point I’m making.

            This stopped being about human deaths for you a long time ago.

            Let’s not even bother to ask the question of whether or not this guy could ultimately be saving lives. All that matters to you is that you have a target to take your anger out on the event that a loved one dies in an accident or something.

            You are shallow beyond belief.

            • @mojofrododojo@lemmy.world
              link
              fedilink
              English
              42 months ago

              This stopped being about human deaths for you a long time ago.

              Nope, it’s about accountability. The fact that you can’t see how important accountability is just says you’re a musk fan boy. If Musk would shut the fuck up and do the work, he’d be better off - instead he’s cheaping out left and right on literal life dependent tech, so tesla’s stock gets a bump. It’s ridiculous, like your entire argument.

              • @sabin@lemmy.world
                link
                fedilink
                English
                02 months ago

                I don’t give a fuck about musk. I think hos Hyperloop is beyond idiotic and nothing he makes fucking works. In fact I never even said I necessarily think the state of Tesla autopilot is acceptable. All I said was that categorically rejecting autopilot (even for future generations where tech can be much better) for the express purpose of being able to prosecute people is beyond empty and shallow.

                If you need to make up lies about me and strawman me to disagree you only prove my point. You stopped being a rational agent who weighs the good and bad of things a long time ago. You don’t care about how good the autopilot is or can be. All you care about is your mental fixation against the CEO of the company in question.

                Your political opinions should be based on principles, not whatever feels convenient in the moment.

                • @mojofrododojo@lemmy.world
                  link
                  fedilink
                  English
                  12 months ago

                  You stopped being a rational agent who weighs the good and bad of things a long time ago.

                  sure thing, you stan musk for no reason, and call me irrational. pfft. gonna block you now, tired of your bullshit

    • @SirEDCaLot@lemmy.today
      link
      fedilink
      English
      22 months ago

      This is 100% correct. Look at the average rate of crashes per mile driven with autopilot versus a human. If the autopilot number is lower, they’re doing it right and should be rewarded and NHTSA should leave them be. If the autopilot number is higher, then yes by all means bring in the regulation or whatever.

      • @flerp@lemm.ee
        link
        fedilink
        English
        -12 months ago

        Humans are extremely flawed beings and if your standard for leaving companies alone to make as much money as possible is that they are at least minimally better than extremely flawed, I don’t want to live in the same world as you want to live in.

        • @AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          42 months ago

          Having anything that can save lives over an alternative is an improvement. In general. Yes, we should be pushing for safer self driving, and regulating that. But if we can start saving lives now, then sooner is better than later.

          • @flerp@lemm.ee
            link
            fedilink
            English
            12 months ago

            I’m not sure if that was supposed to be in agreement or countering what I said.

            Over the past few decades, some people have noticed and commented on the enormous death toll that our reliance on driving and the vast amount of driving hours spent on our roads and said that that amount of death is unacceptable. Nothing has ever been able to come of it because of that aforementioned reliance on driving that our society has. Human nature cannot be the thing that changes, we can’t expect humans to behave differently all of a sudden nor change their ability to focus and drive safely.

            But this moment in time, when the shift from human to machine drivers is happening, the time when we shift from beings incapable of performing better on a global scale, to machines able to avoid the current death tolls due to their ability to be vastly more precise than humans, this is the time to reduce that death toll.

            If we allow companies to get away with removing sensors from their cars which results in lower safety just so that said company can increase their bottom line, I consider that unacceptable even if the death toll is slightly lower than human driven cars if it could be greatly lower than human driven cars.

            • @SirEDCaLot@lemmy.today
              link
              fedilink
              English
              12 months ago

              One company says they can build FSD with 15 sensors and sensor fusion. Another company says they can build FSD with just cameras. As I see it, the development path doesn’t matter, it’s the end result that matters.

        • @SirEDCaLot@lemmy.today
          link
          fedilink
          English
          12 months ago

          It is not my place or yours or the governments to tell people how to spend their money or not. It IS our place to ensure that companies aren’t producing products that kill people.

          Thus money doesn’t matter here. What matters is whether or not FSD is more dangerous than a human. If it is, it should be prohibited or only used under very monitored conditions. If it is equal or better than a human, IE same or fewer accident / fatalities per mile driven, then Tesla should be allowed to sell it, even if it is imperfect.

          In the US we have a free market. Nobody is obligated to pay for FSD or use it. People can vote with their wallet whether they think it’s worth the money or not, THAT is what determines if Tesla makes more money or not. It’s up to each individual customer to decide if it’s worth it. That’s their choice not mine or yours.

          As I see it, in a free market what Tesla has to prove is that their system doesn’t make things worse. If they can, if they can prove they’re not making roads more dangerous IE no need to ban it, then it’s a matter between them and their customer.

    • @PresidentCamacho@lemm.ee
      link
      fedilink
      English
      -72 months ago

      This is the actual logical way to think about self driving cars. Stop down voting him because “Tesla bad” you fuckin goons.

      • @gallopingsnail@lemmy.sdf.org
        link
        fedilink
        English
        132 months ago

        Tesla’s self driving appears to be less safe and causes more accidents than their competitors.

        “NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

        Tesla bad.

        • @TypicalHog@lemm.ee
          link
          fedilink
          English
          52 months ago

          Can you link me the data that says Tesla’s competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.

          • @gallopingsnail@lemmy.sdf.org
            link
            fedilink
            English
            12 months ago

            The data doesn’t exist, no other company has a level of “autonomy” that will let your car plow through shit without you paying attention.

        • @Socsa@sh.itjust.works
          link
          fedilink
          English
          22 months ago

          I don’t quite understand what they mean by this. It tracks drivers with a camera and the steering wheel sensor and literally turns itself off if you stop paying attention. What more can they do?

          • @nxdefiant@startrek.website
            link
            fedilink
            English
            2
            edit-2
            2 months ago

            The NHSTA hasn’t issued rules for these things either.

            the U.S. gov has issued general guidelines for the technology/industry here:

            https://www.transportation.gov/av/4

            They have an article on it discussing levels of automation here:

            https://www.nhtsa.gov/vehicle-safety/automated-vehicles-safety

            By all definitions layed out in that article:

            BlueCruise, Super Cruise, Mercedes’ thing is a lvl3 system ( you must be alert to reengage when the conditions for their operation no longer apply )

            Tesla’s FSD is a lvl 3 system (the system will warn you when you must reengage for any reason)

            Waymo and Cruise are a lvl 4 system (geolocked)

            Lvl 5 systems don’t exist.

            What we don’t have is any kind of federal laws:

            https://www.ncsl.org/transportation/autonomous-vehicles

            Separated into two sections – voluntary guidance and technical assistance to states – the new guidance focuses on SAE international levels of automation 3-5, clarifies that entities do not need to wait to test or deploy their ADS, revises design elements from the safety self-assessment, aligns federal guidance with the latest developments and terminology, and clarifies the role of federal and state governments.

            The guidance reinforces the voluntary nature of the guidelines and does not come with a compliance requirement or enforcement mechanism.

            (emphasis mine)

            The U.S. has operated on a “states are laboratories for laws” principal since its founding. The current situation is in line with that principle.

            These are not my opinions, these are all facts.

        • @nxdefiant@startrek.website
          link
          fedilink
          English
          -42 months ago

          No one else has the same capability in as wide a geographic range. Waymo, Cruise, Blue Cruise, Mercedes, etc are all geolocked to certain areas or certain stretches of road.

          • @GiveMemes@jlai.lu
            link
            fedilink
            English
            32 months ago

            Ok? Nobody else is being as wildly irresponsible, therefore tesla should be… rewarded?

            • @nxdefiant@startrek.website
              link
              fedilink
              English
              12 months ago

              I’m saying larger sample size == larger numbers.

              Tesla announced 300 million miles on FSD v12 in just the last month.

              https://www.notateslaapp.com/news/2001/tesla-on-fsd-close-to-license-deal-with-major-automaker-announces-miles-driven-on-fsd-v12

              Geographically, that’s all over the U.S, not just in hyper specific metro areas or stretches of road.

              The sample size is orders of magnitude bigger than everyone else, by almost every metric.

              If you include the most basic autopilot, Tesla surpassed 1 billion miles in 2018.

              These are not opinions, just facts. Take them into account when you decide to interpret the opinion of others.

              • @GiveMemes@jlai.lu
                link
                fedilink
                English
                0
                edit-2
                2 months ago

                That’s not how rates work tho. Larger sample size doesn’t correlate with a higher rate of accidents, which is what any such study implies, not just raw numbers. Your bullshit rationalization is funny. In fact, a larger sample size tends to correspond with lower rates of flaws, as there is less chance that an error/fault makes an outsized impact on the data.

                • @nxdefiant@startrek.website
                  link
                  fedilink
                  English
                  1
                  edit-2
                  2 months ago

                  No one’s talking about rates. The article itself, all the articles linked in these comments are talking about counts. Numbers of incidents. I’m not justifying anything because I’m not injecting my opinion here. I’m only pointing out that without context, counts don’t give you enough information to draw a conclusion, that’s just math. You can’t even derive a rate without that context!

                  • @GiveMemes@jlai.lu
                    link
                    fedilink
                    English
                    12 months ago

                    That’s not my point though. We both know that the government agency doing this work is primarily interested in the rates, whether or not reports from the media are talking about the total numbers or not. The only reason they started the process of investigation was because of individual incidents, yes, but they’re not looking for a few cases, but a pattern.

                    (Like this one:https://www.ranzlaw.com/why-are-tesla-car-accident-rates-so-high/)

      • @doubtingtammy@lemmy.ml
        link
        fedilink
        English
        112 months ago

        It’s not logical, it’s ideological. It’s the ideology that allows corporations to run a dangerous experiment on the public without their consent.

        And where’s the LIDAR again?

        • @PresidentCamacho@lemm.ee
          link
          fedilink
          English
          1
          edit-2
          2 months ago

          My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn’t your strong suit, maybe sit this one out…