• exscape@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    This kind of serious trouble (from the article):

    The Department of Justice is currently investigating Tesla for a series of accidents — some fatal — that occurred while their autonomous software was in use. In the DoJ’s eyes, Tesla’s marketing and communication departments sold their software as a fully autonomous system, which is far from the truth. As a result, some consumers used it as such, resulting in tragedy. The dates of many of these accidents transpired after Tesla went visual-only, meaning these cars were using the allegedly less capable software.

    Consequently, Tesla faces severe ramifications if the DoJ finds them guilty.

    And of course:

    The report even found that Musk rushed the release of FSD (Full Self-Driving) before it was ready and that, according to former Tesla employees, even today, the software isn’t safe for public road use. In fact, a former test operator went on record saying that the company is “nowhere close” to having a finished product.

    So even though it seems to work for you, the people who created it don’t seem to think it’s safe enough to use.

    • obviouspornalt@lemmynsfw.com
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      My neighborhood has roundabouts. A couple of times when there’s not any traffic around, I’ve let autopilot attempt to navigate them. It works, mostly, but it’s quite unnerving. AP wants to go through them ready faster than I would drive through them myself.

      • SirEDCaLot@lemmy.fmhy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        AP or FSD?
        AP is old and frankly kinda sucks at a lot of things.
        FSD Beta if anything I’ve found is too cautious on such things.