Longtermism poses a real threat to humanity

https://www.newstatesman.com/ideas/2023/08/longtermism-threat-humanity

“AI researchers such as Timnit Gebru affirm that longtermism is everywhere in Silicon Valley. The current race to create advanced AI by companies like OpenAI and DeepMind is driven in part by the longtermist ideology. Longtermists believe that if we create a “friendly” AI, it will solve all our problems and usher in a utopia, but if the AI is “misaligned”, it will destroy humanity…”

@technology

  • Kwakigra@beehaw.org
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    It reminds me of casting a golden calf and then worshipping it as a god. There aren’t even the seeds of the solution to any social problem in LLMs. It’s the classic issue of being knowledgeable about one thing and assuming they are knowledgeable about all things.

    • Erk@cdda.social
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      There are some seeds there. Llms show that automation will eventually destroy all jobs, and at any time even things we think are unassailable for decades could suddenly find themselves at risk.

      That plants a lot of seeds. Just not the ones the average longtermist wants planted.

      • Kwakigra@beehaw.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        I think we agree. LLMs and automation in general have a massive labor saving potential, but because of the way our economic systems are structured this could actually lead in the opposite direction of a Utopia instead of toward one as was suggested by the “longtermists” (weird to type that out). The seeds as you suggest are very different than towards the end of conflict and want.

        • Erk@cdda.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Oh we totally agree, I was just agreeing with you in a slightly tongue in cheek manner.