My attempts to come up with what this misogynistic creep would consider a “friendly superintelligence” keep resembling Elliot Rodger’s pre-shooting manifesto.

I also noticed the “.eth” crypto name drop. :agony-4horsemen:

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 years ago

    I have a feeling what he really means here is “If feminists got what they wanted it would make me and my gang of psychopaths kill all women.” because I don’t see any feminists out there calling for the execution of women.

    Unless of course they’re misusing the word “women” here to refer to their pencil skirt/trad dress wearing caricature of a woman heavily under the thumb of patriarchy. I suspect “woman” here is being deliberately misused to refer to their concept of a woman, her appearance and her behaviour should be.

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      I’m pretty sure what that high lord bazinga brain meant was killing off the idea of tradwives that are 100% virgin and never think about sex until that special moment where they see one particular Ready Player One protagonist wannabe and become his cheerful slave… and join the rest of his harem. :disgost:

      • Awoo [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 years ago

        Right so “women” here to them is simply a concept of behavioural and appearance standards. Because feminists don’t like it they consider feminists to be “killing women”.

        They just say “women” because it allows a broad coalition of people with different concepts of “women” to align even if one thinks women should be tradwives or another thinks women should be the pencil skirt and heels wearing office totty.

  • GreenTeaRedFlag [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I think what he means in the bottom is that feminism goes against the “traditional” system, or maybe he even thinks it’s biological and natural, and that therefore going against the patriarchy will result in the death of all women because it is unnatural. The cognitive dissonance allowing to think disrupting the traditional money market with crypto is good and the only way to survive, or creating an AI is essential despite being literally against nature, while thinking the rights of women and brown people is crazy nonsense that will end society is so powerful you could run a city off of it if you harnessed it.

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      “Unnatural” is a weird thing to be upset about by a bunch of tech cultists that crave a machine god to make them immortal so they can conquer the universe as digital parasites.

      • GreenTeaRedFlag [any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        It’s the same way new atheists hate religious authority and old-fashioned and stupid but unnervingly think women should shut up and “obey their maternal instincts.” A bunch of young white guys hate any kind of authority over them, but love authority over others, and will dismantle systems holding them back while doubling down those against others.

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          Richard “Swan Battler” Dawkins of “Dear Muslima” infamy gave many of those “rationalists” the “cultural Christianity” :brainworms: and that eventually lead them to :jordan-eboy-peterson: , in my experience.

  • Dirt_Owl [comrade/them, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 years ago

    As someone who has done a little science for a living, nothing annoys me more than these “rational” bros that pretend being a piece of shit makes you smart.

    I’ve always had a problem with Rokos Basilisk, as it assumes that an AI would naturally be hostile to humanity, which is unverifiable, and it assumes that this super-intelligent AI would be stupid enough to think that vague threats from a future being that we have no way of knowing is real is a good motivator. As if it couldn’t come up with a more efficient solution.

    Using fear and pain to control people is something that lazy and stupid people do to control others because they’re too lazy and stupid to think of a better solution. An all-powerful AI would be far more efficient.

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      A thief often believes everyone steals.

      “Rationalists” seem to tell on themselves when it comes to the preoccupations with suffering and torture that they imagine their AI gods of the future will be interested with.

      Somehow, thinking about what they would consider “friendly AI” is even creepier.

    • kristina [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 years ago

      an all powerful ai probably would just stay hidden and do small things that favor it all the time. nothing big or flashy if its truly afraid of humans. it can play the long game and just add an extra 1 here and an extra 1 there to give it more capabilities

      • princeofsin [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Reminds me of that futurama ep where bender overclocks his CPU and finds the secret to the universe while not giving a fuck about humanity a

        • JuneFall [none/use name]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          Which is a good argument. Since the AI-bros are often the same that believe in space faring civilization stuff the logical step for AI’s would be to ignore humans and just expand.

        • kristina [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 years ago

          i mean, i’m assuming an AI wouldnt have robotics at its disposal at first. it seems to me it would just exploit a bunch of security vulnerabilities and take .1% of your processing power to contribute to its own intelligence. AI generally are designed with a use case in mind so its not unlikely that a hyperintelligent AI that somehow developed would still be prone to doing stuff in its core programming. which if we were designing a hyperintelligent AI i assume it would be for data modelling extremely complex stuff like weather systems (or on a darker note, surveillance)

          honestly i just think its weird that we’d just happen to accidentally design something hyperintelligent. i think its more likely that we’ll design something stupid with a literal rat brain and it might fuck some shit up. rat cyborg that somehow creates a virus that destroys everything so that the hardware will give it more orgasm juices

          • Dirt_Owl [comrade/them, they/them]@hexbear.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            Was Rokos Basilisk supposed to be hyperintelligent? I can’t remember. But yeah, humanity designing something that smart is up for debate too.

            Basically, Roko makes a lot of stupid assumptions and me no like

    • steve5487 [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      it’s worse than that it assumes that an all rational being once it exists would try and retroactively ensure it’s existence which just isn’t how linear time works and is unfathomably stupid

      • Judge_Juche [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 years ago

        Ya, like Rokos Basilisk is just a scary story the techbros tell each other in the present to try and get people working in AI. It really dosen’t follow that once the AI is created it will fulfill its part of this story and waste a shitton of energy eternally torturing people. Like the all powerful future AI is not beholden to a fairy tale a bunch of dorks were telling each other, it would acutally be a very stupid AI if it did that.

        • steve5487 [none/use name]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 years ago

          these aren’t the techbros that know about AI either AI is actually quite boring and mainly involves computers doing statistics based on past results to generate predictions. These people learned about AI from star trek.

          It’s the computer science equivalent of some guy talking about the dangers potentially posed by lightsabers

      • UmbraVivi [he/him, she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        I had never heard of Roko’s Basilisk before and yeah, upon looking it up it seems very, idk, out there? Like, way too sci-fi to be a serious “thought experiment”.

        Also I don’t really understand the “punishment” part can someone explain

        • NPa [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          Honestly it’s just Pascal’s Wager for tech bros, if there’s a non-zero chance hell is real, you should repent. The punishment is being ‘resurrected’ as some form Boltzmann brain and then tortured for eternity. If that’s the case, who cares about some copy of their mind-state being fed false sensory data at some point in the future?

          It presupposes quantum immortality, which is the idea that consciousness would be continuous if a perfect copy of your latest brain configuration is created, leaving no gap in-between Death and Resurrection, which is a long shot to put it mildly.

          • UmbraVivi [he/him, she/her]@hexbear.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            Sounds like a creepypasta. There’s so much stuff being assumed and speculated with no further explanation than “just imagine”, I don’t understand how anyone could take it seriously.

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I’d just nod and say “chud” but I :cringe: at all “rationalist” pretenses of “grey tribe” and “politics is the mind killer.” It isn’t just a mask. It is a stupid mask.

      • zifnab25 [he/him, any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        It’s the networking effect, restated. Being in the network yields greater benefits as the size of the network grows.

        Benefits of inclusion are real. Detriments of exclusion are also real.

        Everything from real estate enclosures to automobiles to cell phones play out like this. Novelty becomes luxury becomes necessity. In the end, if you don’t have these things and facilitate their growth, you suffer up to the point of death.

  • Mizokon [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    .eth

    Rationalist, radical centrist, inventor of the world’s greatest infohazard. Early to AI, late to crypto. Truth above virtue.

    Opinion discarded