• Lennvor@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    This kind of “why do we seek out happiness/pleasure but stories of artificial happiness/pleasure utopias always read like dystopias” question baffled me a lot until it occurred to me recently - happiness and pleasure are evolved systems that evolved for a reason. It feels absurd to treat them like a goal because they’re not a goal, they’re a measure. It’s a bit like you’re heating something and looking at the thermometer to check it’s heating right, and someone says “hey why don’t we paint the thermometer to have the value you want, that’s much simpler and you’ll reach your goal fine” and the answer is yes, but no. Yes, the thermometer will have the value you were aiming for and it may have looked like that was your goal but actually no, your goal won’t be achieved because the real goal was never the thermometer it was heating the thing.

    In our case, happiness, pleasure and so on evolved to drive us towards certain states and behaviors that it was evolutionarily beneficial for our ancestors to be in. Being physically comfortable, safe and healthy, being well-regarded by peers, achieving personal and collective goals, having friends and family who love you/have your back and you them, acting in line with what one feels is best, etc etc etc.

    I think that has two consequences: 1) it’s entirely possible that perfect happiness/pleasure isn’t something we can ever attain, or that it’s even a coherent state, via real OR artificial means. Because happiness/pleasure evolved under constraints that didn’t include the requirement that such a state be attainable or even coherent. It doesn’t mean it’s impossible, but it definitely means there is no guarantee that it is. Certainly our current experience with happy-making drugs suggests it’s much harder than you’d think. And 2) it puts into question the assumption that this state is “good”. These dystopias always seem so sterile, like what’s the point of all those people being happy, why have this system go to all that trouble to make it happen? Well, why should we care about anything, right, it’s all value judgements. And there are obvious reasons humans would value happiness. But there are also obvious reasons we’d value safety, comfort, loving friends and family, having children, achieving personal and collective goals, social status, discovering new things, leaving a legacy, etc etc. The “artificially happy people” dystopia assumes that we value happiness above all those other things but that’s an illusion borne from the fact happiness is a unified system driving us to all those things. A bit like thinking money is the most important thing because everybody is trying to get some, when in reality the money is just the unified vehicle for various things we really want - products and services, security, status, etc.

    So insofar as all of those different goals are things we care about because we evolved to, it seems both more parsimonious and more robust to focus on goals that happiness/pleasure evolved as instruments to achieve rather than trying to hack the thermometer.

    Arguably that’s the difference between actual utopias and “we’re all happy, that’s good right?” dystopias. Actual utopias explore the conditions for human flourishing, and either portray happiness as obviously following from that or straight-up don’t focus on happiness at all. Happy dystopias are dystopias precisely because the conditions they show are so antithetical to human flourishing that no reader would buy the characters are happy without the in-Universe happiness drugs or brainwashing or whatever.

  • tal@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    https://en.wikipedia.org/wiki/Wirehead_(science_fiction)

    Wireheading is a term associated with fictional or futuristic applications[1] of brain stimulation reward, the act of directly triggering the brain’s reward center by electrical stimulation of an inserted wire, for the purpose of ‘short-circuiting’ the brain’s normal reward process and artificially inducing pleasure. Scientists have successfully performed brain stimulation reward on rats (1950s)[2] and humans (1960s). This stimulation does not appear to lead to tolerance or satiation in the way that sex or drugs do.[3] The term is sometimes associated with science fiction writer Larry Niven, who used the term in his Known Space series.[4][5] In the philosophy of artificial intelligence, the term is used to refer to AI systems that hack their own reward channel.[3]

    The VR involved in the comic is probably a bit of an unnecessary middleman, but same basic idea.

    • Ferk@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      We don’t really know if there’s actually VR involved in the comic. The robot does not say that, and the headset might just be to apply electrical stimulation directly to the brain, like the article you linked suggests.