the-podcast guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • Tomorrow_Farewell [any, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 months ago

    I’m talking about visual memory what you see when you recall it, not about image recognition

    What is ‘visual memory’, then?
    Also, on what grounds are you going to claim that a computer can’t have ‘visual memory’?
    And why is image recognition suddenly irrelevant here?

    So far, this seems rather arbitrary.
    Also, people usually do not keep a memory of an image of a poem if they are memorising it, as far as I can tell, so this pivot to ‘visual memory’ seems irrelevant to what you were saying previously.

    I’m suggesting that it’s not linked lists, or images or sounds or bytes in some way, but rather closer to persistent hallucinations of self referential neural networks upon specified input

    So, what’s the difference?

    which also mutate in place by themselves and by recall but yet not completely wildly

    And? I can just as well point out the fact that hard drives and SSDs do suffer from memory corruption with time, and there is also the fact that a computer can be designed in a way that its memory gets changed every time it is accessed. Now what?

    Memory is like a growing tree or an old house is not exactly most helpful metaphor, but probably closer to what it does than a linked list

    Things that are literally called ‘biological computers’ are a thing. While not all of them feature ability to ‘grow’ memory, it should be pretty clear that computers have this capability.

    • plinky [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      What is visual memory indeed in informational analogy, do tell me? Does it have consistent or persistent size, shape or anything resembling bmp file?

      The difference is neural networks are bolted on structures, not information.

      • Tomorrow_Farewell [any, they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        What is visual memory indeed in informational analogy, do tell me?

        It’s not considered as some special type of memory in this context. Unless you have a case for the opposite, this stuff is irrelevant.

        Does it have consistent or persistent size, shape or anything resembling bmp file?

        Depends on a particular analogy.
        In any case, this question seems irrelevant and rather silly. Is the force of a gravitational pull in models of Newtonian physics constant, does it have a shape, is it a real number, or a vector in R^2, or a vector in R^3, or a vector in R^4, or some other sort of tensor? Obviously, that depends on the relevant context regarding those models.

        Also, in what sense would a memory have a ‘shape’ in any relevant analogy?

        The difference is neural networks are bolted on structures, not information

        Obviously, this sentence makes no sense if it is considered literally. So, you have to explain what you mean by that.