- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
Why would that horrify us? That’s how science works. We observe the world, create hypothesis based on those observations, developed experiments to test those hypothesis, and build theories based on whether experimentation confirmed our hypothesis. Phlogiston wasn’t real, but the theory conformed to the observations made with the tools available at the time. We could have this theory of phlogiston, and we could experiment to determine the validity of that theory. When new tools allowed us to observe novel phenomena the phlogiston theory was discarded. Science is a philosophy of knowledge; The world operates on consistent rules and these rules can be determined by observation and experiment. Science will never be complete. Science makes no definitive statements. We build theoretical models of the world, and we use those models until we find that they don’t agree with our observations.
*because confidently relying on the model (in this case informational) prediction like ooh, we could do brain no problem in computer space, you are not exactly making a good scientific prediction. Good scientific prediction is that model is likely garbage, until proven otherwise, and thus shouldn’t be end all be all.
But then if you take information processing model, what it gives you, exactly, in understanding of the brain? The author contention that it is hot garbage framework, it’s doesn’t fit with how the brain works, your brain is not tiny hdd with ram and cpu, and until you think that it is, you will be searching for mirages.
Yes neural networks are much closer (because they are fucking designed to be), and yet even they has to be force fed random noise to introduce fuzziness in responses, or they’ll do the same thing every time. You reboot and reload neural net, it will do the same thing every time. But brain is not just connections of axons, it’s also extremely complicated state of the neuron itself with point mutations, dna repairs, expression levels, random rna garbage flowing about, lipid rafts at synapses, vesicles missing cause microtubules decided to chill for a day, the hormonal state of the blood, the input from the sympathetic neural system etc
We haven’t even fully simulated one single cell yet.