Rationalist check-list:
- Incorrect use of analogy? Check.
- Pseudoscientific nonsense used to make your point seem more profound? Check.
- Tortured use of probability estimates? Check.
- Over-long description of a point that could just have easily been made in 1 sentence? Check.
This email by SBF is basically one big malapropism.
This reads very, uh, addled. I guess collapsing the wavefunction means agreeing on stuff? And the uncanny valley is when the vibes are off because people are at each others throats? Is ‘being aligned’ like having attained spiritual enlightenment by way of Adderall?
Apparently the context is that he wanted the investment firms under ftx (Alameda and Modulo) to completely coordinate, despite being run by different ex girlfriends at the time (most normal EA workplace), which I guess paints Elis’ comment about Chinese harem rules of dating in a new light.
edit: i think the ‘being aligned’ thing is them invoking the ‘great minds think alike’ adage as absolute truth, i.e. since we both have the High IQ feat you should be agreeing with me, after all we share the same privileged access to absolute truth. That we aren’t must mean you are unaligned/need to be further cleansed of thetans.
the amazing bit here is that SBF’s degree is in physics, he knows the real meanings of the terms he’s using for bad LW analogies
They have to agree, it’s mathematically proven by Aumann’s Agreement Theorem!
Unhinged is another suitable adjective.
It’s noteworthy that how the operations plan seems to boil down to “follow you guts” and “trust the vibes”, above “Communicating Well” or even “fact-based” and “discussion-based problem solving”. It’s all very don’t think about it, let’s all be friends and serve the company like obedient drones.
This reliance on instincts, or the esthetics of relying on instincts, is a disturbing aspect of Rats in general.
Well they are the most rational people on the planet. Bayesian thinking suggests that their own gut feelings are more likely to be correct than not, obviously.