"Thus, the proper approach isn’t to “label” facts versus reality, or spend time developing the perfect “AI detection” tool. It really just comes down to being comfortable that certain things just can’t be known for certain, that it is a form of amusement to keep us plodding away."
Love this. Trying to pursue absolute truths is how you become cynical and depressed. You need to accept the known unknowns for what they are.
It’s almost a rite of passage for someone who excels in STEM in their youth to learn to accept this. Because on some level it’s a leap of faith off of the work done before you that you can’t just run the loop again, that you have to iterate past it. It’s not possible to do it all in your own lifetime, and depression is a function of thinking you’re unique from everyone else. Things are timeless for a reason.
I truly wonder if the “AI” psychosis is a similar function of those whose lives are defined by turning the state into a church. The same people who are oversocialized into never questioning the official narrative until The Atlantic who tells them it’s ok, who can’t ever have an actual conversation. Politics and solitude go hand in hand - something that goes understated is that a vast majority of the demand for an AI partner is for boyfriends, not girlfriends or porn - and it makes me think if that’s the filter mechanism. If you can become “apolitical”, or at least not let it emotionally affect you, it’s probably not possible for AI to become an obsession either. It’s not exactly the redpill from the Matrix - maybe closer to how the thestrals work in Harry Potter. When you’ve experienced actual loss, the siren call has a lot less power, because you’re not obsessed with finding the meaning behind everything. Things happen, and they don’t make sense
The thestrals reference is on point (especially relevant as I just finished my Christmas rerun of the Order of the Phoenix).
I would also add that AI has the potential to create a new class of the Renaissance man, but we're still far off. Truly "internalizing" knowledge is extremely hard. I think what drives people insane is that there is a flicker of something there that signals to them that AI will unlock a new dimension of wisdom. But without innately understanding the sciences to the level that gets you to, say, a PhD-level expert, you're YOLOing your "thinking" off of what the AI has outputted. As you try to grasp at something you have a shallow understanding of, AI continues egging you on until your expectation of what you think you should know diverges from what you actually understand. By not knowing the point at which this happens, AI psychosis occurs.
I think AI can be great in helping you learn, but it also tends to egg you on by telling you how the most basic question you ask is "incredibly deep and thought-provoking." Where AI is today, I expect it to have the greatest impact on finance or jobs where you develop a thesis based on the level of understanding you get from a few months of research and test it out in a world where verification is difficult, but validation is possible.
Heh, I just finished my semi-annual zonk out in front of the Harry Potter movie marathon as well, seemed relevant.
I think the best way to understand AI is that it's a sort of neck and neck race between it and biology to invalidate the <97th percentile, as someone put it to me. The entire "applied"/"technical" expertise is almost certainly going to dissipate as a barrier within 3-5 years — the only people left are "agentic", those who think on an absurdly abstract scale but can pinpoint microstructure issues due to perfect understanding of the design of the system, without necessarily knowing anything about it. (It should be kind of obvious why Elon is so gung ho about AI, as the person who truly doesn't care about the monetary side of things.)
AI is good at plugging holes, but simply due to how these models are constructed, every layer of noise compounds rapidly. I don't see how it can fully manage a system of complex processes, I don't think the math or tech is there yet. But I am virtually certain that if you're a kid <10 years old, college will be fully deprecated by the time they're 18. It is staggering how good of a tutor even an unoptimized model is. Kids who can rapidly pick up concepts and piece them together are going to lap everyone else.
If society doesn't erupt in war - which is far more likely - I don't think it's a coincidence that "democracy" is crumbling, and the two party model is deprecated, right as the hardest IQ floor technology ever created becomes generalizable. On some level, anyone in that range of the distribution feels it
Hey, if you like procedural TV shows? You might also get a kick out of Leverage and/or Psych. As well as Numb3rs and White Collar.
I have seen bits of Psych and White Collar, definitely need to add it to the rotation.
I don't watch much TV anymore — been a few years, really — so it's good to have such a huge backlog
"Thus, the proper approach isn’t to “label” facts versus reality, or spend time developing the perfect “AI detection” tool. It really just comes down to being comfortable that certain things just can’t be known for certain, that it is a form of amusement to keep us plodding away."
Love this. Trying to pursue absolute truths is how you become cynical and depressed. You need to accept the known unknowns for what they are.
It’s almost a rite of passage for someone who excels in STEM in their youth to learn to accept this. Because on some level it’s a leap of faith off of the work done before you that you can’t just run the loop again, that you have to iterate past it. It’s not possible to do it all in your own lifetime, and depression is a function of thinking you’re unique from everyone else. Things are timeless for a reason.
I truly wonder if the “AI” psychosis is a similar function of those whose lives are defined by turning the state into a church. The same people who are oversocialized into never questioning the official narrative until The Atlantic who tells them it’s ok, who can’t ever have an actual conversation. Politics and solitude go hand in hand - something that goes understated is that a vast majority of the demand for an AI partner is for boyfriends, not girlfriends or porn - and it makes me think if that’s the filter mechanism. If you can become “apolitical”, or at least not let it emotionally affect you, it’s probably not possible for AI to become an obsession either. It’s not exactly the redpill from the Matrix - maybe closer to how the thestrals work in Harry Potter. When you’ve experienced actual loss, the siren call has a lot less power, because you’re not obsessed with finding the meaning behind everything. Things happen, and they don’t make sense
The thestrals reference is on point (especially relevant as I just finished my Christmas rerun of the Order of the Phoenix).
I would also add that AI has the potential to create a new class of the Renaissance man, but we're still far off. Truly "internalizing" knowledge is extremely hard. I think what drives people insane is that there is a flicker of something there that signals to them that AI will unlock a new dimension of wisdom. But without innately understanding the sciences to the level that gets you to, say, a PhD-level expert, you're YOLOing your "thinking" off of what the AI has outputted. As you try to grasp at something you have a shallow understanding of, AI continues egging you on until your expectation of what you think you should know diverges from what you actually understand. By not knowing the point at which this happens, AI psychosis occurs.
I think AI can be great in helping you learn, but it also tends to egg you on by telling you how the most basic question you ask is "incredibly deep and thought-provoking." Where AI is today, I expect it to have the greatest impact on finance or jobs where you develop a thesis based on the level of understanding you get from a few months of research and test it out in a world where verification is difficult, but validation is possible.
Heh, I just finished my semi-annual zonk out in front of the Harry Potter movie marathon as well, seemed relevant.
I think the best way to understand AI is that it's a sort of neck and neck race between it and biology to invalidate the <97th percentile, as someone put it to me. The entire "applied"/"technical" expertise is almost certainly going to dissipate as a barrier within 3-5 years — the only people left are "agentic", those who think on an absurdly abstract scale but can pinpoint microstructure issues due to perfect understanding of the design of the system, without necessarily knowing anything about it. (It should be kind of obvious why Elon is so gung ho about AI, as the person who truly doesn't care about the monetary side of things.)
AI is good at plugging holes, but simply due to how these models are constructed, every layer of noise compounds rapidly. I don't see how it can fully manage a system of complex processes, I don't think the math or tech is there yet. But I am virtually certain that if you're a kid <10 years old, college will be fully deprecated by the time they're 18. It is staggering how good of a tutor even an unoptimized model is. Kids who can rapidly pick up concepts and piece them together are going to lap everyone else.
If society doesn't erupt in war - which is far more likely - I don't think it's a coincidence that "democracy" is crumbling, and the two party model is deprecated, right as the hardest IQ floor technology ever created becomes generalizable. On some level, anyone in that range of the distribution feels it