Interesting words, and I think the thesis that culture is a limiting factor in converting AI access to growth/learning is a correct one. I feel confused by a few things, though:
- If AI progress continues the way it has been, what's the point in scaling such high-growth cultures? We're already at a point where LLMs perform better on knowledge tasks than a large portion of the population would be capable of performing. Is there a benefit to people outside of some small portion of the population learning to use LLMs for growth in this manner?
- It seems like we're moving towards a world where having a broad knowledge base internalized is less valuable. In particular, we're moving towards a world where we can access knowledge "just-in-time", in the same way that we currently (or at least prior to pandemic-related supply chain issues) managed logistics. In a few years, it may be that your obsessions with esoteric questions are just a new variant of entertainment, the new way people like you "distract themselves and wriggle out of work".
I suspect that a key culture shift will be that people move from "just Google it" to "just ask ChatGPT" -- and once that happens, and once a new generation grows up with LLMs and is as fluent with prompting them as millenials are with searching Google, and as AI companies make LLMs easier to use, what's the difference between the world we inhabit and the one you worry we won't?