This has been my experience. I have a CS degree from ~20 years ago because that's what you did -- even if you knew the stuff, that piece of paper is what opened the doors. Like most of my era and earlier, any kind of degree was a formality because you had already been programming and building as a kid before choosing that field.
It's different now. Now "learning to code" is treating it more like a vocational school, like training as a cashier, and the product is a junior dev of questionable competence. Rote memorization only goes so far -- you need the knack for abstract logical thinking to truly be valuable because every project and system has its own quirks. Those that can't work like that are the ones AI is really going to replace, and they'll be forced to move on to whatever the next equivalent of "learn to code" is, but like you said, the world does need junior developers with potential.
Where I am, we're being pushed heavily to leverage the AI tools and integrate them into our workflow. It's just a dangerous path right now because they're often subtlety wrong yet very convincing superficially. Junior developers rarely can identify when it's wrong and run with the results, introducing glaring security holes among other problems. Senior developers can see benefit from them, but much of that benefit comes in selective application -- knowing when the task is one it can handle. It's very easy to end up spending more time verifying and fixing the results than if you had just done the work yourself.
Ultimately, in my opinion, the vast majority of people are incapable of doing much more than a mediocre job at anything, whether it's food service or programming or electrical or writing. The world _is_ a living version of the Gell-Mann effect -- many people appear more competent simply because the observer lacks field knowledge. The manual workers will be fine for now, but the knowledge workers are at a precipice. That's the part of the world that AI is going to really shake up.