The idea that hardware on networks should just be caches for movable process descriptions and the processes themselves goes back quite a ways. There's a real sense in which MS and Apple never understood networking or operating systems (or what objects really are), and when they decided to beef up their OSs, they went to (different) very old bad mainframe models of OS design to try to adapt to personal computers.
[Computing] is just a fabulous place for that, because it's a place where you don't have to be a Ph.D. or anything else. It's a place where you can still be an artisan. People are willing to pay you if you're any good at all, and you have plenty of time for screwing around.
I had the fortune or misfortune to learn how to read fluently starting at the age of three. So I had read maybe 150 books by the time I hit 1st grade. And I already knew that the teachers were lying to me.
Basic would never have surfaced because there was always a language better than Basic for that purpose. That language was Joss, which predated Basic and was beautiful. But Basic happened to be on a GE timesharing system that was done by Dartmouth, and when GE decided to franchise that, it started spreading Basic around just because it was there, not because it had any intrinsic merits whatsoever.
The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.
School is basically about one point of view - the one the teacher has or the textbooks have. They don't like the idea of having different points of view.