The idea that hardware on networks should just be caches for movable process descriptions and the processes themselves goes back quite a ways. There's a real sense in which MS and Apple never understood networking or operating systems (or what objects really are), and when they decided to beef up their OSs, they went to (different) very old bad mainframe models of OS design to try to adapt to personal computers.
Sun Microsystems had the right people to make Java into a first-class language, and I believe it was the Sun marketing people who rushed the thing out before it should have gotten out.
I fear - as far as I can tell - that most undergraduate degrees in computer science these days are basically Java vocational training. I've heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification.
The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.
I had the fortune or misfortune to learn how to read fluently starting at the age of three. So I had read maybe 150 books by the time I hit 1st grade. And I already knew that the teachers were lying to me.
Understanding- -like civilization, happiness, music, science and a host of other great endeavors--is not a state of being, but a manner of traveling. This great road has no final destination. The journey itself is the reward.
When the Mac first came out, Newsweek asked me what I [thought] of it. I said: Well, it's the first personal computer worth criticizing. So at the end of the presentation, Steve came up to me and said: Is the iPhone worth criticizing? And I said: Make the screen five inches by eight inches, and you'll rule the world.