I've been dredging up a lot of the seminal works lately - going back to Vannevar Bush's 1945 Atlantic Monthly article, "As We May Think," through a lot of the Xerox PARC papers, Licklider's papers, etc. It just seems to me that a lot of the promises have been side-railed. That is, computing was supposed to become easier, more personal, and more integrated. While that is somewhat true at the cursory, application level, it has become more fractured and difficult at the application level. This assertion is obviously arguable. However, it seems that there is a chasm between a) rudimentary programming (using word processors, spreadsheets, web browsers) and b) what is now considered "programming" (hand-coding HTML; scripting; using graphical IDEs; hand-coding and building make files; full-blown project development using Apache software (maven, cocoon, avalon, etc.); integrating work into a team project using UML, etc - these are increasingly complex, but are closer together in terms of "learning curve" than moving from a to b).
Earlier systems, such as the early Xerox machines (alto, star, etc.) and Lisp machines (LMI, Symbolics, TI Explorer) enabled one to move from a to b (somewhat) seamlessly and intuitively. The current direction has been to segment the development process in order to "tune" each component, then re-evaluate the process-in-general through additional tools - high-level programming and process description. Thus, we start seeing things like BPEL, AGILE, ESB, MDA, and OCL/UML. While I respect players like Grady Booch, I think that the focus needs move back towards integration. Essentially, (to reduce the issue and provide "avatars" to represent the sides) we have Grady Booch pushing components and Alan Kay pushing integration. I think that integration is the right way and Smalltalk is/was an attempt at this.