In a previous blog entry, I suggested a dichotomy between the approaches of Kay and Booch. Yet, perhaps this is unfair since both are working towards simplicity via high level approaches. They indeed converge to some degree in the area of metaprogramming. However, it seems to me that Kay (again more in the metonymic sense), represents the idea of a single, simple solution; whereas Booch represents an approach where high-level processes trickle down to the lower layers. Where I am most unfair is in my villifying Booch for the "crimes" of a new software development model. Maybe I'm a pradoxical techno-luddite.
Speaking of models, in the October, 2004 edition of Dr. Dobb's Journal, Gregory Wilson Discusses development models in his Programmer's Bookshelf column. He points to a tool shift over the past 20+ years, impliesa paradigm shift, and glosses over the increased complexity - somewhat hidden behind the ease with which tools and applications can be plugged into each other. His own background, he says, is the C->Emacs->Make->UNIX command line->CVS->character streams model, whereas the "new" one is Java->Eclipse->Ant->JUnit->Subversion->reflection->XML. Perhaps influenced by modern concepts - going back to what I previously attributed to the Booch-ites - he sees software development paradigms as analogous to the Standard Model in physics: everything from quarks and leptons to cosmology fall under this roof (personally, I think that the Theory of Everything may have been a better analogy).
Inverting Wilson's analogy, one could claim that, like software development, physics has become rather too complex as of late. It reminds me of the shift from Ptolemy, where planetary motion could be accounted for from a geocentric view, but the model was overly complex and unweildy. Copernicus dropped in with a simple model and suddenly the layman could understand the system.
Back to Wilson's article, however, he reviews a book, Java Open Source Programming(JOSP), within the context of this new Standard Model. In the book, the authors demonstrate how to build "yet another online pet store," but also demonstrate how to do so with available open source tools. Wilson notes the chain of tools brought together (apparently seemlessly) before "paus[ing] to describe how they communicte via" another set of tools. "And we're not even at page 200 yet..." Hmmm....I take this is a bit of a hint that something is askew. Further, he adds, "The second half of the book goes back over the application, replacing the simple throwaway prototypes of the first half with versions that could carry their weight in the real world." I am playing Devil's Advocate a bit here, but this does seem to illustrate the point that we need simplification. Sure, there is a benefit in separating the various components of the process: one can choose the application that one favors (Maven vs. Make, for example). So is it just a preference if one wants loosely coupled rather than tightly coupled applications and processes?
On a side note, I think that Wilson is taken in by the "pragmatic approach" to technology. He sees integration as an advance, rather than a prerequisite. Thus, when reviewing another book, Effective Software Test Automation(ESTA), he gets excited about the discovery that one can use Excel as a user interface. He enjoys the inclusion of real world applications within examples in JOSP, is a bit critical of the low-level details in Coder to Developer, and loves the "authors' explanation of how to build it" in ESTA. There is a trend toward the "what do I need to know to get my job done" that is probably reflective of the typical DDJ reader. Developers don't have time to ponder the philosophical implication of the architecture's ontology, but rather must spend their time implementing durable designs within the framework they are handed. Yet, it seems that we are in need of another Copernicus.