The Next Thing After Java Technology

Share the article!

Hadon Nash, a friend of mine, thought that I might me interested in this interview of one of the thinkers inside Sun. In the interview, Victoria Livschitz (Why does it seem that most women in science and technology in America are not native?) says:

Processes are extremely common in the real world and in programming. Elaborate mechanisms have been devised over the years to handle transactions, workflow, orchestration, threads, protocols, and other inherently “procedural” concepts. Those mechanisms breed complexity as they try to compensate for the inherent time-invariant deficiency in OO programming. Instead, the problem should be addressed at the root by allowing process-specific constructs, such as “before/after,” “cause/effect,” and, perhaps, “system state” to be a core part of the language.

Interestingly enough these “procedural” concepts that she mentions is where the pi-calculus is most frequently applied, maybe there’s a connection. Anyway, for anyone who’s ever done a bit of domain modeling, concepts related to time or time itself appears to be a bit difficult to capture. That jogged my memory a bit since I recall seeing and maybe reading several design patterns on handling time. Top of my head, I recall Kent Beck’s piece on “Time Travel” and Martin Fowler’s “History Patterns” and “Things that change with Time“. Of course, I’ve got a really bad memory (one reason that I blog), so I’ve got to re-visit these articles to see what the fuss with modeling Time is all about. What confuses me is that I had thought that functional languages were time-invariant deficient and thought that imperative languages with side-effects in some sense handled the deficiency. So it’s not clear to me if Livschitz is entirely accurate.

The interview has some further interesting tidbits of information. One is mention of some work in Sun on “the next thing after java technology”:

Talk about the complexity and counter-intuitiveness of programming! What seems to be missing is a unified component architecture rich enough to cover the whole spectrum of needs, from distribution to reuse. I am convinced that it isn’t that hard to do. First, a notion of a “component” as a fully autonomous element of software must be strictly defined. An object, to be sure, is not a component, although many components may be implemented with objects. Then the rules of relation, composition, and aggregation of sub-components into higher-level components will be defined, in fully codifiable form. Familiar “Is-A” and “Has-A” relationships will be present, among many others. Finally, the rules of derivation will be defined and codified to enable a comprehensive reuse framework. Inheritance, for example, will be only one form of derivation made possible under the new model.

The distinctive take that I notice here is of the addressing of different forms of composition, I’ve mentioned earlier of a “taxonomy of object composition” that should be relevant to this.

The interview also lead me to an interesting interview with Jaron Lanier another Sun brain. Jaron is one of those guys who for the longest time has been lamenting about why everything you know about computer science is wrong:

For the last twenty years, I have found myself on the inside of a revolution, but on the outside of its resplendent dogma. Now that the revolution has not only hit the mainstream, but bludgeoned it into submission by taking over the economy, it’s probably time for me to cry out my dissent more loudly than I have before.

Something I can relate to! He’s also got “some mysterious new high risk computer science project” on what he calls “phenotropic” computing.

“Phenotropic” is the catchword I’m proposing for this new kind of software. “Pheno” refers to “phenotype,” the outward appearance of something. “Tropic” means interaction. … In phenotropic computing, components of software would connect to each other through a gracefully error-tolerant means that’s statistical and soft and fuzzy and based on pattern recognition in the way I’ve described.

Now I’m not too keen on this “pattern recognition” business, it’s one of those fields that every computer science department in the country seems to have a research project (i.e. image processing, robotics, soft computing etc.). I avoid it because the results doesn’t justify the amount of intellectual effort that goes into it. In short, it doesn’t pay the bills. Nevertheless, Lanier’s got some interesting insight on protocol:

When you de-emphasize protocols and pay attention to patterns on surfaces, you enter into a world of approximation rather than perfection. With protocols you tend to be drawn into all-or-nothing high wire acts of perfect adherence in at least some aspects of your design. Pattern recognition, in contrast, assumes the constant minor presence of errors and doesn’t mind them. My hypothesis is that this trade-off is what primarily leads to the quality I always like to call brittleness in existing computer software, which means that it breaks before it bends.

The strategy of pattern recognition that is tolerant of minor errors as a alternative to strict protocols is similar to what I’ve alluded to in my earlier table (or should I say taxonomy) on loose coupling (albeit applied in a more ambitious context). “Protocols” (I used the term Schema) can be strictly adhered using grammars or more loosely enforced using pattern matching (see Schematron). In fact, this the route taken my many of recent loosely coupled languages.

My gut (or G.U.T) leads me to believe that a study of loose coupling will give us insight on “the next thing after java technology is technology”. That’s just strange, the word “coupling” itself has strong connotations of interaction.


Share the article!

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>