Saturday, June 23, 2007

Smart Module, Dumb Module



A rarely considered part of designing modules and interfaces is: should the module be dumb, and do exactly what it's told, or should it be complex (+). An interface like the classic "C" "strcpy" routine is a good example of simple:


char* strcpy (char* Dest, const char* Src)
{
for (char* const p = Src; *p; p++)
{
*Dest++ = *p;
}
return Dest;
}


It does exactly and precisely what it's told: it doesn't cache the value, or decide to escape some characters in the Dest string++, or skip over quotes, or decide to stop if Dest is getting "too full".

An example of a complex module would be -- well, there aren't any good examples. Good examples should be short; complex modules aren't short. A complex version of string copying, though, would include escaping the incoming string to handling quoting rules, or adding line numbers, or converting HTML syntax to regular text.

SP Rule of Complex Modules:

Never interface two complex modules


Why not? Because you'll spend all of your time trying to convince module 'a' that it should convince module 'b' to actually do what you want and not something clever that you don't want at all.

Here is a classic example of complex modules interacting:

In the early days of the IBM PC, there were just a few kinds of graphic cards: the "monochrome" card that could just display text+++, but was very fast and very crisp, and the "cga" card which was slower and fuzzy but which could display color and bitmaps.

The first programs would be written for one or the other.

Thus, we see two dumb modules connecting. Not ideal, because you have to buy the right program, but not bad. Note that the 'dumb' modules are in fact pretty full of stuff -- but their interactions with each other are dumb.

Next came two different phases. In one phase, the program writers wanted to sell their programs on both kinds of IBM PCs -- the ones with CGA cards, and the ones with Monochrome cards. They figured out that if they queried each card for some value, they could figure out what kind of card you had attached++++. And thus the programs became complex

In the other phase, clever hardware people figured out that it would be keen to support both the CGA and Monochrome interfaces -- that way people would buy the fancy new graphics card and be able to use programs written for either CGA or for Monochrome cards. The new video cards would listen in at both interfaces, and report back correct information on both. And thus the video card became complex.

And now for the disaster: the complex program would probe for a Monochrome graphics card. The complex video card would detect the probe and respond correctly, switching into Monochrome mode. The program would detect the response, think that it was dealing with a Monochrome card, and switch into Monochrome mode. The two sophisticated systems would mutually agree to switch into the worse of the two modes.

And it's all because there were two complex systems talking to each other.

----------------------------------------------------

+my boss say, "call it sophisticated". I think that words like "sophisticated" should be reserved for Gary Grant and Audrey Hebburn - debonair, suave, and mouthing witty sayings over martinis. This doesn't describe any of the programs I've ever worked on.

++A pox on Microsoft Vista's competing disk and registry virtualization.

+++Young readers might not actually believe this, but it was true. Memory was too expensive in those days to waste on a bitmap display unless there was a very good reason. Instead the video card had about 4k of text memory; the internal circuitry included a "character generator ROM" that would expand the individual characters into the bitmap. You can still purchase these ROMs.

++++Of course, you could have both attached. This was actually common in the Microsoft Windows 3.1 programming world because you would run Microsoft Windows on the graphics card and your debugger would run its interface on the monochrome card. The monochrome card was nice because once you sent data to it, the data would be displayed: the operating system didn't have to drive it or interpret the data.

No comments: