Users are Luddites

That’s right. Users are Luddites. They just don’t want change, not even if it’s an improvement. The act of change means adjustment and, as creatures of habit, the last thing we want is change.  There are exceptions, of course, where said users demand change because of just how dire their need. Usually though, you will still hear complaints about certain things “not being like they were before.” Few programmers will disagree with this take on things, having been, quite likely, subjected to precisely these sort of things before, so I shall not bother with examples here.
Lest anyone think that this is merely an l33t (or someone who regards himself as such) talking down to Joe Point ‘n Click, this is not a tendency confined to the average user. Much to the contrary, there are many very avid geeks who feel the same way–the just feel that way on more advanced topics. Take, for example, Perl 6 currently in progress. There has been a ruckus in some quarters about changes being made to the language, particularly changes to the operators. The reason? They don’t want a change. One Slashdotter went so far as to call Larry Wall “arrogant” for the said changes. That strikes me has patently absurd. It isn’t arrogant for someone to make a change to the language he invented. It IS arrogant for someone else to try and tell that person not to change is own stuff. Moreover, it’s OSS. Fork it if you want to. But I digress. Other examples include FreeDOS (originally created because someone didn’t want to give up MS-DOS, but now used for other, more useful things–like games), and various APL revival attempts.
Users are users and users are Luddites. Add to that that programmers/geeks/admins are users and we get geeks->users->Luddites. Interesting path, no doubt.

Professional

I had a compsci professor who said that we, as computer scientists, are professional problem solvers. True, quite true, but that isn’t the most concise way to define the profession. What we really are is professional junk-takers. That’s right, folks. I know, I know all professions get this to some degree. No matter what you do, you’re going to have to put up with some of it–just more so in an IT related field. The reason is simple: the vast majority of people who use a computer do not understand it. That would be fine, except for they do not think of it as what it is: a tool they do not understand. Most people don’t know, really know, how a microwave works, either, but they do understand that it is a tool. Nothing more, nothing less.

Sadly, it is not so in the world of computers. The average user thinks of the computer not as a tool, but as a magical mystical artifact and those who harness their power, not as craftsman, but as wizards. Yes, behind our monitors surrounded by a sea of blinking lights we ply
our own flavor of black magic. We could muster the world, if we would, and lay it at their feet–but we don’t. In their eyes, we simply hide in our dark lairs (whether we do or not), and come up with reasons why we can’t do what they wish. We delay, we moan and we dodge. Why don’t we just do what they want? Surely it can’t be THAT hard? And that’s the problem. It IS that hard, but, despite the fact that they do not understand the issue at hand, we just can’t make them understand what goes into even the most trivial application they use or the most ubiquitous.

I wish I could say that education were the answer. I really do, but it seems to me that every generation gets more technically aware than its predecessor and it doesn’t really help because every generation adds its own technical wonder to the world and layers over the wood that already was with marble. The more there is, the less they understand–and the more they assume that we are wizards uttering incantations over a caldron.

The School of DIY

I’ve been reading some papers on lambda lifting. At its core, lambda lifting is the process of elevating (i.e. lifting) anonymous and inner lambda to the global scope, renaming and revising the program along the way. The end result is that the output will be functionally equivalent to the input, but that it will be more or less scopeless. It’s interesting stuff for
someone who has never been inside the bowels of a compiler for a functional language before.
It does, however, bring a couple of things to mind. First, how hard it can be to find out on your own what is often considered basic in the field you are studying and secondly, how much more insight it can give you into the topic at hand.
The real reason that it can be so hard is that when you have a question, you don’t know the answer and so you do not know in what form or direction it will come.

Take the example above. I understand the concept of lambda lifting and now I am trying to extract a
specific enough algorithm to code it in stage 1 of Ocean’s compiler, but I didn’t start out looking for lambda lifting material. I was reading Guy Steele’s paper on Rabbit and the section that reviews the compilation strategy. I figured that, although compiler technology for functional programming had come a long way since then, that
it would be an excellent starting place. In that section, Steele casually says that the input Scheme function is alpha renamed. Alpha renaming is simple enough to understand (it is the rule in the lambda calculus that states that changing variable names does not change the fundamental program), but I could find nothing that really aided me in figuring out how to actually implement it as an algorithm. I sat down to code it up myself and
realized the daunting complexity.A great deal of wandering finally revealed that lambda lifting was the cure. No doubt, if I had been doing Ocean as a master’s thesis I could have asked the advisor or if I had taken a related class I might have already known the answer, but I wasn’t. So, it was harder to come by the knowledge. Yet, I am fully convinced that knowledge that was painful to gain is harder to lose. It has a distinct tendency to ingrain itself more fully into the psyche. Besides which is the benefit that a bit of wandering, within the field of study mind you, gave me a great deal more insight or exposure than I would otherwise have had. I love to learn for the sake of learning, but I have a harder time doing that if I cannot focus the energy on something in particular. Long and short, I do go to school–it is the Do It Yourself School, which has a very close working relationship with the School of Hard Knocks.
I am certainly not advocating making learning harder than it should be, but I do think that the value of having to, in some fashion, reblaze the path that was cut is often undervalued.

Packaging: Debian vs. Gentoo

If you have read this blog, you know that I have been playing with Debian packaging. I have created packages on both Debian and Gentoo systems and so, here and now, I offer my final manifesto on how they compare.

So, what is the difference? Debian packaging basically archives an installed version of the tree (etc/, usr/, usr/bin, etc.) with a couple of text files that describe the package. What it is, what it depends on, etc. Gentoo doesn’t really package anything. Ever. Gentoo’s Portage system is a network of Python and bash scripting that tells the system where to get the parts for the package, how to configure it, how to build it, and how to install it.

There is a deep philosophical divide here. For the package creator, the tasks are almost completely different. In both cases you have to build the package, but in the case of Debian that is almost all you are doing. You are going to build the package, roll the install into one tarball and pass it around. In Gentoo, you are scripting the build for every single user thereafter. You don’t give them a finished product, you give them a machine-readable build manual. From the package creator’s point of view, it can be a pain in the neck either way. From the user’s point of view it’s a case of convenience versus flexibility. Debian packages are nice and easy to install. No fuss, no waiting. It’s just done. Gentoo’s portage system offers endless configurability and flexibility to build your system the way you want it. Ultimately, it comes down to user preference. Do you want the flexibility or do you want it now? There is no right answer here.

There are other concerns for the package creator. With Gentoo, you are more or less recording the build process. Sure, you sometimes have to do some tinkering to get it to play nice in the sandbox, buut those times are relatively rare and for good old fashioned autotools software, it is a piece of cake. Maybe it’s just me, but Debian seems far more fussy. There has been some software that I just said “oh, heck. I’ll just make install.” rather than fiddle any more with the package.

All in all, I like Gentoo’s system better. It is simple, clean, elegant, and flexible. That’s not to say it is without caveat, but my experience with it has been smoother than Debian. But, hey, who knows? I may write here in not to long about the glories of deb packages–but I doubt it.

Chicken Package–Updated

In my continued travels on the subject of Debian Packages, I came across a discussion of checkinstall. Basically, the high and low of it is that checkinstall is quick and easy–but the packages will not necessarily work all that well in a clean room environment. This combined with the fact that the paths in the previous package are not quite right, prompted me to build a new package “the right way”. It is now available.
Before I talk about the solution, I wanted to mention a couple of things that researching this problem brought into greater perspective for me.

That out of the way, here is what I did: First, I downloaded and extracted the source. Then I cd’d into the base source directory. Then I ran:

$ dh_make -n -e my-email@my-domain.dom

This command generates default build scripts for the Debian package (note: you will need to ensure that the package for dh_make is installed). The next step is to edit the control file. This will, by default, be created under debian/control. The main changes needed are to the description and to the dependencies. The documentation on the specifics can be found here. As it is relatively straight forward I will not go into detail on it here. Once you have finished with that, you would need to edit the various rules in the rules file. If, however, your software is fairly standard automake
then you probably don’t need to do anything. If, on the other hand, you were using NAnt, Ant, HMake, or some other custom build system you would have to modify the rules file to build properly. After all of this is squared away, simply run

$ fakeroot debian/rules binary

If the previous step was completed correctly, dh_make will spit a shiny new Debian package out
in the parent directory. A good old fashioned dpkg run and the package will install on your system.