layer-8
Wednesday, April 12, 2006
  windows and unix, oil and water
Some days go by without even remembering that there are different software platforms and different system philosophies. In this ever-more-so service oriented world, everything quickly becomes a service, and platforms are readily available to do whatever they are best at doing.

It is know that when all you have is a hammer, pretty much everything starts looking like a nail in some way. So the weltanschauung of a distributed systems architect is, often, that everything is a distributed system, even your desktop.

For years, I've had two or three machines as my "desktop". Usually, one of them is a headless linux box under the desk (does that still count as "desktop"?). The other is usually a Windows box that I run Microsoft Project, Visio and sometines Outlook on. The third is a small Solaris box in the development datacenter with a big sticker with my name on it.

Why would an architect need - or want - three boxes for daily usage?
First, you must answer the question of "what does an architect do".

Architects don't spend all their time "architecting" things away. Beneath the architect's skin, there's someone that needs to hack, to explore, to experiment.
Software and systems architecture is a relatively young field, in a fast changing environment. Designing software and systems is not a close parallel to designing houses. In housing, materials and building techniques have developed in the last 20 years, and so have people's requirements,
but those changes pale in comparison with the yearly changes in the software world.
That makes software architecture a partly experimental discipline. The only architects that can do without the hands-on component are the ones that have large teams to do the hands-on bit for them. Still, it's like being a painter without ever touching a brush - and most architects do have the hacker bug in them, so we tend to enjoy the tinkering bit.

Living daily with all the different platforms gives you the true notion of how they can be best combined, so eventually you start looking at them as complements rather than alternatives. Even at home, where we run a Windows box, a couple of Macs (one OS X, one System 9), a FreeBSD and a Debian box, platforms tend to get used for what they do best (we run graphics and audio stuff on the Mac, the Linux box is for development and pen-testing, the Windows box runs OpenOffice, Firefox and a few more things).

But sometimes you have to step outside this confort zone.

Today, I had to setup a way to run remote programs on three dozen windows servers. I tried a number of things, from sysinternals psexec to Windows Scripting. The solution that seemed to be more stable and work best, in the end, was an old friend - ssh.
So I went and installed and configured OpenSSH on 36 Windows boxes in less than an afternoon, complete with public key authentication, and was quite happy with the result.

And then I remembered why I dislike the Windows platform so much.
It has to do with expectations.

An API is an API. A system API, even more so.
The behaviour of a system call is a ponderous thing - it cannot be changed because of a whim, there are whole layers of software that depend on it.

Unix-heritage systems understand this, so they implement system functions with the upmost care, sometimes taking endless time to discuss, to review, to approve a small change. This upsets a lot of people, but changing the API would upset a lot more.
Linux takes a somewhat different approach. The kernel and libraries evolve faster, but there is a serious peer-review process going on in the kernel mailing-lists, and you have well-chosen and responsible people looking over the whole process. So change happens, while disruption is kept at a minimum.

On Windows, they must have a process. Some process. And it's probably good, as processes go. But the outcome sometimes sucks.
Somewhere between Windows 2000 and Windows XP, someone decided to change the behaviour of the RunAs call regarding system-wide resources, such as mapped drives.
In Windows 2000, if a user maps a drive, processes that run under that same account will find the drive mapped and will be able to use it. That allows you to have a session owning the desktop, under user fred, mapping a drive as z:, and then ssh into the box as fred to check on the status of the drive (whether it is still connected or not, if you can write on it, what the free space is).
In Windows XP, you can no longer do that. The RunAs call doesn't export user fred's environment or mapped resources. If you want to check on the drive, you have to map it again - which is not what you need nor what you wanted in the first place.

I'm not even arguing if the decision is good or bad. It doesn't look like it can add much security; if the original user has the power to access a resource, then any other program running as the same user will be able to access it using the exact same credentials. But it does prevent you from inspecting the logged-on user's environment, or to have helper programs be spawned in the background to reconnect the drive (since they are launched by system, and use the RunAs function to change into the user's identity).

The outcome of it all is that I'll have to go find yet another solution for my problem. I'm sure I will find a good one, that's not the part that I'm upset about.

But this close encounter with the things that are wrong in Windows, in the Microsoft change-management process over the operating system, and with the pains of enduring support of a complex software system on top of an operating system that keeps changing for no apparent reason reminded me of why I've chosen Unix over Windows for the last 20 years.
 
Comments: Post a Comment

Links to this post:

Create a Link



<< Home
layer-8:
Juliao Duartenn's thoughts on people and technology

My Photo
Name:
Location: Portugal
ARCHIVES
March 2006 / April 2006 / May 2006 /

BLOGS I READ

OTHER STUFF I READ

ORGANIZATIONS

LATEST TAGS


TAG CLOUD

MUSIC I LISTEN TO

Powered by Blogger