Unix vs Microsoft Windows - The Religious Debate

Introduction
Windows or Unix
The Windows GUI and all those configs
Which way should one go?
Gaining some control of MS Windows
Microsoft's E-Mail products
Microsoft's web server
Java
Microsoft and Intel - TCPA/Palladium project
Windows and Linux and relative virus susceptibility


Introduction

This comparison of the Unix and Microsoft NT/Win2K/XP operating systems is written from the point of view of one technical user - someone who spends some time with system administration, some with scientific and database application programming, and the remainder doing documentation, e-mail, web browsing, and following a small number of news groups.

Like many over the last few years, I've watched Microsoft's push from the desktop into the server market in particular with some misgivings. Probably the greatest fear of all has been the implicit threat to open standards. The networking aspects of commercial unix (SunOS, Solaris, HP-UX, AIX, Linux etc) have always rested on a solid foundation of the freely available RFCs (Requests For Comment). And because the free 'nixes such as Linux (GNU), FreeBSD, and so on, are available in source form, they're completely open by definition.

The effect of closed, proprietary standards such as those employed by Microsoft and other vendors is to automatically close off software development in the effected areas to anyone except the companies concerned. And where such companies can manage to gain a monopoly market share, the incentive to improve a product in real terms all but disappears.

A more serious problem, as we're now seeing, is that Microsoft have established such a fierce hold on the PC Operating Systems market (with Windows 98, XP, 2000, and so on) that they're now in a postition to kill off any MS Windows applications which are built on open standards. These include network products such as Netscape, and the various E-Mailers, which Microsoft have all but destroyed by simply shipping their own, "free" versions (Internet Explorer and Outlook) as part of their OS distributions. Attempts through the U.S. court system to break up this patently obvious Microsoft monopoly have, unfortunately, failed so far.

One bright light has continued to shine. Microsoft's original attempt to destroy the open-standards based Internet in the early 1990s by offering their users a proprietary replacement version called MSN (Microsoft Network) was greeted, if not with laughter, then at least with a very loud yawn. This forced a hasty and very costly back-track, finally resulting in the inclusion of the standards-based (and U.S. DoD developed) TCP/IP Internet protocol into Windows 95 and NT 4. So for once - some sanity from the management team at Microsoft ... the use of a well established, robust open standard.

Their attempt at a proprietary E-Mail system (Microsoft Mail) in the early 1990s was similarly unsuccessful, although not before many large corporations (such as Telstra in Australia) invested hundreds of millions of taxpayer's dollars implementing it across the country. The product proved to be a complete lemon, and probably put Australia's communications networks back by at least 5 years in the early 1990s.

Leaving such historical issues aside, the comparison which follows is generally restricted to those aspects which are currently the most obvious to me in my own day-to-day work. And I won't be delving particularly into the internals of either MS Windows or the 'nixes unless it's absolutely essential. Bearing such caveats in mind, let's now wade in.


Windows or Unix?

Windows XP, 2000 and even NT now outnumber Unix variants in terms of numbers shipped, although that's hardly surprising when you consider that millions of businesses worldwide have moved up to a Microsoft server system from almost nothing. That is - they've upgraded to a client/server system from a simple setup of PCs around an office which either weren't networked at all, or which were only networked via a loosely organised MS "peer to peer" network configuration.

For these sorts of businesses, installing one or more Microsoft servers would be a major step forward. And XP/W2K/NT (and their applications) have the obvious advantage of having fully-blown, comprehensive menu-driven interfaces that look (at first glance) fairly easy and user-friendly. All in all, it's a setup that comes across as a "nice, safe, and solid" option for the average small organisation that needs centralised Internet access onto a LAN.

Mind you - if you do actually stop to think about it (horrors), this general perception of "Microsoft is the easiest and safest way to go" (born almost entirely of the belief that "everyone else is using Microsoft") has more to do with their half-billion dollar saturation-advertising campaigns than with reality.

Of course, Microsoft products are not totally bad (as some would have you believe), and the various Unix and Linux flavours are not perfect for every situation either. Both have their strengths and weaknesses, and we'll now explore a few of these.

The Windows GUI and all those configs

The Windows GUI (the Graphics User Interface) is taken for granted by the vast majority of users and computer professionals. Most people simply regard the interface as being Windows itself. And that isn't far off the mark as it turns out.

On Unix and Linux, however, the GUI for the operating system and each application is actually optional. You can use it if you wish, or if you need better control, you can operate via a command-line interface. With most Unix/Linux flavours, you can moreover pick the GUI of your choice, thus allowing the system to resemble (eg) Windows, SunOS, CDE or a VT220 terminal.

So when starting Unix/Linux, you first get the main operating system up and running, and then (if you want the various icons and menus), you start the GUI of your choice to run "over the top" of the operating system. A bit like applying a colourful coat of paint and some posters over some drab plasterwork. But the two parts - the operating system and the GUI - are quite distinct, and each can operate quite independently of the other.

With Microsoft Windows, the GUI is a fixed, permanent part of the operating system, and although the command-line interface is available in some areas, it's generally undocumented and hidden.

As one who spends about 50% of his time with Unix and the other 50% with XP or NT (in system administration and application programming), I know only too well that "easy-to-learn" menu-driven interfaces such as those that pervade the Microsoft Windows world can equally be a system's downfall.

Traditionally, MS Operating Systems, the various OS components, and the applications can only be configured via these multi-layered, multi-level menus. So for each such menu, a user must click around for some period of time, occasionally returning to fine-tune things in an iterative fashion, and when things are just right the configuration is saved away ... somewhere.

But where is it saved, and in what format?

Well, some of it may end up in the Windows Registry, some may turn up in a file silently dropped into the Windows system tree, and some in one or more files in the user area under "Documents and Settings". And the format in each case ... well, that's anybody's guess. As a general rule, it's a case of - who really knows? Generally, it's so messy and difficult to figure out where the settings are that we rarely bother trying.

With unix or freenix on the other hand, all such configuration data is invariably stored in various plain ASCII files, of which the names, locations and formats are invariably well documented. Furthermore, all such configuration information is routinely picked up by the normal system backups.

The obscure, proprietary nature of MS Windows and the various installed applications means that there's rarely any such simple way of saving such setups for later reference or recall when required. And if this wasn't annoying enough in itself - the use of such menu-driven interfaces generally means that there's no way of specifying the config data via an appropriate functional notation.

For any reasonably complex central application server, this is a problem for long-term maintenance. When a config goes haywire in any part of the system or an application, for any reason, you may even be faced with having to do a full restore of the system from your backup - just to restore the config to a working, known state.

The same sort of problem can occur when the system gets confused and refuses to allow you to make some essential change to correct a problem or an inadvertant misconfiguration. I suspect that this sort of thing is typically caused by Microsoft's ill-conceived file and application config locking - trying to protect the dumb user (ie: you) from yourself. So when something does go haywire and you find yourself locked out, or in one of those infuriating "endless config loops", you're basically stuffed. The only choice is often to do a full system restore after advising users that the system will be "unavailable until further notice".

One way around this problem is to run mirror systems. Functionally, this works well, but it doubles the number of XPpro and/or Win2000 boxes you need to find space for, and of course you now have to look after the mirroring application as well. And when you consider that the current philosophy with Microsoft servers is "Only run one major application per box if you really want long-term reliability", your server farm can end up requiring a lot of rack space (and a lot of work to set up and maintain).

In contrast, with a unix system, where (unless file locking is being enforced on read or write - unusual) you can do "the obvious" and copy a whole file system and put it away somewhere and copy it back, all while the system is up and running. And just restoring an errant config is simple in the extreme - just dig out a back-copy of the appropriate file(s), copy back on, and restart things as appropriate.

It's funny really - I well remember when Windows 3.1 first arrived around 1992 and the "pro-Microsoft" lobby where I was working at that time sneered at our group for using Unix - insisting that "Unix is far too big and complex".

I look at Unix (or Linux) now and compare it with Windows XP or 2000 and I really have to laugh. Not only have the latter two grown absurdly large and complex in comparison with Unix, the complete absence of source code for Windows in the public arena means that users and administrators usually have little meaningful information as to what's really going on inside.

And this latter disadvantage will become an even bigger problem as Microsoft's sneaky little TCPA/Palladium project comes to fruition and starts to hit the market over the next couple of years.

Which way should one go?

For many organisations, it must seem extremely difficult to avoid Microsoft Windows as an operating system on user's desktops. Although Linux, typified by popular distributions such as Redhat, is already a serious potential alternative to Windows (especially with StarOffice now available), neither Sun nor the Linux community appear have the funds (or the will) required to create the necessary public and corporate awareness. Which is a pity, because Linux is (a) essentially free, (b) far more reliable than any Microsoft system, and (c) an 'open software' solution - meaning that its internals are well documented, so there are no 'nasty surprises' possible.

In fact, from mid 2003, public awareness of various fundamental deficiencies in Microsoft operating systems and other products (Outlook, Office, etc) rose sharply and has continued to rise. This has been brought about as a result of serious and ongoing virus and spam attacks - see below. And Linux user-friendliness and versatility continues to improve rapidly at the same time. Even Redhat 9, for example, although designed more for rugged server deployment than desktop use, is very good. I'm using it for both at the moment - one as a server, and several others as destops. Installation and updating is now very slick, and with KDE, you get a selection of GUI interfaces ... including one called "Redmond" :-)

Please don't construe this as a recommendation for Redhat for desktop use. RH is certainly very good for server use (as is FreeBSD), but there are probably better versions of Linux for desktop deployment. You should look for reviews on the web and/or discuss it with people who are already using Linux.

For any server, assuming of course that one has a choice, even though I dislike Unix/Linux for its terse and often complex aloofness and I readily admit that XP/W2K servers are great to administer while they're working - when the chips are down and I've got a major problem on my hands, give me Unix (Solaris, Linux, HP-UX, whatever) any day.

As just discussed, having all configuration information for each application readily accessible in the form of a text file is great for disaster recovery. But such an arrangement is also great for running the occasional "what if" scenario with the configuration and tuning of an application. In my case, I begin by editing the config of interest using my vib script (to ensure that I have a snapshot of the current working setup before I start). Then I can play around as much as I like, because when something breaks (as it sometimes does when one doesn't fully understand the entire package), I can always restore the original setup in a matter of seconds. And that is critical to me when that 'phone starts ringing!

In trying similar stunts on Windows servers for which I'm also responsible, I have occasionally come unstuck badly. You cannot back up a "configuration" as such, and on more than one occasion, when carefully altering parameters for our Project and Finance control database system (MS SQL), I've discovered to my horror that it's gone completely troppo. One such experiment broke things so badly that I ended up having to reinstall the SQL server and the Windows OS ... around 5 days work by the time I finally got everything reconfigured even half reasonably.

I have plenty of other reasons for preferring unix systems for "mission critical" applications. Unix systems are reliable - they hardly ever need rebooting, whereas the Windows systems (typically running just the one miserable application) need to be rebooted every month or two. Then there's the ease of secure remote administration, consistent user profiles, and so on.

If you're already contemplating a migration from any Windows server(s) to a Unix/Linux server, Jon C. LeBlanc's white paper Migrate With Confidence From Microsoft Windows Servers to UNIX/Linux is also an good read and a useful starting point.

A slight digression: Re learning unix, in the R&D lab where we set up our first pair of Unix boxes to get some initial exposure, we just played around a bit as we found time. We read the hardcopy manuals and man entries at length, wrote lots of shell scripts (good fun), and (being all superusers, of course!) managed to thoroughly stuff it once or twice and even had to reload the OS on one occasion.

We also followed various comp.unix newsgroups for a year or two, and tried our hand at some elementary C programming out of K&R's classic C text. All in all, just wetting our feet was really good fun! Be warned that Unix isn't something you can "learn" in a week or even a month - it's just too broad and too deep. Just play around with it a bit as you find time and do some reading and you'll get there in a year or so without going to any courses. All you need is interest, enthusiasm, and 3 or 4 good reference texts on your shelf such as a good shell book, a vi book, and a suitable unix admin book - see the O'Reilly site for some good titles.

As a last resort (provided you can get your employer to pay for it), if you find that you just can't make enough time during working hours, find a good unix/linux introductory course and go along to that.

Unix man entries can be infuriating for the uninitiated - they often seem to be written on the assumption that you're already a unix "wizard" or that you can read the author's mind. But some are quite good - in fact, some are even downright funny (esp the bug descriptions). Unfortunately, with GNU/Linux, many man entries point you to some pathetic doco system called "info", which I personally detest (but then, it's free, so one can't really complain).
(Hint for reading man entries: set "PAGER=less" in your profile - the default pager for man is usually "more", but this often makes scrolling backwards difficult)

Anyway, that's quite enough digression. Moving on to the user's desktop machines ... well, that's a harder one. As mentioned earlier, it's often still difficult to avoid Microsoft. One still needs a degree of political bravery to move a company from Microsoft PC to Unix or Linux or Apple (OS X). But it can be done - and many smarter organisations are doing it following Microsoft's arrogant decision to double their prices as of Aug 1, 2002.

Maintenance of user applications

It is still a fact that the "total cost of ownership" of Unix (as in maintenance manpower requirements for a given sized site) is far lower. You only have to stop to consider the scenario of having half a dozen major applications in use around a site. If these are set up centrally on a Unix/Linux network, then you have six programmes to look after (and occasionally upgrade). The user's workstations on their desks would normally be set up to be "data-less" (ie: all loaded with only basic O/S software).

With XP/W2K and with six major applications now loaded onto (say) 150 PCs, we now move from six binaries to ... 900 binaries (and 900 configurations).

And guess what happens when a user's PC full of carefully worked out program configurations is upgraded every 4 or 5 years? Duh ... oh well, too bad - the user can go and spend the next 6 months slowly setting everything up again from scratch, can't they? (It's not the IT department's problem - we don't know what their setups were anyway if it's a M$ Windows shop)

With dataless unix or linux, user's don't waste their valuable time reconfiguring dozens of programs every time a workstation is replaced because their setups are still there, where they belong ... on the network. (What is it Sun say ... "The network is the computer" ...) With these sorts of obvious (to some) topologies, it's quickly apparent why a Unix shop on a high speed network can be run by one (or perhaps two) competent Unix administrators, whereas the XP/W2K shop needs 4 or 5 and still struggles for much of the time.

And none of the buggy systems for "pushing" installations and updates out to user's Windows desktops are any sort of substitute for the simple, elegant Unix solution either. All they provide is to supply a massive and unreliable "Band-Aid" for a massive problem of their own creation.

More to the point, of course - PC software is hardly ever designed for such convenient wide-spread deployment. Their licensing systems generally assume single box operation, so that's all you can get.

With Unix (and now Linux), we've always found the use of network applications to be quite seamless, because the networked file system is so orthogonal and straightforward. Install a given package on one central machine, spend some time setting it up (and where necessary, set up the license server), and everyone around the place can then use it via the network. And user configs live in their home directories - on the main file server - not on their desktop.

This is what makes upgrades such a breeze. Not just upgrades of the various applications, but upgrades of people's desktop machines. As in - upgrade the hardware and/or the operating system on someone's desktop machine and - surprise, surprise - all their old programs still work and their configurations are unchanged. (Strange idea? Well, only to MS Windows users.)

Unix has been able to do all this in a quite straightforward manner since around 1990. So why can't Windows? The answer, as we all know, is because it's prime design goal from the beginning was never one of maximum usability, but rather to make the greatest possible amount of money for Microsoft.   As such, it makes no sense to provide the ideal product. If they did, you wouldn't pay up for the next version, would you?


But what about web-based packages?

Needless to say, for packages that have been fully implemented via a web browser interface, the above differences evaporate. Where it's feasible to do so, this approach can work well, with all the intrinsic advantages of a Unix-style solution.

As with all things, of course, the devil is in the detail ... in trying to get consistent printouts, lack of direct file I/O, the full range of windows-style features (eg: drag-n-drop), the saving of user settings, and so on. But for some types of applications, it can nevertheless come close to ideal.

Consistency across machines

One could also launch off into a discussion of "user portability". The Unix user gets the same "desktop" setup and usability regardless of which machine he or she logs into. And the XP or W2K user? No, afraid not - nothing remotely like it. For example, I have Microsoft's Visual Studio C and Boreland's Java Development Suite installed on my main office PC at work. But when I log in on any other other Windows machine on our network, I'm just outa luck - no tools.

Contrast this with Unix or Linux, whereby once such tools (compilers or whatever) are installed, we can use them from anywhere - even from across the other side of the world.

Talking of which ...

Remote access

And finally, of course - the most obvious advantage of Unix/Linux from an administration viewpoint - its well developed multiprocessing and multi-user login capabilities make remote management and remote use such a snap. So unless there's a hardware or network connectivity problem, there's no need at all to be physically at any machine to work on it. And as well, a user won't even be aware that you are logged in when you're changing or upgrading something for her - she can go right on working. (Why Microsoft didn't implement such an elegant model is truly beyond me.)

Windows W2K and XP do seem to have evolved some sort of concept of User ID and Group, but somehow or other, it still seems to be locked into Microsoft's PC (Personal Computer) mindset. It still isn't really "there" in the full sense. Well, certainly not to the point where you can simultaneously establish interactive control sessions into a server from one or more remote systems, or run a program under any arbitrary user ID.

For remote access, you can of course take over the console session via the network using some sort of "carbon copy" application, but these are less than practical over analogue MODEM links, plus you can only have one such session at a time (because Microsoft have no concept of multiple simultaneous users). And in any case, you usually have to present yourself in person at the console when things go badly wrong.

Memory leaks

I've also found that applications - even those from Microsoft - running on Windows W2K/XP systems with ample physical memory installed still suffer from significant memory leaks, to the point where servers need a full reboot every 10 to 12 weeks. The symptom is sluggish performance, and a console message to the effect that we're "Out of System Memory." Once again, this is a design deficiency which seems unique to Microsoft. Is it the fault of the Operating System (W2K/XP), or the applications? Or some weird interaction between them? I have no idea. But when a badly written or misconfigured application running on a Unix system does chew up all available memory, the Unix global-memory-allocation model means that stopping the application is all that's required to return everything to normal, as one would logically expect.


So to me at least, Microsoft Windows operating systems still come across as poorly conceived renovation jobs - concepts that started small and have been growing ever since, having different bits added from time to time as the demand arose. (A bit like that great house that looks so terrific in the brochure, but which in reality is just something that started out as a holiday shack and has had one or more new rooms added every year.)


Gaining some control of MS Windows

As one who prefers to program (script) just about everything on a computer, I was initially aghast at the thought of having to set up my first major server application on a Windows NT 4 box in 1998.

The first thing I did was to buy and install a copy of the MKS Toolkit package, and that has since allowed me to automate many system and application tasks. The various shell scripts run via the Windows "at" command (similar to cron) and email their results to me at completion.

The second thing I did was to install PERL in the form of ActivePerl. Perl is an acronym for Practical Extraction and Report Language, and also Pathologically Eclectic Rubbish Lister but in reality it's a very powerful scripting and control language (albeit with a somewhat cryptic syntax).

Some of my PERL scripts are called from within MKS shell scripts, some run alone, and some even run via the web (using Microsoft's IIS!). Altogether, this whole system has given me back control of the various NT and Win 98 boxes that I need to look after.

PERL in particular is virtually unlimited in what it can do. I have PERL scripts running on Win 98 and NT boxes that collect data from Sun Unix systems via network sockets, reformat it suitably, and then pipe it out again through the PCs serial port to external scientific equipment, cameras or video switching boxes, and so on.

Other simpler scripts merely keep an eye on disc usage and email me weekly summaries, or synchronise a PCs time and date to a reference Unix system - whatever I need.

PERL is a bit like Linux and GNU in its background history. Originally written by Larry Wall, it's subsequently been expanded and further developed by a cast of thousands. And all sorts of powerful add-on modules to do just about anything you can imagine are free for the asking from the PERL CPAN (Comprehensive Perl Archive Network) site. (The Win32 (ActivePerl) package gives you access to this archive via an automated CLI interface called PPM.)

So if you are forced to use Microsoft Windows (usually because of the availability of a particular application that your organisation has decreed "it must have", or even just because your organisation has Microsoft tunnel-vision), all is still not entirely lost. You can still get the job done with a professional, robust result - in spite of Windows.


Microsoft's E-Mail products

Microsoft Outlook

When using Microsoft operating systems such as Windows 95/98, Win2000, XP, ME, NT, etc, it's perfectly straightforward to avoid the use of Microsoft's more problematical application programs - such as Outlook for email. Moreover, such a move is also highly responsible from a corporate viewpoint since a high proportion of viruses and worms are now targeted directly at Outlook, using its well known file structures to propagate themselves rapidly. So something as simple as specifying another E-Mailer (such as Eudora, Lotus, Netscape Communicator, Pine, etc) immediately lowers an organisation's virus vulnerability by an order of magnitude. (Also see below for more on the subject of Microsoft Windows viruses)

Microsoft too are always adding proprietary extensions and MS-other-package-hooks to Outlook, specifically to cause maximum havoc with non-Outlook users in order to further maximise their market-place advantage. Their assumption here (perfectly correct) is that most organisations will then opt for the easy fix and force their users to move over to Outlook for "compatibility".

Such organisations are, however, quietly ignoring the fact that they've just multiplied their virus and security exposure by an order of magnitude.

Microsoft Exchange

Similar comments apply to the use of Microsoft Exchange as a main, central mail-exchange server. In contrast with the more established, open-source "mission critical" mail exchange packages such as Sendmail (usually on Unix), MS Exchange is again promoted as the "easy, safe option" - especially for organisations with no in-house unix or programming expertise.

The pay-back with Exchange occurs when (eg) the company finds that it needs to make large organisational changes but needs the flexibility to allow the old email addressing and forwarding structure to "run in parallel" for a while. Or when users request rather complicated intersecting and nested mail-group definitions with very specific handling requirements.

Even just trying to use an E-Mail client other than Microsoft Outlook with Exchange can cause problems. Although Exchange theoretically supports SMTP (the open standards protocol as used by other E-Mail clients), it does so in a sufficiently complicated and convoluted way that many Systems Administrators just take the easy road and stick with Microsoft's proprietary RPC protocol. Fine for running Outlook - but nothing else.

One also needs to be aware that Exchange doesn't maintain user mailboxes and folders in the traditional flat ASCII file form, but keeps them instead in a large, proprietary database. This can cause further havoc if and when a decision is made to move to another E-Mail system. It also complicates backups unless the backup system is "MS Exchange aware". (A case of the age-old "proprietary database" trick so favoured by the old mainframe application manufacturers.)

MS Exchange falls flat on its face in many such scenarios, and your local Microsoft Certified IT professional is often forced to present management with sad news such as "Sorry, guys - can't be done".

Sendmail, on the other hand, handles most such requirements with ease - provided of course that a competent unix administrator and/or programmer is employed to do the requisite configuring. Even in those rare instances where Sendmail alone is insufficient, there are 'hooks' there to allow any desired extra functionality to be added from outside.

Mind you, Sendmail admittedly has one of the most complex configuration interfaces known to the free world. Sendmail is immensely powerful, flexible, reliable - and a snap to back up. And the use of open standards such as SMTP, POP and IMAP allows the use of any convenient E-Mail client.

But simple to configure? Nup - no way!

In reality, this is actually not a major problem provided that you just want to run it with its "out of the box" defaults, but for most organisations the need will eventually arise to block this domain or prevent relaying for that domain etc.

In such cases, the only option is to print out Sendmail's (quite comprehensive) documentation and actually sit down and spend a day reading it (known in the industry as RTFM - as in ... Read The F______ Manual :-).

It's also not a bad idea to buy a copy of O'Reilly's Sendmail book, although I must admit that I've found the doco files which come with the Sendmail source distribution and the FAQs at the Sendmail web site quite adaquate for my needs so far. (But then again, I also do a bit of programming from time to time - I'm not just a Systems Administrator.)

In particular, the Sendmail Installation and Operation Guide (PDF format) which is supplied with the source distribution is well worth reading (and printing out). For some reason known only to Eric Allman (the original author of sendmail), this is only distributed in M4 macro format or as Postscript. So this PDF version is one I've just converted using ps2pdf. (The rest of the doco supplied with the source is in the form of plain text or as unix man entries, both of which are fine by me!)

Of course, there are other MTAs (Mail Transport Agents) one can use apart from Sendmail. It may have been the first kid on the block, but as Stefano Rivera put it to me after reading this page: " Sendmail isn't the only mailer, qmail (for example) is a doddle to setup and maintain, almost infinitly secure, and lightning fast. The same applies to all D.J. Bernstein's other programmes (djbdns, daemontools, etc)."

I have to admit that I haven't tried anything other than Sendmail to date. If I had, maybe I'd be mentioning one of those instead. I'm not really too fussed either way. As long as it's configurable, secure and open-source, I'm always happy to try anything. Sendmail just happens to be bundled by default with so many Unix-type systems, and when you boot up for the first time, it's there and it's running. So it almost seems a bit of a pity not to use it!


Microsoft's web server

A guy who shall remain nameless told us at a recent computing conference that he can ... "configure any Microsoft IIS (web) server to be secure and reliable. Simply apply all the latest patches and then go through the machine and turn off all unnecessary services", he asserted.

Well, maybe he's right or maybe he's wrong.

Or maybe he's just right in theory (this guy was in fact a rather cool security expert and a very competent hacker). But I've always tended to trust Microsoft less than most people, and I for one wouldn't risk following such advice. For one thing, setups on such systems can change all too easily with no-one being any the wiser.

The proof of the pudding here (particularly since around March 2001) has been the string of gaping holes that have been uncovered - both in Microsoft's Internet Explorer (their web browser) and IIS (their web server).

Okay - sure the patches are there, if you don't mind copping a whole string of other upgrades and changes that you didn't expect or want. Microsoft's patches (Service Packs, as they call them) come in an all or nothing form. (Too much effort to sort out the various interdependencies, presumably?) The last time I applied a Service Pack 6 to one of our critical SQL server machines, it blew away the ODBC control interface and replaced it with some ghastly web-based thing that had nowhere near the required degree of functionality. So we had to devise a way of selectively backing out (yet another day wasted ... thanks, Bill)

At the research establishment where I currently work during daylight hours, our Microsoft IIS servers are well protected from the Internet by a firewall and (equally importantly) by a pair of Unix (Solaris) systems running well maintained Apache http servers. This ensures that the MS IIS machines are not even visible to the outside jungle except via the Unix systems.

So why do many organisations run Microsoft IIS servers directly exposed to the 'net? Is it lack of expertise (as in general stupidity)? Or are they just too mean to spend the requisite $1000 on a Linux reverse-proxy system? Or even (loud gasp) a real Sun Microsystems Ultra for around $3000?

Well, whatever it is, the thought of running any MS software such as IIS directly on the 'net sends shivers down my spine. And judging by the vast number of sites world-wide who were hit by the Code Red and NIMDA worms in 2001, it should send shivers down the collective spines of any organisation's computer group who have the slightest clue as to what they're really doing.

What is most truly amazing, though, is that many large Government departments, banks, insurance and investment groups and so on are in fact doing just that - running Microsoft servers (such as IIS) directly on the 'net. Yep, that's right - Goverment departments and banks, no less! Can you believe that?

Needless to say, stupidity tends to reap its just rewards, and many many such sites were compromised by NIMDA and Code Red throughout 2001 as a result. Here in Australia, for example, users attempting to access a certain large Bank's web (itself running on a PC server with Microsoft software) suddenly found they'd just been infected by a virus or a worm - merely by reading the Bank's home page!. Mind you, the user also had to be viewing the pages using Microsoft's Internet Explorer web browser to be infected (ie: Netscape and other non-MS-web-browser users were quite safe).

Again, all of this is so predictable. After all, what's the most widely used web server? And again, what's the most widely used web browser? So once again, no prizes for guessing which pair of programs from which U.S. software company (whether individually or in unison) maximised people's chances of having their security compromised and/or having their PC damaged or destroyed.


Java

In light of the above, it's probably now apparent why Sun Microsystems were so furious when Microsoft tried to hijack Java in the 1990's.
Java as a 'browser' and 'platform-independent' computer language has always had security and platform independence as its prime focus. Security and platform independence first, functionality and speed second. To this end, all Java implementations are therefore required to pass Sun's Java Compatibility Tests before a company is legally entitled to use Sun's JAVA COMPATIBLE trademark.

Microsoft asserted that they did not accept this compatibility requirement as binding on them, and eventually left Sun with no choice except litigation for an injunction against Microsoft if future Java language portability was to be maintained. And so began the famous court case.

Anyway, I am digressing in the extreme now (you can go and trawl though Sun Microsystems Sun v Microsoft Case History if you want more information on that one).




Microsoft and Intel - TCPA/Palladium project

The material above was originally put together as a web page in January, 2002. In July of the same year, an ex-work-colleague made me aware for the first time of the Microsoft/Intel project called Palladium. Another colleague then pointed me to Ross Anderson's paper on the subject.

If the material above hasn't given you too much in the way of concern, then perhaps this information will. It is most interesting.




Windows and Linux and relative virus susceptibility

For another good discussion of Windows, and in particular how the system continues to encourage virus attacks, this article (6-Oct-2003) by Scott Granneman (SecurityFocus) at The Registry is also worth reading. After reading it, I was reminded again of my ongoing frustration with Microsoft and their pig-headed attitude to basic file-system design.

When designing their new server-strength protected-kernel OS, Microsoft Windows NT, in the early 1990's, Microsoft decided they could do it better than Unix by employing some guys from DEC and getting them to create something called NTFS. Nothing so simple as the established Unix file system was considered - Microsoft were going to do it much better.

Yeh?

Well, no ... (duh)

In fact, standing right back from all this ... I've never really been able to quite figure out why Microsoft didn't just do the sensible thing and clone Unix in its entirety as the backbone of NT, 2000 and XP. Plenty of others have. After all, operating systems are unbelievably complex, and no-one ever manages to release one which is totally bullet-proof and bug-free. The various Unix implementations (and Linux) generally come very close though, so why not take advantage of the world of talented open source developers when it's all out there? It simply isn't worth "doing your own thing", as Apple have discovered in recent times. (Much better to have "yet another port of Unix" than these dreadful Microsoft systems with their woeful toolsets.)

Anyway, this is water under the bridge. Microsoft have gone their own way - and it's clearly not a good way for users. So it's better and ultimately much safer, simpler and less expensive to step off the MS juggernaught entirely if you can manage it.

Unix and Linux have always been designed for maximum usability, reliabilty, and security. And because so much of it has been designed by academics and so-called computer science "boffins", you actually feel much safer. And why? Because in that world, professional reputation is everything. It is that simple.

Scott's article on the relative virus susceptibility of the two competing operating systems again raises an issue which has really gone "critical" since September 2003. Any business which is in any way connected to the Internet and continues to use Microsoft Windows as their platform of choice would need to have irrefutable logic for doing so, considering the security implications of the most recent virus attacks.

Making the change to Linux or Unix may be costly in the first 12 months, but it nevertheless makes excellent business sense for any company that values its data and intellectual property. (In any case, "total cost of ownership" will still end up being lower in the medium to long term.)

The bottom line is that with all sorts of viruses continuing to attack virus-prone Microsoft operating systems and applications, you generally won't even know when your data has been stolen.

For a Windows system administrator, a course in Linux and/or Unix may thus be prudent. A management which discovers the sanguine realities of "security" under Microsoft before the organisation's IT staff have bothered taking appropriate action is unlikely to be pleased.



  Back to CD-R Backup Strategy for Linux

  Back to Bluehaze Home Page

Copyright © 2001-2003, Bluehaze Solutions. Last update: Tue 28-Jun-2005