RAM Emergency

How much memory does a typical computer need in 2019? When I look at machines that are sold at electronics shops1, I'm seeing machines that ship with between 4 and 8 GB of RAM. Looking at the Lenovo and Dell websites will show much of the same. Most of the notebooks that are handed out at the day job also fall into this category, with schools getting machines with 8GB, and managers getting units with 4GB2. When I moved out of the classroom in 2016 I was given a Lenovo W541 notebook with 8GB of RAM, which I promptly upgraded to 32 because it was the right thing to do. That machine has since been converted to a development server and I'm now using an X1 Carbon with 8GB. As I've lamented, perhaps too often, the sleek little notebook is great except for one little detail: 8GB is simply not enough.

RAM Emergency

As one would expect, I've brought this up with a couple of my managers who have all pretty much said the same thing: there isn't money in the budget right now for a new machine, so try to make due with what's at hand. I am certainly accustomed to working with what's on hand, though it generally means that I try to find creative solutions to my problems. The "fix" that I currently have is to offload work to other machines. I can send large workloads to the development server upstairs to chug through or, if I need even more power, a potent virtual machine with lots of memory and processing horsepower has been configured for me to use at the corporate data centre. This means potentially transferring up to 50 gigabytes of compressed data3 to get work done. It's suboptimal, but it's better than struggling with a machine that is simply not up to the task.

Today was pretty rough, though. More than once I noticed the machine struggling to keep up with the workload. If I were doing data transformations today then I could understand why the physical memory was exhausted and the swap file was being thrashed. However, today's tasks were all about working with web development tools. No database work. No API development or testing. Just design and development. Why couldn't the machine keep up?

The company had a RAM emergency. The office had too much RAM.
— Jen Barber, Relationship Manager for the IT Department of Reynholm Industries

Sometimes I'm tempted to bring up the issues that I face when using this notebook to carry out my duties. I didn't have these problems when I was permitted to use my own hardware, a MacBook Pro with 16GB RAM and a much slower SSD running the very same version of Ubuntu as the Lenovo. The previous system I had requested was denied as it came out to 338,700円, which is just over $3,000 USD. If I'm a little more conservative and choose a similar machine to what I have now, an X1 Carbon with 16GB RAM, less NVMe storage, and a higher resolution screen for 182,488円, which works out to just under $1,650 USD. The 2019-model X1 Carbons will be shipping in June, so the current version is priced to clear.

But am I asking too much?

For the longest time I have tried to cost the company less money than anyone in IT. This doesn't seem to be the case anymore. I work an excessive number of hours overtime and the hardware that I've managed to acquire over the last three years is not cheap. All of this is in the service of the day job, of course, but there is still a cost involved. The management has already said "no" to the request, so coming back at them for the third time in less than four months could appear to be selfish or persistent in the worst way.

While it's true that I could just "secretly" go back to using my own personal hardware to get the job done, I would be much more comfortable having sensitive, work-related data on a work-owned machine. This way, if I am terminated or decide to leave the company at some point in the future, then I'll know that there's no company data on any of my machines. Wiping a drive and re-installing an operating system isn't enough when it comes to keeping a device clear of data, as there are backups that could also contain data that does not belong to me. I treat this subject seriously as it's my responsibility to protect and maintain data not just for the day job, but for a number of people I offer services to. For this reason my machines will continue to be used for non-day job tasks. In the meantime, it will probably make the most sense to continue doing what I'm doing, working with the tools I have and finding ways to make it all work. When it comes time to discuss this year's performance with the management team, it may be possible to bring up the topic again.

Besides, I can always use the occasional system sluggishness as an excuse to get up and walk around; something I don't do nearly enough of anymore.


  1. Never buy a computer from an electronics shop unless it's an absolute emergency. You'll pay through the nose for something that's worth less than half of the amount you forked over. Buy online if at all possible.

  2. I don't understand the logic, either. Outlook alone will consume all of this just to start up, nevermind what the browser(s) and operating system want.

  3. I work with a lot of databases. Right now I've been tasked to perform a number of data migrations for corporate offices around the world.

What's the Alternative?

John Gordon recently wrote a short blog post explaining that he can no longer recommend people make the switch from Windows to macOS when shopping around for a new computer. The reasons he cites are quite valid, from Apple's recent abandoning sprees on software, hardware, and business sense, to the high cost of entry for machines that have arguably mid-range specs. While I can agree that the average person may not be willing to invest a grand or two in hardware before investing even more money in applications that may or may not work as the operating system evolves year over year, it's important to ask one question: what's the alternative?

After months of investigation, I settled on picking up a 2015-era MacBook Pro and replacing OS X — as it was known at the time — with Ubuntu. I've been happy with this decision for the most part and have even gone so far as to contribute updates to drivers that allow people to get better performance out of their Bluetooth radio. I chose the MacBook Pro not because I wanted a quick way to jump back to the safe confines of Apple's ecosystem, but because the alternatives were just not worth the money.

When it comes to buying a computer, a person really needs to consider how they'll be using the machine. Will it be something you're looking at for more than an hour or two a day? Then it simply cannot have a low-resolution screen. Will it be something you'll type on a lot? Then the keyboard needs to match your hands just right. Will it be something you'll carry from place to place? Then it had better have a really good battery, or be light enough that carrying the ridiculously bulky charging adapter is slightly more bearable. Then there's the problem of the hideously awful touchpads that seem to exist on every notebook not designed in Cupertino and manufactured in China. I spent months looking for a good-quality notebook that met these 4 criteria and a few other details and always came away disappointed.

You can have a good keyboard or a good screen, not both. You can have decent expandability or good battery life, not both. You can have a fast processor or a thin formfactor, not both. Buying just about any product will require a person to prioritise certain features, but one expects the decision to be less painful the higher up you go in the product line.

The HP Spectre 13 x360 came very close to what I was looking for in terms of hardware, but was limited by 8GB of RAM and a keyboard that just didn't feel very good. Lenovo's T450s was also close, as it allowed for hardware swapping along with a mostly-acceptable battery life and decently-comfortable keyboard, but was limited by the screen's awful pixellation and colour fade.

As a person who looks at a glowing screen for 10+ hours a day and interacts with the keyboard almost exclusively1, any machine that cannot offer both solid typing and crisp text2 simply cannot become a tool I rely on.

So what are the options?

Dell does have some decent machines, yes. The screen's aren't all that great, and the keyboards feel cheap, but they'll do. The same can be said for HP, Lenovo, Mouse, and System76. Nothing from any Japanese manufacturer is even worth mentioning anymore, as it's all lowest-quality-highest-price plastic crap. Even Sony, once the pinnacle of amazing-screens everywhere, is barely worth a cursory glance at an electronics shop. Try as I might, there just hasn't been a compelling notebook from any manufacturer in the last five years — if not longer — that comes anywhere near what a MacBook Pro can offer in terms of screen quality, keyboard usability, battery longevity, and overall build quality. Yes, a person needs to resign themselves to the fact that the unit is ultimately a non-upgradeable appliance, but it's still the best-made appliance out there. And, if you're willing to go with a store model to save a few hundred dollars, you'll wind up paying the same as you would for a top-of-the-line HP or Dell that comes with an infuriating touchpad that you leave disabled 90% of the time.

When people ask me for advice on what computer they should buy next, I still ask the basic questions. What's the main purpose? How long will it be used for every day3? Who will be the main person using it? And then I make a recommendation. Sometimes it's for a tablet. Sometimes it's for a notebook. And in those instances where somebody is looking for a decent quality notebook, I'll recommend either a MacBook Air or a MacBook Pro … which then has its operating system replaced with whatever the buyer is most comfortable with soon after getting it home.

Ideal? Maybe not. But it's better than the alternatives.


  1. I have not used a mouse in over five years, and I have no plans on ever going back to those horrible things.

  2. Crisp Japanese text. Roman characters are tolerable with awful pixellation some of the time, but it's brutal when trying to read complex kanji consisting of 12 or more strokes.

  3. then multiply this number by at least two

Lengthening Upgrade Cycles

Bryan Clark over at The Next Web seems to have a beef with the relative age of the current-selling MacBook Pro, saying that the machine hasn't been updated by Apple because the company has "forgotten what a computer is." This is an interesting hypothesis, but not one that I'm willing to entertain. Instead, I'll posit my own theory for why Apple isn't updating any of their computers on a regular 6 ~ 12 month cycle anymore: it's 2016.

Intel's Legendary 486 DX2/66

Twenty years ago when I first started to deep-dive into the world of technology, I would buy a copy of Windows Magazine from the Avondale near my high school and read it cover-to-cover and absorb the information like a sponge. Later, I'd go through the advertisements and watch as the processors described as "speed demons" just six months before were being replaced by faster, more capable CPUs at an ever faster rate. When Intel released the Pentium-branded chips, the MHz race was on and we quickly jumped from 200 to 450 in the space of a summer, then up to 800 before the next spring, and finally hitting the blazing fast 1.0GHz by the start of 2000. Anybody who had actively tried to keep up with the speed of technology at this time was investing close to $600 a month in upgrades … and I was one of those suckers.

This was how technology (and marketing) seemed to go for well over a decade. The "l33t hotness" today would be considered obsolete by geeky peers less than 180 days later, and people would be openly mocked online as a result. This started to change for most people around 2008, though, as technology and software had mostly started to reach the "good enough" point where people were comfortable using a computer for longer than 18 months without complaints. Operating systems had become much more reliable by then, as well, meaning that wiping the hard drive and re-installing everything from scratch became a rare nuisance rather than a twice-annual happenstance. By 2012, a lot of non-gaming geeks also found that any hardware they had was good enough for at least four years. Batteries were lasting longer. Processors could keep up with 90% of the work we'd throw at them. Software crashed a lot less.

Now, there's a very real "problem" that arises when people don't have a problem with a product they own: the idea of replacing the item never enters their mind.

How often do people buy new appliances for their home? Washing machines, refrigerators, and microwaves are typically replaced once a decade if that. Many people are able to get fifteen years or more from these machines. As a result, manufacturers have a slower product refresh cycle. Does Maytag or Toshiba really need to put out a new front-load washer every 8 months in order to appear relevant? No. Having worked in the appliance industry for many years¹, I can tell you that most appliance manufacturers have refresh cycles similar to auto manufacturers. There are minor cosmetic changes every spring, maybe a new software update on the higher-end models, but the underlying hardware stays the same for a number of years.

While most manufacturers of cheap, plastic computers churn out an endless array of crap every 6 months, the companies that produce the better, more reliable hardware have slowed down. They choose to invest their time in designing better devices that they can then charge a premium for. Compare any "brand new" Toshiba, Sony, Samsung, Lenovo, HP, or Acer computer that sells for less than $1500 USD to a 440 day old MacBook Pro and tell me how they're better in any 3 measures. I guarantee you can't do it. I tried for 8 months to find a better notebook and failed miserably. As a result, I bought a MacBook Pro and soon replaced the default OS X installation with Ubuntu MATE 16.04 and haven't (often) looked back.

The current MacBook Pro line ships with the 5th Generation Intel Core processors. The 6th Generation is out with reduced power consumption and generally reduced clock rates, but does it make sense to update a whole line of professional-grade notebooks to get another few hours out of a notebook that already provides 12+ hours of life for most people? Heck, according to CPU Boss, going with a 6th Generation processor would be a step down in terms of processing speed. Given that OS X's performance problems are a direct result of the file system² and not the hardware, I'd say a MacBook Pro refresh would be a bad thing.

But what do I know? I'm just a person who uses his MacBook Pro running Ubuntu for 14+ hours a day, 7 days a week. The battery is being used for ten-to-twelve hours most of these days, and this machine is still the fastest non-server-grade computer I've ever used. Would I like a little more battery life? Sure. But I won't accept it at the cost of raw throughput. I won't pretend to know how Bryan uses his computer and what he might want a refreshed Pro line for, but as a person who spends his days programming for multiple platforms, running Virtual Machines, podcasting, and just about anything else aside from gaming, I can tell you that the most recent MacBook Pro line can likely go another 440 days before a refresh is honestly necessary.


  1. Many years ago, I might add …
  2. Don't believe me? Put Ubuntu on a MacBook Pro and tell me it doesn't feel faster. Heck, put Windows on a MacBook Pro and compare it to an equivalently-spec'd Sony or Dell and tell me that it doesn't perform drastically better. Apple's hardware is incredibly capable. Their problems are all in the software.

Documenting the Ubuntu Trek

It seems like just yesterday when I switched from Ubuntu to Windows and then, less than two years later, from Windows to OS X. Each move was done with the expectation that I would spend less time fiddling with the technology and more time using it to express myself creatively. Each move did allow me to try new things and understand a different way of interacting with computers, and each move did allow me to explore novel ways of solving problems. That said, after little more than three years, I'm making move away from OS X and returning to Ubuntu Linux — Ubuntu Mate 16.04, specifically — in an effort to leave commercial operating systems behind. When I first moved to Apple, it was because I really liked what I saw in OS X 10.6 Snow Leopard. Each revision since then has made the operating system more and more a commercial project, and I'm just not comfortable with all the deep hooks the OS has with online servers. It's time for something else.

Moving from one platform to another is never easy. Data can move without too much hassle, but this is because data is data. Finding the various applications that can satisfy our needs can be a little tricky. More than this, though, is the hardware. Linux typically supports a wide range of hardware, but there are still limitations. So in a bid to help others make the move to Ubuntu, I've decided it would be a good idea to have a website dedicated just to that flavour of Linux.

Say hello to @matigo Goes Ubuntu.

Matigo Goes Ubuntu

If you're interested in giving Ubuntu a try or just want to see what all the fuss is about, feel free to check the site out. Every post will include at least one photo of Nozomi, usually as a desktop background, and contain useful information about the platform. I can't promise a daily post schedule, but I can say that by blogging about Linux over there, I can keep this site focused on opinion pieces.

Ubuntu Server and the Fully Qualified Domain Name

Over the last few months I've been receiving a large number of questions about Ubuntu Server, and the messages it gives when starting and restarting services. The most common ones seem to revolve around the effective configuration of Apache with a particular emphasis on resolving this message:

* Starting web server apache2

apache2: Could not reliably determine the server's fully qualified domain name, using 127.0.1.1 for ServerName

… waiting apache2: Could not reliably determine the server's fully qualified domain name, using 127.0.1.1 for ServerName

One thing I would like to stress is that this message is not a system-halting message. Apache will run just fine (in most cases), and the warning is just letting us know that a single configuration file is missing; /etc/apache2/conf.d/fqdn. Luckily the solution can be applied in less time than it takes to read the intro of this blog post.

First, create the file I mentioned above. This can be done in any text editor, such as vi:

vi fqdn

Next, set the ServerName value:

ServerName localhost

Alternatively, you can use your actual domain name:

ServerName jasonirwin.ca

Finally, restart Apache:

service apache2 restart

Done and done! You shouldn't see the fully qualified domain name message ever again!

Build an 8TB NAS On The Cheap

Since helping a friend with their network storage needs a few weeks ago, I've been inundated with requests from his friends who would like me to turn one of their old or discarded computers into a network-connected hard drive. While this is certainly something I enjoy doing, there are a number of situations where this might not be the most economically or ecologically sound option. Many of the older machines that I was asked to "revive" were not only too old to be useful, but they would have consumed incredible sums of power while sitting idle. So what's the solution?

For one particular person, I suggested going all out and spending 60,000円 to have a brand new system built, and you can do the same.

To do this, I found a HP Proliant MicroServer on Yahoo! Auctions for just under 30,000円, four Samsung 2TB hard drives for 5,947円 at BestDo, and 4GB of Samsung ECC RAM at another Yahoo! Auction for 5,000円. After shipping, all of the parts worked out to 60,368円.

HP ProLiant MicroServer (Inside)As you can see from the picture, there isn't a whole lot of room to work with inside this little box, but slide out trays and quick-release sockets allowed the system to be fully upgraded in less than two minutes with the new components. The 250 GB hard drive that originally came with the MicroServer was packed into a 5.25" adapter and affixed to the single bay, and the four Samsung hard drives were put into the appropriate bays, easy as pie.

From here the system was fired up, the four main hard drives were configured in RAID5, Ubuntu Server was installed on the primary partition, and the system was further configured to act as a network store, print server, and media streamer.

Not bad for 60,000円 and an afternoon's worth of work.

After everything was all said and done, this was the final configuration: AMD 64-bit Processor, 4GB RAM, 250GB Primary / 6TB RAID (as RAID5, 8TB if RAID0), Gigabit LAN, eSATA & 8 USB ports for lots of expansion later, and running at a peak of 52 watts … which equates to just under 4,400円 a year in electricity costs here in Tokai, Japan if left running 24/7.

It's an absolute steal for anyone who wants a central place for their family to store and protect digital assets that cannot be replaced.

Cheap Man's RAID With Ubuntu

It seems people are starting to look at putting some of their older hardware to use as simple network storage servers. Since posting a quick little screen shot a few days ago showing a 32-bit Ubuntu installation with a 7.5TB partition I've had two people ask me how they could do this with a netbook and a bunch of external drives. While I would usually not suggest that people do RAID over USB unless there's a dedicated box for the hard drives, there's nothing wrong with looking at this from a strictly academic standpoint.

What We'll Need:

  • one netbook (Atom processors welcome)
  • a bunch of same-sized USB hard disks (for this example, I will be using five 2TB Seagate hard drives)
  • a powered USB Hub (this depends on how many hard drives you want to connect to the netbook)
  • Ubuntu on a bootable USB stick
  • one of your favorite beverages
Ubuntu is incredibly easy to install, so I won't describe how to do it here. Instead, I'll start from the next step after plugging all the hard drives in to the system: configuring mdadm.

First, let's install the package. From the terminal, type: sudo apt-get install mdadm

This will install all of the required packages to use the software RAID tool, and will configure the system appropriately. Next, if you're like me, you'd probably prefer to use a file system like XFS for larger partitions spread across disks. I won't get into the details of why I would choose XFS over other file systems like Ext3, Ext4, or ZFS, but you can install it with a simple sudo apt-get install xfs and sudo apt-get install xfsprogs

Again, because this is strictly an academic exercise, this won't be necessary but it also won't hurt anything if we go this way.

Now comes the fun part. Let's create a definition with mdadm. First, identify the locations of your external hard drives. Ideally, the external disks would not be mounted, as that would make this process quite difficult. In my case, the five hard drives range from /dev/sdb to /dev/sdf. I would like to put these together in a RAID5 configuration, and I would like to use a 16K chunk size. With this in mind, I would type:

sudo mdadm —create /dev/md0 —chunk=16 —level=5 —raid-devices=5 /dev/sdb /dev/sdc /dev/sdd /dev/sde /dev/sdf

Let's break this down.

—create /dev/mdo creates the array name (I typically use md0, md1, md2, etc.)

—chunk=16 this is optional, but sets the chunk size for the RAID array. Caution: bigger is not always better!

—level=5 signifies what kind of RAID array we would like (0, 1, 5, 6, etc.)

—raid-devices=5 states the number of drives (or partitions) that will be connected to the array. This is immediately followed by the locations of each drive (or partition) to be attached.

Ubuntu | mdadm started

You should now see something similar to the screen shot above, with no errors anywhere in sight. Let's format the sucker, now; sudo mkfs.xfs /dev/md0

Depending on the total size of the RAID array and the speed of the system, this could take a while. When it's all done, you'll see something like the screenshot on the right. So far so good? I bet you're not even half-way through that favorite beverage!

Next, let's confirm that everything is running as it should. If you've configured RAID5 or 6, you've probably noticed that the drives are still busy even though we're not using them yet. This is normal, as the system is creating a recovery plan should anything go wrong with one of the drives later. You can check on the status of the array by typing cat /proc/mdstat. You should see something like this:

Ubuntu | cat /proc/mdstat

Now we need to create the mdadm.conf file. This is done with a single line: sudo mdadm —detail —scan > /etc/mdadm/mdadm.conf

Next we need a mount point for the RAID array. This can be thought of as a very large directory. I like to use things like "ebs", "volA" and "volB" to keep things simple, but you can call it anything you'd like. Let's make that mount point now: mkdir /volA

Then we add a line to the end of the /etc/fstab file. This file controls how drives are auto-mounted after a reboot. Type: /dev/md0 /volA xfs errors=remount-ro 1 2

Please remember to change /dev/md0, /volA, and xfs to the array, mount point, and file system you chose earlier, otherwise none of this will work when you reboot. Now you should be able to mount the RAID array by typing: mount -a. This command will mount anything listed in the fstab file that isn't already mounted.

If you've made it this far, congratulations. You're all done. Just to make sure that everything is running properly, do a quick cat /proc/mdstat and df -h. You should see something like this:

Ubuntu | Final mdadm Check

Not too bad, if I say so myself. The RAID array is still being checked, and we can see the five 2TB drives have been assembled into a single 8TB partition just waiting to be crammed full of wonderful data! But remember; this is just for academic purposes. There are several advantages to this sort of situation, and several disadvantages. At the end of the day, though, it really comes down to what you need the machine to do.

Another Middle Finger

Seems I can't get anything done today …

WTF Does This Mean, PERL?

Not only is it damn near impossible for me to do, what should be, the simplest of tasks regarding Amazon's Route 53, but I have to read some smart-ass programmer's stupidly long texts that have sweet nothing to do with what I actually tried to accomplish. Thanks CPAN! You utterly useless stack of binary crap.

Ubuntu's Partition Magic

Kenji is going to love this the next time we chat:

Ubuntu 10.10 (32bit) | 8TB RAID5 Partition

Here we have five 2TB hard drives strung together in a RAID5 partition on the 32-bit version of Ubuntu 10.10. That's eight whole terabytes of goodness waiting to be put to use, and there's a lot more waiting to join the party.

Long story short, Kenji's asked me to build him a server that's bigger, faster, and more resilient to failure than mine. By the end of this week he'll have a monster home server, I'll have the chance to turn theory into practice, and you'll have some interesting articles showing how to get the most out of some old hardware.

This is going to be fun.

Windows Home Server 2011 vs. Linux

Windows Home Server 2011Microsoft has recently put out the first Windows Home Server 2011 Release Candidate, and I've been playing with it for a while now. The previous release was built on Windows Server 2003 and provided some excellent stability. This new version sits on top of a slimmed down Windows Server 2008 and requires a great deal more processing power.

WHS and WHS2K11 Requirements

This update seems to require quite a bit more server for the sake of being a server yet, after spending some time with the system (from the last two betas and this release candidate), I fail to see the reason behind the spec update. A faster CPU? That can be explained any number of ways. But why 64-bit? The system is not making true use of the 64-bit architecture, nor should this backup server be used to host software tools such as a database or web site. Same can be said for the RAM. It's true that more is generally better, and that memory is pretty cheap this year, but where is that memory going?

Looking at the feature list of the two versions makes one wonder.

The most popular and useful feature in Windows Home Server, Drive Extender, was eliminated last year for reasons unknown. Technical problems were cited but, for a company as large and intelligent as Microsoft, this sounds like a cop out. The Drive Extender function made Home Server a real home server. People could add a hard drive to the system and not have to think about what kind of data would be stored on the drive, how it should be partitioned, or even how to manage it. The OS would do it all automatically, making the system simple enough for anyone to use and enjoy. Without this feature, Microsoft is saying that customers will need to know the purpose of each hard drive, and specify where music, pictures, videos, and everything else will be stored. This isn't a problem for people who have a solid understanding of Microsoft tools, but it would be one heck of a learning curve for people who have never had to think about more than one hard drive at a time.

With this in mind, what benefits will Microsoft offer consumers in their Windows Home Server 2011 systems that couldn't be found elsewhere? Automatic backups? No. Redundant storage? No. Flexibile expasion? Not really. The system is, from what I've seen so far, little more than a spare Windows box sitting in a corner running a version of Windows and serving files on demand. The type of system we usually use our old computers for with their desktop OS after being sent to the back of a closet. The same limitations that we see on most desktop OSes are still found in the next version of WHS despite being built on a server platform, so why would anyone want to spend the money for Windows Home Server 2011?

Ubuntu ServerThis could be a great place for Ubuntu to chime in. Ubuntu is a human-friendly version of Linux that has slowly gained traction over the years. Back in 2007, I had even mentioned that it would be the version of Linux I'd be comfortable having my family use, and it's improved markedly since then. Like many operating systems, Ubuntu comes in two flavors; desktop and server. With a little bit of tweaking, a very user-friendly version of Ubuntu Server could be put together that would do pretty much all of the same things Windows Home Server does, and carry a much smaller price tag.

Nicer still is Ubuntu Server's system requirements: a 300 MHz CPU, 128 MB RAM, and a 1 GB hard drive. This kind of configuration was popular 12 years ago and can be bought at a used computer store for less than 2,000円 most of the time. What's not to like about this?

The more I think about it, the more I see Microsoft as having lost touch with the world. Normal people don't want to think about the contents of their computers, let alone configure servers for optimal performance. Instead of making things simpler, they move towards greater complexity. We see this time and time again from so many companies, but it's those who offer simplicity who often win out. It's seen time and time again, but understood by few.

The next version of Windows Home Server will not have Drive Extender, cannot support partitions larger than 2 TeraBytes, over-complicates procedures with text-heavy wizards rather than automated functions, does not play nicely with system backups, and fails to offer a competitive pricing scheme despite Microsoft's decades of experience in over-charging. Ubuntu Server, on the other hand, doesn't need to suffer from any of these faults.

To prove the point, I'll have a post on here later this week showing just how easy it is to install and configure Ubuntu to be a central network repository. No Microsoft certifications required!

Perhaps it's time for Microsoft to shake up their management and let some of their incredibly talented engineers build the simple systems we know they can.