One hidden effect of cloud computing being that it's no longer possible to do what we all used to do: Have an older PC running, say, Windows XP and Office 2000, which was perfectly adequate for our needs. Now the auto-refresh of every cloud app forces us to "upgrade" even if it's too much for our poky older hardware. Unless, that is, you pay attention during your QA process.
One of the touted advantages of cloud computing (especially Software as a Service) is that it removes the need to support older application versions. The IT department doesn't have to worry about keeping end-user software in compliance with security patches or upgrading the business’ computers with commercial applications (such as Microsoft Office) or custom in-house software. Developers appreciate this, too, since with SaaS, nobody issues trouble tickets on the 3-year-old version of the application which users refuse to relinquish.
So far, so good. The user logs in and gets the latest version of the software. Developers and IT know that the version the user is touching is the most recent build, because that’s all that the user can access. Oh boy! Simplicity! Predictability!
But while savvy developers test their apps on older browsers (even, God help them, IE6), I don't think very many test on older, memory-constrained hardware.
Let me explain.
It used to be that you could stick with your old hardware and software because it did the job perfectly well. Many users’ software needs remain constant. My old Dell Windows system (6 or 7 years old now?), sitting on the corner of the desk, runs Windows XP and uses Office circa 2000. I bet it can read most Word docs (though slowly). I replaced that laptop with a newer Windows laptop almost two years ago, but there’s nothing to say that the old Dell doesn’t do the job it was bought to do.
We computer geeks forget that most people are not motivated to buy a new computer. Most businesses try to amortize the equipment over three years, at least, and they hope all the keys will stay attached that long.
In practice, many people use hardware that’s much older. A high school friend works at a nonprofit on 10-year-old hardware, for example. I have a MacBook circa 2001 languishing on a shelf in my bedroom. I have a 4-year-old MacBook sitting on a shelf in the living room; it’s in the same spot where I "temporarily" stashed it while I decided what to do with the old computer after I bought my new MacBook last year. (You notice that I'm not good at actually retiring these things, huh?)
Both of those older computers work just fine, in the sense of no parts are falling off or battery giving up. (Though the really old computer's keyboard has a really strange smell. What’s up with that?) They do everything I asked them to do when I bought them, including run the then-standard software I relied on.
I knew it was time to upgrade the last MacBook when I could no longer run the software I needed. Not just the latest versions of Microsoft Office or Pages (though they got really painful to use) but also the number of web-based things that are now opened-up within 5 minutes of me booting the system. The old hardware is not CPU-bound nearly as much as memory-bound. I'm in minimal mode at the moment on my current system right-this-second, for instance, and I have 8 tabs open in Firefox and 6 in Chrome. On top of Mail, Skype, iChat, a text editor, Preview, and Office. And that's the minimum. I’ve just gotten started today.
With a desktop PC, I could make a conscious choice to keep a system in the stone age. For example, I could stick with accounting software that was four versions “out of date” (because newer versions required more CPU horsepower) as long as that version did what I needed. (And there are just-so-many features that an accounting program needs.)
As someone who embraces cloud-based applications – like the ones you build for me – the “use the old computer as long as it works” option may not be viable for long. Just as hard disks always followed The Law of Closet Space (i.e. you use up whatever you have), developers write web applications with the assumption that their end users have computer systems as powerful as their own… when, I hate to say this guys, most developers justify buying far more heavy-duty hardware than do the rest of us mortals. For a developer, the computer is the primary tool for revenue generation. For anyone else, it’s not. In a small business, the decision to upgrade the aging computer too often is put off because the electrician needs a new truck. (I learned that lesson the hard way, back when I owned a computer store.)
Software sloppiness is an undeniable fact, and it’s one that’s been around since I began programming on mainframes. Few developers work hard to tune performance for more than speed, and those who do compare application behavior on current browsers and current hardware. Some developers and QA staff test web apps on older browsers (including you poor slobs who must support IE6). But who among you tests applications on, say, 2006-era PCs? And optimizes the code to help that user get her work done?
With cloud apps, it’s rarely CPU or storage that are the barrier; it’s memory. I do adopt the viewpoint that it’s wise to install the maximum amount of memory when I buy a new computer (not everyone does), but that doesn’t help. What is considered “Oh wow can you imagine using that much?” today is a ridiculous limitation in three years. Older systems simply don’t have as much RAM as newer ones, totally aside from the increased cost of such RAM-maxxing-out. I upgraded my works-perfectly-fine 2006-era iMac to a newer one last year because the old system had a hard stop at 2GB of RAM; there was nowhere to add more. I spent too much time watching the ball spin as the system thrashed. (My sister-in-law, who got the hand-me-down and uses it purely as a personal computer, was grateful for this technical limitation.)
SaaS apps are just the tip of the iceberg, because mobile apps are just as much of a limitation. They’re going to get worse, too. As SQC contributor Cameron Laird commented to me in an e-mail discussion, “I get old-guy cranky about bloat in Web applications; somewhere I have statistics on how ‘normal’ sites have gone from roughly 100 kB per page to 1 MB over the last decade. As I occasionally and obliquely write, though, the importance of mobile, along with the technical possibilities of HTML5, provides a little countervailing pressure on bloat. Fat (in either screen real estate or bandwidth demands) Web apps can be bad on mobile. Too many organizations still don't get this. They will, more and more.”
Are you doing anything to support users of older equipment? At what point do you (or your organization) decide these people no longer matter? I’d like to hear from you.
Photo credit: BarryKidd