In my last post, I talked about the four pillars of good technology:

  • Useful
  • Purposeful
  • Accessible
  • Usable
(A note: Technology needn't be computer-related, though these days it often is; technology in this post is defined by our use of tools of any kind. When you're evaluating computer tech solutions, it's often helpful to compare them to other technologies we don't even think of as tech--you know, the wheel? doorknobs? your car?)

It might be easy to confuse "useful" with "usable" but the two concepts differ. Here's a short way to evaluate technology solutions based on these four pillars:

1) Useful: Does the technology solution solve a problem that was previously unsolved? For example, I've been searching for a while for a form-building program for non-technical people. There are lots of form-building programs out there; I haven't yet found one that doesn't insist you understand (to at least a small degree) databases and their connections; javascript; and the methodology or at least concepts behind form submission. "Open source" is all well and good if you're a developer of some kind, but if you're a non-techie, the best you're going to do is be able to create drag-and-drop HTML (which is, by the way, not a small thing, and quite revolutionary in its way.)

The point is, all technology--whether it's a web site, a web app, a desktop app, a new console, a new device, etc., must at its heart solve a problem.

If it solves no problem--or, in fact, creates new problems--it's not a good technology. Take for instance Second Life, the most-hyped "big new thing" of 2005. Second Life solved a problem almost no one had: the lack of a First Life. It created a virtual world where one was not needed, nor did it do so in such a way that most people found useful. I mean, it's cool to fly around; but if you have your hands full with your First Life--and your Second Life offers nothing better than your First Life except worse graphics and the ability to fly--it is not a useful application.

Less extreme and more pedestrian examples abound. Google Docs, for example, is useful in a way that many people may not have predicted. Instead of relying on check-in version control systems for techies, Google Docs offers collaborative opportunities and document sharing that--in retrospect--it's surprising that no other easy to use technology ever did before. It's boring to share a spreadsheet; but it's thrilling that you no longer have to worry about your server filename conventions. Conversely, most much-hyped iPhone apps solve problems which didn't exist in the first place. OK,  maybe some people thought it was essential to have ladies' chests jiggling at the shake of a phone, but come on.

The "Usefulness Pillar" divides that which is helpful from that which is merely new.

2) Purposeful: Does it zero in on the problem and solve it elegantly and with as few gewgaws as possible? Most people would put this under "Usability" but I think it's a different issue. There was a great Dan Clowes cartoon in the 90's that was his vision of the future, and one of his most prescient drawings was of the "hi-fi pizza." I use the "hi-fi pizza" as shorthand whenever I see a gadget or technology that tries to do too much at once, instead of laser-focusing on its purpose and executing that purpose with ease and grace. The reason everyone loved the iPod is because it was what it was: an MP3 player. It had a purpose, and it served its purpose well. The reason everyone hates their cellphones now (outside of iPhone users--though I suspect they'll hate their hi-fi pizzas soon enough) is because the cell phone is not purposeful. It wanders aimlessly between its function as a camera, a time-minder, a phone, a calendar, an IM client, an email client, and so on. If the engineers could make the cell phone a microwave, I can assure you they'd do it without a second thought.

Purposeful technology includes dedicated GPS systems, the wheel, trivets, Travelocity, and three-hole punches. Purposeless technology includes fax/printer/scanners, Amazon.com, and anything labeled "all in one".

3) Accessible: People often conflate accessibility with Section 508 compliance or "accessible design" for the disabled and infirm. What most people forget is we're ALL disabled and infirm in some way or another. That's why it's great that the term "universal design" has supplanted "accessible design" in many ways (although it, too, will go the way of all euphemisms eventually).

I believe that people who are "tech savvy" are some of the most insecure people in the world, and protect their technological fiefdoms jealously. After all, if everyone knows BASIC, we could all be out there writing counfounding "GOTO" statements! (Yes, I was born a long time ago.) The point is that tech people love confusing non-tech people and most of us believe that by doing so, we're protecting our long-term livelihood. This is just false, and self-defeating; resorting to jargon and keeping the masses away from the actual building of things like programs, web sites, applications, and models of any kind merely prolongs the inevitable--and keeps the AV geeks from developing really useful skills.

Now, this is not to say that every technology can be made accessible to everyone; but anyone who's old enough to remember the desktop design revolution of the early 90's knows that it's only a matter of time before all the tools become available to the great unwashed.

So why not focus on bringing them to their doorstep now, and training people in their proper use, rather than wringing one's hands when suddenly anyone can make a new mousetrap in the privacy of their own home? I mean, the open source movement is (supposedly) predicated on this idea, though in practice you have to be a tech guru to make sense of any of it--and like wikipedia, much of what's produced is crap. But imagining that the production of things won't be demoocratized eventually is like saying "OMG! Now ANYONE can set type! The world is ending!"

Anyway, "accessibility" to me means that anyone with half a brain should be able to use the technology easily; and if they can't, it's a design flaw. And trust me, I didn't always think this way; I used to be the biggest proponent of "you're obviously not qualified to use this, if you can't see its genius." But years of working with other human beings have taught me that the most beautiful solutions are almost always accessible to everyone; and if they're not, they're probably not working.

4) Usable. Look, I'll leave this to the estimable Jakob Nielsen. Or not.

Usability's gotten a bad rap among most of us who work with screen-based media, and rightly so. Principles were expounded and discarded and then thrown in our faces; rules were made and broken; suspect methodologies were tried and people paid a lot--a LOT--of money for them. The reality of the design world is that you can only "focus group" things that exist, and even then, doing it in a closed room in Bala Cynwyd, PA, with a bunch of sandwiches and a one-way mirror, is not a realistic environment in which to do so.

Well, ok, so then come the "usability experts" who claimed to have developed a scientific methodology involving, of all things, eye-tracking and "heat maps" and video cameras and so forth. This, everyone claimed, would give us "data."

Forget whether this data was actionable. I mean, a web site or an application is a living thing, so you can't test it on paper or in static form; you can only test it when it's done--and by then it's too late to change its core. I've heard of people "usability testing" wireframes and other prototypes and quite frankly I think it's BS--interactive media just does not come alive 'til it's done.

Furthermore, the whole idea that there's some universally-agreed-upon set of usability standards is nonsense. Ask ten people, you'll get ten answers. "Is it usable?" is the new media equivalent of "Did it test well?" and both questions are absurd.

Yet--yet, there's something there that people want to get their hands around, and they want to do right, and they don't have the objectivity to measure themselves. So what is this thing, this "usability" stuff? What is it, really, and how can we make it happen?

As you can see, I'm not a big fan of testing; it's a false environment no matter how you cut it, and the metrics used are usually wrong. So how can you make something usable?

First, you can follow the three principles outlined above; if you do that, you're 3/4ths of the way there. But second, you can use some common sense and follow some basic rules. Does everything work? This sounds simple, but it's not; most products and technologies are not adequately QC'ed, and 90% of their "usability" problems would be solved by good quality assurance. Second, is it simple and uncluttered? The iPod is the best example of this. For every button, nav item, wire, or hoozit you want to put on your creation, ask yourself whether it's necessary. Eliminating hoozits (as Dr. Seuss as that sounds) is the key to most of the rest of your usability problems. And finally... does it fit with the human body or human language? If it's a device, make sure it is sized correctly and gestural in the right ways; if it's a web site or application, make sure the words, signs and symbols used are clear, appropriate, grammatical, and well-arranged.

Again, even if you only do the first two items--making it work and making it simple--you'll have better usability than 90% of the products out there; and following the other three principles will do the rest. And that, my friends, is my manifesto!