Friday, November 18, 2005

The Problem with Computing

The Problem

The problem with present computing boils down to this: computers are hard to use for the majority of tasks.

Why are Computers Hard to Use?

At first this seems like it should be a simple question, one that could be answered by conducting some usability studies. In fact, it is an extremely difficult question and the answer is non-obvious.

Ancient Ideas

Lets hop in a time machine and go back to the dawn of modern computing, a little over thirty years ago. At Xerox PARC, the Alto is born. Armed with a glorious 128KB of RAM and a 2.5MB hard drive, the machine contained every major UI concept we have today: GUI and mouse. The modern PC is nothing more than a glorified Alto. The display is now capable of billions of colors instead of black and white. Most people use a two-button mouse, down from the Alto's three. Oh, and windows can now be translucent while you drag them around.

The rest of the power of computing is wasted. Hardware has improved by orders-of-magnitude so that the software will do what it has always done, except faster. Users still experience crashing systems, must learn about intricacies of networking, figure out when to double click and when to single click, etc. Low level concepts litter the user experience.

One example of this is how network resources are handled. For some reason, operating systems still don't assume applications are going to use the network, which means, among other things, that there is no consistent method of handling errors. If you are disconnected from the network and try to go to a website, the browser spits out some useless error, instead of telling you that you are not connected to the network.

Abstract Engineering Concepts

Two key ideas that make computing difficult for the majority of users.
File Systems
Out of the primordial ooze of modern computing, from a time when only engineers used computers, came the idea of the file system. File systems are an OK way to store data and manipulate it when using a single computer. They make sense when the designer of the data store has no idea what facilities an application will need.

A typical modern file system is logically structured as a hierarchy. However, a rigid hierarchy is only partially useful for working with data, and soon symlinks were tacked on as an afterthought. This incremental improvement may have seemed like a good idea at the time, but the choice haunts us to the present day.

Think of all myriad ways we package and organize files: varies file compression schemes, package management systems, application installation programs, IDE project files, countless specialized file organizers like itunes, and document management systems, just to name a few. What all these schemes have in common is that they take a set of files and perform some action on them.

I see no reason to reinvent the wheel every time a useful new batch program is made. Perhaps more of these tasks can be pushed down into the filesystem. For example, a 'directory' could be designated to use some form of compression, and thereafter any file added to that directory would automatically be compressed. If that 'directory' were to be copied or uploaded, it would be sent as a single compressed file. Certainly in Unix, this concept is not new. There already exist special files like the TTYs, that cause the OS to perform an action when they are interacted with.

Present file systems also suffer because they are unaware of the network. This leads to many extremely common problems. Files are duplicated between local machines and fileservers. Special software must be written to enable even simple collaboration on a given file. An even more basic problem is that of limited file access. At my family's home, we have 4 desktops, 3 laptops, and a pda. My dad needs access to some file from his PDA. I need to work on files from both a desktop and a laptop. My mom would like to play her casino game from any machine. But there is no fileserver. The result is information anarchy.
Applications
What is an application? Depends on who you ask. An engineer might tell you it is code that executes on a cpu and usually performs some operation on data. I think the remaining 98% of the population will say an application is a file of some kind. They will say that the letter they wrote to their grandmother is an application, or that their contacts are an application. In other words, users make no distinction between their data and the programs that let them work with their data.

If you have ever attempted to explain how to a lay user how to perform a basic task with their computer, you will understand why this is a problem. The user wants to write a letter. You tell them to launch an application like Microsoft Word. You might explain to them the precise steps by which they can accomplish this: click on start, goto all programs, find the office shortcut group, find word, click it. Then the user can start typing.

All the steps prior to typing the document are meaningless to the user, because users think in terms of verbs, but until the user gets into the appropriate application, there is no accessible verb metaphor. In other words, the user does not want to launch Word, they want to write a letter.

This problem is further exacerbated by the lack of a unified approach to application. I would guess that the vast majority of users would have extreme difficulty understanding what a simple text editor is and how they are different from word processors, not because they are incapable of understand the feature differences, but because the interfaces are so different. And that is the simple case. Consider Microsoft Outlook. Outlook uses Word as the email composition editor, yet the interface is radically different. Why doesn't every application in the system use a single text editing component?
Intersection of File Systems and Application
Some problems arise when the problems of file systems and applications are combined.

Why are databases and file systems different? They both do essentially the same thing: store discrete bits of information. The database simply has more metadata. I think the only reason they are different is because "that's just the way it is." It would be useful for many applications to have database actions available at the filesystem level. One obvious area where this applies is computer games. Nearly every computer game needs a store of object/level/monster/weapon information. Most games end up implementing some sort of crude packaging tool to combine all the objects into one file. Also, when this problem is addressed at the operating system level, it is not done well. Consider the windows registry: it only stores configuration information and is also only accessible via Windows API functions.

High Base Complexity

The minimum amount of knowledge and number of concepts (base complexity) a user must understand in order to effectively use a computer is too high.

Consider what a typical office worker must understand to send an email. First, the user must be aware of the concept of email. Most likely they only have a vague idea that email is a little bit like mail, but different. Then the user must think "If I want to send an email, I must start Outlook." Once in outlook, the users needs to figure out how to bring up the new mail screen. Next, the user needs to understand the concept of email addresses. Assuming they know what an email address is, they can then begin working on the actual email. I suspect it would make more sense if the user could intuitively find the contact they want to send an email to, without launching a program, and then click on some kind of "write email" button.

Beyond conceptual problems, the task of system administration is daunting. A typical windows user is simply incapable of administering their own system. Doing so would require knowledge of the registry, screen resolution and color settings, storage partitioning, just to list some basics. Security administration is even more complex, as it relies on detailed knowledge of how various components function, thus exposing vulnerabilities.

Perhaps even more absurd is the knowledge of hardware required to intelligently purchase a computer in the first place. CPU speed, multiple cores, amount of RAM, hard drive capacity, 3D video... I suspect part of the reason manufacturers have not devised a better way of informing lay users about the attributes of a given computer is that they thrive on the user's lack on knowledge. Every time I visit a computer store, I see systems priced from $600-$2500. I can tell that the difference between a store displayed $1000 system and a $2000 system is negligible. I think retailers are hoping their terminology is so confusing that lay users will purchase a machine based on perceived prestige.

Once again, two hard problems (too many software concepts, a lack of understanding of hardware) lead to an extremely difficult problem: powerful user computers means that every machine needs to be administered separately. Since each machine can be configured in myriad ways, administrative tasks that should be simple and automatable become impossible to implement successfully on every machine. Consider Windows security patches. One would imagine that a closed source proprietary system would be reasonably easy to update automatically. But consider the pains corporate IT departments go through to determine if a given patch will break some critical functionality.

Ignoring the Problem

So how have engineers attempted to fix the problem so far? Mostly by explaining why the problem is a problem (aka blaming users) or attempting to fix the wrong problem.

Common Explanations

  1. You say computers are too hard to use, but that is because they are complex. Shouldn't users learn how to use computers? And if they can't, then they are lazy/dumb/etc.
    • Actually, I think users are pretty smart. They can understand complex concepts like business plans and convoluted laws. What has happened to the computer industry is similar to what has happened to the study of english: it has evolved to the point that it is incomprehensible to those outside the field. For decades computers have been built to fulfill the needs of a large sets of people, but have been designed and approved by a non-representative subset of those people. Just because the fraction of the population designing information systems thinks the designs are good and that people need training doesn't mean its true. Perhaps the designers need training and the rest of the population is right.
  2. Users are generally not logical and therefore they need rigid methods of information organization imposed on them.
    • I suspect everyone has about the same ability to organize information. So why should they have to learn the engineer way of doing things? With our present hardware capabilities, we can let them work how they want to work.
  3. Once a users learns about computers, its easy
    • Nope. The problem is that users do not "learn about computers", they actually "learn about tasks". For example, a user will learn the step by step details of sending an email, but the idea of adding an attachment to the email is a whole new thing to learn. Since users do not learn the concepts behind each task they perform, each new task has the same steep learning curve
  4. Users think engineers are snobs because they (users) are dumb
    • Nope. A doctor can converse with a lawyer. A business person can converse with an artist. Most professionals can talk with professionals from other fields and understand what the other person is talking about. Computing is the only industry that has a presence in every single home, school, and business, but whose professionals cannot have reasonable conversations with professionals from other fields. Thus, it must be the engineers who have the problem, not the users.
  5. Users are dumb, we cannot make computing brain dead simple.
    • Perhaps we can't make it brain dead simple, but we can do a lot better than we do now. Why should a user need to know about the intricacies of security when all the user wants to do is write a letter?
  6. The present model of computing might be difficult, but its really useful having the power of a desktop pc.
    • Not really. My guess: 90% of tasks that 95% of users perform are basic (Office functionality, email, managing contacts, etc). The added benefit of increased system performance usually does not outweigh the burden of system administration.
  7. The current way of doing things has been around for over 30 years, so we must be doing something right.
    • No, stability in technology is harmful. Technology does not stabilize, it stagnates.
  8. Look at all the software that IT purchasers and business guys buy. The tech budget of corporations must mean we are doing things right.
    • Nope, they are just buying what is available. Most of the time its not even a technical decision. When was the last time you saw a website selling “enterprise-class” technology, and actually understood what they were talking about, or even what they were selling? The only reason I know Siebel sells CRM software is because I know Siebel sells CRM software. Nothing on their website would indicate that is what they do, or even what CRM is. Heck, my primary job is implementing CRM, but even I find it hard to explain how CRM is different from a glorified contact management system.
    • One reason corporate IT budgets are so high is because the computing industry is doing so many things poorly. So the costs add up in terms of tech support and consulting.
    • Corporations are run by business people. They trust us engineers to build what they need. We tell them “this is what you need”, they say “but I would like it if it did this”, then we say “that would require perfect AI, we cant do that.” But maybe we can do it. Maybe we just don’t see the solution because our tools are so poor.
    • Popularity means nothing. Read Paul Graham's essays
  9. Well, maybe things suck for lay users, but our development tools are awesome!
    • I see you have never written software. Our development systems are so bad that when a RubyOnRails or a TurboGears is announced, programmers are shocked that someone developed a nice, clean, integrated way to develop web applications.
  10. Ok, things might suck for users and developers, but our tools are awesome for science and research.
    • No, our tools are horrible and our abstractions are useless. That’s why all the scientists use software like Mathematica.
  11. Computing isn’t stagnant, look at home much it has improved
    • The hardware has improved. Its improved by several orders of magnitude.
    • The software has not improved. Please refer to the section titled "Consequences"

Attempted Fixes

  1. If we make a bunch of small programs and let them pass data, everything will be great because users can connect these programs however they like!
    • This is the present nix approach. It is usable for engineers, but useless to users.
  2. Our new UI for *insert application type* will make it simple
    • No, you are building on a broken foundation
    • It isn’t just one application that needs fixing, its all of them.
    • You are probably only solving the problem for those who already understand computing
    • Users don’t even know they have a problem, we have beat them into accepting our superiority
    • People do not learn concepts, they learn tasks. A new UI will just confuse them.
  3. Konfabulator shows just how good our interfaces can become
    • Konfabulator also shows how shitty our present computing is. Oh look, pretty widgets that make sense. Cool. Oh, Its kinda weird when I want to add new widgets to my desktop; that interface is kinda ho hum. Back to where we started. Why cant I create a task by clicking on the calendar? Shit. My task list doesn’t appear in outlook! Fuck.
  4. So you are saying that our interfaces aren’t pretty enough.
    • No, they are plenty pretty. They just don’t make any sense.

The Consequences

Now that we saw how the problem manifests itself, what are the effects?

Monetary expense

Mistakes cost money, and computing is full of mistakes right now.

Technical support is a term dreaded by users. Countless people toil in cubes trying to help users with impossible to reproduce computer problems. This is in sharp contrast to web applications which are generally support free.

In addition to the software support costs, the millions of computers distributed across the world need physical repair. This means there are costs to transport the parts, transport the PCs, employ countless certified technicians, and so on. And the downtime of a downed consumer PC is generally far greater than a server in a server farm. And when consumer PCs fail, they tend to lose a lot of information due to the lack of backups.

There is also the cost of distributing the hardware and software. Giant factories need to produce ultra-complex computers. These might then be sent to giant warehouses. The warehouses need trucking to deliver the units to retail stores. The retail stores employ countless salespeople to pester clients. Every step has a cost.

Some monetary expenses are less obvious. The value of lost data is intangible, but significant. And it happens all the time. Every time a consumer hard drive crashes, or a consumer gets a new PC but forgets to copy over all of their old information, data is lost. Perhaps even more significant is the enormous electrical requirements to power the millions of consumer PCs that are utilized for only a fraction of the time they are powered.

Software development

The lack of basic abstractions at the file system means that programmers tend to re-implement a concept over and over again. Consider all the package managers, compressions formats, and so on.

The focus on the platform also leads to the idea of "portable code". Yet the kinds of applications where portable code is important are the exact same kinds of applications that would be better implemented as web applications.

Also, targeting platforms means that software is released in versioned packages. This is an outdated model of software development. Many of the issues that making a release is supposed to solve, like bug fixing, are non-existent in the web application model. Indeed, quality assurance for web applications has more to do with ensuring scalability than, for example, worrying about poor interaction with a video card driver.

The low level concepts littering high-level computing also result in the average person being unable to code. Thus, in many companies the IT department essentially acts like a group of internal consultants. Data in databases is hard to access, so "Business Intelligence" employees design and run queries against a database so that management can get the numbers they need. I think this inability to perform aggregate operations on data can be resolved by improved UI access to data and better models of information storage.

Other Consequences

* Aesthetic Expense

Perhaps less important are the aesthetic consequences of having high-powered PCs. The are generally loud due to the cooling requirements. Also, they are generally fairly large and ugly.

* Inherent Tendency for Monopoly

Due to all of the monetary expenses and the difficulty of administration, there is a strong push to standardize on the dominant platform. I once read an article on slashdot about a company moving from Linux back to Windows. They simply did not have the internal capacity to support both platforms, so they standardized on one.

* The dumbification of selling technology

Technology is now sold in terms of prestige or by intentionally attempting to confuse consumers with complex specs. Consider Intel commercials from a couple years ago: how is the blue man group related to CPUs? Or consider television ads for PCs, where the announcer mumbles through a whole list of specs. Expecting the average user to understand details of each component of a PC is absurd.

* Other Idiocy
o Self-healing machines/networks/etc OR how to sell someone hardware that is more expensive and additional software, while providing them with no value.
o Vertical markets OR our software must be rigid to be reliable and developable, and therefore we need fifty versions of everything

* Obsolescence

When was the last time you absolutely had to run out and by a new computer to run some application? Probably never. Obsolescence in computing is a phenomenon accepted as unavoidable, but this is not the case. Software tends to be stable for longer intervals than hardware, so a way to decouple software from consumer hardware would be useful.

* UI Design

Present UI design assumes engineering abstractions are good and does not attempt to fix them.

My Dad: A Case Study

I will now discuss the ultimate layman: my dad. He has two traits which hinder his ability to use computers effectively: english is his second language and he did not grow up around computing. However, he is highly intelligent and has two masters degrees.

Abtin, can you show me something?

Every time my dad needs to perform some new task using a computer, he asks for my help. Armed with a legal pad and a pen, he takes careful notes while I guide him through the steps of accomplishing what he needs to accomplish. He then attempts to repeat the task using only his notes. If he gets stuck, I help him and he takes additional notes to clarify the step.

At my job, I frequently go through a similar process with employees who must use office applications. Try as I might, it is extremely difficult to teach users the concepts behind a task. Thus, users are given nothing more than simple algorithms with which to tackle complex tasks.

When note taking fails

My dad likes to save images from news sites to his computer. So he asks me how to do so, and I show him. What I would really like to explain to him is how the file system is structured and what that means for his files and how he works, but I have a limited vocabulary in my dad's native language, so the ultra-abstract concepts are too hard to explain.

My dad happily saves his images until, one day, I get a phone call: All of the images he has recently saved are missing! I examine his computer and immediately discover the problem. For some reason, the internet explorer save dialog is no longer saving his images to the "NewsImages" directory I created, but to his "My Documents" directory. Since I cannot explain to him the concept/structure of the file system, I move his files to where he expects them to be and make sure internet explorer is pointing to the right directory. I cross my fingers and hope he does not have the problem again.

The PDA

About two years ago, I noticed that my dad kept several legal pads full of contact names, phone numbers, and addresses. I decided I wanted to help him organize his contacts better, however the thought of teaching him how to use Outlook or any other application was frightening.

One day I was at Costco, where I noticed they had several PDAs for sale. I examined the floor unit and realized that the interface was so simple, my dad would be able to use it. All he would have to do was click on "Contacts" from the main menu, and the resulting contact management screen would be extremely simple. I purchased the iPaq for his birthday and he was delighted to start using it. This was the first time I did not have to give him step by step instructions. I simply showed him how the main menu was structured and he figured out how to use it. I also setup Outlook on his PC so that he could plug his PDA in and backup his data; I did not show him how to use Outlook.

A few weeks later I checked in on my dad. His PDA now contained hundreds of contacts and he claimed that he couldn't live without the device. While I was please by his acceptance of the technology, I was genuinely surprised when I examined his PDA and discovered he had begun to use the scheduling functionality as well: he had entered a number of meetings. I asked if he had sought help from anyone to accomplish this and he said no, he had figured it out on his own.

The Internet

So how does my dad handle the internet? It depends on the task.

News and Email are probably the two most important tools my dad uses, though he does not really understand them. He can send emails to others and he can also pull up a news website. But frequently he confuses the two. He might see a website address and try to send an email to it. Or he sees an email address and tries to open it in a web browser. He is not the only one with these problems: his friends tell him things like "email me at www.somedomain.com".

These would not be such major issues, except that the error message web browsers generate are hideous. They should use simple regular expressions to determine if the address entered is an email address and tell the user the problem. They should detect when they are not connected to the net and say "You are not connected to the internet" instead of "host not found".

I also suspect that frequently, users (like my dad) do not read the content of the web page they are looking at. I've noticed that users will fixate on some portion of the screen and ignore everything else. This causes problems for things like looking at shopping carts, creating accounts or logging into website; these are actions that are usually placed in the corners of the web page, and users fixate on the center of the page.

5 Comments:

Blogger Arvind N said...

Great post. Some things about computers that I always felt about them but could never put my finger on what the problem is.

11/25/2005 6:43 AM  
Anonymous Anonymous said...

I agree. Computers fail to be accessable to everyone that could benefit from them. Ever tried teaching your grandparents to find e-mail? Almost impossible.

11/25/2005 8:48 AM  
Blogger Ole said...

This is a great summary of a problem. So, what's the solution? I read the whole post thinking you were going to propose a solution, then it ended :)

I think many of us have realized some or all of the things you've written about, and yet we continue doing things the same way, because it works for us in some fashion, and because we have no alternative.

The popularity of Macs - based on a simpler GUI - is one indication that people do want simpler and more intuitive computing experiences. The popularity of dedicated-function devices like cell phones and MP3 players and PDAs is another indication; these devices are simpler and easier to use than general purpose computers. (I use a Treo, which combines phone / PDA / MP3 player in one device, which I like, but many people take one look at the Treo's QWERTY keyboard and are horrified!)

11/25/2005 8:51 AM  
Blogger Bob said...

You have a clear grasp of the problems, and no doubt have ideas for a solution. I look forward to hearing about them.

11/25/2005 9:40 AM  
Anonymous Anonymous said...

So I read it again.

You are right, the answer is web-applications. The answer is to not give a user a computer, give them a terminal, let the hardware be at the other end.

12/13/2005 1:08 AM  

Post a Comment

<< Home