Philip Guo (Phil Guo, Philip J. Guo, Philip Jia Guo, pgbovine)

Same interface, works better

The easiest new technologies for people to adopt are those that work better than their predecessors while maintaining the same interface to users. If your technology is better than its predecessors but requires a different interface, then in order for it to succeed, the benefits had better outweigh the inconvenience associated with adapting to that new interface.


The easiest new technologies for people to adopt are those that work better than their predecessors while maintaining the same interface. An interface is broadly defined as the manner in which users interact with a particular technology. For instance, the interface to a computer monitor is its power plug, video input connector, power switch, and menu buttons; the interface to a VCR is mainly the remote control; the interface to Google web search is a text box on a web page. If an improved version of a technology does not require a change in interface, then users who are accustomed to the existing interface do not have to learn anything new in order to upgrade, thus potentially speeding up its adoption rate.

In reality, many advances in technology require changes in interface; for example, the telephone is strictly superior to the telegraph for holding conversations, but people accustomed to sending telegrams had to adapt to learning how to use early telephones. Computer word processing software is strictly superior to typewriters, but people migrating from typewriters to Microsoft Word need to learn about saving, loading, and printing digital document files. If your technology is better than its predecessors but requires a different interface, then in order for it to succeed, the benefits had better outweigh the inconvenience and learning curve associated with adapting to a new interface.

Sadly, there are cases in which new technologies try to maintain the same interface but actually work worse than their predecessors; even more unfortunate are those technologies with different interfaces than their predecessors but which end up working worse. Needless to say, those technologies are often not widely adopted.

To summarize, here is a table providing some examples of technologies in each quadrant:

Works better Works worse
Same interface

Energy-saving light bulbs

LCD monitors

Digital signature pads






Different interface

Computer word processor


Desktop Linux for casual users

This article will mainly focus on describing examples in the first quadrant (same interface, works better), since they are often the most easily and widely adopted.

Same interface, works better

Energy-saving light bulbs

Let's start simple. The interface for an ordinary household light bulb is pretty darn simple: just a single screw-in socket (<insert your favorite Polish light bulb joke here>). The energy-saving compact fluorescent light bulb (CFL) has the exact same interface as a traditional incandescent light bulb, but it simply works better: It uses over 3 times less power, produces less heat, and can last up to 10 times as long before dying. Thus, for most ordinary household applications, a CFL is strictly superior to an incandescent light bulb, with the exact same interface.

Of course, CFLs aren't without their disadvantages: They take slightly longer to achieve full brightness, cannot be used with light dimmers, and might not produce the proper color temperature if you're trying to achieve a certain type of ambiance in your home. However, these limitations should not affect the majority of people who simply want to purchase an ordinary household light bulb.

LCD flat screen computer monitors

Remember the bad ole' days of bulky CRT computer monitors that weighed a ton? Nowadays, LCD flat screen computer monitors are fairly affordable and pervasive. For almost all users, they are superior to their CRT predecessors in every practical way, most notably in size and weight. And they have exactly the same interface: plugging into a computer video output.

Digital signature pads at store checkout counters

You know those digital signature pads at store checkout counters where you use a plastic stylus pen to sign your name on a touch-sensitive monitor? They provide the same interface (signing your name on a flat surface) as the paper receipts store clerks printed out and had you sign back in the bad ole' days.

Of course, some nitpicky folks might argue that you can provide a higher-quality signature using pen-and-paper than you can on a touch-sensitive monitor, but for most practical purposes, aesthetic beauty isn't the primary objective; stores just want to see some scribble that remotely resembles your signature, and most of the time the clerks don't even look at it. Not having to archive (or scan in) reams of paper receipts saves stores lots of money, space, time, and hassle.

Algorithms (e.g., for machine learning)

An improvement in algorithms to solve a particular problem (e.g., using machine learning) can usually be deployed right away without much hassle, because the interface to users remains exactly the same. (Some refer to such improvements as optimizations, since some metric of performance strictly improves while the interface remains identical.)

A classic textbook example of algorithmic improvement involves sorting an array of numbers. Students in introductory Computer Science courses learn about O(n^2)-time sorting algorithms such as bubble sort and insertion sort, and then learn about the much more efficient O(n * lg n)-time quicksort. Quicksort is strictly better than bubble or insertion sort for almost all practical applications, and its interface remains exactly the same (just give it an array of numbers). Thus, it's prudent to always use quicksort rather than bubble or insertion sort.

As a simple real-world application, take face detection, in which the interface involves receiving a photograph as input and outputting the regions in which there are human faces. If you implement an improved face detection algorithm that achieves higher recognition rates and lower false positive rates (without much detriment in speed or memory usage), then it's trivial for the next version of your product to incorporate your improvements. The interface remains the same, while now the face detection works just a tad better.

Small and subtle algorithmic improvements can translate into large revenue increases when there are enough users. Such improvements are easy to deploy since the interface remains constant while the internal implementation changes. Consider Google search or targeted advertisement results. If some smart hacker can improve the quality of search results or advertisements in order to draw more clicks, then those improvements can be deployed immediately. The millions of Google users still keep the same dead-simple interface when interacting with the search engine (i.e., typing text into a box and pressing Enter), but now their results are just a tad better. Or take the machine learning algorithms that generate Netflix movie recommendations: If you liked Die Hard, then you might also enjoy Rambo, Total Recall, and Predator. Netflix relies on these recommendations to keep their customers coming back to rent more movies and to keep their subscriptions active. They take recommendation quality so seriously that they are offering a 1 million dollar prize to anybody who can improve the quality of their results by 10% (see Netflix Prize for more details).

Google web search

Google web search is the poster child of the simple user interface: A single unpretentious text box. Type some words into the box, hit Enter, and within a fraction of a second, the first page of search results will likely contain the information you were seeking.

Those of us who came of age during the dawn of the Web (early- to mid-1990's) might remember the bad ole' days of search engines sucking ass. Before Google came along, the results from search engines were quite dismal, especially if you simply typed natural language phrases into the search box. Search results from natural-looking queries like toyota camry SE prices often sucked ass and turned up tons of spam and shady web sites that have 'gamed the system' using unethical 'Search Engine Optimization' techniques. The more tech-savvy kids learned to use quotation marks and boolean operators to customize their searches, typing in awkward monstrosities like Toyota AND "Camry SE" NEAR prices NOT advertisement in the hopes of getting more relevant hits, often to no avail.

Google web search kept the same simple text box interface as previous search engines but drastically improved the quality (i.e., relevance) of the search results, and made it much more difficult for shady web sites to 'game the system' and artificially boost their rankings. Same interface; just works better!


telnet is a protocol for remotely logging into computers; unfortunately, it's completely insecure, sending your password and data through the Internet in the clear, so anyone who eavesdrops on your network packets can steal your private information. ssh has the exact same interface as telnet, except that it encrypts your connection to prevent eavesdroppers from stealing your passwords and data.

The only noticeable difference in interface between telnet and ssh is that when you connect to a particular computer for the first time, ssh will prompt you to accept the connection (and generate a pair of cryptographic keys behind the scenes). Due to the enormous security benefits of ssh and the negligible amount of extra inconvenience, I can't fathom a reason why anyone nowadays would still prefer telnet over ssh.


rsync is a UNIX utility that copies files in an efficient manner, by only transmitting the differences between two versions of the same file. rsync has roughly the same interface as the cp and scp utilities, which copy files within one machine and across two remote machines, respectively. However, for many purposes (most notably incremental backups), rsync performs the copies faster than cp or scp. Same interface, strictly better performance!

For example, say that you have two versions of a large file F on two machines A and B located across the Internet; let's call these two versions FA and FB, respectively. If you want to copy FA into FB (e.g., for backup purposes), an ordinary invocation of scp would copy the entire contents of FA across the Internet and completely clobber FB, which might take a long time. However, if FA and FB are mostly identical except for some small chunks, then rsync will only transmit the differences (called deltas) across the Internet, which can be much faster.

Using rsync might not pay off for copying small files, especially within one machine, but as the amount and sizes of files increase (especially when transferring over a slow Internet connection), so do the advantages of rsync.

Different interface, works better

Computer word processing software

Almost everyone today would agree that writing documents using computer word processing software (e.g., Microsoft Word) is vastly superior than typing on a typewriter. In fact, kids who are coming of age nowadays probably haven't ever used a typewriter.

Creating, saving, opening, and printing Microsoft Word files might seem like the most natural thing in the world for people who haven't used typewriters before, but for those accustomed to using typewriters, computer word processors represent a profound change in interface: Rather than typing words mechanically on a physical piece of paper, one needs to boot up a computer, open up word processing software, create and edit documents, and then print them out. Although the QWERTY computer keyboard mimics the typewriter keyboard fairly well, the concepts of digital files, saving, loading, and (gasp!) undo-ing are completely different in the physical and digital worlds.

However, word processing software has been immensely successful in spite of the required changes in interface, because the added expressive power and convenience of saving files, making edits and undo-ing changes, and sharing documents vastly outweigh the associated learning curve.


Most of my friends (technically-savvy young people) swear by gmail and can't imagine using any other email service or program. The user interface is simple, elegant, and efficient. However, try to convert an older person or someone who is not as technically-savvy from using their existing email client to gmail, and you might soon realize that it's not so easy. gmail doesn't look or act like traditional email clients; most notably, it displays emails by threads and organizes using labels rather than folders. For people used to traditional email clients (e.g., Microsoft Outlook), the interface to gmail is noticeably different than those of other email programs—different enough to deter some from switching.

Google took a risk by making gmail look and act different from Yahoo Mail, Hotmail, and their other webmail competitors, and they seem to have been fairly successful so far. To innovate significantly often requires breaking away from existing interfaces. Fortunately, the benefits of gmail (e.g., nearly unlimited storage space, convenient web accessibility, powerful search) seem to outweigh the burdens associated with an interface change.

Same interface, works worse


OpenOffice is free, open-source, and cross-platform clone of the Microsoft Office application suite. The noble goal of the OpenOffice developers is to provide a free and open alternative to the expensive (and company-controlled) Microsoft Office. Thus, OpenOffice applications such as the Word, Excel, and PowerPoint clones aim to mimic the same basic interface as their Microsoft counterparts.

However, as many people who have used OpenOffice can attest, it works worse than Microsoft Office. The Word clone creates crappier-looking documents, the PowerPoint clone creates crappier-looking presentations, and the files aren't exactly compatible with their Microsoft counterparts, so sharing files between OpenOffice and Microsoft Office applications is often a less-than-optimal experience.

I'm not claiming that the Microsoft Office way of doing things is more 'correct' or 'beautiful' than the OpenOffice way, but the undeniable truth is that far more people are already accustomed to Microsoft Office, so for any clone that tries to mimic its interface to be successful, it had better provide an improved experience without any major detriments. Unfortunately, OpenOffice currently doesn't seem up to par; it seems like Microsoft isn't going to lose their monopoly on office productivity software anytime soon.

Different interface, works worse

Desktop Linux for non-geeks

Linux has come a long, long way in its journey towards being a viable competitor to personal desktop operating systems such as Windows and Mac OS X. Some distributions, most notably Ubuntu, are extremely easy to install and provide a fairly pleasant user experience. Unfortunately, for non-geeks trying to use Linux on desktop computers, the interface is sufficiently different, and it also works worse.

The desktop Linux user interface is sufficiently different from that of Windows and Mac, so there is still somewhat of a learning curve. For example, users install software via a package manager rather than by double-clicking on a downloaded installer, and must type in administrator passwords before modifying system properties (a GUI layer atop sudo).

Most detrimentally, the usual software that users are used to running on Windows and Mac just aren't available in Linux; instead, they must use open-source clones, which are often different enough and work worse than their original proprietary counterparts. For instance, when users find out that they can't easily run Microsoft Office in Linux, that's often a dealbreaker. (OpenOffice just won't cut it.) It's no wonder that Linux has not taken off on the desktop as some open-source (ponytailed and often heavily-bearded) evangelists had originally hoped. (The good news, though, is that Linux has increasingly dominated the server and scientific/research computing niches.)


Smaller-scale, incremental advances in technology often involve products that maintain the same interface but work better than their predecessors. Big leaps in innovation often contain technologies with different interfaces but work far better than their predecessors.

Finally, when you aim to innovate, make sure first and foremost that you don't end up with something that works worse than its predecessors, regardless of whether you've changed its interface or not. People might be willing to learn your new technology if its benefits are compelling enough, but nobody will be willing to spend the effort to switch to a technology that works worse for them!

Subscribe to email newsletter
Donate to help pay for web hosting costs

Created: 2008-11-28
Last modified: 2008-11-28
Related pages tagged as human-computer interaction: