devxlogo

Be Careful of Cognitive Friction in Program Design

Be Careful of Cognitive Friction in Program Design

My favorite session at the Las Vegas Dev Connections conference last April was the part given by Markus Egger on the topic of Cognitive Friction. I’m embarrassed to say that after 15 years in the business I’d never even heard of it, though the idea is clear enough. The term Cognitive Friction was first coined by Alan Cooper in a book he wrote back in 1999 called The Inmates Are Running The Asylum. I’m currently reading the 2004 edition on my Kindle DX and loving it. Much has been written on the topic and book reviews abound but I think there’s room for one more developer’s 2 cents.

[login]

The Definition

I could cut-and-paste Alan Cooper’s definition, which would probably be more accurate but I’d rather give my own. Cognitive Friction is what you call it when your brain hurts from trying to figure out how to correctly use a computer program.

One of the odd things about this concept is that it’s peculiar to computer software and hardware. Cognitive Friction was hard to find 100 years ago. A hand operated egg-beater isn’t hard to figure out. For that matter, the Model T Ford was pretty simple. ATMs showed up 30 years ago and as Alan points out, cognition started getting sticky.

As often as I run into bad user interfaces I complain about them to my workmates and sometimes to the manufacturers. My workmates say I should start an entertaining blog. Software companies invariably tell me to read the manual. I don’t want to read the manual and I don’t want to write software that requires others to read the manual. It shouldn’t be that difficult. Yes, there is a generic learning curve to operating a computer, like understanding the difference between a right-click and a left-click, but beyond that, programs should be intuitive.

For example, I recently spent 15 minutes in the checkout line of a Wal-Mart store behind a customer who wanted to redeem a gift card. The actions taken by the cashier caused the system to freeze. It literally required a reboot by a manager, who irritatingly chastised the cashier for not punching a particular key on the keyboard before swiping the gift card. This action is apparently not necessary when using debit or credit cards but will bring down the entire system if neglected when redeeming a gift card.

How stupid was the cashier for not knowing or remembering that? I say NOT AT ALL! The blame rests squarely on the computer programmers who designed the convoluted system. THAT is cognitive friction and I’m afraid we’re surrounded by it.

Another thing that recently fooled me at a cash register was a system for collecting my signature after having swiped my credit card. At the top of the LCD screen was a pair of “soft” buttons labeled Approve and Cancel. Down below was the typical signature line. When the screen appeared, I read from top-to-bottom and instinctively pressed the top-most button that applied, the Approve button. The system then threw an error because I had neglected to provide my signature, down below and we had to reset the machine and start over.

Wouldn’t it make more sense to put the signature line at the top and the action buttons that depend upon a signature below? I thought so, and so did someone else. Another store sported a similar machine where the buttons don’t even become visible until the customer begins scribbling his/her name. At least some programmers take cognitive friction into consideration when designing interfaces. Honestly, we all should.

Reduce Your Coefficient

As an author for DBJ I’m required to use an online system for uploading articles but I’ve been cautioned against using it as an example of cognitive friction. On the bright side, the authors of this program are responsive to criticism and are doing what all good programmers must do, reducing the coefficient of cognitive friction in their applications.

Alan Cooper admits that programmers (the inmates) are possibly the worst candidates for designing application user interfaces. When he says, “the inmates are running the asylum” he means that design authority is, in a de facto way, conferred upon the wrong group of people. The right group includes “business-savvy technologists or the technology-savvy business-persons”.

Programmers often stumble on something they think is “cool” and figure it’s good to include in their applications. For example, our friends at ORACLE who generously provide the Java engine, have a unique idea when it comes to rolling out new updates. There is a setting somewhere that will automatically look for updates and inform you with a balloon message in the task-bar, but I generally turn those things off. Alternatively, you can access the Java applet through the control panel and push the button that reads Update Now. This action either tells you that your version is the latest or it opens a window that allows you to install the new one. However, in addition to the obvious “Install Me Now” pop-up-window, a balloon message appears, informing you that there happens to be a new update for the Java engine … just in case you didn’t know.

One might argue that this is an annoyance but in my opinion, that’s being generous. It’s the same as when I perform updates using the Microsoft Windows Update center and an icon in the task bar informs me that there are updates. It makes my brain hurt for a minute. I ask myself questions like “Should I abandon the update that I expressly initiated and perform the update that just coincidentally volunteered itself? Or is this volunteer update suggestion a reflection of what I’m currently executing? Will my system go Kaddy-Whompus if I don’t respond to this volunteer update request? Is it needed, nay even required, before I follow through with the updates I’ve specifically requested, and launched? Should I terminate that process and follow “the balloon”?

You probably think I’m nuts for thinking all of this and my therapist might tell me to take another Klonopin, but this is what runs through my head when programmers throw extraneous information at me. Granted, one’s brain can process millions of pieces of data in a period of time so small I don’t even know what it’s called, but that processing causes “friction”, and unnecessary friction at that.

Follow the Leader

Yeah we all want to be different but remember, you’re special… just like everyone else. When it comes to writing software that doesn’t hurt people’s heads, you have to follow the paradigm users are accustomed to. One program I recently had to use decided that rather than having a user select an option from a drop-down box and then pressing a button, they’d combine the actions. Selecting an option executed an action. Imagine a program with a drop-down box containing the following list:

Make A Selection
—————————————-
Apply a 10% raise to my paycheck
—————————————-
Deliver pizza to the lunch room
—————————————-
Turn off grandma’s life support
—————————————-

Now imagine that at the instant that you click on one of the options, it magically executes. (Quite literally in the third case.) Most users would not be prepared for that option. It would be a kin to the Buy Now button on web pages actually executing a purchase without giving you a chance to review the price, shipping method or payment option. Users would scream and the UI would quickly be changed.

Sure, you’re a cowboy coder and you want to do things your own way, but what I’m telling you, and I think Alan Cooper will back me up on this, is that you can’t … at least not unless it is very, very intuitive. To test that assertion, try standing behind a user who runs your app for the first time. Watch what they do, how they move through the page, where they click and mis-click. Set up a web cam to watch them for an entire day, and you’ll see exactly where your program is causing cognitive friction.

Cognitive Friction Is Not Just For End Users

Just a final note about friction for developers when it comes to writing code and designing databases: don’t make it hard for those who follow you and those who use your interfaces. At least one course on database design suggested naming the ID field for every table as ID. I’m sure this made sense to somebody but they probably never actually wrote any SQL by hand. If it’s a customer table, then call it CustomerID. For the sales person table it’s SalesPersonID. Why? Because this SQL statement …

SELECT c.*, s.* FROM tblCustomer c INNER JOIN tblSalesPerson s ON s.SalesPersonID = c.SalesPersonID

is a lot easier to wirte without making a mistake than this one:

SELECT c.*, s.* FROM tblCustomer c INNER JOIN tblSalesPerson s ON s.ID = c.SalesPersonID

The same applies to naming everything, from classes to variables to files. Names should be descriptive, not deceptive. Don’t make me think too hard when selecting an object. The friction heats up my brain and fogs up my glasses. I don’t like it and neither do other programmers, no more than users appreciate having to bend their brains to accomplish the simplest things.

Conclusion

I hope you’re not disappointed I didn’t include more examples of really poorly designed user interfaces. I wish I had space for them because there’s no dearth of examples and it’s conceivable I’d provide an example of something you hadn’t considered cognitive friction. But the fact is that neither I nor anyone can possibly hope to document all the possible permutations of bone-headed design practices. The best we can hope for is to keep the concept of cognitive friction in our thoughts as we write programs.

devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist