Don’t make me (not) think. Responsible design

Steve Krug’s famous call for clarity is sensible and easy to understand. His book Don’t make me think deserves its place as a design classic. As a primer it is useful to understanding how other people think. Touching on how cognitive barriers can prevent users from achieving their goals. Don’t make me think suggests a path towards reducing friction. Adopting it blindly as a rule is dangerous.

A lens on the past.

This book came out in 2000. Back then, computers were unfamiliar alien objects with impenetrable interfaces. The internet was niche, and often little regard was given to the actual experience. The space jam website from 1996 is a beautiful artifact that shows how things were at the time. Don’t make me think was a needed clarion call to making digital things more usable.

How we think.

Humans are surprising, and diverse of thought. Technically savvy people think in a way that is frequently far removed from the rest of societies expectations. To massively generalise, engineers understand processes, logic, and think in systems . Regular Humans have systems too. They just don’t follow the implementation models of software. Designs role is to make systems behave how the people using them expect them to work.

The conceptual model.

This isn’t the same as how things actually work. It’s a subtle and important distinction. A good real-world example can be found in your kitchen. If you ever set your oven to a high temperature to get it hotter faster you have (probably) fallen into a trap. Most ovens, like thermostats, heat at the same rate. It’s a reasonable misunderstanding. The conceptual model you hold of how you think something works doesn’t line up with the reality of how it actually works.

There are times when you can be right in understanding the concept of how something works in one situation, but wrong in a virtually identical situation.

Take music as an example. Playing a song from your phone connected to a speaker with Bluetooth is different than playing over wifi. Bluetooth acts like an invisible wire that tethers the phone to the speaker. Playing a song over wifi cuts this tether and effectively reduces the phone to remote control.

When is it okay to trick and deceive people?

These deceptions exist in lots of software. To give the impression of speed, Apple show a screenshot of the users desktop when they power on their computers. Likewise, Twitter “works” when you are offline.

Picture of a magician turning the words interaction designer into distraction engineer
Interaction Designer is an anagram of distraction engineer. Illusionist image by sobinsergey

Don’t believe me? Try it this yourself. Go into airplane mode and like some tweets. It will work without an internet connection. A way nicer way of handling things than popping up an error. As a result, the action is only applied when you go back online but that doesn’t matter. Twitter assumes success, and handles failure.

Feeling fast. Speed as deception.

There is a reason why you walk so far to collect your bags when you get off an airplane. This walk is designed to occupy you while they unload the plane. People hate waiting. Perceived efficiency is often as good as, if not better than, actual efficiency.

Make it easy to use.

This seems a noble goal. That said, there are times when cognitive ease is bad and friction is good. Sometimes you want to engage what Daniel Kahneman and Amos Tversky called System 2. In essence, thought is divided into the 2 systems. System 1 is automatic and quick. System 2 is slow and deliberate.

Forming design.

Seemingly small design decisions like the positioning of labels with input fields can serve to slow down or speed up people when they are entering information on forms. Using a single checkbox for decisions puts the burden on the person to understand what the impact is. This is utilised as a dark pattern to manipulate and trick. A simple cheap solution exists in the form of radio buttons which spell out the decisions without requiring cognitive leaps.

As an aside, if you want to learn more, read the book that Michael Lewis wrote instead of Kahnemans. It’s far more fun and way less academic than Thinking fast and slow. Personally, I’ll take a love story over a science journal any day.

Rules of engagement.

The Privacy by Design guidelines by Ann Cavoukian are an interesting and often cited source worth mentioning. There is criticism that the guidelines are vague, favour corporations, and are difficult to adopt/enforce. This is probably valid, but a little harsh. Personally, I see the guides as a decent framework that can serve to discuss values. The European General Data Protection Regulation (EU GDPR) incorporates these guides.

Privacy by Design.

  1. Proactive not reactive; preventive not remedial
  2. Privacy as the default setting
  3. Privacy embedded into design
  4. Full functionality – positive-sum, not zero-sum
  5. End-to-end security – full lifecycle protection
  6. Visibility and transparency – keep it open
  7. Respect for user privacy – keep it user-centric

Brave new world.

As a tool these guides can help to ensure the impact on people is considered upfront. It can at least facilitate the conversation. Like Neil Postman predicted, we are all amusing ourselves to death. So, we’ve given up our privacy and let companies look deep into our lives. More than we probably should.

It is absurd to say we understand what we are giving up. Companies claim we are willing participants, but that is clearly a deception. Honestly though, do you really trust Jeff Bezos?

Your rules and my rules are likely different. We are all moral creatures. As such, you are welcome to disagree with me, and I often contradict myself or otherwise fail to live up to the standards I set. But it is worth considering your impact on society.

Price of success.

I am in a very privileged position in that I can determine the kind of place I work, and the type work I do. I have worked on things in the past that I wasn’t proud of. Honestly, when I was starting out I didn’t give enough thought to the actual impact of the work I was doing.

Personally, I don’t really want to work on addictive software that nudges people towards compulsion, depression, and isolation. I mean, I wasn’t designing bombs, but I did work on some questionable products that treated people as means. It’s all very well to moralise on this now that I can afford to, but as I have gotten older I am more interested in this stuff.

Destructive innovation is the term for things that get destroyed by progress, and is seen as the cost of doing business. Amazon killed bookshops and ironically brought them back in 2018. We don’t consider the bad side of what could happen if we are successful when designing products.

Sometimes make me think.

Cennydd Bowles has written an incredible book called Future Ethics. Watch Cennydd speak over on ethical.net. He makes a more compelling case than I can for when it is important to make people think. The book is essential reading for anyone interested in Ethics. It is not preachy in the slightest, and cleverly frames ethics as a constraint. Designers love and understand constraints.

In the book he contradicts the idea of data transparency. The case instead is for making data more material and visible. If we could see information flow, we would be better informed on the exchanges we were making with people our data. He also keenly articulates the need to widen the net when considering stakeholders in our design. AirBnB serves as an example of something that is well designed for two specific groups (the host and the guest) but bad for the local community. Dublin has felt the downside of AirBnb immensely.

1 thought on “Don’t make me (not) think. Responsible design

Comments are closed.