The problem with behaviour design

The term ‘persuasive technology’ covers a broad set of technologies and techniques combining computer science with behavioural psychology. It’s particularly topical today because of the recent (unsurprising) revelations about Facebook and what it’s been doing with user data. But it’s something I’ve been thinking about for a while, because it’s interesting and has a lot of practical consequences for life in the 21st century.

It’s no surprise that interactive digital technology can be designed to change people’s behaviour. After all, lots of things can change our behaviour, from an attention-grabbing ad campaign to a road design that makes drivers respond differently. But we sometimes forget that design has ethical consequences: a charismatic brand can mask environmental destruction; good road design can save lives.

‘Persuasive technology’ is in the same vein, but in the digital context it represents something new. This is because it’s possible to create digital interfaces that are ‘mass-customised’ to individual users, learning from their past behaviour to target them better. This isn't persuasive in the traditional, rhetorical sense, because it’s not about using language, logic or delivery to change someone’s mind: it’s about bypassing a user’s rational thought process to change their behaviour.

Nudge nudge

In the 1960s, the psychologists Kahneman and Tversky started exploring the ‘predictably irrational’ ways that we humans behave. They reasoned that humans have two cognitive systems – System 1 and System 2 – which work in parallel. System 1 is fast, involuntary and intuitive, whereas System 2 is deliberate and rational. We need a balance of both in order to survive, and the interplay between them is what makes us human (one of the challenges I faced, moving into creative work, involved learning to balance the two systems, move between them, and to muzzle one or other at different times). Through this framework, and by exploring other facets of perception and decision-making, the researchers explained how well-known cognitive biases such as the anchoring effect, or risk aversion, can arise. Walter Mischel used a similar framework, using the terms ‘hot’ and ‘cold’ decision-making to explore willpower, starting with the famous marshmallow test.

Then, in the 80s and 90s, Richard Thaler used these ideas as the foundation for Nudge theory. Nudging is about using cognitive insights to help people make ‘better’ decisions. You don’t have to be gullible for a nudge to work – you just have to be human. Nudges have indeed been shown to be effective at helping people, especially in situations where there is a mismatch between long- and short-term interests, like saving for pensions or helping someone lose weight. 

Speeding nudge: the emoji on these signs work on our desire for social approval.

Speeding nudge: the emoji on these signs work on our desire for social approval.

Nudges can also be used to exploit people against their own interests – as in the so-called ‘Dark Patterns’ of digital user experience design. Again, this isn’t a question of having poor judgment or being credulous: it's something that, for the most part, bypasses reason. We are all susceptible to this because our human brains have to use heuristics – shortcuts to simplify decisions – and for the most part, we don’t even realise we’re doing it. This means that if someone wants to exploit us by designing their product or website to bias users towards certain outcomes, they can. Moreover, if they are able to analyse some data on your past behaviour, the company might be able to predict what kind of nudges you’d be likely to respond to, and adapt the site instantly. We, the users, usually have no idea this is happening. (For more on psychological profiling see Apply Magic Sauce, by Cambridge University)

 

Exploiting internal tensions

BJ Fogg, the Stanford academic, popularised the phrase ‘persuasive technology’ from the 1990s. His simple ‘Fogg Behaviour Model’ three conditions for a user’s behaviour to be shunted towards a particular path:

  1. The user wants to do the thing,

  2. They have the ability to do it, and

  3. They have been prompted to do it (‘triggered’).

The design and timing of the trigger is especially important, since it works best if it speaks to the instinctive/involuntary/‘hot’ system.

If, for example, someone wants to eat more healthily (step 1), but doesn’t because it’s too much effort, they could make healthy food more convenient, for example by stocking up on vegetables (step 2), and litter their kitchen with ‘triggers’ to inspire healthy eating just at the moments when they’re thinking about food – perhaps some pictures of healthy food or fit people stuck to the fridge (step 3).

The simplicity of the Fogg model is part of its power. The ethical problems come into focus when we unpack it a bit, and ask who is changing whose behaviour, and why.

Looking at step 1, there are a lot of other things that we might ‘want’ to do on some basic level, but which we know we’d be better off if we didn’t. For example: if I have a pub lunch on a weekday, it might be nice to have a beer but I know it’ll make me sluggish in the afternoon, so I don’t. But a combination of factors might convince me otherwise – from peer pressure to nice weather. In this case, the ‘hot triggers’ will have gotten me to change my behaviour. This example is innocuous but lots of things fall into this category: drinking, gambling, overeating – and addiction to social media.

Credit: Centre for Humane Technology (http://humanetech.com/app-ratings/

Credit: Centre for Humane Technology (http://humanetech.com/app-ratings/

Behaviour-change models have provided us with a set of tools, nothing more. But an internet funded by advertising seems to lead inexorably to business models based on behavioural exploitation. The consequences of this include addiction, social anxiety, poor mental health, and more. Grindr makes its users miserable (but they still use it). Is this just the price we must pay for a functioning internet? Zuckerberg’s comment that "there will always be a free version of Facebook" suggests he believes data exploitation is here to stay - at least for most people.

How could we design business models for digital tech that are aligned with individual and collective wellbeing?

These questions are likely to become more pressing as digital interactions are built into the physical fabric of our world, and digital interactions are increasingly manifested in the physical world – so-called ‘ubiquitous computing’ and ‘ambient intelligence’. The Facebooks and Googles of the world will be keen to track our behaviour throughout, to improve their targeting algorithms and throw more effective ‘hot triggers’ at us, wherever our attention may be directed at the time. (But tellingly, Facebook execs ‘don't get high on their own supply’.)

What next?

In a world of ‘smart’ nudges, do free will and personal responsibility still mean the same things? Probably not. 

As BJ Fogg himself says: "you should design your life to minimise your reliance on willpower." This includes everything from turning off phone notifications, to choosing products and services that help you achieve what you want with a minimum of exploitation.

My position is ambivalent with regards nudging / persuasive technology - I see it as a designable layer of experience, another material for designers to work with. And just as with materials selection, we need to be aware of our ethical responsibilities. The design theorist P Verbeek suggests that, as users, we should educate ourselves better about how we are influenced. But in practice the time and headspace this would require is a luxury that many people can’t afford. I have less of a problem with the tools themselves, than with the intention behind them.

It’s interesting to consider how organisations and projects could move in a more humane direction, and what a humane internet might look like. The Mozilla Foundation, the EFF, the Centre for Humane Technology, and the customer commons initiative are pushing for positive change in different ways. My own 2017 project, Treasure, took an experimental approach to create practical interventions that help users identify and serve their longer-term financial interests and thereby resist short-term exploitation. 

Digital space has become an exploitative 'wild west', like a city without zoning laws, and I agree with Tristan Harris of the CHT that more should be done to protect us from being manipulated unfairly. This might include regulation, for example with enforced transparency - but this space needs a culture change more than check-boxes. A ‘humane’ approach to digital technology requires a transformational rethink of business models, encompassing branding and product design. This is an exciting creative challenge, and a huge opportunity.

The craft of materials

Photo: Wikimedia Commons, Joost Ijmuiden

Photo: Wikimedia Commons, Joost Ijmuiden

On a sailing boat, you feel exposed, vulnerable even, but also closely connected to the forces of nature. You feel the sea and the wind through the tension and vibration of the ropes, the pressure on the tiller and the movements of the deck. The boat acts as a vector for this connection, channelling streams of tactile information to you, and enabling you to act through it and negotiate your way across the water.

Ashby chart for materials selection (Credit: Granta design)

Ashby chart for materials selection (Credit: Granta design)

In the past, I’ve discussed how designed objects represent a kind of ‘interface’ between ourselves and the world. This interface, and the way it is designed, can determine our sense of connectedness with things outside ourselves. It can make environmental sensations feel close and immediate, or push them into irrelevance.

I've now completed two university degrees - the first in materials science and the second focused on design. In my undergrad, materials selection was touched on briefly, but the general impression was that materials selection can – ‘should’ – be reduced to a mathematical process. Having characterised all the available materials and quantified their various properties, you can just plot a chart, or feed the information into an algorithm, and find the material that has the right balance of mechanical, electrical, thermal (etc.) properties, and cost.

One thing rarely mentioned in the science/engineering context is that our subjective experience of materials forms a fundamental part of human experience. A wooden ship will be profoundly different to sail compared to a modern fibre-glass yacht with aluminium mast, while a large steel cruise ship is designed to eliminate the sensations of the sea for its passengers.

Different styles of ocean transport evolved from material capabilities - and result in different experiences. (photo credit: Clipper Round the World Race; Cunard)

Different styles of ocean transport evolved from material capabilities - and result in different experiences. (photo credit: Clipper Round the World Race; Cunard)

As materials scientists keep creating new materials, designers have an ever-widening palate from which to craft experiences. Of course, each new material came into use for reasons that may not have included the experience: perhaps it let people do things they couldn’t before, or improved safety, or was cheaper. My point is simply that materials mediate experience - and we now have choices.

With my infiltration of the design world, it seems clear that the emphasis over here is much more on these subjective properties of materials. I agree with this view, to a point – aesthetics and subjective experiences clearly matter. When you interact with a product, your opinions of how it looks and feels all get rolled into your overall ‘product experience’. The way you feel when you use a product can affect how easy you find it to use, whether you’re likely to use it again, and so on – and this is all partly because of materials. When Apple wanted the original iPhone to be ‘seductive’, they weren’t just talking about the digital interface, but also the sleek mirror finish, the curved steel body, the expensive-seeming weight, and so on. The physicality and materiality of the object speaks the same language as Apple’s digital experience. (This sense of coherence is precisely what makes Apple products so pleasant to use.)

The original iPhone (photo: Apple)

The original iPhone (photo: Apple)

I think of materials as actors, performing different roles in an experience – which can be a product or environment. Like actors, a material can play a range of roles, but they still have a kind of underlying character or personality. Woods are usually characterised as warm, organic and pliable, and metals as cold, industrial and formal: these sensory associations derive from physical properties like mechanical elasticity, density, thermal conductance and colour, and so are remarkably consistent across cultures. I think these associations still hold when the material in question is structural rather than aesthetic, as with the ships mentioned above (the only difference being that we sense the material’s character with apparatus other than our eyes).

There are exceptions to the general classifications – the formality of ebony, the warmth of copper – but if anything, these prove the point that each material has a distinctive character. And just like actors, materials can be typecast (yet another glass-and-steel skyscraper) and mimicked (concrete apartment blocks with a veneer of brick to look traditional).

I find the old modernist notion of ‘truth to material’ fascinating – although of course today every rule exists to be broken, every philosophy subverted. Materials are an essential part of all physical and digital objects, and materials selection is a vast topic. Stacks of books have been written about it; there are several large companies specialising in it. I’m not proposing a grand unified theory: quite the opposite, I think materials selection is a practice, a modern craft.

Decisions about physical form and materiality can’t be considered in isolation to each other. Materials awareness, and intelligent materials selection decisions are critical to the success of any design – its popularity, longevity, and afterlife – and thus to humanity as a whole.

606 Universal Shelving System by Dieter Rams. Beautiful, understated, thoughtful synthesis of material and form.

606 Universal Shelving System by Dieter Rams. Beautiful, understated, thoughtful synthesis of material and form.