How did you get involved in this work?
Lindsey (BPS): In sustainability we’re often trying to nudge people to change their behaviors – to recycle, bike, etc. – so I became interested in research about what motivates people to change their behavior. It’s hard to change, even when we want to (hello, failed New Year’s Resolutions).
While working with Behavioral Insights Team, through the City’s participation in What Works Cities, I’ve seen how behavioral science and evaluation can be used to improve government programs, policies and communications.
Renata (PBOT): The Portland SmartTrips program has always been rooted in behavior change research – it’s designed to encourage people to try new ways of getting around that work best for them (including bike, transit, walking and more).
We’ve worked with Alta Planning + Design to incorporate Stages of Change theory into our individualized marketing program, and recently worked with Ideas42 through the American Cities Climate Challenge to audit our communications materials and develop a campaign to encourage more people to use neighborhood greenways.
What problems are you solving for?
Evaluation allows us to test what works and what doesn’t.
By using randomized controlled trials – the gold standard of research evaluation – we’re able to pilot multiple approaches at once and find out which works best, by how much, and for whom.
When we have clear data that something is working well, it gives us confidence to continue and expand it (and assures leadership it’s worth investing in). If something doesn’t work, while it’s hard to hear, it pushes us to try something different, rather than spend time and money on something that’s not effective.
Behavioral insights help us understand how people actually make decisions, so we can design communications and programs that work for real people.
Most of the decisions we make throughout the day are quick, gut decisions, based on habit, or what’s easiest, or by what people around us are doing. Yet when we design City communications and policies, we often assume people are going to be giving us their full attention. (Not to be the bearer of bad news, but we’re probably not the most exciting or important thing in people’s lives.)
Key take-aways and successes?
Lindsey: Staff across the City have used behavioral science and evaluation to design and test new ways of doing things. A few of my favorite examples are:
- Increased sign-ups by decreasing clicks: We found that by reducing one step in a sign-up process, 42% more people signed up for the program.
- Reduced angry calls and staff time: By re-writing a warning letter with easy to understand language and actionable steps, we saw a substantial drop in angry callers and the number of people who filed appeals, saving hundreds of hours of staff time and paperwork.
- Descriptive images get messages across quickly: We found simple, descriptive images of COVID safe behaviors made it easy for people to understand what actions they needed to take – even when we removed text completely. (Read more.)
Renata: Some examples from our work encouraging active transportation include:
- Timing matters: People who had just moved homes were 4 times as likely to try BIKETOWN than those who hadn’t.
- Catch people before habits are set: Portland new movers who receive information and encouragement to try new ways of getting around (biking, transit, walking, etc.) decrease driving by 8%.
- Reduce program costs: By doing a communications audit and A/B testing our Welcome Newsletter we found that the new version was so effective at getting people to sign up that we were able to eliminate a postcard from our standard series of mailers.
What makes you excited about this work?
Lindsey: There are so many success stories from other governments we could learn from or replicate.
- Seattle made small changes to a traffic fine notice that’s expected to prevent 22,000 drivers a year from going into debt collection.
- Philadelphia found that using larger envelopes increased the number of seniors signing up for a water discount program 10-fold.
- Washington, D.C. found that adding a reminder letter to families who received low-income assistance could prevent 744 families from losing their benefits every year.
And it’s not just communications – Washington, D.C. has a robust team of staff working across departments that have tested whether body-worn cameras changed police or resident behavior, and when they found it did not, they applied the same rigorous testing to a new training program designed to improve police-resident interactions. New York and Chicago have improved 311 services, increased diversity in Fire Department hiring, and showed that a plastic bag fee is more effective than a ban.
Renata: Learning about how to design with behavioral science in mind has provided my team with a valuable template to use in creating programs and communications. We no longer make decisions solely based on what’s available or what’s easiest for us to carry out, but now use data and science which we can trace all the way to evaluation. It makes us feel intentional and empowered in our work.
Every time I design a communication, I use The Behavioral Insights Team’s easy-to-remember EAST acronym:
- Easy: Is this easy for people to understand and do? (Get rid of hassle and jargon!)
- Attractive: Do key messages and actions stand out? Have you emphasized the benefits?
- Social: Can you show real people taking the action? Can you quote a community member encouraging others to try it?
- Timely: Are you reaching people when they’re most likely to take action? When they most need a reminder?
How can staff get involved?
There are great articles, books, and resources to learn about these tools. Here's a few to start with:
- What is a ‘randomized control trial’ and why are a growing number of local governments using them?
- What is a behavioral ‘nudge’?
- Nudge in the City: Using Science to Improve Public Services
For DIY resources, check out the Behavioral Evidence Hub checklists to use when drafting letters or emails or thinking about how to simplify complex processes so they're easier for people to navigate. For help designing a high-quality evaluation, Ideas42's A/B testing tool can do some of the heavy lifting for you.
For expert assistance testing a new idea or a change to a current practice or policy, BetaGov.Org is a great resource. They offer foundation-funded assistance at no cost to cities.
Or reach out to Lindsey Maser (BPS), our liaison for What Works Cities evaluation and behavioral science work. She can offer advice or connect you with resources. And if you're already using evaluation or behavioral science in your work, let us know!