Yokum discusses wide-ranging issues on first Behavioral Cities Webinar

Tuesday, Oct 15, 2019

David Yokum, Director of the Policy Lab @ Brown, joined Center Assistant Director Leslie Rowley and ideas 42's Managing Director Anthony Barrows in the first Behavioral Cities Webinar in October 2019.  The webinar was a continuation of the conversation that sprang forth from the November 2018 Behavioral Cities Summit convened in Princeton by the Kahneman-Treisman Center and ideas42.

On the webinar, Yokum spoke on the conception of the Policy Lab @ Brown and the key elements they were sure to put into place when setting up the partnership with the Rhode Island state house and its various agencies.  He discussed some of the quick wins for establishing trust, the need to keep constituents in mind in the planning of behavioral interventions, and the importance of building capacity for behavioral analysis within the government.  

The recording of the full webinar is available.  Selected highlights:

Pre-Analysis Plans Reap Benefits Beyond Research Integrity (24:49 - 28:04)

David spoke on the benefits of having and sharing written and registered pre-analysis plans, beyond just increasing research integrity.  After sharing the plans with community members in a number of different forums, his team not only received feedback that was…

“informative for the scientific enterprise, but it also was remarkably helpful for helping with the health of discussion once we had the results. So normally, what happens after you do evaluations…if you don’t have a process like this, in my experience, people jump to quibbling about the method as a function of whether they like the results or not.  And if you don’t like the results, you start to pick apart all the things that were wrong with the design.  With this approach, with the pre-analysis plan, it’s not that you don’t make mistakes – our study had tons of limitations – but we had already talked through them; we already knew them; we had already agreed that this was the best we could do in this scenario.  And so it really short-circuited the jump to just want to quibble about the method and shift toward what do we to do next, in a very positive way.”

 

Orchestrating informal touchpoints to help people be proximate is a key capacity-building step. (30:20 - 32:49)

When asked about building an internal community of practice around analysis and evaluation, David agreed that:

“Community of practice really is the right phrase to have in your mind because a lot of it is not bringing in new people, it’s identifying folks that are already scattered across government, that are scattered in the community who are excited about this, that don’t know about each other or don’t have a platform for talking about this type of work or ultimately rolling up their sleeves.”

But he warns that sometimes we lose sight of the importance of creating informal touchpoints.

“It can be easy to accidentally think you need to sort of over design some beautiful massive conference or something like that where really maybe the first thing you need to do is just start emailing the ten people you know today who might be interested to meet next week over a beer or lunch to talk about a project and then ask them to bring a friend next week and then you sort of snow pile on that.” 

 

A way to frame the value of evaluation is to make salient the costs of “remaining ignorant.” (40:59 - 42:56)

How we frame and communicate the importance of evaluation is important for buy-in.  But doing this effectively can seem difficult.  David suggested that we often think we know how well something works, but may not really know this.

“In my experience, if someone is reluctant to do an evaluation—not only an RCT but almost any other sort of evaluation—it’s often coming from a flash intuition that we already know what works.  And if we do, then they’re right.  So, in many of my conversations, it can be cathartic to say aloud, literally ‘You are right. If we know something is working, we should just shift our focus to just implement it as well as we can and just get in the funding for it assuming it works well enough to justify the cost. I agree!’  But now let’s press a little bit on how well we really do know this program works.  And if it turns out—and this is my experience 99 out of 100 times—if you really start to press on that, it kind of chips away at that…This is an exercise of keeping really at the forefront the cost of not spending money well and an anchoring on that common mission.”

 

“Real problems that are worth answering almost always have good stories fall right out of them.” (44:55 - 47:44)

David mused about the kind of gruesome storytelling that his young boys do when they are engaged in imaginative play and suggests that they are trying out scenarios and what they could do differently to get a different outcome. At the root of that practice of future problem avoidance is counterfactual thinking.

“I think we might actually be wrong when we think that people aren’t deeply wired to think counterfactually; I think it’s more a reflection that we’re not at all wired to understand coefficient tables and stuff like that.  Stories are an interesting gateway into how to talk about counterfactuals in a way that maps with how we intuitively want to think anyway. You just want to make sure you’re telling stories that are built on some of the quantitative data that you have. A big problem that happens now is you have stories that are anecdotal in the sense that don’t have any quantitative data and then you’ve got a lot of writing that is all quantitative but they don’t try to wrap around compelling stories. Investment in doing both is really important. 

David said that running the exercise of what the story will be before the interventions are run—in a single visual or single graph—helps you think about what the thing is that is worth doing in the first place.