Causal Explorer: a new web toy for exploring causal connections

Causal Explorer: a new web toy for exploring causal connections


When you look at the arrows in a theory of change, like the simple one above, can we assume everyone knows what they mean? Does this diagram mean that to be happy, you have to have enough luck and enough wealth? How much is enough? Is there some kind of threshold? Does it mean that it is enough to have a bit of either? Or perhaps you can only be happy if you have lots of both. Or perhaps luck and wealth are just kind of added up? But what if you already have 90% luck and 90% wealth, can you have more than 100% happiness? Or should we think of happiness as something that goes on forever? But does that even make sense? Could there be someone who was a million billion times happier than an ordinary person? Plus, we've assumed that all three variables vary between "none at all" and "the maximum possible". But the diagram doesn't say so. Perhaps some of them are just true-false variables? Or all of them are?

These questions are real headaches.

I've just released a new web app called "Causal Explorer". You can have fun, rather than headaches, by exploring these issues by dragging sliders and watching the effects - live.

The app is subtitled "the Wiggle Room" - let me know if you think that is too corny or doesn't make sense to you.

You might have had a play around with my theorymaker.info or even the experimental version at stevepowell.shinyapps.io/theorymaker3. Causal Explorer uses the same algorithm to focus just on combinations of 2-3 variables.

This example shows what happens when your intervention (messaging to the public) has an inverted-U effect on your desired outcome (Public support for the SAVE BEES campaign) - that means, the best results are at 50% of maximum possible, and more than that is counterproductive. You can play with this example here.

This next diagram is more complicated. It shows the same situation when at the same time there is another intervention which has a direct relationship on the same outcome ... and the two influences are combined in a way which comes close to ordinary addition. You can see that a small intervention on our part goes a long way if the other campaign doesn't do much. But if they are already doing a lot, out intervention won't make much difference. You can explore this very example here.

Causal Explorer is still a bit buggy. If it crashes on you, sorry. You might get some error messages but they tend to go away again, so be patient.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了