Baby Come Back

It’s been a while, but I’m back at it with the blogging.

It’s interesting to see the backlash against “strategic philanthropy” continuing to gain force. Bill Schambra’s latest continues a theme he’s hammered for a while, but when the likes of FSG (disclosure: a competitor of the firm for which I work) begin to moderate their approach, you know something is up.

Part of this has to do with an absolutism about data, which cuts both ways. Either you have to be driven entirely by metrics, or they’re the devil. If metrics don’t work, throw ’em overboard.

But what’s most interesting, and difficult, is decision-making in conditions of uncertainty. Which is, you know, the human condition.

This is particularly important when you put data in their proper social context. As I’ve continually railed, the concept of “moving the needle” in philanthropy is inherently problematic. The scale of changes philanthropy can foster, particularly in a social-service context, just aren’t big enough – there aren’t enough people affected – to actually change social indicators. The scale is off. Maybe I’m just being too literal, but it seems like the phrase should actually mean something….

To that point, economist Justin Wolfers has a fascinating account of how difficult it is to draw meaningful conclusions even under the best quasi-experimental conditions, allegedly the gold standard of social analysis.

To wit, North Carolina stopped extending unemployment benefits as of this past January, while surrounding states with broadly similar economies and cultural backgrounds continued them. Conservatives argued that stopping benefits would incentivize the unemployed to try harder to find a job, lowering unemployment rates. Progressives argued that those denied benefits would spend less money, exerting a negative influence on the economy.

When Wolfers crunches the numbers, thoughtfully and in accord with good standards, the answer is…we can’t tell. There are changes in both expected directions, but they’re not significantly different than changes in neighboring states. We can’t tell what difference the reform made, and who’s right.

If we’re hoping data will give us greater certainty, there’s a good chance they won’t. And we’ll need to go back to good old values to decide whether or not to do certain things. Now, there are values that are out of touch with lived reality on the ground. For my money, those aren’t worth much cottoning to. So I’m not saying we abandon evidence. But let’s be clear that the data aren’t necessarily going to give us the anchor we thought they could. A degree of faith may be required that longer-term outcomes will ultimately result. Or we may want to value process outcomes more, like improving people’s dignity or promoting learning among relevant actors.

Intentionality in philanthropy is critical, but let’s be honest about what we can and can’t be certain about, and be all right with less certainty than an overly predictive view of metrics might suggest….

Share/Save/Email/Bookmark

Tags: , , , ,

Leave a Reply