Posts Tagged ‘methodology’

Brand New Key(s)

Friday, September 19th, 2014

At my day job, we just put out a paper that I was involved in, “Ten Keys, Ten Years Later: Successful Strategic Planning for Foundation Leaders.” As much as strategy has and continues to evolve, we find that the fundamentals of strategic planning remain relevant for a wide variety of funders. One sign of this is that one of our most enduringly popular briefing papers is “Ten Keys to Successful Strategic Planning for Nonprofit and Foundation Leaders.” We just had a potential client say that’s how they found us – and it came out more than ten years ago.

So we thought it would be useful to revisit the Ten Keys and see what was the same and what changed. We also decided to focus specifically on funders this time around.

It was pleasantly surprising to see that most of the keys held up. We framed them a bit differently this time around, but the fundamentals remain sound, and easily overlooked.

There are two that receive different emphasis this time around, and they’re topics that are near and dear to my heart. One is about how non-grantmaking tools are no longer just an afterthought, but an integral part of the strategy discussion. And the other is that it’s more important than ever to frame strategy relationally – in terms of the ecosystem in which you’re embedded. If you’re an education funder, your strategy needs to address its relation to the strategy of the school district, the charter school network, other education funders – your strategy is not just yours alone, in other words. Those who get that do better in strategic planning.

What do you think about the updated Ten Keys? Do they ring true with your experience of strategic planning? How relevant do you find strategic planning in today’s environment?

Share/Save/Email/Bookmark

The Gambler

Thursday, June 7th, 2012

I’m wondering whether the key to “strategery” isn’t found in the wisdom of Kenny Rogers: “You’ve gotta know when to hold ’em, know when to fold ’em, know when to walk away, know when to run.”

The song is about a card player, who’s observing a series of numbers, but also a group of people. It’s said successful poker players read their opponents, not the cards.

This strikes me as a useful metaphor for strategy in philanthropy, particularly at a time when “metrics mania” has taken hold. To me, it becomes “mania” when metrics are driven by superstition: DATA take on a totemic power and aren’t understood either in themselves or in relation to their context.

It’s not enough to gather data, you have to know how to use them. Which means being clear about why you’re gathering them. Which means being clear about what you’re hoping to accomplish through the use of data.

Strategy in this respect is about the judgment of when to use different kinds of data, and how to balance them against each other. Context is everything. Decision-making is strategic when it’s data-driven, but even that phrase is a bit deceptive. It’s not the data doing the driving; they’re the fuel – you have to be the driver. But all too often we act as if we’re in one of those Google self-driving cars and try to have the data “speak for themselves.” Ain’t no such thing, my friends.

So think about Kenny Rogers the next time you’re wondering how to be more strategic in your giving. Read the numbers on the cards and do your calculations, but only as you read the players and the table.

The unbelievable truth?

Wednesday, December 29th, 2010

Provocative piece in a recent New Yorker (hat tip to Tactical Philanthropy) about an emerging doubt among scientists about the validity of many published results. The “decline effect” is that many results that initially appear robust and statistically valid (X drug helps lessen symptoms of Y disease in Z percent of patients), when replicated over time, either can’t be replicated, or the effect lessens (Z gets smaller or disappears).

The upshot?

The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.

Interesting given how much weight is being given these days in philanthropy to randomized controlled trials and experimental design as the gold standard for evaluation, particularly in international development. Reminds us to be humble about our claims.

There are two ways this should happen: one is to be very explicit about our assumptions, and to make them publicly available. This was what I was taught in grad school: describe how you conceptualize, operationalize, and measure your variables, and talk about how you code them. And I studied one of the wanna-be sciences; I’m frankly shocked that such practices aren’t standard in medical research, if the article is to be believed.

The other way to be humble about our claims around evaluation is to triangulate: to put quantitative results in context. Another thing I learned in grad school was to specify mechanisms: in as much detail as you can, describe how you see the causal pathway working between the cause you posit and the effect you’re trying to explain. And harmonize the two: have quant and qual work with each other and reinforce each other.

As a new year approaches, always good to be reminded of the importance of humility. I’m often ambivalent about transparency, for a variety of complicated reasons. This kind of transparency, about methods and assumptions that back up claims of empirical “proof” – this I can get behind.

Here’s to a happy and healthy 2011 for one and all. I’ll resume my regular Tuesday-Wednesday-Thursday schedule next week.

The data-driven, multi-method, context-sensitive life (continued)

Tuesday, September 14th, 2010

I’ve had a few posts riffing on a New York Times article about people who create databases about every idea they’ve had since they were teenagers or their caffeine consumption over months. Now comes Ethan Zuckerman, a fellow Eph, aiming to monitor his consumption – of ideas. Ethan’s hypothesis is that our media diet is actually much less heterogeneous than popular images of the freedom of the Internet would have us believe. Sounds plausible to me. That’s why I’m glad to add Philanthropy Daily to my blogroll. I’ve complained about the “conservative think-tank bromides” in philanthropy writer Bill Schambra’s work; I’d better keep myself honest and not get caught in my own echo chamber. I’m telling you, the privileging of local knowledge is a bipartisan concept….

(Hat tip to Stanford Social Innovation Review.)

The data-driven, multi-method, context-sensitive life

Tuesday, August 10th, 2010

In the wake of the Shirley Sherrod fiasco, this op-ed from Van Jones struck a chord: how easy it is today to tear someone down based on a single utterance, divorced of context. This piece from Marcia Stepanek about danah boyd’s (yup, that’s how she spells it) reflections on privacy drove the point home:

“The material that is being put up online is searchable by anyone, and it is being constantly accessed—out of context and without any level of nuance,” Boyd told attendees of last week’s Supernova Conference at The Wharton School in Philadelphia. “That kind of spotlight on people can be deeply devastating, and a type of exposure that may not be beneficial to society.” Put simply, Boyd said, “we can’t divorce information from interpretation … or we risk grave inaccuracy.”

Where are the search algorithms that take a result and put it in context? Is this the next frontier Google should be exploring (rather than “being evil“)? Or is that function one that used to be called journalism?

Methodology for evaluation is like this; it needs to be put into context: what are the assumptions being made, what happens to the results if those assumptions are relaxed? The data-driven life is about more than just numbers; the data-driven, multi-method life has to be about context.

The data-driven, multi-method life

Wednesday, August 4th, 2010

Multi-method research involves some mixture of qualitative, quantitative, and game-theoretical approaches. As I was coming up in grad school, this was increasingly becoming the norm in my department, UC Berkeley. In my own research, I combined archival research with some quantitative analysis – in part of data that I had gathered through that archival research, in part of a dataset that I created based on existing qualitative work. The qualitative work set up the quantitative analysis: I developed concepts and a theoretical framework, and examined them in a case study involving multiple episodes over time in one country. Based on that examination, I identified ways to operationalize the concepts for a broader set of countries, gathered that data, and used it to test the theoretical framework across a set of Latin American countries. In that same chapter, I did three case vignettes, looking at how my theoretical framework applied or did not in three other Latin American countries.

This is one reason I think the “data-driven life” is of necessity a multi-method one. Conceptualization and measurement are closely tied, and while measurement is viewed as quantitative, conceptualization is intensely qualitative. It’s important to understand and be clear about the conceptual frameworks underlying measurement when doing evaluation in the philanthropic and nonprofit sectors.

The data-driven life

Tuesday, August 3rd, 2010

Came across an article by this title in the NYT from a few months back, about people who itemize their activities or ideas and turn them into searchable databases. Interesting, but some basic misapperceptions about the nature of data, I think. For example:

If you want to replace the vagaries of intuition with something more reliable, you first need to gather data. Once you know the facts, you can live by them.

And:

In other contexts, it is normal to seek data. A fetish for numbers is the defining trait of the modern manager. Corporate executives facing down hostile shareholders load their pockets full of numbers. So do politicians on the hustings, doctors counseling patients and fans abusing their local sports franchise on talk radio.

But data aren’t just numbers. And the opposite of numbers is not intuition.

A) Qualitative data can be systematized, coded, and made searchable.

B) Tools of quantitative data analysis are subject to the assumptions built into the equations, and those assumptions can be mighty hard to satisfy. And there’s an element of intuition and experimentation to the way those assumptions are made.

We need a more holistic view of what count as data. Yes, to the article’s point, more things than we think can be made into databases, but that only increases the need for interpretation. Data don’t speak for themselves….

Varieties of capitalism, varieties of philanthropy (part 4)

Friday, July 16th, 2010

So last time, I wondered: what kinds of innovation might an emerging hybrid CME/LME nonprofit economy be good at? Such an economy would be like a European coordinated market economy in that standard setting would happen cooperatively instead of competitively and labor relations would be “sticky” (it’s hard to fire people), but like an Anglo-American liberal market economy in that financing would happen in a public market with publicly available information and workers would continue to come in with general rather than highly specialized skills.

Coordinated market economies are good at incremental innovation, and liberal market economies are good at radical innovation, according to Hall and Soskice. Why should that be? Here I’m reminded of one of the key methodological lessons I learned from grad school; when thinking about causality, focus on the mechanism. This is one way to get across the gap between correlation and causation: try to tease out and classify the chain of events by which one thing causes another.

For example, in my dissertation, I argue that politicized security forces – in which the army and police have roughly equal resources, the army is more professionalized, and politicians have control over police at the local level – make a government characteristically susceptible to a particular type of armed challenge: insurrection from below. The mechanism I adduce for this is the incentives such a security-force configuration generate for potential armed rebels at the local level. If the police are captured by politicians and are not very professional compared to the army, there is a higher probability that when faced with insurrection from below, they will defect and join the rebels. (This happened a fair amount in Colombia during the La Violencia civil war of the 1940s and 50s, which was my case study.) And if the army is not that much stronger than the police in terms of resources, then they can’t simply crush rebel-affiliated police. This creates an opening for potential rebels, and in a country with politicized security forces, like Colombia in the 1940s and 50s, you’d expect to see more insurrection than other types of armed challenges, like military coups. (These, I argue, are likelier to happen in a country with militarized security forces, like Chile or Argentina.) The mechanism is the incentives that the configuration of control and power among army, police, and politicians creates for potential armed rebels.

So what’s the mechanism for radical vs. incremental innovation in coordinated vs. liberal market economies? For Hall and Soskice, it’s the incentives that labor relations and inter-firm relations create for workers and firms (think nonprofits).

  • In a coordinated market economy, job security, peaceful labor relations, and high levels of skills give workers the freedom to try new things in the confidence that there will be uptake from management and that firms will be willing to share ideas with each other in the interest of improving overall processes.

How does that work in a hybrid model? You have relative job security, labor relations (at places outside large nonprofits like hospitals and universities) are generally peaceful. But are nonprofits open to new ideas from their workers? And do they share new ideas with each other?

  • In a liberal market economy, the job market is much more open, which means that firms wanting to try something new have the latitude to hire workers knowing they can easily lay them off if the new project doesn’t work out. Financing is also more open, so established firms can acquire other firms doing innovative things (think Microsoft or Google hoovering up the startups that created Hotmail or YouTube) – which is a mechanism by which there are incentives for entrepreneurs to create startups that try radically new things. (This is not always a good thing; I remember meeting someone when I lived in the Bay Area in the late 90s who worked for a startup whose idea was “smell over the internet.” You would put a USB doohicky on your monitor that would emit different scents based on the webpage you were on. The doohicky was shaped like a nose.)

How does this work in a hybrid model? (The labor and firm relations, not the nose thing.) It’s relatively easy to hire nonprofit workers, but firing them is difficult. If financing were to become more open, more based on publicly available information, would we see established nonprofits “acquiring” other nonprofits doing innovative things?

It’s a fascinating possibility – a social-service agency that realizes it needs to do policy advocacy to really have impact on education “acquiring” a grassroots community-organizing effort that’s developing new advocacy tools. But it’s difficult enough for nonprofit mergers to happen, let alone nonprofit acquisitions.

Would this change with a social capital market? I’m not sure, because what the “varieties of capitalism” approach teaches us is that the systems of financing, labor relations, education, and inter-firm relations are connected, and their incentives shape and reinforce each other in powerful ways. So to change the financing structure without looking at the other systems might lead to some strange unintended consequences. I’ll explore those in a future post.

What does it really mean to be methodologically rigorous?

Thursday, May 6th, 2010

One of the reasons I chose to get my doctorate in political science at UC Berkeley is because our department is known for being “methodologically plural,” meaning that multiple methods are embraced and taught: statistical analysis, game theory, survey analysis, case studies, comparative-historical analysis, and others.

I came into the program somewhat skeptical about the idea of social “science” – I wanted to study comparative politics, and this seemed to be the place to do it. But I learned something simple and profound about the scientific ideal: it’s about logic, consistency, clarity, and transparency. The ideal is that you make your methods of data collection and analysis clear enough that someone else could use your data, re-run the analysis, and get the same results. In practice, this meant thinking a lot about case selection, about the potential sources of error, and about the tools of data analysis.

What I took away was the idea that rigor is about making explicit what many take for granted: where did you get your information, how did you analyze it, how else could you have analyzed it, and how do your results follow from your analysis? With so much focus on data and metrics in the nonprofit sector and philanthropy, it’s important to remember that simple idea: rigor is not an elaborate technique or a fancy spreadsheet – it’s about honesty, with yourself and your audience, about the limitations, and the possibilities, of your work. If we can message that more effectively, it may be easier for some folks to get on the metrics bandwagon, and for the public at large to trust in the results of our work.