Posts Tagged ‘context’

The unbelievable truth?

Wednesday, December 29th, 2010

Provocative piece in a recent New Yorker (hat tip to Tactical Philanthropy) about an emerging doubt among scientists about the validity of many published results. The “decline effect” is that many results that initially appear robust and statistically valid (X drug helps lessen symptoms of Y disease in Z percent of patients), when replicated over time, either can’t be replicated, or the effect lessens (Z gets smaller or disappears).

The upshot?

The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.

Interesting given how much weight is being given these days in philanthropy to randomized controlled trials and experimental design as the gold standard for evaluation, particularly in international development. Reminds us to be humble about our claims.

There are two ways this should happen: one is to be very explicit about our assumptions, and to make them publicly available. This was what I was taught in grad school: describe how you conceptualize, operationalize, and measure your variables, and talk about how you code them. And I studied one of the wanna-be sciences; I’m frankly shocked that such practices aren’t standard in medical research, if the article is to be believed.

The other way to be humble about our claims around evaluation is to triangulate: to put quantitative results in context. Another thing I learned in grad school was to specify mechanisms: in as much detail as you can, describe how you see the causal pathway working between the cause you posit and the effect you’re trying to explain. And harmonize the two: have quant and qual work with each other and reinforce each other.

As a new year approaches, always good to be reminded of the importance of humility. I’m often ambivalent about transparency, for a variety of complicated reasons. This kind of transparency, about methods and assumptions that back up claims of empirical “proof” – this I can get behind.

Here’s to a happy and healthy 2011 for one and all. I’ll resume my regular Tuesday-Wednesday-Thursday schedule next week.


The data-driven, multi-method, context-sensitive life

Tuesday, August 10th, 2010

In the wake of the Shirley Sherrod fiasco, this op-ed from Van Jones struck a chord: how easy it is today to tear someone down based on a single utterance, divorced of context. This piece from Marcia Stepanek about danah boyd’s (yup, that’s how she spells it) reflections on privacy drove the point home:

“The material that is being put up online is searchable by anyone, and it is being constantly accessed—out of context and without any level of nuance,” Boyd told attendees of last week’s Supernova Conference at The Wharton School in Philadelphia. “That kind of spotlight on people can be deeply devastating, and a type of exposure that may not be beneficial to society.” Put simply, Boyd said, “we can’t divorce information from interpretation … or we risk grave inaccuracy.”

Where are the search algorithms that take a result and put it in context? Is this the next frontier Google should be exploring (rather than “being evil“)? Or is that function one that used to be called journalism?

Methodology for evaluation is like this; it needs to be put into context: what are the assumptions being made, what happens to the results if those assumptions are relaxed? The data-driven life is about more than just numbers; the data-driven, multi-method life has to be about context.