Annotated Psych Links


…exactly what it sounds like.

Every once in a while I start to think we’re getting somewhere concrete on this whole braining thing, and then I’m reminded that our method looks a lot like “poke it with a stick and see what happens.” Or, you know, “poke it with a deep brain stimulator and see if we accidentally change your music preferences.”

Meta-analyses, psychology, and Doing It Right

My colleagues and I have successfully pushed for formalizing what was previously informal and inconsistent: in conducting a meta-analysis, source of funding for an RCT should routinely be noted in the evaluation using the Cochrane Collaboration  risk of bias criteria. Unless this risk of bias is flagged, authors of meta-analyses are themselves at risk for unknowingly laundering studies tainted by conflict of interest and coming up with seemingly squeaky clean effect sizes for the products of industry.

Of course, those effect sizes will be smaller if industry funded trials are excluded. And maybe the meta analysis would have come to a verdict of “insufficient evidence” if they are excluded.

My  colleagues and I then took aim at the Cochrane Collaboration itself. We pointed out that this had been only inconsistently done in past Cochrane reviews. Shame on them.

They were impressed and set about fixing things and then gave us the Bill Silverman Award. Apparently the Cochrane Collaboration is exceptionally big on someone pointing out when they are wrong and so they reserve a special award for who does it best in any given year.

I was recently shown Data Colada, the blog of Leif NelsonJoe Simmons, and Uri Simonsohn. There are thoughtful and easy-to-read pieces on the interaction of variables, effect size measurement in the lab, and….researching people who take baths in hotel rooms.

Leave a Reply

Your email address will not be published. Required fields are marked *