ISS Conference Confronts Transparency and Replication "Crisis"

By Ben Hinshaw - As the debate regarding the so-called replication crisis in social science continues, the Making Social Science Transparent conference (hosted by ISS on April 22, 2016) provided a timely opportunity to tackle the issue head-on. Scholars from an array of institutions and fields convened to assess the extent of the problem, and to develop paths towards lasting solutions.

In his opening remarks, UC Davis Professor of Statistics Duncan Temple Lang promised a “broad, interdisciplinary, yet coherent agenda.” He also thanked those in attendance for their ongoing commitment to addressing the so-called replication crisis. “You’re doing extraordinarily important work,” he said, because “every single aspect of the research and publication process is touched by this issue.”

Collective action

The day’s first session was entitled Defining the Issues: Research on Replication, Reproducibility, and Transparency. Garret Christensen of the UC Berkeley Initiative for Transparency in the Social Sciences presented data suggesting that researchers often overestimate their own conformity to standards of transparency, while assuming others are more lax. Meanwhile, Richard E. Lucas of Michigan State argued that common arguments against replication (it’s boring; replicators are “out to get” authors; reputations must inevitably suffer) are easy to discredit, and that rather than debating the value of replication, social scientists should be investing more resources in it. Ensuring that studies wield sufficient statistical power to avoid irreproducibility should, he said, be the rule, not the exception.

Offering a historical point of view—specifically, that replication was key to the scientific revolution in that it separated science from superstition and dogma—Stanford’s Cristobal Young insisted that replication packages are (or should be) public goods. As the spirit of capitalism continues to infiltrate academia, he said, transparency represents a problem of collective action.

Missing pieces

After a break, the second session was devoted to Best Practices in Action. Katie Corker of Kenyon College introduced the audience to the Open Science Framework, “a one-stop resource simplifying each step of scientific workflow.” Covering planning, execution, reporting, archiving and discovery, the OSF makes the pursuit of transparency simple at every stage—including pre-registration (a process in which researchers commit themselves to a research design before the observation of their outcome variable.)

Michael C. Frank

Developmental psychologist Michael C. Frank of Stanford proposed several “incremental changes” to help make social science research more cumulative. These included larger studies, multiple internal replications, an emphasis on measurement and estimation rather than statistical significance, and making everything open by default. “There’s no reason why our laundry shouldn’t be out on display,” he said. “It’s no dirtier than anyone else’s.” Multiple iterations, he added, are essential to revealing true patterns—social science should be conducted accordingly.

Jehan SparksBased on personal experience garnered as a PhD candidate in psychology at UC Davis, Jehan Sparks presented the “Start-Local Approach” adopted by the Attitudes and Group Identities Lab, led by Alison Ledgerwood. Her recommendations included learning and critically evaluating new techniques and adopting a “gain frame” attitude. By starting local, Sparks and her lab partners have saved time and resources, accumulated knowledge about what works and what doesn’t, and developed a deeper trust in their own data. Such local advances can, she said, be used to inform development of field-wide standards.

“Working with big data is no longer the domain of specialists,” said Tracy K. Teal, introducing Data Carpentry and its program of data skills training to enable more effective and more reproducible research. Such training, she said, is “the missing piece between data collection and data-driven discovery.”

"I would love open data to be the default." - Simine Vazire

Open by default

After lunch, UC Davis Professor of Political Science Brad Jones moderated a Journal Editors Roundtable. With academic careers so dependent on publication, he said, it is important to discuss transparency from the the perspective of journals. William Jacoby of the American Journal of Political Science described a comprehensive replication policy initiated by that publication in March 2015. This policy, which involves attempting to reproduce all analyses before they are published, has thus far proved “wildly successful.” Simine Vazire, editor at Social Psychological and Personality Science, echoed Jacoby’s sentiments. “I would love open data to be the default,” she said, emphasizing that she would be open to publishing incremental research and null results in order to guarantee replicability. “Let’s move away from binary thinking,” she added. “That doesn’t mean lowering the standards”—quite the opposite, in fact.

“What we really want is for people to think ahead of time of the reasons for what they’re about to do,” said John Patty, co-editor of the Journal of Theoretical Politics. Since it is physically impossible for journals to check every last claim made in a given paper, they must necessarily prioritize the questions an author seems most likely to subsequently face. Also important to consider is the matter of consistency: if standards differ widely from journal to journal, the meaning of tenure (so dependent on publication) will inevitably change.

“Too often,” said Kristin Kanthak, associate editor at the Journal of Experimental Political Science, “people operate on a ‘Here are my results—they prove I’m right’ basis.” Advocating for a cultural shift that allows for more open discussion of research failures, Kanthak predicted that in a few short years it will be standard procedure for researchers to submit their code and data along with their manuscripts.

"The system is broken. It needs to be completely overthrown." - Jonathan Eisen

Revolutionary thinking

The conference’s final session was entitled Alternative Ways of Looking at Reproducibility and Transparency. Shu Shen, assistant professor of economics at UC Davis, presented evidence of statistical manipulation by the Chinese government in its reporting of “Blue-Sky Days,” revealing the extent to which self-reported and evaluated data can be misrepresented for political ends.

Jonathan Eisen, professor of evolution & ecology and medical microbiology & immunology, declared himself an openness evangelist. Researchers shouldn’t fear being “scooped” when they share their data, he said; rather, they should expect offers of help. But he has also come to recognize that openness does come with real risks, including facilitating biopiracy and poaching of endangered species. Nonetheless, he said, “the system is broken. It needs to be completely overthrown.”

Don A. Moore of the Haas School of Business concluded the day on an inspiring note with his presentation How to Start a Revolution. Describing his own “vigilante” efforts to insist on reproducibility in every aspect of academia (including hiring and grant writing), he challenged the audience to be part of the solution, not the problem. “The changes we’re talking about can help us to advance our fields and rebuild faith in our published research,” he said. 

Catch up on the Twitter conversation: #MSST2016. To watch videos of all the day's panels and talks, visit the Presentations page.

Photographs by Cindy Chin. This event was hosted by ISS and co-sponsored by the Department of Political Science.

Filed under: