Data Publishing, Experimentation

Open Data and Trust in the Literature

David Spiegelhalter from the Royal Society makes a compelling case for the need for accurate statistical analysis of data, and the availability of that data in order to know whether one can trust a research result. In this short film, he visits with the University of Cambridge’s Nicole Janz to talk about reproducibility. One really intriguing suggestion Janz makes is that we need better training for students in this area. A course that required students to replicate published results from available data would go a long way toward building an understanding of what they need to do to generate (and publish) their own trustworthy results.

About David Crotty

I am the Editorial Director, Journals Policy for Oxford University Press. I oversee journal policy and contribute to strategy across OUP’s journals program, drive technological innovation, serve as an information officer, and manage a suite of research society-owned journals. I was previously an Executive Editor with Cold Spring Harbor Laboratory Press, creating and editing new science books and journals, and was the Editor in Chief for Cold Spring Harbor Protocols. I received my Ph.D. in Genetics from Columbia University and did developmental neuroscience research at Caltech before moving from the bench to publishing. I have been elected to the STM Association Board and serve on the interim Board of Directors for CHOR Inc., a not-for-profit public-private partnership to increase public access to research.

Discussion

3 thoughts on “Open Data and Trust in the Literature

  1. Maybe the problem isn’t with the lack of “better training” but rather with the ad hoc statistical methods themselves

    Posted by Enrique Guerra-Pujol | Jan 15, 2016, 4:49 pm
    • This could be a chicken:egg argument, but if researchers were trained in better techniques, experimental design and statistical analysis, wouldn’t they likely use better methods in their future work?

      Posted by David Crotty | Jan 15, 2016, 5:42 pm
  2. the issue is more subtle and terminology is confusing. Spiegelhalter emphasizes reproducability but probably means replicability. we tried to clarify these various terms in http://www.nature.com/nmeth/journal/v12/n8/nmeth.3489/metrics/googleplus

    see also http://www.statisticsviews.com/details/news/8274061/Clarifying-terminology-that-describes-reproducibility.html

    the bottom line is that research uses numbers to derive findings. To clarify what are the claims of the research authors (what they found or did not find), we suggest that they should generalize their findings and map a “boundary of meaning” delineating what their findings imply and what they do not imply.

    generalization is one of the eight information quality dimensions we describe in
    Kenett, R.S. and Shmueli, G. (2014) On Information Quality, Journal of the Royal Statistical Society, Series A (with discussion), Vol. 177, No. 1, pp. 3-38.

    see also
    http://www.statisticsviews.com/details/feature/4812131/For-survival-statistics-as-a-profession-needs-to-provide-added-value-to-fellow-s.html

    In addition to replicate study results analysis as a technical exercise (with open source data), one should replicate experiments in order to verify the claimed generalization of its findings, what we consider is the essence of reproducability. This is in contrast to replicability that attempts to reconstruct the specific experimental set up conditions in order to obtain a replication of the experimental outcomes. In fact, reproducibility requires changes, and replicability attempts to avoid them. A critical point of reproducing an experimental result is that irrelevant events are not necessarily replicated. A successful generalization providing extensive reproducability allows replicating the findings of a scientific concept rather than reproducing distinct experiments.

    Posted by rkenett | Jan 30, 2016, 12:30 am

The Scholarly Kitchen on Twitter

Find Posts by Category

Find Posts by Date

January 2016
S M T W T F S
« Dec   Feb »
 12
3456789
10111213141516
17181920212223
24252627282930
31  
SSP_LOGO
The mission of the Society for Scholarly Publishing (SSP) is "[t]o advance scholarly publishing and communication, and the professional development of its members through education, collaboration, and networking." SSP established The Scholarly Kitchen blog in February 2008 to keep SSP members and interested parties aware of new developments in publishing.
......................................
The Scholarly Kitchen is a moderated and independent blog. Opinions on The Scholarly Kitchen are those of the authors. They are not necessarily those held by the Society for Scholarly Publishing nor by their respective employers.
%d bloggers like this: