Reinventing the Methods Journal: Increasing Reproducibility with Video Journals

Against the Grain, Oct 2016

By Kira Henderson, Published on 10/31/16

A PDF file should load here. If you do not see its contents the file may be temporarily unavailable at the journal website or you do not have a PDF plug-in installed and enabled in your browser.

Alternatively, you can download the file locally and open with any standalone PDF reader:

https://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=6605&context=atg

Reinventing the Methods Journal: Increasing Reproducibility with Video Journals

Kira (2013) "Reinventing the Methods Journal: Increasing Reproducibility with Video Journals Reinventing the Methods Journal: Increasing Reproducibility with Video Journals Kira Henderson JoVE Follow this and additional works at: http://docs.lib.purdue.edu/atg Part of the Library and Information Science Commons - Article 9 The requirement for geospatial tagging was also problematic. For example, the following caption explains that the painting in question was once in Florence, but the aim was to have dates and places held in separate fields so users would be able to search by dates and date ranges rather than through a simple string-based search. This is one of two panels that were part of the predella that forms the lower edge of the large altarpiece of v eneziano’s “St. Lucy Altarpiece” (c.1442-48). Originally in the church of St. Lucia dei Magnoli in Florence, the altarpiece appears to have been dismantled by 1816. A large number of images and moving images were rejected at the initial evaluation because of spelling errors in the encodings or metadata. This problem particularly applied to the “rushes” (the never-before seen unedited footage from which news broadcasts are selected), which Jisc had encouraged the vendors to provide. It should also be remembered that commercial providers usually compile metadata for internal use, rather than for publication, and so most of the encodings and metadata supplied had not been through any form of editorial review. The logistical and metadata problems overcome, the project produced more than 500 hours of film clips — fromgorbachev’s accession to power in the Soviet Union in 1985 to the financial crisis of 2009, and including powerful raw footage of the 9/11 attacks as well as coverage of key issues such as deforestation and global warming. All told, a large and diverse collection of over 56,000 photographs to support teaching and lifelong learning was developed in the areas of history, social sciences, science, art and creative industries, and geography. These collections were and continue to be delivered to the UK academic community through a service called Jisc MediaHub, which provides a single point of access and enables users to search and link out to other external media collections such as the Open Video Project, Wellcome Images, ADS, ARKive, and the First World War Poetry Archive. In summary, although Jisc usually negotiates with vendors on behalf of libraries, in the area of media resources we recommend a tender process, not least because this ensures a very clear definition of requirements and evaluation process. Evaluation by educational experts is essential in building collections that will be of value in research and teaching and provide a long-term return on investment. Licenses in perpetuity — or for at least a very long term, are essential, because it is impossible to sustain annual subscription fees in an uncertain economic climate. Finally, metadata is king! However interesting or informative an image, it is useless if it cannot be found. Reinventing the Methods Journal: Increasing Reproducibility with Video Journals by Kira henderson (Deputy Director of Journal Development, JoVE) <> Tmust be rehabilitated or risk becoming he way science journals present research obsolete, causing foreseeable negative consequences to research funding and productivity. Researchers are dealing with everincreasing complexities, and as techniques and solutions become more involved, so too does the task of describing them. Unfortunately, simply explaining a technique with text does not always paint a clear enough picture. Scientific publishing has followed essentially the same model since the original scientific journal was published in the mid-seventeenth century. Thanks to advances in technology, we have seen some minor improvements such as the addition of color printing and better dissemination and search functionality through online cataloging. But what has actually changed? In truth, not all that much. Articles are still published as text heavytomes with the occasional photograph or chart to demonstrate a point. Dr. John ioannidis, the C.F. Rehnborg Chair in Disease Prevention at Stanford university, and two independent teams of scientific analysts, recently attempted to reproduce the findings of 18 research articles. The articles, published in Nature Genetics in 2005 and 2006, profiled gene expression from microarray data. Despite the authors’ claims that the microarray data set was publicly available, the procedures were not detailed enough to allow for accurate reproduction of the findings for 16 of the 18 articles.1 Inability to reproduce findings is not an uncommon problem in modern science. Several other independent studies confirm Dr. ioannidis’ findings, including a report by researchers at Amgen pharmaceutical company, where only six of the 53 studies they tested were reproducible,2 and an internal report at bayer healthCare, where results from published data were irreproducible in two-thirds of their projects.3 As research becomes more complex and the dependency on detail and accuracy grows, there is a need for more clarity in the publication of methods. Is the lack of progress in scientific publishing affecting the productivity of science? Data from several recent studies would suggest that this is a possibility. So, inevitably we are faced with the question of what can we do to increase the productivity of science? Is the current problem an example of the way science is performed or the way it is published? biomedical Research budgets at Risk Due to Low Reproducibility A recent article in the Journal of the American Medical Association detailed a large-scale biomedical research budget and spending study by the Alerion institute. The authors of the study found that spending on biomedical research, which had doubled over the last century to an all-time high rate of over $100 billion a year in the U.S. alone, has now begun to decline. The Alerion study found that industry is the largest sponsor of medical research, at 58 percent of the spending, followed by a 33 percent contribution from the federal government. This equates to an approximate $30 billion contributed by the U.S. government each year (from agencies like the National institutes of health and the National Science Foundation), and means that the U.S. spends about six cents of every health care dollar on medical research. Dr. hamilton Moses, iii, coauthor of the study and chairman of the Alerion institute, said “If we’re going to be spending $100 billion a year, we’d better have treatments that work over a long period of time against diseases that are important today and will be more important tomorrow.”4 Dr. Moses and his team also concluded something rather shocking from their study: while spending on biomedical research has doubled over the past century, approval for new drugs and medical devices has stagnated. Possible causes for the productivity shortcomings in biomedical fields have been linked to the current lack of reproducibility in published work. The implication is an incredible waste of resources and risk to research funding. Drug manufacturers rely heavily on early-stage academic research and can waste millions of dollars on products if the original results are later shown to be unreliable. More, when patients enroll in clinical trials based on conflicting data they may sometimes see no benefit, or worse, suffer harmful side effects. Unlike pharmaceutical companies, academic researchers rarely conduct experiments in a “blinded” manner. This makes it easier to handpick statistical findings that support a positive result. And, in the quest for jobs and funding (especially in an era of economic malaise), the growing army of scientists need more successful experiments published under their name, not failed ones. So if everyone wants and needs to reproduce experiments, why are duplicative results becoming so elusive? One reason may be that different labs and different materials can produce variant continued on page 20 Reinventing the Methods Journal from page 18 results. The more variables in an experiment, the more likely it is that accumulative errors will swing a lab’s conclusions one way or another. However, given the systematic inability to reproduce experiments that is occurring (and which is publicly documented in Nature and other journals), something else must be happening. As pressure to “publish or perish” increases, we seem to be creating a system that is collapsing in on itself. With a 25 percent increase in researchers, according to a 2007 report from the U.K.’s Royal Society, and an increase in the number of publications, it would seem that there is an adequate amount of information to properly convey progressing techniques and findings. However, the information that is available frequently isn’t effectively communicating the intricacies of experiment. This is because scientific publications have not kept up with the changes in science. Most journals are still dependent on the same communication methods of 350 years ago. They are using text, intending for words alone to convey increasingly complex experiments. However, this need not be the only solution. As a publisher, we have provided a new way for scientists to disseminate information. We are JoVE, the Journal of Visualized Experiments, and we create a novel publication that includes all the essential elements of a traditional text publication, such as the abstract, introduction, protocol, results, and discussion, but also feature one crucial addition — video. JoVE has published over 2,500 video articles across a variety of disciplines in biological, medical, and physical sciences. 300,000 people a month in over 2,000 institutions around the world view these articles to learn new techniques, increase collaboration, and teach the next generation of scientists. They say a picture is worth a thousand words, so can you imagine what a video is worth? As the world becomes increasingly interested and involved in multimedia through the proliferation of the Internet and the expansion of access to smart phones with built in video recorders, there is no explanation for publishers who allow themselves to be left behind. From YouTube’s straightforward video stock to JoVE’s professional video production demonstrations, we are now at a point where video is not a luxury but a requirement. As video becomes more a part of everyday life, scientists are becoming more comfortable and excited about using it as their publishing preference. has v isualization Produced Results? Dr. Nikolaos g iagtzoglou, postdoctoral researcher in neuroscience at baylor College of Medicine, needed to learn three techniques for working with Drosophila (fruit flies) for a new application of his research. Unfortunately Dr. giagtzoglou found it challenging to learn these techniques from traditional text-based literature, and trying to find and then learn from a researcher fluent in the technique proved difficult. “Even when you meet someone who specializes in a technique, it can be hard to coordinate busy schedules to travel and learn the method,” explained Dr. giagtzoglou .5 He discovered an article in JoVE by Dr. greg Macleod, an expert in motor neuron backfilling in Drosophila, one of the techniques giagtzoglou was attempting to learn. giagtzoglou immediately saw the benefit of video publishing. “It’s like night and day. JoVE’s visual demonstration, from the beginning to the end, is helpful to researchers,” he said. “Watching a JoVE video-article is so much more helpful than reading just materials or methods, which can have grammatical mistakes, bad syntax, or may be hard to interpret.”6 Using JoVE, Dr. giagtzoglou and his colleagues were able to backfill Drosophila motor neurons with calcium indicators in just days, instead of the weeks he spent reading other journals. The lab also reduced the number of generations of flies required to get experimental results, saving thousands of dollars in researcher time and the cost of fly upkeep. While Dr. g iagtzoglou found it difficult to coordinate schedules with a researcher who knew the techniques he needed, other researchers have found that even if they can coordinate schedules, the cost savings of learning by video adds up tremendously. Dr. Theresa Casey, assistant professor in the department of animal sciences at Purdue university and member of the Dr. Karen Plaut Lab, explains that JoVE has saved the Plaut Lab thousands of dollars, particularly in travel costs. “I had a collaborator in Buffalo who knew the [Suprachiasmatic Nuclei] surgery, and I’ve seen it done before. By using the JoVE video, we saved money in travel costs to go to Buffalo repeatedly to learn the technique.”7 Dr. Casey goes on to say that video articles have helped her both as a refresher for techniques and as a foundation for building her own research. “I’ve been doing research for 20 years, and having JoVE makes things so much easier. You can educate yourself on research other scientists are doing around you and get familiarized on a technique before you try it. I like to watch techniques and refresh myself on experiments I haven’t conducted in 18 years but need now.”8 Using a video-based protocol is useful both for learning new techniques, as Dr. Casey did, and to validate the techniques used to produce novel results, as Dr. Jonathan T. butcher at Cornell university has done. Dr. butcher explained that after publishing his group’s results in a high-impact journal, they received numerous inquiries from other labs questioning the validity of their findings because, he said, “these other labs were not able to reproduce our results using the written instructions in the methods section of our novel research paper.”9 It is undoubtedly frustrating for researchers who spend months or years perfecting a technique to produce potentially groundbreaking results just to have those results challenged by other labs. However, after they published their methodology with JoVE, Dr. butcher’s group no longer receives questions about the validity or reproducibility of their results. Dr. butcher explains, “The video format conveys complicated methods significantly better than text alone and helped to validate our novel results.”10 Conclusion With publication up and reproducibility in decline, there is a clear disconnect between peer-reviewed journals and the assistance they provide to researchers. Reproducibility is what makes science, science. Without the ability to replicate what others have done, science not only loses its credibility, but also its ability to serve as building blocks for future discoveries and breakthroughs. As JoVE has demonstrated, with new technology available there is a way to assist researchers that was not possible 350 years ago at the inception of the printed science journal. Science is responsible for many advances and there is no reason that its partner in disseminating information, the scientific journal, should not also be progressive. Video articles have been found to reduce the time it takes for a scientist to learn a technique, thereby increasing productivity, saving research dollars and resources, and giving researchers the base to apply and gain new grants. Put simply, today’s scientific environment is challenging, and it is no longer reasonable to suggest that what worked 350 years ago will yield the same results today. Text-only publications are no longer progressively advancing science, and if they don’t reinvent themselves in the way JoVE has shown to be possible and productive, then the scientific community faces some very real and serious obstacles. endnotes 1. “Repeatability of published microarray gene expression analyses , ” Nature Genetics 41 , 149 - 155 ( 2009 ). Published online: 28 January 2008 . DOI: 10 .1038/ng.295, http:// www.nature.com/ng/journal/v41/n2/full/ ng.295.html. 2. “ Drug development: Raise standards for preclinical cancer research , ” Nature 483 , 531 - 533 ( 29 March 2012 ). Published online: 28 March 2012 . DOI: 10 .1038/483531a, http://www.nature.com/nature/journal/v483/ n7391/full/483531a.html. 3. “ Believe it or not: How much can we rely on published data on potential drug target? ” Nature Reviews Drug Discovery 10 , 712 ( September 2011 ). DOI: 10 .1038/nrd3439c1, http://www.nature.com/nrd/journal/v10/n9/ full/nrd3439- c1 .html. 4. “Medical Research Got More Money Over Last Decade,” New York Times (21 September 2005 ). Published online: http:// www.nytimes.com/ 2005 /09/21/national/ 21medical.html?_ r=0. 5. Interview conducted in internal case study. 6. Ibid. 7. Ibid. 8. Ibid. 9. Ibid . 10 . Ibid .


This is a preview of a remote PDF: https://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=6605&context=atg

Kira Henderson. Reinventing the Methods Journal: Increasing Reproducibility with Video Journals, Against the Grain, 2016,