The Evolution of Technology

CraigW

By Craig Whittington, PhD, Associate Director (Clinical Effectiveness), NCCMH, Senior Research Fellow, UCL Director, UK GRADE Centre*

We live in interesting times. In the world of evidence synthesis and guideline development, like in other areas of life, technology is advancing rapidly with the promise of more efficient, more accurate, and more transparent use of evidence. Technology is being used to make, among other things, huge advances in: text mining, methods for synthesizing direct and indirect data (Network Meta-Analysis), approaches for appraising the overall quality of evidence, linked data and big data. At the same time we are starting to see increased transparency and access to clinical trial data in Europe and the US.

On the flip side, evidence based medicine (EBM) itself is under increased scrutiny and criticism (for example, see Spence, 2014). At the end of last year, Carl Heneghan (Professor of EBM at the University of Oxford) wrote in the BMJ that the problem can be categorized, in broad terms (and he acknowledged this doesn’t do justice to the range of the issue), into three groups, “distortion of the research agenda, very poor quality research, and a lack of transparency for published evidence.”

Nevertheless, for those conducting systematic reviews and developing clinical guidelines, there is increased pressure to be more efficient, while at the same time, there is pressure to deal with the issues that Heneghan and others have described.

Technology alone is no panacea, but it promises to improve efficiency and may help tackle some of the criticisms surrounding EBM. A technological example is GINDER (the Guidelines International Network Data Extraction Resource); an online system for extracting study characteristics and outcome data based on templates developed by the G-I-N Evidence Tables Working Group (for further information about the templates, see Mlika-Cabanne, 2011). GINDER was designed to help address the objectives of G-I-N, which include the promotion of “international collaboration in guideline activities to avoid duplication of effort and to facilitate information-sharing, education and knowledge transfer.” Unfortunately for GINDER, technology alone was not sufficient; funding and difficulties achieving a critical mass of users were barriers to success, and ultimately led to development being stopped in 2013. Fast forward to 2015 and guideline developers now have a number of options which promise to overcome these barriers. That said, technological solutions are not new in the world of systematic reviewing, but in the past usability has been less than desirable. A point made recently by Elliot and colleagues, who said “review authors commonly conduct the majority of their work on a patchwork of general software products poorly adapted to their needs, much of the data they handle is not captured for future use, and the core review output of a static PDF document limits the ability to search and process the contents of the review.” These problems point to the need for end-to-end solutions, that is, a system that allows a seamless process from creating a review protocol to searching for evidence to evidence synthesis to drafting a report.

In 2012, when looking for an end-to-end solution I found Doctor Evidence (DRE). Their suite of products includes a library feature (DOC™ Library), a data management and analysis feature (DOC™ Data), a reporting feature (DOC™ Create), as well as products for searching drug labels (DOC™ Label) and analyzing data from NHANES (DOC™ NHANES). There were few solutions, at the time, that were as comprehensive and linked to a large database of digitized studies. A subsequent pilot project using the first version of the DRE platform demonstrated the power of an integrated solution. As you might expect, the full range of tasks associated with a systematic review could be conducted online, for example, screening search results, data extraction, critical appraisal of individual studies, meta-analysis and the overall assessment of quality using the GRADE approach. Multiple ways to query, tabulate and report data extracted from each study were intuitive and efficient. Nevertheless, due to the way the system had been developed and its original purpose, DRE staff were needed to set up each review, import studies into DOC Library, and create the data extraction template. Of course, for some users, this level of support is both needed and wanted. However, for others, greater autonomy is required. Version 2.0 of the platform addresses these issues giving greater control to the user, and in addition, integrates access to the R software environment for a greater range of meta-analysis options, including network meta-analysis.

Others have recognized the importance of end-to-end solutions, and have or are moving in this direction. For example, see EPPI-Reviewer, the Guideline Development Tool (GDT), the Cochrane Author Support Tool (based on Covidence), DistillerSR and SUMARI. Other systems are specifically designed for part of the review process, for example, the Systematic Review Data Repository (SRDR), EROS and Rayyan. Many developers also recognize the benefit of sharing data extracted from individual studies and harmonizing data models; the linked PICO project is one example of this. Projects like these, as well as initiatives like GROWTH, will clearly play an important role in facilitating co-operation among guideline and research organizations. It is through co-operation that we’ll see the real evolution in the world of evidence synthesis and guideline development.

* National Collaborating Centre for Mental HealthCentre for Outcomes Research and Effectiveness, Research Department of Clinical, Educational & Health Psychology, University College London, London, UK

Twitter: @whittington_cj