Building an RWD “Ecosystem” Requires Standardization, Collaboration
Interest in making greater use of real-world data (RWD) is growing among clinical researchers, regulators and healthcare providers alike, but effective collection and analysis of RWD on a large scale will require a collaborative “ecosystem” in which data collection is standardized for easier analysis.
A critical problem is that RWD is often unstructured and difficult to transfer to clinical trial use, noted Laura Esserman, professor of surgery and radiology at the University of California, San Francisco (UCSF) and director of the UCSF Breast Care Clinic. Electronic health records (EHRs) provide a wealth of RWD, but clinical data management is not done through EHRs, going instead through study-specific case report forms (CRF) and proprietary data management systems.
“The model wasn’t developed to leverage digital healthcare data because it wasn’t available,” said Stephanie Reisinger, vice president and general manager of Veradigm Life Sciences. “We are trying to plug RWD into a model not designed to accept it.”
Standardizing data collection practices so that “clinical trials look more like care and care like clinical trials” would help create and scale up the RWD ecosystem, Esserman said at a two-day workshop sponsored by the FDA and Duke University’s Margolis Center for Health Policy.
It could be more efficient to take the data as it is collected and move it directly from the EHR into the clinical research database in response to the demands of a study protocol, suggested Monica Bertagnolli, professor of surgical oncology at Harvard Medical School. The Minimal Common Oncology Data Elements (mCODE) could offer a model for future RWD collection efforts, Bertagnolli noted. The mCODE has been successfully used to track disease history and treatment response in clinical trials for low-burden data collection.
Structured data fields can help support the systematic collection of key variables, said Leslie Harrold, chief scientific officer at biotech company Corrona. Those fields could include patient demographics, patient lifestyle information, treatment history, disease characteristics and activity, patient-reported outcomes, laboratory measurements and adverse events. The electronic data capture design is also important, she said. It needs to include automatic edit checks, routine site performance checks and routine analytic data checks based on logic. In addition to the data itself, terminology, codes and formatting need to be standardized.
But the real challenge is not in collecting data per se, Harrold said, but in how to collect “regulatory-grade” data.
From the FDA’s point of view, the most important concern is that the RWD be fit for purpose, said Jacqueline Corrigan-Curry, director of the Office of Medical Policy in the agency’s Center for Drug Evaluation and Research. It must be relevant, representing the conditions it is supposed to reflect and must meet key quality standards.
Patient-generated data is considered critical for creating real-world evidence (RWE) that can be used for regulatory decisionmaking. Patient surveys are one way of getting this information, but more and more, researchers and healthcare providers alike are turning to technologies capable of objectively measuring patient statistics and reporting that information electronically in real time.
The FDA provides one model of an effective RWD ecosystem. Amy Abernethy, FDA principal deputy commissioner of food and drugs, discussed the COVID-19 Evidence Accelerator, an initiative by the Reagan-Udall Foundation for the FDA in collaboration with Friends of Cancer Research. The Evidence Accelerator provides a venue for the collection and quick turnaround and sharing of data and results related to COVID-19 research.
“At its heart, an RWD ecosystem is an RWD community, with people sharing data and ideas,” Abernethy said. The Evidence Accelerator relies on the following features to work toward that goal in the COVID-19 research and treatment communities:
- Common data elements;
- FDA-provided translation tables between common data models;
- Common protocols, including ideas for master protocols;
- Parallel analysis, including how to review and use different findings and how to handle lack of convergence;
- Individual accelerator communities focused on certain topics; and
- Frequent meetings and forums for rapid cycle feedback and learning.
The draft principles governing the Evidence Accelerator also reflect priorities already identified within the clinical research industry, including respect for patient privacy, transparency about how data is collected and analyzed, traceability and provenance of the data, and prompt dissemination, not only to other researchers but back to healthcare providers for use in treatment plans.
But merely collecting data is not sufficient to create an ecosystem in which all stakeholders have ready access to both the data and its analysis. Work spurred by the COVID-19 pandemic has shown that researchers can perform extraordinarily well both in terms of generating and analyzing data at speed and sharing results, said Lesley Curtis, professor and chair of the Department of Population Health Sciences in the Duke Medical Center.
“We need to make sure we keep some of these attributes going forward,” Curtis said.