Pilot-to-Portfolio Transition Remains Barrier to Novel Data Management Methods
Overly compartmentalized approaches to testing new data management solutions is one of the key obstacles to widespread adoption within an organization, research by the Tufts University Center for the Study of Drug Development (CSDD) and Saama Technologies indicates.
For instance, most companies — 84 percent, according to CSDD — are unlikely to adopt any novel data management solutions without first conducting a pilot test or proof-of-concept study. And 87 percent report having trouble generalizing pilot experience to portfolio-related activities.
The problem, said Kenneth Getz, deputy director of CSDD, is that the clinical research industry tends to conduct pilot tests as separate entities from the organization’s overall development strategy.
“There is no continuity planning,” Getz told attendees at a recent Xtalks webinar. “Once a pilot ends, there is no plan to move it to a broader level of activity. Often, the people involved in the pilot are no longer available to help the organization learn lessons from the pilot and begin to integrate it into drug development activities.”
To overcome this weakness, pilot or proof-of-concept programs should include concrete plans for integrating successful tests into a company’s broader development program. This could include incorporating insights from clinical research partners and staff involved in moving new products through the development pipeline, as well as regulatory specialists familiar with FDA demands for clinical data.
CSDD has identified several options available to help companies move new data management methods from concept to reality, as well as handling massive amounts of data generated through a growing variety of sources. These include:
- Protocol simplification;
- Assessment of protocol feasibility using novel technology;
- Quality-by-design and use of risk-based approaches to determine which data is more critical;
- Integrated platform solutions and unified data;
- Automation to drive some more rapid data collection with validation tools; and
- Application of artificial intelligence (AI) to leverage computing capacity to supplement limited personnel and capacity.
None of these approaches are new, Getz noted. The industry has been applying many of them at the pilot or proof-of-concept scale for up to 15 years, but little progress had been made until the COVID-19 crisis created a sense of urgency.
Response to the COVID-19 pandemic has certainly led to faster adoption of many technology solutions, particularly around remote monitoring and virtual studies. CSDD reported that 40 percent of trials ongoing at the time of the outbreak moved to remote or virtual models that included home health visits, direct-to-patient drug and supply delivery, and collection of data via portable devices, among other approaches.
A poll taken of webinar participants indicated that 90 percent have seen accelerated digital technology uptake and the need for remote monitoring due to the COVID-19 pandemic, a figure Getz said was borne out by CSDD research. The greatest data management challenges reported by webinar attendees were data integration (38 percent of respondents), data analysis (28 percent), data capture (23 percent) and database lock (10 percent).
But while pressures due to the COVID-19 pandemic have accelerated testing and adoption of new analytics and technology solutions for clinical data management, several factors still exist in the industry that can impact adoption of data management technologies, Getz noted. Among the issues that CSDD has identified include lack of staff skills, company culture, lack of leadership, risk to research integrity, lack of trust in the solutions, economic risks and regulatory risks. Of these, economic and regulatory risks are the most difficult to manage, Getz said; most of the other challenges can be mitigated by study design and changes to corporate policies.
Necessity — such as that posed by the COVID-19 outbreak — is an answer to the economic issue. Under the stresses of the pandemic, the risk profile for trying new data management approaches has changed, Getz said. And regulatory agencies have been encouraging use of virtual and remote models and allowing more flexibility in data collection and management approaches during the pandemic, he said.
Adopting novel approaches to collecting and managing data will be important in a clinical research world where vast volumes of data must be handled. CSDD research indicates that the volume of data points collected for a typical clinical trial has increased by about 200 percent over the past 10 years. The fastest growing area is in tertiary or exploratory areas, rather than data related to core endpoints, Getz said. This type of data is often used to position or support new drug products commercially.
One effect of the increased inflow of data has been increased development cycle times; CSDD data indicates that the average is up 14 percent compared to three years ago. There is also greater variation around the mean cycle time, Getz said, which indicates that performance has become less predictable.
“Studies with higher numbers of data sources contributing to the study databases show increases in average cycle time and in variation around the mean cycle time,” Getz said.
Adding to the challenge is the fact that an increasing amount of the data collected is more subjective, patient-provided information. The movement toward use of more remote, unstructured data from patients began before the pandemic, Getz said, but has increased rapidly in 2020.