Human Experience, Analytical Systems Join to Strengthen Trials

August 12, 2019

Drawing objective conclusions from subjective measurements is a common challenge in pain management research and one that can lead to skewed results without a system to analyze data, spot problems and evaluate their impact.

But identifying aberrations is not the only challenge, says Arturo Morales, chief technology and data officer for WCG Analgesic Solutions. “It’s very easy to tell that one thing is not like the others. But merely looking at data and saying, ‘This one looks different,’ describes the way people have been approaching monitoring in clinical trials for years.”

That approach is no longer enough. Today’s trials require more sophisticated approaches that overcome the data quantity and quality challenges and requirements inherent in machine learning tools, Morales says. Statistical process control (SPC), a technique borrowed from the manufacturing and quality domains is one of those approaches. SPC, combined with a clinical review done by central clinical and statistical monitors, can be used to cull through clinical data and look for differences that are mathematically significant and clinically relevant. This approach, he says, then allows for the incorporation of human experience and judgment to interpret those differences, their impact on outcomes and the implementation of interventions to mitigate risks to study outcomes.

Consider the sheer quantity of variables in today’s trials, Morales says. Only a handful are likely to have a direct impact on the outcome. If you try to tackle the task without the systems to help you, you are quickly overwhelmed by the number of signals, the amount of work, the quantity of data and the challenge of consistently applying interventions to avoid introducing bias in the study.

But you also need to formulate solutions. “No machine, no matter how learned or intelligent, can handle that task,” he says. “You need human knowledge to make sense of the data.”

Focusing on too large a set of variables will lead to many signals that are clinically irrelevant, Morales says. “We aren’t looking for every questionable blip on the radar. The key is that we know which blips are likely significant.” For example:

  • In a patient: Extreme variability in daily symptom reporting or discordance between caregiver and clinician assessments of the same symptom;
  • At a particular site: An outlier site with multiple subjects with high anxiety scores;
  • Across the entire study: A change over time in perceived disability scores or number of adverse events.

Humans select those metrics, focusing on things that have a reason to be looked at, not just a bunch of variables that may or may not be relevant. After all, variation itself isn’t a bad thing. Aberrations that may be mathematically relevant aren’t necessarily clinically relevant, Morales says.

Analgesic Solutions’ answer to the challenge is its Quantitative Data Surveillance System (QDSS), which combines SPC with interpretation and analysis by a team of clinical experts – trial monitors, clinicians, subject matter experts, etc. – appropriate to the individual trial. Then, working with trial staff and a sponsor, the team comes up with recommendations on what to do based on clinical operations, disease knowledge and regulatory expertise.

But timing is key. Waiting until the trial is completed and the database locked leaves you with no recourse if you find a problem. “In the past,” Morales says, “you set up your trial, and you hoped you designed your protocols perfectly and the sites executed them accurately. Then you closed your eyes and hoped for the best for two years.”

“Today, sponsors and CROs can look under the hood while the trial is running and blinded and identify threats to the outcome of the study,” he says. “That’s a monumental change.”

The QDSS goal is to systematically and consistently recommend the mitigation strategies and actions to correct data problems. The system allows you to triage the problem in real-time and act while you still have the opportunity to make changes. “You have to stop the bleeding first” Morales says.

For example, at some sites — due to an apparent decrease in data quality in patient-reported outcomes – staff and patients may need more training in accurate symptom reporting. This training will refresh them on key concepts shown to have an impact on data quality and thus can potentially improve the outcome of the study.

Other solutions QDSS might propose include adjusting screening criteria to recruit more accurately or monitoring specific sites for procedural problems.

Beyond helping current trials, QDSS can use the lessons it learns to make future trials better, Morales says. “The learnings we derive from past trials can increase sensitivity earlier and optimize the variables we look at and how we look at them for clinical relevance.” Using that approach, he says, bias is minimized by standardizing analysis and responses. 


-By Leslie Ramsey