MCC is the place to meet industry peers, exchange ideas, collaborate and think.

Clinical Trial Risk & Performance
Management vSummit

Sept. 8-10, 2020  |  Early access to pre-recorded sessions beginning Aug. 26, 2020

Schedule

Podcast Interviews

Pre-recorded Sessions

An analysis of Jazz clinical trial protocols and of the MCC Protocol Operational Complexity Tool

Alec Vardy

Executive Director
Jazz Pharmaceuticals

The MCC’s protocol operational complexity tool was employed to score the complexity of several ongoing Jazz clinical trial protocols. The findings were used to identify the key drivers behind the complexity of these protocols, and to suggest modifications that could reduce that complexity. In addition, an analysis of the MCC tool was also conducted in order to propose modifications that make the tool more relevant to the therapeutic areas in which Jazz is active and the nature of the trials that it conducts. This adaptation is intended to demonstrate quantitatively that steps taken to reduce protocol complexity do indeed result in lower complexity scores when the tool is applied.

 

 
Read more

Bridging the Gap Between Issue and Risk Management – Why You Need to Connect the Programs

Linda Sullivan, MBA

Executive Director
WCG MCC

When it comes to implementing corrective and preventive action (CAPA) plans in clinical trials, we consistently hear similar frustrations from members of the research community. They identify and implement corrective actions, but identifying and implementing preventive actions, which aim to minimize the likelihood that the issue will occur in other studies, is still somehow getting lost in the shuffle. This happens for several reasons, namely because, when it comes to root cause analyses (RCAs), the correct root causes often aren’t identified beyond the individual study being reviewed. Often the focus is solely on the tactical issue or issues at the study level, rather than also at the system level, or across studies. Consequently, issues are only resolved within the individual study where they were identified and not further addressed with preventive measures for other, ongoing, and future clinical trials. Now that organizations have implemented risk-based quality management programs to comply with ICH-E6 R2, they have an opportunity to improve the effectiveness of CAPA preventive actions by applying learnings from current and emerging issues to risk control and reduction at the system level.

 

 
Read more

Case Study: Determining the Right Metrics to Monitor and Develop a Sponsor-CRO Partnership

Stephen Crow

Associate Director, Performance & Training, Clinical Operations
GW Pharmaceuticals

Keith Dorricott

Director
Dorricott Metrics & Process Improvement Limited

Rapid expansion at a biotech company has led to a change in strategy - to outsourcing work to strategic partner CROs. But how to use metrics to assist with the oversight and development of the partnerships? GW Pharmaceuticals is working with an MCC Ambassador to select and define metrics and to lead discussions with the partner CROs on developing metrics that add value for both organizations. The engagement has also led to training opportunities on risk-based quality management, metrics definition & use, and root cause analysis. Join us to find out where we are on the journey, and the future plans.

 

 
Read more

Case Study: The Merz Journey to Successful RBQM

Nico Wegener

Senior Clinical Project Manager
Merz Pharmaceuticals

Johann Proeve

Chief Scientific Officer
Cyntegrity

Many healthcare companies and CROs struggle with the best approach to the implementation of RBQM. This is due to the fact that there is only 'little history' available, i.e. not a lot to piggy-back on what other companies had developed and shared already. Cyntegrity built a whole course on RBQM using the 'belt' approach while Merz Pharmaceuticals was the first company to jump onto the bandwagon and used it for a fully-fletched roll out into their R&D; organization.

The training had been split into four parts (white, green black and executive belt) with a different focus for the different staff levels. Merz staff took the white belt course to get comfortable with the basic idea of RBQM, while the green belt course had been tailored to a specific Merz study in order to link an actual study with the RBQM process.

Eventually, the black belt course had been taken by staff in charge of RBQM management in the future, mainly covering the change management aspects of RBQM implementation. Finally, Senior Management had to be familiarized with the concepts and the return on investment of an RBQM implementation, covered by the 'executive belt' course.

This session will address the lessons learned when developing such a course and the experience made by the participants. We will also discuss a real study as an example for RBQM and how Merz shall implement RBQM in future studies.

 

 

 
Read more

Evolving Risk-Based Monitoring Metrics

Vera Pomerantseva

Sr. Central Monitoring Lead, Central and Risk Based Monitoring
Bristol-Myers Squibb

Metrics measure performance and progress, but it is also so much more than that. Sufficient set of metrics should be able to tell a story, track progress and give directions. Despite being in the pharma space for a while, RBM remains rapidly evolving area. And as it evolves, so do the metrics. Many established functions were starting in the same manner and progressing with the development of their methodology: user experience, risk management, project management etc. Is it time to benchmark RBM? This presentation is designed to open discussion on the best approach to the RBM metrics.

 

 
Read more

Remote Monitoring: Data Quality Risk Assessment During and After a Pandemic

Amy Jimmerson, RN, BSN, MSHCA, CCRA

Clinical Monitoring Director, Global Monitoring
Medtronic

Todd Johnson

Principal Consultant
Halloran Consulting Group, Inc.

This topic will focus on how one company developed and implemented a Remote Monitoring Quality Assessment tool and process post COVID-19. The discussion will mostly focus on the process development aspect and will help the listener to understand why the process was necessary and the determinants/components used for developing the tool. The discussion will include factors to consider when determining if there is a difference in data quality between remote monitoring and on-site monitoring.

 

 
Read more

The RBM Training Gap: Risk-Based Monitoring the Noun vs. the Verb

Sandra (SAM) Sather, MS, BSN, CCRA, CCRC

Vice President
Clinical Pathways

Have you heard "We are not doing RBM for this study . . ." What? Our quality system should support a risk-based approach to many clinical trial activities like site monitoring, data quality oversight, vendor oversight, safety surveillance, audit planning, and more. Sure you train on what risk management is and also what your process is for risk assessment, but what do the team members that work directly with sites or project teams not understand is that we don't have RBM trials and non-RBM trials. This is holding us back from critical thinking during project management, vendor oversight, and site monitoring. Many times the use of RBM at study start-up as a noun in the vendor selection creates a false understanding that we are not doing risk-based thinking for all trials. There is a large gap in the "elevator speech" from senior management, vendor selection defense teams, audit representatives describing how risk management is ensured at the project level. Come to this session to:

  • Identify the gaps in RBM training,
  • Recognize the misuse of the term RBM and impact during study set-up of SOW and monitoring plan development,
  • Describe some ways to assess and address gaps.

 

 
Read more

Case Study: Gaining Access to and Utilizing Operational Data to Improve the Study Start-Up Process

Anamika Sarkar

Business Relationship Manager, Global Development Systems
Regeneron

With more trials outsourced, in part or in whole, many sponsors do not have timely access to their operational data. They work with multiple CROs, each with their own dashboards, portals, etc. By getting CROs to use sponsor systems, or by having regular data feeds into sponsor systems (e.g. Clinical Trial Management Systems, Study Start-Up), the reporting capabilities of the sponsor systems can be used. With the increase in outsourcing, is there a point at which using the sponsor’s clinical trial systems in this way becomes inefficient? Would it be more effective to manage decisions using a data aggregation platform? During this session, the presenter will discuss the challenges faced by a mid-sized biotechnology organization and describe the approach used to integrate clinical operational data from CROs and other vendors into a sponsor’s system that allows users to view both planned and actual study start-up data in real time. This has transformed the approach to start-up and enabled real time decision making to keep clinical trials on track.

 

 
Read more

Risk Based Quality Management Conversations in the Field

Nechama Katan

Director, Data Science Lead
Pfizer

Christine Panetti

Sr. Associate Central Monitor Risk Based Monitoring
Pfizer

Assertiveness: Learn to be assertive within your team discussions. Work beyond challenging personalities, limited resources, and unclear expectations by conveying your needs and interests confidently within the same arena as your teammates. Analyze your own style of communicating. Recognize your personal triggers and obtain techniques for overcoming them when the pressure is on. Learn to provide and accept feedback willingly while keeping the relationship in mind.

Influence: Discuss the power within you to influence the success of your study team meetings. There are many ways in which you can have an impact on the team. Knowing your role and how to leverage your influence will help you to build trust with your teammates and rely on each other for accountabilities. Effective RBQM requires the effort of each member of the team. Identify how you can make your best contribution.

 

 
Read more

Case Study: How to Use Quality Metrics in Vendor Oversight

Nancy Dynes, MBA

Metrics Consultant, Medicines Quality Organization
Eli Lilly and Company

The success of Sponsor-Vendor relationships is often measured by milestones achieved, budgets met, and data collected. But the success of clinical development is based on the quality and reliability of the information submitted. Tracking and measuring quality is key to a successful submission and partnership, and this session will review some fundamental indicators to ensure quality is always in the picture.

 

 
Read more

Improving Measurement of Sponsor/Vendor Relationship Health

Keith Dorricott

Director
Dorricott Metrics & Process Improvement Limited

Feedback from sponsor/vendor surveys is an important part of evaluating a vendor relationship. But what happens when the results look OK but those in your organization know there are real problems? Some people refer to this phenomenon as “the watermelon” – it’s green on the outside but red on the inside. What questions should be asked in surveys? How should results be quantified? How should the data be interpreted so the right actions can be taken to improve the relationship? How can positive results be identified, acknowledged and celebrated? This presentation summarizes one of the key outputs of the MCC Vendor Oversight Work Group – how to design, conduct, interpret, and use the data from relationship surveys.

 

 
Read more

Case Study: Gates MRI Experience Utilizing a Site Audit Selection Tool Derived from the MCC Site Study Conduct Scoring Tool

Maryann Livolsi, MSN, RN

GCP Compliance Leader
Bill & Melinda Gates

Linda Sullivan, MBA

Executive Director
WCG MCC

During this session, presenters will discuss the approach, criteria and performance metrics used by the Bill and Melinda Gates Medical Research Institute to select sites for onsite audits of a Phase 2 study. The approach allows users to weight the importance of selection criteria based on study-specific, align performance measures to review for each criterion and identify high scoring sites to consider for audits. Additionally, this session will review the importance of using leading indicator versions of metrics when evaluating and comparing site performance during study conduct.

 

 
Read more

Foundational IT Systems Selection and Contracting Best Practices

Gary Tyson

Partner
Pharma Initiatives

Failed large-scale IT systems (e.g. EDC, CTMS, QMS, eTMF) implementations waste hundreds of thousands of dollars and hundreds of hours of precious time. Selecting the right large-scale IT system is the most important step in ensuring a successful implementation. Yet, many organizations take an informal, unstructured approach to IT system selection and open themselves up to risk of failure.

Gary Tyson from Pharma Initiatives Consulting will share the best practices they have developed by successfully leading over 40 large-scale IT system selections and implementations over the past 25 years. This program, which includes a presentation and a workshop, will focus on the practical steps that organizations can take to increase the likelihood of IT system selection success. The workshop will be a deep dive in building a Value Case for your next IT system, a step which will serve as the foundation for your selection process.

 

 
Read more

Fulfill Requirements and Ensure the Reliability of Clinical Trial Results Using Risk-Based Monitoring Methods

Nathaniel Katz, MD

Chief Scientific Officer
WCG Analgesic Solutions

ICH requires sponsors to use risk-based monitoring approaches to ensure the reliability of clinical trial results, but does not define reliability or related concepts. The accuracy and reliability do have consensus definitions that can be applied to clinical trials. This presentation will describe how sponsors can implement definitions of accuracy and reliability in their RBM methods, which will fulfill regulatory requirements and improve the success rate of clinical trials.

 

 
Read more

Building and Assessing Vendor Relationships

Maria Makarovskaya, MA

Global Strategic Sourcing
Clinical Category Management Lead
Corbus Pharmaceuticals

How do you define a meaningful relationship and how to measure it? When is the right time to establish a governance plan? Is it always defined by the size of engagement or length of it? What risk factors do you need to consider? What are common Partnership challenges and how to overcome them? Do you have tools to build a culture of collaboration to support a productive relationship?

 

 
Read more

Machine Learning: How to Implement Operational Predictions and Why these Insights are Key to Business Success

Elvin Thalund

Director, Industry Strategy
Oracle Health Sciences

An introduction to the implementation of machine learning based on milestone prediction and how this can evolve. Clinical operations staff need to have confidence in machine learning predictive models and be able to validate the accuracy of outcomes. By knowing which indicators have the most impact on these models, organizations can focus on those indicators to refine their models and learn from these insights, which can ultimately drive behavioral changes (i.e., less reliance on subjective decisions) to optimize business processes.

Machine learning allows organizations to continuously improve with direct implications on timelines and associated costs of clinical trials.

 

 
Read more

Day 1

Tuesday, Sept. 8, 2020

9:00 AM

Private time to view pre-recorded sessions and prep for live discussions

9:00 AM - 10:00 AM EDT
10:00 AM

Opening comments

10:00 AM - 10:15 AM EDT

Linda Sullivan, MBA

Executive Director
WCG MCC
10:15 AM

Community Discussions

10:15 AM - 11:30 AM EDT

  • Community Discussions use a virtual white board platform and polling functionality
  • Presenters of related presentations will participate in the discussion (rather than having individual presenter Q&A; sessions)
  • A summary of the discussion is included in the Summit Report – which is received by all participants for no additional cost
  • To enhance the richness of the discussions, participants are requested to view the presentations related to topic prior to the discussion
Bridging the Gaps between CAPA and RBQM
Does your organization connect the programs? How does your organization track and stratify “issues”?

Linda Sullivan, MBA (Discussion Leader)

Executive Director
WCG MCC

Oleg Shevaldyshev (Discussion Leader)

Associate Director, Quality Assurance
PRA Health Sciences
Risk-based Auditing
How does your organization decide which vendors and/or investigative sites to audit?

Maryann Livolsi, MSN, RN (Discussion Leader)

GCP Compliance Leader
Bill & Melinda Gates Medical Research Institute

Liz Wool, CCRA, CMT (Discussion Leader)

President
Wool Consulting Group, Inc.
Virtual/Remote Trial Oversight
How do you oversee and monitor virtual trials? What data do you need to collect and review? What is the role of the "site" monitor?

Arturo Morales, PhD (Discussion Leader)

Vice President, Technology Solutions
WCG
RBQM Training — The Foundation for Successful RBQM Implementation
RBQM training surfaced to be one of the most critical areas frequently preventing companies from making the first step into the RBQM direction. This hurdle has already been discussed between Cyntegrity and Merz in the pre-recorded presentation. This community discussion session will highlight in more detail the training content, how to get started (big bang or step-by-step), whom to involve, which studies might be good candidates vs not so good candidates, retrospective analyses vs RBQM implementation in a brand-new or ongoing study, when to be strict and when to be more relaxed, and other, potentially necessary hand-holding processes required for a successful start into the RBQM world. This is also your opportunity to ask any question related to the training or the RBQM implementation steps.

Johann Proeve (Discussion Leader)

Chief Scientific Officer
Cyntegrity

Nico Wegener (Discussion Leader)

Senior Clinical Project Manager
Merz Pharmaceuticals

Working Groups

10:15 AM - 11:30 AM EDT

  • Working Groups are a roll-up-your-sleeves and build-a-solution group with the use of a virtual white board platform
  • The Work Group Leader will provide participants with ideas as a starting place for the Work Group to build on
  • To make best use of the available time, participants are requested:
    • to view presentations related to topic prior to the session
    • to complete survey questionnaire prior to the session
    • to gather information from their organization and be ready to share
  • A summary of the work group product is included in the Summit Report – which is received by all participants for no additional cost
Key Risk Indicators (KRI)
Develop a list of commonly used KRIs and their effectiveness

Kevin Douglass (Working Group Leader)

Associate Director-Process Excellence & Risk Management
DSI

Steve Young (Working Group Leader)

Chief Scientific Officer
CluePoints
Vendor Oversight
What are the most important questions about vendor performance that your organization seeks to answer? What do you measure to answer the questions?

Keith Dorricott (Working Group Leader)

Director
Dorricott Metrics & Process Improvement Limited
11:30 AM

Break

11:30 AM - 12:00 PM EDT
12:00 PM

Group leaders recap discussion and working group outcomes

12:00 PM - 12:30 PM EDT
12:30 PM

Group Exercise – Data Analytics Team Exercise Part 1

12:30 PM - 1:45 PM EDT
In this group exercise, participants will be separated into teams competing to uncover the root cause(s) of issues described in a case study. Each team will be provided with a case study packet that includes a description of a clinical trial, data reports, questions to explore and a worksheet to record the discussion.

Exercise Wrap-up
Teams will compare results, how they worked through the analysis and reflect on lessons learned.

Keith Dorricott (Facilitator)

Director
Dorricott Metrics & Process Improvement Limited

Linda Sullivan, MBA (Facilitator)

Executive Director
WCG MCC
1:45 PM

Recap and Closing Comments

1:45 PM - 2:00 PM EDT

Linda Sullivan, MBA

Executive Director
WCG MCC
2:00 PM

Private time to view pre-recorded sessions and prep for live discussions

2:00 PM - 4:00 PM EDT
4:00 PM

Day 1 Adjourns

Day 2

Wednesday, Sept. 9, 2020

9:00 AM

Private time to view pre-recorded sessions and prep for live discussions

9:00 AM - 10:00 AM EDT
10:00 AM

Opening comments

10:00 AM - 10:15 AM EDT

Linda Sullivan, MBA

Executive Director
WCG MCC
10:15 AM

KEYNOTE: A Regulatory Compliance Perspective on Improving Clinical Trial Quality, Protection of Trial Participants, and Data Integrity

10:15 AM - 10:45 AM EDT
  • What should clinical research industry stakeholders keep in mind when implementing quality-by-design and risk-based quality management programs?
  • Has FDA seen a shift in inspectional findings that suggest good clinical practice compliance and data quality are improving?
  • How does CDER use submission data to quantify risk to determine where to conduct BIMO inspections?
  • Impact of the COVID 19 pandemic?

Jean Mulinde, MD

Policy Advisor
Division of Clinical Compliance Evaluation
Office of Scientific Investigations, CDER, FDA
10:45 AM

Community Discussion Groups

10:45 AM - 12:15 PM EDT

  • Community Discussions use a virtual white board platform and polling functionality
  • Presenters of related presentations will participate in the discussion (rather than having individual presenter Q&A; sessions)
  • A summary of the discussion is included in the Summit Report – which is received by all participants for no additional cost
  • To enhance the richness of the discussions, participants are requested to view the presentations related to topic prior to the discussion
Centralized/Onsite Monitoring Process Oversight and Metrics
How well are the centralized and onsite monitoring processes working? What does your organization measure?
Vendor Oversight
How do you assess partnership quality?

Maria Makarovskaya, MA (Discussion Leader)

Global Strategic Sourcing, Clinical Category Management Lead
Corbus Pharmaceuticals
Monitoring Data Quality
How has the COVID pandemic changed how your organization monitors data quality?

Working Groups

10:45 AM - 12:15 PM EDT

  • Working Groups are a roll-up-your-sleeves and build-a-solution group with the use of a virtual white board platform
  • The Work Group Leader will provide participants with ideas as a starting place for the Work Group to build on
  • To make best use of the available time, participants are requested:
    • to view presentations related to topic prior to the session
    • to complete survey questionnaire prior to the session
    • to gather information from their organization and be ready to share
  • A summary of the work group product is included in the Summit Report – which is received by all participants for no additional cost
Quality Tolerance Limit (QTL) Parameters
Develop list of commonly used QTLs
IT System Selection – How to Define and Estimate System Benefits before Selection.
Defining the benefits of a new system are critical for:
  • Making the case to senior leadership
  • Understanding what functionality truly drives the system’s value and
  • Determining how much the organization can really afford to invest in the new system
Yet, defining benefits is very challenging. As a result, many system selections do not have a clear benefit statement. During this work group session, participants will each develop a set of benefits and value estimates and share with each other to get support and help.

Gary Tyson (Working Group Leader)

Partner
Pharma Initiatives
Risk Based Quality Management Conversations in the Field
In this working group we will focus on a break-out session where concepts can be applied to real work problems. Bring your study team conflicts to the discussion and work collaboratively to develop well thought out resolution strategies. With this exercise you will learn to write, mitigate and action signals like a champion at your next study team meeting!

Nechama Katan (Working Group Leader)

Director, Data Science Lead
Pfizer

Christine Panetti

Sr. Associate Central Monitor Risk Based Monitoring
Pfizer

Jennifer Campbell

Clinical Data Analyst
CluePoints
12:15 PM

Break

12:15 PM - 12:45 PM EDT
12:45 PM

Group leaders recap discussion and working group outcomes

12:45 PM - 1:15 PM EDT
1:15 PM

Group Exercise – Data Analytics Team Exercise Part 2

1:15 PM - 2:15 PM EDT
In this group exercise, participants will be separated into teams competing to uncover the root cause(s) of issues described in a case study. Each team will be provided with a case study packet that includes a description of the organization and outsourcing vendors, protocol synopsis, data reports, questions to explore and a worksheet to record the discussion. Teams may ask facilitators for additional information as the need arises.

Exercise Wrap-up
Teams will compare results, how they worked through the analysis and reflect on lessons learned.

Keith Dorricott (Facilitator)

Director
Dorricott Metrics & Process Improvement Limited

Linda Sullivan, MBA (Facilitator)

Executive Director
WCG MCC
2:15 PM

Recap and Closing Comments

2:15 PM - 2:30 PM EDT

Linda Sullivan, MBA

Executive Director
WCG MCC
2:30 PM

Private time to view pre-recorded sessions and prep for live discussions

2:30 PM - 4:00 PM EDT
4:00 PM

Day 2 Adjourns

Day 3

Thursday, Sept. 10, 2020

9:00 AM

14:00

Private time to view pre-recorded sessions and prep for live discussions

9:00 AM - 10:00 AM EDT
10:00 AM

Opening comments

10:00 AM - 10:15 AM EDT

Linda Sullivan, MBA

Executive Director
WCG MCC
10:15 AM

Community Discussion Groups

10:15 AM - 11:45 AM EDT

  • Community Discussions use a virtual white board platform and polling functionality
  • Presenters of related presentations will participate in the discussion (rather than having individual presenter Q&A; sessions)
  • A summary of the discussion is included in the Summit Report – which is received by all participants for no additional cost
  • To enhance the richness of the discussions, participants are requested to view the presentations related to topic prior to the discussion
Quality by Design: Protocol and Operational Complexity
How are organizations assessing protocol and operational complexity? How do we use the assessment information to drive change?
The Risk-Based Monitoring Training Gap
Discuss ways to assess and address training gaps
Artificial Intelligence & Machine Learning
People, Process, Tools – How and Why to Upskill Your SMEs
  • What AI/ML applications is your organization using to support/improve in clinical trial operations?
  • How do you transform current SMEs into people who can leverage and/or design data science solutions?

Nechama Katan (Discussion Leader)

Director-Data Science Lead
Pfizer

Working Groups

10:15 AM - 11:45 AM EDT

  • Working Groups are a roll-up-your-sleeves and build-a-solution group with the use of a virtual white board platform
  • The Work Group Leader will provide participants with ideas as a starting place for the Work Group to build on
  • To make best use of the available time, participants are requested:
    • to view presentations related to topic prior to the session
    • to complete survey questionnaire prior to the session
    • to gather information from their organization and be ready to share
  • A summary of the work group product is included in the Summit Report – which is received by all participants for no additional cost
Risk Re-Assessment
Develop a list of events & milestones that trigger re-assessment during conduct

Gary Tyson (Working Group Leader)

Partner
Pharma Initiatives
11:45 AM

Break

11:45 AM - 12:30 PM EDT
12:30 PM

Group leaders recap discussion and working group outcomes

12:30 PM - 1:00 PM EDT
1:00 PM

vSummit Concludes

About MCC

Leading the drug-development enterprise in the adoption and utilization of standardized metrics and benchmarks to drive performance improvement. Founded in 2006, MCC is the leading industry association dedicated to the development of standardized performance metrics to improve clinical trials. MCC provides the collaborative environment for biopharmaceutical and device sponsors, service providers and sites to improve clinical-trial development through use of MCC standardized performance metrics.

MCC logo

300 N. Washington St., Suite 200, Falls Church, VA 22046, USA
Phone: 317.622.0266

metricschampion.org