Aug 24, 2016 - Today's Thought Line. â Think outside what you do. â Think about what you're measuring and why. â T
August 24, 2016
Just Because You Can Measure Everything, Doesn’t Mean You Should – Measure What Matters
Eli Alford, COO, Schulman IRB
About Schulman IRB
Established in 1983 Superior audit history with FDA—five consecutive audits with no findings 21 CFR Part 11 compliant electronic systems Compliant with FDA, OHRP and Health Canada requirements Full Board meetings five days a week Dedicated daily expedited review of qualifying minimal risk protocols
About Schulman IRB
Review outcome provided within one business day of new study review One business day turnaround for complete new site submissions Dedicated streamlined processes tailored to Phase I timelines Expert oncology IRB members experienced in all phases of oncology research
National IRB for Cancer MoonShot 2020 initiative
Customized services for institutions Experienced primary points of contact for sponsors, CROs, institutions and sites
About Schulman IRB
Clinical Quality Assurance (CQA) and Human Research Protection (HRP) consulting services provided by Provision Research Compliance Services
www.provisionrcs.com
schulmanirb.com
About Today’s Presenter Eli Alford Chief Operating Officer, Schulman IRB
With Schulman since 2011 Provides strategic leadership for Schulman by establishing plans, policies and performance goals that support client needs and ensure protection of human research subjects Retired Army colonel with over 25 years of service in combat arms leadership and research/analysis roles Experience as a Contracting Officer’s Representative under federal acquisition regulations and project managed federal contracts on the vendor side. Experience at a global CRO in clinical operations management, business improvement, proposals, and corporate development Six Sigma Black Belt trained BA, MS, former Harvard Fellow
What’s Your Comfort Level with Performance Metrics?
Guru Expert Capable and comfortable Novice Not sure about any of this
Today’s Thought Line
Think outside what you do
Think about what you’re measuring and why
Think about how to apply what we talk about to what you’re measuring
Disclaimers
Today’s discussion is not prescriptive Today’s discussion is not specifically about Schulman or IRB metrics None of these ideas are originally mine Opinions presented are my own, not necessarily those of Schulman IRB
Sabermetrics: the search for objective knowledge about baseball
Do better hitters get less good pitches?
Which is correlated with more wins: on-base percentage or slugging percentage?
A Wealth of Information Creates a Poverty of Attention “In an information-rich world, the wealth of information means a dearth of
something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of
attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” – Herbert Simon 1971
The 7 Deadly Sins of Performance Measurement 1.
Vanity
Measures that make the organization, people, and especially managers look good
2.
Provincialism
Letting organizational boundaries and concerns dictate measurements
CRO/Sponsor/Site Process
IRB Process
CRO/Sponsor/Site Process
3.
Narcissism
Measuring from one’s own point of view, rather than the customer’s.
4.
Laziness
Assuming one knows what’s important without giving it adequate thought or effort
5.
Pettiness
Measure only a small component of what matters
6.
Inanity
No thought to the consequences of metrics on human behavior and enterprise performance
7.
Frivolity
Not being serious about measurement in the first place “The 7 Deadly Sins of Performance Management [and How to Avoid Them]” Michael Hammer, MIT Sloan Management Review, Spring 2007 www.hammerandco.com/pdf/0407_SMR_7DeadlySinsPerfMeas.pdf
Chasing the Metric — The Cobra Effect The British colonial governor in India thought there were too many cobras in Delhi and placed a bounty on them
Financial incentive resulted in cobra farming, which created a flood of cobra skins to earn the bounty Government decided the scheme was a bad idea and rescinded the bounty
Cobra farmers now had cobras and no market . . . so they released them In the end, the cobra population in Delhi was bigger than before
http://freakonomics.com/podcast/the-cobra-effect-a-new-freakonomics-radio-podcast/
Defining Terms
Measure: An amount or degree of something (concrete and objective)
Metric: A derivative of measure (e.g. site activation rate (# sites / time)
Key Performance Indicator: Measure of progress toward a desirable outcome
Leading Indicator: Signal of future events (yellow traffic light)
Lagging Indicator: Signal of past events (financial statement)
Leading or Lagging In a clinical trial, is “site activation rate” a leading or lagging indicator?
Leading Lagging Both Neither Don’t know
Commonly Used Metrics
Cycle Time: Period required to complete a task End: 24 Aug vs. Start: 17 Aug = 7 days . . . business or calendar days?
Timeliness: Milestone met? 20 Aug planned vs. 24 Aug actual = 4 days late
Efficiency: Ratio of useful work to money expended 1,664 billed hours vs. 2,080 paid hours = 80% utilization Other: Protocols per FTE, revenue per FTE
Quality: Degree of excellence 5,467 error-free items vs. 5,500 total items = 99.4% accuracy rate Other: % studies with amendments prior to first patient enrolled
Apples and Oranges o
F
What does this tell you about, Davis, the patient? Nothing, really. Davis is a dog
What are the risks of combining metrics:
Oncology with Infectious Disease studies . . . for enrollment rate?
Phase I with Phase III studies . . . for timeliness?
U.S. central IRBs with local IRBs . . . for approval cycle time
Our World Planning
Protocol & CRF Development
Regulatory and IRB/EC Approval
Trial Documents
Where do you live on this planet? What matters to you (and your customer)? How do you know how well you’re doing what matters?
Investigator Selection Site Initiation Recruitment Treatment Periodic Monitoring Data Collection
Site Closeout Query Resolution
Biostats
Clinical Study Report
Drilling down
Local IRB/EC(s) Study Start
Country Selection
Site Feasibility & Selection
Essential Doc Collection
IP Release
Central IRB/EC National Competent Authority
Site Contract
IP Shipment
SIV
Team Accuracy Metrics, as of Jan 2015
My boss said, “Improvement! Keep up the great work!”
Team Accuracy Metrics, as of May 2015
Jul-14 Aug-14 Sep-14 Oct-14 Nov-14 Dec-14 Jan-15 Feb-15 Mar-15 Apr-15 May-15
98.2% 98.8% 98.7% 99.0% 98.5% 99.7% 100.0% 97.8% 99.7% 98.0% 97.1%
“Still looks good, but can’t really see it. Re-do, please”
Team Accuracy Metrics, as of May 2015 (revised)
“This is unacceptable, Eli! What don’t you understand about 100%!”
To Save My Job, I Got Serious
What’s important?
What metrics will provide insight?
I decided: Accuracy…Speed…Efficiency
I decided: Monthly error rates, cycle time, volume/FTE
What might that tell me?
Errors: KPI and a lagging indicator may help point to improvement opportunities
Cycle time: Delivery when client wants it (fast!) Volume/FTE: Do I have enough people to do the work? Perhaps related to quality and speed?
I Gathered All the Data I Thought Was Relevant Month Jul-14 Aug-14 Sep-14 Oct-14 Nov-14 Dec-14 Jan-15 Feb-15 Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15 Sep-15 Oct-15 Nov-15 Dec-15 Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16
Accuracy Rate 98.2% 98.8% 98.7% 99.0% 98.5% 99.7% 100.0% 97.8% 96.8% 97.6% 97.1% 100.0% 99.3% 99.6% 99.0% 97.0% 99.1% 97.4% 98.5% 99.7% 97.6% 98.1% 97.7% 98.8% 99.6%
Volume 221 345 225 399 325 345 250 625 590 589 450 225 276 260 286 600 465 627 478 382 630 428 352 324 282
Errors 4 4 3 4 5 1 0 14 19 14 13 0 2 1 3 18 4 16 7 1 15 8 8 4 1
Error Rate 1.8% 1.2% 1.3% 1.0% 1.5% 0.3% 0.0% 2.2% 3.2% 2.4% 2.9% 0.0% 0.7% 0.4% 1.0% 3.0% 0.9% 2.6% 1.5% 0.3% 2.4% 1.9% 2.3% 1.2% 0.4%
Volume / Mean Cycle Time FTE (work hrs)
FTEs 10 10 10 10 10 10 9 8 9 9 9 9 9 9 8 8 9 9 9 9 9 9 9 9 9
22.1 34.5 22.5 39.9 32.5 34.5 27.8 78.1 65.6 65.4 50.0 25.0 30.7 28.9 35.8 75.0 51.7 69.7 53.1 42.4 70.0 47.6 39.1 36.0 31.3
No data No data No data No data No data No data No data No data No data No data No data 5.3 5.9 5.8 5.7 6.9 5.8 7 5.9 5.8 6.9 5.6 5.7 5.6 5.4
I Plotted the Dots Error Rate
?
?
?
Volume per FTE
10
9 8 9
8
9
?
More Dots Error Rate
?
Mean Cycle Time (work hours)
?
?
Summary
What is important to you?
Define success
What are you going to do with the measurement?
Define the terms
Plot the dots
Keep score over time
What does it tell you? What does it mean?
Refine and repeat
Back to the Beginning
What’s your comfort level with performance metrics?
Guru
Expert
Capable and comfortable
Novice
Not sure about any of this
Are the results a measure, metric or KPI?
What’s the difference between a guru and expert?
Did the order of the choices influence your thinking?
This was self-reported. Would an objective test have provided more accurate results?
Additional Reading and Resources
Davis Balestracci, Data Sanity Newsletter, http://davisdatasanity.com/
Example situation derived from “Don’t Ever Forget ‘Beginner’s Mind’,” August 15, 2016: http://archive.aweber.com/davisnewslettr/EA0jK/h/From_Davis_Balestracci_.htm
Nassim Taleb, Fooled by Randomness, http://www.fooledbyrandomness.com/ Metrics Champion Consortium, http://metricschampion.org/
Moneyball: The Art of Winning an Unfair Game, Michael Lewis, 2004
Contact Information
Eli Alford
[email protected] 919-287-4927 https://www.linkedin.com/in/elialford
?
August 24, 2016
Just Because You Can Measure Everything, Doesn’t Mean You Should – Measure What Matters
Eli Alford, COO, Schulman IRB