What follows is more or less a transcript of my presentation at the annual meeting of the Pacific Region Chapter of the Society of Quality Assurance on 6 Nov 2014. The conclusion? A proposal that the same requirements for data integrity apply to GLP, GCP, and GMP data and that there is no regulatory basis for claiming data are “draft data” in GLP or GCP studies.
I believe we’ve stepped off the trail to data integrity and that we’ve been lost for a while now. I’d like to take this meeting as an opportunity to stop, take our bearings, and see if we can work toward agreement on where the trail to data integrity is. It’s my hope this presentation will help start a conversation here, which can be continued back home at your companies and with your service providers.
There was a quip that Part 58 only requires data to be recorded in ink, and it will be interesting to see how the use of invisible ink is a problem. And you’re right – when you use invisible ink, there’s a way to reveal what was written. However, in the case studies we’ll discuss today, there is no way to reveal what was written – it’s as if the data were never recorded at all.
Let’s start by stopping and taking a look at recent FDA enforcement in the GMPs. In addition to their efforts to protect us from unsanitary manufacturing conditions at compounding pharmacies, they’ve really upped their game looking for data integrity problems in the manufacturing processes for both active pharmaceutical ingredients and finished pharmaceuticals. We’ve seen a number of important data integrity Warning Letters over the past few years, and earlier this year, FDA issued a helpful set of FAQs on the GMPs. (Read more…)
For example, FDA has communicated that electronic chromatography records supporting GMP work must be maintained rather than relying exclusively on printed “copies” of chromatograms. Why? Because printed “copies” of chromatograms are not sufficient. Printed copies typically don’t include attributes like “the injection sequence, instrument method, integration method, or the audit trail, of which all were used to create the chromatogram or are associated with its validity.”
What does this mean for us?
Let’s take our bearings
I’m not a GMP practitioner, so let’s come back to the GLPs and GCPs.
Some of you don’t have experience working outside of a quality or compliance group performing GLP or GCP regulated activities. I’m going to share a little history with you in the hopes it gives you some new insights into how we got where we are today.
When I started in industry back in 1990, I was a statistician and statistical programmer supporting GLP studies in general toxicology, reproductive toxicology and pharmacology. We followed well-written SOPs and validated all of our computer programs. Why? Because as 21 CFR Part 58 was interpreted, these activities were required.
Our QAU audited our department on a yearly basis; they audited our computer system validation deliverables for every system as we produced them; and they QC’d our work products in every study report right back to the raw data. Our QAU taught us the importance of working in a way that would give all our customers confidence in the conclusions of the reports we contributed to.
While none of us used the phrase “data integrity,” the way we worked helped ensure data integrity for the studies in non-human animals.
Our counterparts in the clinical statistics group, who worked with data from human subjects, did not have SOPs requiring system validation. Why not? As we understood the world at that time, there was no regulatory requirement to do so. Our QAU was not engaged with the clinical statistics group in any way.
What game changing regulatory action happened in 1997?
FDA promulgated 21 CFR Part 11: Electronic Records; Electronic Signatures.
As a consequence, I soon found myself project managing the retrospective evaluation of over 100 clinical systems. Validation was just starting to get talked about in clinical. The GLP QAU was restructuring and beginning to extend its reach into GCP.
Our initial efforts were met with resistance, anger, even derision. One PhD clinical statistician, when questioned by an auditor about a table in a study report shouted, “I’m a PhD statistician! I don’t make mistakes!” At a meeting between clinical management and the combined GLP/GCP audit function, a very senior manager on the clinical side called one of the auditors a Nazi. (There’s just nothing that stops a dialog better than calling one of the participants a Nazi.)
Fortunately, we’ve made progress in the following 17 years.
Nearly all sponsors have a clinical audit function that provides compliance advice to clinical teams and routinely performs audits of clinical investigator sites, internal processes, and service providers. However, for a number of complex reasons, we’re still wrestling with the meaning of data integrity and how to apply the Part 11 meaning of computer system validation, especially in clinical research.
Why do we validate computer systems?
Why do we bother? It’s not just because QA makes us. We validate computer systems to ensure they perform accurately, reliably, and consistently with their intended performance AND so we have “the ability to discern invalid or altered records” (21 CFR Part 11).
It’s hard for anyone to discern invalid or altered records when the recorder uses invisible ink!
Before we look at two case studies and how they give the impression of using invisible ink, let’s work toward agreement on the answer to the question:
Are there any regulatory provisions for draft GLP or draft GCP data?
Post-meeting note: the audience appeared to agree (with the possible exception of pathologists’ reports), that the concept of draft data does not exist in FDA regulations.
Case Study 1: Flow Cytometry
Flow cytometry has become an important tool to assess the cellular impact of biologics on clinical trial subjects and is often a key technology use to establish biomarkers.
For background, a flow cytometer is a lab instrument made up of 3 components: fluidics, optics, and an electronic computer system. The fluidic system streams individual cells suspended in fluid through a beam of light. As each cell passes through the light, it scatters light and fluoresces, creating light signals. The optic system detects the light signals and converts them to electronic signals. These are stored by the computer in a standard format, allowing the scientist to perform analyses.
Once the data are saved, they can be subset using a process called gating. Gating helps the scientist subset the data to focus the analysis on the cells of interest. Gates can be changed manually.
When scientists do change the gates, the changes may not be included in an audit trail, because
- The instrument was not designed with an audit trail
- The instrument has an audit trail, but the lab did not turn it on
- There is an audit trail, but for it to work, the gates must be saved first, and the scientist believes:
“I can change the gates as often as I like! It’s not data until I say it’s data!”
When this happens, the previous gate setting have been recorded in invisible ink. That is to say, the previous gate values haven’t been recorded at all.
FDA Warning Letters issued to manufacturers of active pharmaceutical ingredients and finished pharmaceuticals say that’s not ok. Why is it ok for a GLP or GCP study?
Case Study 2: Electronic Data Capture
Electronic data capture (EDC) systems have become sponsors’ technology of choice for recording case report form data in clinical trials. The records in the EDC system form part of the subjects’ case histories, for which the investigator bears sole responsibility under 21 CFR Part 312. This puts investigators in the uncomfortable position of being responsible for records over which they have limited control once they have saved them.
I want to talk with you today about what can happen with these records before the investigator saves them.
In a well-meaning effort to make sure electronic Case Report Form (eCRF) data are as clean as possible as quickly as possible, some EDC systems are designed to allow queries to fire on data entered as soon as the computer cursor leaves the current data entry field and before the data on the eCRF are saved. Let’s look at what that means in practice, using 2 scenarios.
Let’s say a sponsor is developing a blood pressure medication designed to bring blood pressure in hypertensive patients down to a “normal” range of 130/80. The medical charts for Subject 1 show a blood pressure of 145/80 at Visit 5, after the Subject has been on study for 2 months.
- The Study Nurse enters 145 in the field for systolic blood pressure and tabs to the diastolic blood pressure field.
- A warning message pops up: “Systolic blood pressure is too high. Please correct.” (You might think the language used could be perceived as coercive, but that’s the topic of another talk.)
- The Study Nurse checks the medical record; leaves the 145 unchanged; closes the warning message; enters the diastolic blood pressure; and saves the eCRF.
- The Study Nurse enters 154 in the field for systolic blood pressure and tabs to the diastolic blood pressure field.
- A warning message pops up: “Systolic blood pressure is too high. Please correct.”
- The Study Nurse checks the medical record; realizes they entered the value incorrectly; changes the 154 to 145; and tabs to the diastolic blood pressure field.
- A warning message pops up: “Systolic blood pressure is too high. Please correct.”
- The Study Nurse closes the warning message, enters the diastolic blood pressure, and saves the eCRF.
The problem in each scenario is that the EDC system has been designed not to record the
- Original value entered in Scenario 2
- Fact that the query fired in either Scenario
- New value entered in Scenario 2
It’s all been done using invisible ink.
In my experience, this happens because IT has not thought through the data integrity implications, and it’s a generally held belief that
“It’s not data until the site saves it!”
Having the discussion
Yesterday, John Bolling discussed with us what’s important to scientists about their data: quality, integrity and reproducibility. I can’t promise success, but I can tell you I’ve been successful most of the time when I walk a scientist through their process and demonstrate to them where they’ll have problems with quality, integrity, and reproducibility. John’s right: I nearly always see the light come on – the penny drops – they get it.
I just planted a seed and watched it germinate. Now, whether it’s germinated in a rocky place, among thorns, or in good soil is difficult to tell. So much of what happens next depends on senior management and the culture they establish and the the behaviors they reward and punish.
Senior management may have a different understanding of data quality, integrity and reproducibility than a bench scientist. To be successful, the conversation may have to sound different. Where have data integrity Warning Letters affected stock prices or the ability to do business? How have data integrity problems delayed submissions, resulting in lost revenue? When did the lack of data reproducibility cause an expensive merger to go south?
Where is the trail to data integrity?
In GLP studies, we work with animals, who cannot give their consent to participate, to help predict safe doses of experimental compounds to give to human subjects.
In GCP studies, we work with sick people (who could be considered vulnerable just on the basis of being sick) who do give their consent to participate in an experiment to help predict safe and efficacious doses to give to human patients in the future.
In the manufacturing process, people work with machines and ingredients to create a medicine that has been approved for use in human patients on the basis of the GLP and GCP data.
We are seeing very clear messages about data integrity in the manufacturing setting. It’s my hope that we can agree how to think about data integrity in GLP and GCP research, align it with the regulatory expectations for GMP data, and eliminate the use of invisible ink.
To that end, I make the following proposal, which I hope will start a conversation here and which can be continued when you get back home to your companies and with your service providers.
- The same requirements for data integrity apply to GLP, GCP, and GMP data.
- There is no regulatory basis for claiming data are draft data in GLP or GCP studies.
Two of the many implications of this proposal are that
- All changes to data, including metadata (like gate settings) should be recorded and maintained with the electronic record of interest.
- Queries prompting investigator site staff to change data on eCRFs should only fire after saving the data.
On the first day of the conference, speakers from the Northwest Association for Biomedical Research offered us encouragement: “QA provides a road map to change.” Another presenter reminded us that change is gradual.
I feel confident we can work together as QA professionals to bring all parties back to the trail to data integrity, whether they’re currently in a rocky place, or are surrounded by thorns.
Leave a comment below – I look forward to the conversation!