Nine Pitfalls To Avoid In Data Integrity￼
Peter H. Calcott, D.Phil., president and CEO of Calcott Consulting LLC wrote in the current issue of meddeviceonline:
1. Part 11 Compliance
Part 11 compliance has been around for about 20 years,1 but still today I see confusion in the industry. While most elements have been implemented successfully, in part attributable to the array of great software available in the marketplace, there are still areas where people stumble. I will illustrate two areas.
First, while almost all my clients purchase software in the marketplace, the systems fall into two types for the purpose of this point. The systems that reside in the cloud, where you access the application via the internet, usually are very robust. However, I have seen clients disable certain features that can have impact (e.g., audit trails). On e-systems that reside on a PC at the site of use, I have witnessed several problematic incidents. These all focus on the integrity of the clock used to date-stamp data. In these examples, the software has used the Microsoft (computer) clock to date-stamp the data rather than the software clock. In both cases, the Microsoft clock was not protected, and I could change the date and time with a click of the mouse. This rendered the date stamp worthless. The simple fix was for the administrator to lock the computer setting so that analysts could not change it. Of course, the system administrator can still do that.
Second, assignment of appropriate access based on job function is critical for meeting Part 11 compliance. At larger companies, there are at least three levels of access, easily recognizable. I will use setting up access levels for a High Pressure Liquid Chromatography (HPLC) as an example. The most restrictive is for staff who simply review data, either rejecting or approving the results. This is common for supervisors or manager-level staff. The next level is assigned to analysts, where they need to be able to set up runs, review data, and make adjustments to, for instance, integration parameters. And finally, there is the system administrator, who has complete access to the inner working of the system. In some smaller companies, these assignments are often blurred, with “super users” having administrator rights although they actually run samples and process data. This means that these analysts can actually access the file systems and make major adjustments. This leads to situations where DI can be questioned. It is particularly important to assign the access based on job function and separate the administrative functions from analyst roles. This tends to be a problem particularly in smaller companies.
2. Integration Of HPLC Chromatograms
Ideally, software should be set up to run automatically and integrate correctly every time. However, in many analyses, integration is not perfect, requiring post-analysis adjustments. If this is the case, it is paramount to incorporate into the method SOP a procedure to follow, with appropriate documentation, so the reintegration is performed in a reproducible compliant and documented manner. Without these controls and checks, it leaves the company open to DI questions.
3. Out Of Specification (OOS) Investigations
Even after the “Barr Case” of 1993,2 companies run into problems with how they conduct OOS investigations. There needs to be a robust SOP detailing how you proceed when a suspected OOS is encountered. Initially, before any investigation into product quality, there needs to be an assessment as to whether the method was run correctly in the laboratory. If an error can be demonstrated, the whole result or even the whole run can be nullified and the run or sample repeated. Once the run is shown valid, or at least cannot be nullified, then a product investigation can be considered. A detailed investigation plan, including repeat testing or retesting, must be drawn up and executed. Failure to follow a structured plan can create doubt about your DI status and conclusions.
4. Environmental Monitoring (EM) Data
Particularly in sterile or aseptic processing operations, many EM data are generated and analyzed. Obviously, the correlation between numbers of colonies on plates and the results on test forms must be perfect. So, when I routinely audit operations, I often review these forms. Even well-run operations will pick up counts on plates quite normally. If I see page upon page of zero colony forming units, my suspicions are triggered. If it looks too good to be true, it usually is.
5. Reports That De-emphasize Data That Compromises The Study – Validation And Investigations
As I indicated in part 1 of this article series, any discrepancy or deviation or anomalous data that is generated in an investigation or validation must be considered in the context of the end result of the report. Too many times, I have found results that might cast doubt on a conclusion are ignored and not discussed. Not all “failing” results or deviations will necessarily nullify the conclusion. In many cases, other tests can be performed to address the anomaly. At the end of the day, you want a report that can be read by others (including an inspector) that is correct and convincing.
6. Cherry-Picking Data
In many MHRA and FDA presentations and guidances,3-7 they have described cherry-picking of data. That is the tendency to keep testing until you get the result you “want” – usually a passing result. It often manifests in using unofficial databases to house data, running trial samples, using test samples to “calibrate” systems, and the list goes on. In the GMP world of validated methods, you get a chance to run a sample once according to the SOP. Only if you can prove there was a lab error can you justify nullifying the test and repeating it. While FDA warning letters are rife with incidents, I have found in auditing it often happens in smaller companies that are transitioning from being solely a research company to a development company moving into clinical trial manufacturing. Often, the senior staff is research trained and not familiar with the GMP requirements. It is a hard transition to make in a career. I know because I made that transition a long time ago.
7. Making Data And Sampling Ports Accessible
Although I have never seen an example of this in my years of auditing, I am including it because the MHRA used this example in its 2018 guidance 6. By this, they mean that a sampler might be tempted to sample not from the correct port in, for instance, a WFI loop, but rather another from a more accessible one, if the former is difficult to get to. So, we must be vigilant to assure we do not install obstacles in the way of our staff getting their jobs done correctly.
8. QA-Issued Forms
Any blank form used in your operations (on the shop floor or QC labs) must be a controlled form. That is, it must be QA issued (appropriately reviewed and approved) and be issued with a unique identifier. QA needs to keep an inventory of those issued, when, and to whom. If the forms are available online and can be printed off by an operator, then the control is lost. A form can be filled in or destroyed with no record. I have found this is a difficult principle for some smaller companies to grasp. If you do not track issuance of your forms, you are open to questions about the integrity of your documentation.”
Please find the complete article here.
For further information please get in touch with us: