Common DI points of failure: Incomplete Datasets
Accurate and complete data are the lifeblood of pharmaceutical laboratories. They are essential to ensure drug safety, efficacy, and quality. However, ensuring that data are complete is a major hurdle for pharmaceutical companies that must comply with both CFR21 Part 11 predicate rule 211.194 and ALCOA+ principles. Incomplete data sets in pharmaceutical quality laboratories can have serious consequences. First, they affect the reliability and validity of scientific findings. Second, inaccurate or missing data lead to faulty conclusions and incorrect drug release decisions which can risk patient safety.
Real-life examples of incomplete data sets can be found in the 483 defect reports by the U.S. Food and Drug Administration (FDA) regulatory agency. These reports outline defects found during company inspections and common errors made that can have serious impacts on data quality. Here are some common examples of deficiencies from the FDA’s 483 defect reports to watch out for.
1. Written records of stand-alone devices are not maintained.
The FDA observed that GCs and HPLCs in the quality control laboratory of a company were stand-alone and not server based. Written records specifically equipment use logs were not maintained. This makes it impossible to track the performed tests, which in turn calls into question the quality of the drugs. Another challenge with stand-alone devices can be the verification of original data, which requires a secondary record to ensure data security in the short term and transition to centralized solutions in the long term. Short-term transitional solutions include the use of device logbooks, as well as regularly auditing the electronic data generated. Checklists are often used to review the audit trail, logbook review and application administrator plans. To gain more insight into long-term solutions, it is worth taking a look at the blog “Common DI points of failure: Legacy Systems” . For example, the blog suggests master executor systems (e.g. LIMS, LES for lab systems) to offload records (e.g. audit trails, data) or run devices electronically.
2. Insufficient reporting of laboratory test data
The FDA observed that data obtained from laboratory tests were not properly reported. The reported result of the environmental monitoring samples showed fewer colonies than the analyst observed on the plate, indicating possible deficiencies in quality control procedures or personnel training. Adequate processes should be detailed in an approved work instruction to create a controlled execution of the evaluation and release of analytical results. To define these processes, data flow diagrams are often an essential tool. By regularly arranging training during processes, human errors can also be reduced. Historically, measurements that are named “test” have been misused to disguise “failed” measurements (OOS – Out Of Specification) so that the failed measures can be repeated with new parameters, which is why the topic is in the focus of authorities. Whenever data are generated that cannot be precisely assigned to any measurement or other regular activity (“orphan data”), this suspicion arises.
3. Lack of verification of sample test results
In another instance, the FDA observed that not all required data for verification of results was provided or not included in the verification. In this case, paper printouts were used as raw data for analyses without reviewing the original electronic data and associated audit trail. The actual performance of the analytics can be tracked using the audit trails. Audit trails should be reviewed regularly according to their importance and must also be available for inspections for traceability of laboratory processes. Only a complete review of all data can lead to unambiguous analytical results and, in the case of OOS and OOT results, help in the follow-up investigation of the deviation.
4. Missing HPLC data
Another common observation lies in labs inadequately recording data obtained during the test, which leads to insufficient traceability. For example, when a faulty peak integration occurred, failing to document changes to the processing method. In this case, the original extension of the chromatogram was not saved, thus the need for changes could not be documented and the second analysis overwrote the original evaluation of the chromatogram. In this case, it was determined that the analyst did not adequately document that the initial integration of the peaks was incorrect. Data storage should be set in the analysis software to prevent data loss when re-analyzing results.
More observations in the quality control laboratories occurred that adequate documentation and evaluation of interrupted injections and sequences were missing. These observations affect the traceability and trustworthiness of laboratory results. To report interrupted sequences without reducing data integrity, the laboratory can take several approaches to pre-emptively mark the data as incomplete. Many laboratories are not aware of the “Verify incomplete Data” feature in chromatography software that marks incomplete sets. However, even without this digital feature, it is possible to create complete traceability in the documentary examination process. All generated data must be used for the evaluation and possible process changes, or sequence interruptions must be justified and documented in detail in the evaluation. To ensure complete and transparent traceability of the data, each chromatogram is marked with a “Data missing” or “incomplete Dataset”. A good document practice (GDocP) forms the basic building block in the documentation.
5. Incorrect distribution of administrator rights
The FDA also frequently observes cases where analysts performing and reviewing analyses had unrestricted rights to modify or delete data. This leads to potential security vulnerabilities and opportunities for data manipulation. FDA expects appropriate “Segregation of Duties” especially in the laboratory environment. It is not acceptable for quality control personnel to hold administrator rights. This security vulnerability can be closed by keeping a thorough role and permission log, where a variety of roles and permission combinations are created so that no role has access to both data review and data editing.
6. Missing backup of electronically generated data
The FDA observed that a software update was performed in a company and raw electronic data was moved. At the end of this process, the data in the destination was incomplete. The data integrity of electronically generated data must be guaranteed throughout the storage period and must be fully available at all times. If there is a change in data storage, previously secured backups are essential to prevent data loss. Also here, the detailed process should be described in a work instruction.
The FDA has observed many pharmaceutical labs that failed in generating or ensuring data quality, integrity, or completeness. From the FDA’s observations, we can see many cases where data integrity was lost, not through malicious intent, but by simple mistakes in the setup of data saving or data access rights.
The topic discussed of incomplete data sets is one of the seven causes of common data integrity errors first discussed in the article “7 Common Data Integrity Points of Failure and How to Avoid Them” .
If you would like to learn more about data integrity, have some help in validating your CSV system to avoid data integrity concerns, or have your organization trained on how to avoid data integrity implementation errors, contact GxP-CC.
We are happy to support you.
 Blog “Common DI points of failure: Legacy Systems” published by Dr. Jennifer Roebber and Dr. Ulrich Köllisch on June 5, 2023.
 Blog “7 Common Data Integrity Points of Failure and How to Avoid Them” published by Dr. Elham Abdollahi-Mirzanagh, Dr. Ulrich Köllisch and Sarah Wittmar on September 15, 2022.