7 Common Data Integrity Points of Failure and How to Avoid Them

Reliable data is the foundation of your entire product offering but ensuring its reliability is not always easy. Many life science companies rely upon the ALCOA+ framework to ensure data integrity. It is the gold standard set by different regulators all over the world for maintaining compliance with data integrity regulations such as U.S. FDA 21 CFR Part 11.

Yet, success in applying this framework requires clear oversight and regular process reviews. It also helps to pay close attention to the most frequent obstacles in the life science industry.

Below we list the seven of the most common points of failure around ALCOA+, based on FDA audit findings. While this list is by no means exhaustive, it provides a good starting point from which companies can begin to audit their processes and strengthen their ALCOA+ framework.

  1. Inadequate access control

Computer system access control is the cornerstone of every data integrity program. FDA 21 CFR, Part 11.10 (d) sets clear requirements for limiting system access to authorized individuals. Chief among these requirements is the stipulation that each system user has a unique login and password to access critical systems. It is an important protection for ensuring data is not deleted or otherwise manipulated and maintaining a clear trail of the individuals interacting with this data.

However, this requirement is repeatedly ignored—and the FDA catches it. As but one example, the FDA issued Letter 483 to a pharmaceutical manufacturer in March 2022 that cited the manufacturer for not exercising appropriate controls over computers and related systems. By sharing a single user ID and password combination across the IT and computer system validation departments to access a tablet test system for in-process checks, the company opened itself to a risk of changes in master production and controls records, by unauthorized personnel. In this case, the offense was a repeat observation.

In some instances, the risk of inadequate access control is exacerbated by a lack of training and awareness among staff. It is more convenient, after all, to share access to a system rather than to log out so that someone else can log in. In other cases, however, outdated technology and legacy systems may not provide the means for individual logins on the scale the company needs.

To prevent shared access, companies must ensure they are using up-to-date technology. This is best supported by a structured Standard Operating Procedure that regulates logical access control, and regular training sessions to raise awareness about login protection best practices.

  1. Legacy systems

While sometimes costly, regular software and computer systems updates provide a clear line of defense against data integrity issues. As the FDA points out in its Facts About the Current Good Manufacturing Practices (CGMPs), “Systems and equipment that may have been ‘top-of-the-line’ to prevent contamination, mix-ups, and errors 10 or 20 years ago may be less than adequate by today’s standards. Moreover, consider the data integrity guidance from the United Kingdom’s Medical Devices and Healthcare Regulatory Authority (MHRA). The guidance suggests that in cases where an appropriate software update or a compliant system does not currently exist, organizations should document the process of seeking a compliant solution, and any mitigation measures being used to temporarily support the continued use of a legacy system. Dated systems open companies up to the risk of non-compliance with ALCOA+ procedures.

In addition, proactive system updates prevent a costly and timely response to a citation. It takes time and resources to identify a new system that best meets your needs, performs adequate qualification and validation testing, and seamlessly implements the chosen solution. Your timeline for this process is likely to be more forgiving than the FDA’s.

So how do you know that it is time to update your system? In a 2020 warning letter, the FDA made clear that the company’s inability to troubleshoot software issues is a violation of good management practices. In this instance, the FDA found (Observation 2) during a demonstration that the computer operating a spectrophotometer was not secured so that data files could be deleted without the knowledge of the company’s quality unit. While this was problematic enough, the FDA determined the company “acknowledged that [their] software was not working as intended and [they] lacked the necessary knowledge or experience to troubleshoot the issue.”

As this example highlights, the clearest sign that devices are outdated is that the organization is unable to effectively meet all data integrity regulations. Another clear indication is that a software vendor has stopped providing support or updates. To stay on top of potential updates, talk with trusted vendors or look for conferences or trade publications that identify changes in electronic compliance requirements or capabilities.

  1. Incomplete datasets

A complete dataset is essential to reconstruct all associated suitable practice activities as needed. As a result, companies must retain full datasets around batch release decisions.  However, the FDA finds all too often that companies have difficulty capturing the entire dataset for each batch release.

As one chemical company found out the hard way in December 2021, the difficulty in capturing this data may be because batch release data often extends beyond a single system. As this chemical company learned in the issued 483, controls must be established to ensure the authenticity, integrity, and security of all computerized systems and raw data. As ALCOA+ principles indicate, this includes original electronic data, laboratory notebooks, and audit trails as well.

The FDA notes in its Q&A guidance on Data Integrity and Compliance with Drug CGMP that maintaining only result files, which are static datasets, may not be enough. It also requires dynamic record formats, which allow interaction between the user and the record content. “For example, a dynamic chromatographic record may allow the user to change the baseline and reprocess chromatographic data so that the resulting peaks may appear smaller or larger. It also may allow the user to modify formulas or entries in a spreadsheet used to compute test results or other information such as calculated yield,” the guideline explains.

To prevent this potential for the omission, companies must account for this full chain of data and its storage as early as possible in the process design phase. Identifying early on the data that will be considered part of the complete dataset will help determine risk-based measures for ensuring adequate review and retention.

  1. Lack of audit trails and their review

As noted above, maintaining a complete dataset is essential. Yet in the case of audit trails, organizations must not only retain but also have a strategy to review this data.

It is easy to focus solely on meeting the FDA data retention requirements to save data while overlooking the value that can be gained from using this data. In fact, that was the mindset that earned one pharmaceutical laboratory a citation (Observation 5) when FDA investigators determined neither the reviewer nor laboratory personnel knew how to access the audit trail they had saved. The laboratory had gone through all the appropriate steps to retain records, including following an audit trail checklist to confirm that the audit trail was enabled and there was no risk of data deletion or modification. But without access to the audit trail, the lab lost most of the value of this requirement.

Audit trails provide useful data for driving process improvements or identifying deficiencies. Moreover, reviewing these audit trails may also uncover an unauthorized change in data. It’s for this reason that the ALCOA+ based framework advises data be made available.

Organizations may overlook this step due to the overwhelming amount of data audit trails generate. There is no use in sifting through all this data without first defining a clear plan for how to approach the regular review of data. This requires organizations to first define a meaningful, risk-based strategy for determining the frequency of audit trail reviews and what to look for during this search.

  1. Inadequate segregation of duties

The FDA is clear that the system administrator role—which holds any rights to alter files and settings—should be assigned only to personnel who are not responsible for the record content. This separation prevents an individual whose role has a direct interest in the results of the decision from having the ability to modify or delete critical data.

This requirement protects the integrity of the data, but it also protects your laboratory from the risk of allegations regarding the integrity of your quality control processes. To prevent this risk, organizations should perform a thorough risk assessment on adequate segregation of duties. In addition, a list of authorized access privileges for each GxP-relevant computer system in use should be maintained.

  1. Orphan and unreported data

Cherry-picking results is a tremendous concern when it comes to quality control, and a prime reason for putting ALCOA+ controls in place. Still, the prevention of orphan and unreported data emphasizes the fact that good data integrity processes do not exist solely for electronic records. These controls are essential in maintaining quality and therefore should also exist as part of your quality control management. Be aware that this is applicable throughout the complete product lifecycle, including pre-commercial states such as research and development and clinical trials.

That was the reminder that the FDA gave a pharmaceutical lab in a warning letter issued in 2020. As a result of the lack of laboratory oversight from the quality control unit (Observation1), “several chromatographic injections of samples and standards associated with an out-of-specification (OOS) investigation were not included in your investigation, reviewed by your quality unit, and communicated to your client,” states the cited warning. “For instance, the FDA determined, that four samples tested OOS for assay”. As part of the investigation they were retested, yet one of the samples was reinjected in duplicate under a separate series for assay later that day. The second dataset was not captured or documented, and the quality unit was unaware of the sample reinjection.

If test results are out of specification, it is critical that your team be able to document and report them without fear of repercussion. With an intact quality culture and proper data integrity procedures, laboratories can be confident that data will not be modified or deleted.

  1. Inadequate third-party management

Limited control over industry third parties such as contract research/manufacturing organizations (CROs/CMOs) can be a direct path to disaster. These third parties deliver huge amounts of critical data to the pharmaceutical industry but remain a major blind spot when it comes to pharmaceutical companies’ data integrity processes. This is a critical problem for pharmaceutical companies, as the sponsor of any third-party research stays accountable for data integrity.

Several laboratories learned this lesson when an FDA notification in 2021  informed them that clinical and bioanalytical studies conducted by two CROs were not acceptable due to data integrity concerns. Thus began the lengthy process of contracting new organizations, repeating the bioequivalence and bioavailability studies at risk, and proceeding with production.

Having reliable control over suppliers and service providers is of outstanding importance not only for data integrity but also for supply chain security and business continuity. To ensure control, it is important to not only rely on the vendors but also to check real-life data and have adequate contracts, including statements to make regular vendor audits the norm for continued cooperation.

Data integrity guidance issued by the Pharmaceutical Inspection Co-operation Scheme (PIC/S) in 2021 provides a good overview of this topic. It advises companies to conduct regular risk reviews of their supply chain partners and other outsourced activity to evaluate the extent of data integrity controls required.

Conclusion 

It is all too easy to make a mistake when it comes to something as simple as omitting a single form of documentation or forgetting to update GxP-relevant software. But these and other mistakes listed above can be prevented with the right oversight and process controls.

Preventing these common data integrity points of failure becomes less challenging still with the right partner to assist you as an independent party. This is an area where GxP-CC can help. To take the next step in strengthening your data integrity, contact us today.



You Might Also Like:
Join Our Team
Reach your full potential while making a powerful impact.
Learn More
Contact Us
Let’s find the best solution for your compliance needs.
Learn More