Data integrity remediation and cGMP facilities

Feature ArticlesFeature Articles
| 01 November 2021 | By Loganathan Kumarasamy, MS, RACPoorani Karunanidhi, MTech  | ©

Data integrity remediation is a critical activity performed by analytical laboratories, quality control laboratories, and manufacturing plants to comply with good manufacturing practice (GMP) and data integrity requirements. There are various challenges during this activity, and current GMP (cGMP) facilities struggle to comply due to technical limitations and resource constraints. This article examines the data integrity remediation challenges faced by cGMP facilities and provide recommendations to overcome such challenges along with case studies.
Data integrity refers to data being attributable, legible, contemporaneous, original, and accurate, which is commonly abbreviated as ALCOA.1 In recent years, the US Food and Drug Administration (FDA),2 European Medicines’ Agency (EMA),3 the UK’s Medicines & Healthcare products Regulatory Agency (MHRA),4 and other agencies have identified several data integrity gaps with GMP across organizations in the industry. To help the organizations circumvent these gaps, the agencies have released guidance documents, which may raise the need for a data integrity assessment for identifying gaps, followed by a data integrity remediation exercise. Most quality control and analytical laboratories are able to comply with these guidelines by upgrading computerized instrument applications, integrations, and use of procedural controls, such as source data review, periodic review, and so on. However, manufacturing facilities may struggle to comply with these guidance requirements because of their systems are outdated, they have limited resources, and/or the costs are high.
Data integrity remediation
In the past 5 years, the FDA has issued more than 100 warning letters to cGMP facilities about data integrity gaps and has mandated remediation exercises. In addition, agency has recommended remediation be conducted through independent cGMP consultants to identify the gaps. With the issuance of such warning letters and data integrity guidance from regulatory agencies, the remediation exercise across the pharmaceutical cGMP space become more important. This exercise typically involves:
  • Reviewing the existing data process to understand the data flow between various systems.
  • Identifying the gaps in the process flow where there are no controls on data modification/deletion or missing data integrity ALCOA principles.
  • Creating a remediation plan to identify the steps to address the gaps.
  • Establishing technical controls to meet ALCOA principles by implementing software upgrades or making changes to the computer system configuration or installation of additional software controls.
  • Establishing additional procedural controls where technical controls could not be established and controls for user training, access and data review process.
  • Testing and validation to ensure the technical controls are implemented appropriately and meets the intended usage.
  • Summary of data integrity remediation activities and establishing monitoring process or periodic review process to achieve ongoing data integrity compliance.
Challenges for data integrity remediation in cGMP facilities
When the need for data integrity remediation activity was originally conceptualized, companies began focusing on their analytical and quality control laboratories, which typically consist of instruments and computer applications such as electronic lab notebook (ELN) and laboratory information management systems (LIMS). Although the remediation activities required substantial effort and cost, it was comparatively easier given most of the computer applications have inbuilt capabilities to read data from instruments there by reducing manual intervention resulting in higher data integrity compliance. However, conducting the remediation activity in cGMP facilities presents a range of challenges because it typically involves use of standalone process equipment.5
Heterogeneity of systems
Typical cGMP facilities involve variety of standalone equipment and systems from different vendors. Each vendor would have their own approach for data acquiring, processing, storage and hence it requires numerous custom integrations between applications to ensure limited manual intervention during data transfer. Most of these systems also lack electronic transfer capabilities there by requiring manual printouts to be generated and attached to achieve the required compliance.
Outdated systems lacking technical advancements
Most of the systems that involve data from equipment, batch record systems and manufacturing execution systems rely on old technologies and do not offer latest capabilities as in ELN, LIMS systems which restraints the technical controls that could be established for data integrity compliance. Some of these standalone equipment and systems are not even Part 11 compliant, lacking audit trail and individual user account capabilities, which creates a huge limitation on data integrity.5,6
Increased personnel responsibility
In smaller manufacturing facilities and pilot manufacturing facilities, where there is limited workforce, it is challenging to segregate duties and provide restricted access to users. Adding to this, the systems are not sophisticated enough to allow nonconflicting permissions that satisfy both business and data integrity requirements. Manual data entry must be supplemented with witnessing to meet data integrity needs, which also adds to the workload.1,6
Data flow
As manufacturing processes typically include multiple equipment and technology platforms, data flow is generally complex. It is highly possible that the primary records are lost in the process and hence the sequence of events cannot be traced completely.
How data integrity remediation can be approached for cGMP facilities
It will take several years for vendors to implement technical advancements to their existing systems that comply with data integrity requirements. In the meantime, cGMP facilities must focus on achieving data integrity compliance by overcoming the above challenges. A practical approach to remediate manufacturing systems is detailed below.
Perform data assessment
The first step to achieve data integrity compliance is to understand the data usage. Data from a manufacturing process may be classified as:
  • Regulated data ‒ Data that must be retained to demonstrate product quality.
  • Operational ‒ Data that is required by business areas to operate efficiently.
  • Nonregulated (other) ‒ Research and development data that is not required for regulatory filings or data that does not add any value or context to the process and is not required to reconstruct the process if needed.6
To identify the regulated data, it is essential to identify the critical quality attributes (CQA) and critical process parameters (CPP) in a manufacturing process, as required in the International Council for Harmonisation (ICH) Q8 (R2).7 Control strategies should have been established in manufacturing facilities to ensure product quality based on the CQAs and CPPs. Regulated data includes control strategy records, validation records, monitoring records, and batch records, for example.6
Document data flow
To understand the current control strategy for regulated data, it is essential to document the data flow capturing the entire data lifecycle for individual datum under consideration. Interviews with the current operation team at this stage would prove useful to identify the key business requirements and assess the team’s interpretation and compliance with existing standard operating procedure.
Assess functional risks
Identifying regulated data and mapping the current data flow helps understanding the impact of the data. The next step is to evaluate the system recording the data and its functionalities to identify potential data integrity risk. Technical and/or procedural controls to mitigate data integrity risk must be identified depending on the risk probability and detectability.
Depending on the organization’s validation strategy, one of the following options may be used:
 The data integrity requirements may be built into the requirement specifications and the organization’s risk assessment strategy may be used to identify the data integrity controls, residual risks, mitigations and supporting evidence.
  • A data integrity risk assessment questionnaire may be used to evaluate each system and document the data integrity controls, residual risks, mitigations and supporting evidence.
Implement technical controls
The technical controls identified to mitigate the data integrity risks should be implemented in the system or supporting systems. Systems with less susceptibility to manual intervention are less prone to data integrity noncompliance. Procedural controls may be implemented as interim risk mitigations, with organizations demonstrating efforts to move to a long-term technical control strategy.
Some of the immediate remedies include:
  • Synchronizing system clocks throughout the manufacturing facility and restricting unauthorized changes to date and time.
  • Implementing individual logins to ensure attributability and disabling shared logins or generic accounts.
  • Protecting fixed configuration settings against unauthorized access through physical or logical controls. This reduces the number of metadata that need to be retained with the original data and reviewed.
  • Ensuring the user is provided with individual accounts of varying permissions to perform the desired job function in case it is necessary for one user to have multiple roles.
  • Developing reports that are an accurate representation of data and include details about data processing.
  • Ensuring uninterrupted power supply to simple controllers or instruments storing data in a volatile memory.
Validate technical controls
All technical data integrity controls must be validated. For existing systems, the controls may have been validated as part of the initial validation activity. Existing documentation should be reviewed and traced back to the functional risk assessments. For controls that have not been adequately validated, additional validation should be completed and traced back to the functional risk assessments. For new systems, it is recommended that data integrity requirements are tested in the initial validation of the system.
Implement procedural controls
For data integrity risks that do not have a feasible technical control, standard operating procedure should be available providing clear instructions for users to avoid data integrity noncompliance.
Some aspects that require procedural controls include:
  • Ensuring that there are alternate means to prevent unauthorized changes such as change control or audit trail reviews or reporting alarm settings when the alarm settings are modifiable by operators.
  • Ensuring that there is a manual witnessing or automatic verification for manual activities and/or data entry
  • Establishing attributability to the individual using the system when the system is incapable of individual logins.
Review residual risks
The functional risk assessment should be updated to document the current data integrity controls, supporting evidence, residual risks, and risk mitigations. The data flows should be updated with the technical and procedural controls.
Monitor risks
The data integrity controls should be monitored periodically to evaluate if controls continue to function as expected and the procedural controls effective reduce the data integrity risks. At this stage, it is recommended to evaluate if system upgrades may replace any procedural controls used for the system.

Case Study 1
Case statement. A cGMP manufacturing facility used a standalone manufacturing equipment that was not equipped with a PLC/SCADA (programmable logic controller/supervisory control and data acquisition) system. Temperature was one of the process data that was measured every 10 milliseconds. There was no timestamped audit trail, data management, archival, and retrieval of records on this standalone manufacturing equipment.
Risk evaluation. A risk-based approach was considered to define what data must be archived. This risk-based approach considered the data criticality, process dynamics and the level of manual intervention allowed. First, the impact of the data to the product quality or patient safety was determined. The process development team confirmed that temperature exceeding 30℃ for more than a minute caused product degradation. Hence this data was critical and must be monitored to ensure that it does not deviate the specification. Next, based on how frequently the equipment measured temperature and how much temperature can be increased/decreased within the measurement period, the minimum data points that must be collected to ensure steady state operation of the equipment was identified. In this case, the equipment measured temperature every 10 milliseconds; however, it took 10 seconds to observe a sizable fluctuation in temperature. Next, the team identified that no manual intervention was required for the data collection process.
A similar risk-based approach was considered for audit trail management. Since the system did not provide an audit trail functionality, the risks were evaluated. The temperature values recorded, and the corresponding timestamps were restricted from modifications. All users had access to the system configuration settings that could impact the temperature values recorded and changes to these settings were not tracked. Changes in user and access management was not tracked. The equipment usage information was not tracked.
Remediation solution. To mitigate the risks associated with managing and archiving data, temperature data collected every 10 seconds was archived into a historian. This integration was validated, and the historian trends were used to monitor any deviations during the data review process. The basic metadata such as the timestamp and temperature unit were transmitted alongside the data value, and metadata such as the batch number, equipment ID, operator ID was traceable via the electronic logbook. In a fully automated system, it was acceptable to retain data captured at 10 second interval based on the risk assessment. Alternatively, if the systems allow for unauthorized access to data or configuration, the initially acquired data (10 millisecond data) must be stored in a data archival system (historian) directly to ensure that data was not manipulated.
To mitigate the risks associated with the lack of audit trail, the system configurations such as alarm range, calibration settings, sample rate, user and access management were restricted for operators. This ensured that the temperature data collected cannot be inadvertently or deliberately falsified. To establish attributability for the actions that must be tracked, an electronic logbook was maintained to record the user information, date and time of use and reason for use. The electronic logbook was reviewed as part of data review process.
Case Study 2
Case statement. A cGMP facility used a tablet tester to support in-process quality testing during tablet production. To operate the system, the operator used a shared login with operator permissions. The password was provided by the vendor and listed in the equipment user manual. The master recipe did not include the settings required to operate the equipment and the operator did not record/report the settings used in the production report. Upon completing the testing, the operator copied the raw data from the system memory to a CD manually and then deleted the raw data from the hard drive to free up space.
Risk evaluation. The tablet testers generated quality data that is used to accept or reject the tablets produced in a batch. The equipment must be operated correctly to generate data that is reliable. Shared logins did not provide attributability to the user operating the equipment. The password provided by the vendor at the time of installation was not changed prior to cGMP use, thereby promoting unauthorized access. The settings (metadata) used to generate the quality data was not recorded and reviewed. Data was manually archived to an external device and deleted without supervision or review. Readability of archived data was not verified.
Remediation solution. To mitigate the risks associated with access and password management, individual logins were set up for each user with user generated password. An administrative procedure was established for access management by the administrator.
To mitigate the risks associated with the completeness of data, the master recipe included the critical settings that must be used to control the equipment and the settings were reported to the production batch record and reviewed.
To mitigate the risks associated with data availability, simple programs that automated the data transfer from the source to the target location was validated and used. The permission to delete data by operators or supervisors was restricted. The system was configured to generate an alarm when the memory was full, and the equipment operation procedure detailed the measures to be taken by the operator in the case of encountering the alarm. Periodic monitoring procedure was established to monitor the automatic transfer of data and periodically test the data retrieval process. The requirement for data compatibility tests for legacy data when systems were being upgraded were included in the qualification procedure.
When conducting data integrity remediation for cGMP facilities, it is important to understand the risks, challenges and identify an approach that balances the risk vs benefits. A traditional remediation approach like that of Analytical and quality control laboratories will not work for cGMP facilities due to uniqueness and varied systems in use. Organizations should not simply use procedural control as a control for remediation instead, identify the technical opportunities that can improve efficiency as well as help adhere with data integrity requirements.
CD, compact disc; cGMP, current good manufacturing practices; CPP, critical process parameters; CQA, critical quality attributes, ELN, electronic laboratory notebook; EMA [EU], European Medicines Agency; FDA, [US] Food and Drug Administration; ICH, International Conference On Harmonization; LIMS, Laboratory Information Management System; MHRA, [UK| Medicines & Healthcare products Regulatory Agency; PLC, programmable logic controller; SCADA, supervisory control and data acquisition.
About the Authors
Loganathan Kumarasamy, MS, RAC, is head of scientific informatics, validation and compliance services at Zifo Technologies. He has more than 11 years of regulatory experience, with a focus on the pharmaceutical, biotech, and medical device industries. Kumarasamy has led several research and development and manufacturing projects and has first-hand experience in lab informatics, scientific informatics, quality assurance, computer system validation, instrument qualification, and data integrity. Kumarasamy has a master’s degree in regulatory affairs from Northeastern University and holds Regulatory Affairs Certification (US). He can be reached at
Poorani Karunanidhi, MTech, is a senior GxP subject matter expert/consultant at Zifo Technologies. She has worked as a regulatory compliance consultant with several pharmaceutical and biotech Industries for more than 5 years. Karunanidhi has led the data integrity initiatives of several cGMP facilities. Her areas of expertise include computer system validation, instrument qualification and data integrity. She has a master’s degree in industrial biotechnology from SASTRA University, India. Karunanidhi can be reached at
Citation Kumarasamy L, Karunanidhi P. Data integrity remediation and cGMP facilities. Regulatory Focus. 31 October 2021.
References with URLs were last accessed on 21 October 2021.
  1. Pharmaceutical Inspection Co-operation Scheme (PIC/S). Good practices for data management and integrity in regulated GMP/GDP environments. Dated 1 July 2021.
  2. Food and Drug Administration. Data integrity and compliance with drug cGMP: Questions and answers guidance for industry. Dated December 2018.
  3. European Medicines Agency. Guidance on good manufacturing practice and good distribution practice: Questions and answers. Last updated June 2019.
  4. Medicines & Healthcare products Regulatory Agency [UK]. ‘GXP’ data integrity guidance and definitions. Dated March 2018.
  5. International Society for Pharmaceutical Engineering, Data integrity: A vertical journey, Published January/February 2019.
  6. ISPE GAMP® RDI Good practice guide: Data integrity - Manufacturing records. Published May 2019.
  7. International Council for Harmonisation. ICH harmonised tripartite guideline: Pharmaceutical development, Q8(R2). Dated August 2009.


© 2022 Regulatory Affairs Professionals Society.

Discover more of what matters to you