Computer System Validation
Computer System Validation
Compliance
CSV is a comprehensive approach that ensures the reliability, accuracy, and integrity of
computer systems used in the pharmaceutical industry. It encompasses a series of activities,
processes, and documentation that collectively establish the validity and compliance of
computer systems. From research and development to manufacturing, distribution, and
beyond, CSV touches every facet of pharmaceutical operations.
1. Regulatory Compliance: Regulatory agencies such as the U.S. Food and Drug
Administration (FDA), the European Medicines Agency (EMA), and others require
pharmaceutical companies to validate computer systems used in GxP (Good x Practice)
environments.
2. Data Integrity: CSV ensures that the data generated and processed by computer systems
are accurate, reliable, and consistent, contributing to maintaining data integrity.
3. Patient Safety: Many computer systems in the pharmaceutical industry control critical
processes that directly impact patient safety. Ensuring the proper functioning of these systems
is vital to prevent errors that could lead to adverse events.
4. Risk Management: CSV helps identify and mitigate risks associated with computer
systems, ensuring that potential vulnerabilities are addressed before they impact product
quality.
5. Operational Efficiency: Validated computer systems are more likely to operate effectively,
minimizing downtime and disruptions in manufacturing processes.
CSV Process:
1. Planning and Strategy: Defining the scope, objectives, and resources required for CSV.
3. Functional Specification (FS): Describing how the system will meet user requirements.
7. Performance Qualification (PQ): Ensuring that the system consistently performs within
defined parameters.
8. User Acceptance Testing (UAT): Confirming that the system meets user requirements.
9. Risk Assessment: Identifying and addressing potential risks associated with the system.
10. Change Control: Managing changes to the validated system to ensure ongoing
compliance.
Challenges in CSV:
I was requested to support younger pharmaceutical industry professionals with a CSV Q&A.
Find it below:
tell me how you will start and decide the validation approach and explain the entire things
you will do from start up to release of the system for use.
A1: here’s a detailed step-by-step approach for managing a Computer System Validation
(CSV) project in the pharmaceutical industry:
• Identify key stakeholders, including users, IT, quality assurance, and regulatory teams.
• Determine the validation approach based on system complexity, risk assessment, and
regulatory requirements.
• Collect detailed user requirements, including functional, technical, security, and regulatory
requirements.
• Conduct vendor audits to assess their quality systems and ability to meet regulatory
requirements.
4. Risk Assessment:
• Identify potential risks associated with the system’s intended use, data integrity, patient
safety, and regulatory compliance.
• Perform a risk assessment to prioritize validation activities and determine the level of
testing required.
• Develop a Validation Master Plan (VMP) outlining the overall validation strategy, scope,
resources, and documentation requirements.
• Design the system architecture, including hardware, software, interfaces, and data flows.
• Include security measures, data backup, and disaster recovery plans.
8. Testing:
• Conduct PQ tests to demonstrate the system meets user requirements under realistic
conditions.
• Involve end users in testing to ensure the system meets their requirements.
13. Training:
• Perform periodic reviews to ensure the system continues to meet regulatory requirements.
• Update documentation, perform periodic testing, and address any issues that arise.
• Use lessons learned from the project to improve future CSV projects and enhance the
quality system.
A2: In the pharmaceutical industry, Computer System Validation (CSV) is a critical process
that ensures that computerized systems used for manufacturing, testing, and quality control
comply with regulatory requirements and are fit for their intended use. CSV includes various
stages of testing to verify and document the system's functionality and performance. Two
important stages of CSV are Operational Qualification (OQ) and Performance Qualification
(PQ). Let's delve into the definitions and differences between these two stages:
2. Functional Testing: Each system function is tested to ensure it operates as intended. This
may include testing user interfaces, data entry, calculations, data retrieval, and reporting.
3. Performance Testing: OQ also includes testing the system's performance under normal
operating conditions. This can involve assessing response times, transaction throughput, and
data retrieval times.
4. Boundary Testing: Boundaries of the system's functionality are tested, including input
limits, error conditions, and exceptions.
5. Security and Access Control Testing: OQ verifies that the system's security measures, such
as user authentication and access controls, are functioning correctly.
6. Data Integrity Testing: Data integrity controls, including data entry, storage, retrieval, and
audit trails, are validated.
7. Interface Testing: If the system interfaces with other systems or instruments, OQ verifies
that the data exchange and integration are working as intended.
1. Real-world Testing: The system is subjected to real-world scenarios, data, and inputs that
simulate actual operational conditions.
3. Stability and Reliability: The system's stability and reliability are evaluated over an
extended period to ensure it can consistently perform without errors.
4. Fail-over and Recovery Testing: If applicable, PQ tests the system's ability to recover from
failures, data loss, or system crashes.
6. Batch Processing: If the system is used for batch processing, PQ ensures that batch records
are accurately generated and processed.
7. Regulatory Compliance: PQ verifies that the system generates accurate and compliant
records required by regulatory agencies.
The primary difference between OQ and PQ lies in the focus of testing. OQ primarily focuses
on verifying that the system's design specifications and functional requirements are met,
whereas PQ emphasizes demonstrating the system's consistent performance in real-world
conditions, ensuring it can reliably support production and quality processes. Both stages are
essential to ensure that the computerized system is validated and fit for its intended use in the
pharmaceutical industry
Q3: How will you do the risk assessment during CSV validation in pharmaceutical industry?
A3: Performing a risk assessment during Computer System Validation (CSV) in the
pharmaceutical industry is crucial to identify, evaluate, and mitigate potential risks associated
with the computerized system and its impact on product quality, patient safety, and data
integrity. Here's a comprehensive approach to conducting a risk assessment during CSV:
Clearly define the scope of the risk assessment, including the computerized system, its
functionalities, interfaces, and intended use. Set objectives for the risk assessment process.
Form a team of experts from various relevant disciplines, such as quality assurance, IT,
compliance, regulatory affairs, process owners, and subject matter experts.
Identify potential hazards and risks associated with the computerized system, its components,
interfaces, data integrity, and impact on patient safety, product quality, and regulatory
compliance.
4. Risk Identification:
Use tools such as brainstorming, process mapping, and Failure Modes and Effects Analysis
(FMEA) to systematically identify potential failure modes, vulnerabilities, and scenarios that
could result in harm.
5. Risk Assessment:
Assess the identified risks based on severity, likelihood, and detectability. Use a risk matrix
to categorize risks into low, medium, and high levels of risk.
6. Risk Evaluation:
Evaluate the assessed risks to determine their significance and prioritize them for mitigation.
Focus on risks that have the potential to affect patient safety, product quality, data integrity,
and regulatory compliance.
Develop risk mitigation strategies for high and medium-risk scenarios. These strategies may
include process changes, system enhancements, additional controls, procedural safeguards, or
training.
Document the entire risk assessment process, including identified risks, their assessment,
prioritization, and mitigation strategies. This documentation is crucial for regulatory
compliance and audit purposes.
Implement the risk mitigation measures based on the strategies developed in the previous
step. Ensure that necessary changes are made to the system, processes, or procedures.
Summarize the findings of the risk assessment, including identified risks, assessment results,
mitigation strategies, and their implementation status. This documentation serves as evidence
of a systematic approach to risk management.
Ensure that the risk assessment process aligns with relevant regulations and guidelines, such
as ICH Q9 (Quality Risk Management), FDA's 21 CFR Part 11, and other industry-specific
guidelines.
Use the insights gained from the risk assessment to improve the validation process, enhance
system functionality, and strengthen overall quality management practices.
The risk assessment process is iterative and should be an integral part of the CSV lifecycle. It
ensures that potential risks are addressed proactively, leading to a robust and compliant
computerized system that supports patient safety, product quality, and regulatory compliance.
Q4: What kind of applications need computer system validation and why?
A4: Computer System Validation (CSV) is needed for applications that are used in regulated
industries, such as pharmaceuticals, medical devices, biotechnology, and other industries
where product quality, patient safety, and data integrity are critical. The primary purpose of
CSV is to ensure that computerized systems operate reliably, consistently, and in compliance
with regulatory requirements. Here are some examples of applications that require CSV and
the reasons why:
LIMS are used to manage laboratory workflows, data, and sample tracking. They play a
crucial role in maintaining data integrity and traceability in laboratories. CSV ensures
accurate data recording, sample tracking, and adherence to testing and reporting procedures.
EDMS systems manage electronic documents, records, and workflows. They are vital for
maintaining controlled and organized documentation, including SOPs, batch records, and
regulatory submissions. CSV ensures that documents are securely stored, accessible, and in
compliance with version control.
QMS systems manage quality-related processes, such as deviations, CAPAs, change controls,
and audits. These systems are critical for maintaining compliance, identifying and resolving
quality issues, and tracking corrective actions. CSV ensures that quality processes are
consistent and well-documented.
MES systems manage manufacturing processes, batch records, equipment, and personnel.
These systems ensure that manufacturing operations are controlled, monitored, and compliant
with GMP requirements. CSV helps prevent errors and discrepancies in batch records,
ensuring product consistency and quality.
CTMS systems manage clinical trial data, including patient enrollment, study protocols, and
regulatory submissions. These systems help ensure the integrity and accuracy of clinical trial
data, critical for regulatory submissions and patient safety. CSV safeguards the reliability of
clinical trial data.
7. Pharmacovigilance Systems:
Pharmacovigilance systems manage adverse event reporting and safety surveillance for
pharmaceutical products. These systems are crucial for ensuring patient safety and regulatory
compliance. CSV ensures that adverse event data is accurately captured, assessed, and
reported.
RIM systems manage regulatory submissions, approvals, and compliance information. These
systems support the timely submission of regulatory documents and the maintenance of
regulatory compliance. CSV helps ensure that regulatory information is accurate and up-to-
date.
Process control systems are used in manufacturing environments to monitor and control
critical process parameters. In industries like pharmaceuticals and biotechnology, these
systems ensure consistent product quality and adherence to GMP requirements. CSV
safeguards the accuracy of process control data.
Any software used for data analysis, reporting, and decision-making in regulated
environments should undergo CSV. This includes statistical analysis tools, data visualization
software, and reporting tools used to generate data-driven insights.
Overall, CSV is necessary for any application that handles critical data, supports regulatory
compliance, impacts patient safety, and contributes to product quality in regulated industries.
It ensures that these applications are developed, implemented, and maintained in a controlled
and documented manner to mitigate risks and maintain data integrity.
Q5: What are the phases in software development life cycle in pharmaceutical industry?
A5: In the pharmaceutical industry, the software development life cycle (SDLC) consists of
several phases that ensure the proper development, validation, and deployment of
computerized systems used in various processes. The SDLC phases in the pharmaceutical
industry typically include:
In this phase, the requirements for the software system are gathered from stakeholders, users,
and regulatory guidelines. These requirements are analyzed, documented, and translated into
functional and non-functional specifications.
2. System Design:
During this phase, the detailed system design is created based on the requirements. This
includes designing the architecture, data flow, user interfaces, and interactions. Design
specifications are created, which will guide the actual development process.
The coding phase involves writing the actual software code based on the design
specifications. Programming practices must follow industry standards and good coding
practices to ensure maintainability, traceability, and future modifications.
4. Testing:
Testing is a critical phase in the SDLC. It includes unit testing, integration testing, system
testing, and user acceptance testing (UAT). The software is tested for functionality, accuracy,
performance, security, and compliance with requirements.
This phase is specific to the pharmaceutical industry. The software undergoes validation to
ensure that it meets regulatory requirements and is fit for its intended use. Validation includes
verification (did we build it right?) and validation (did we build the right thing?).
Documentation is generated to demonstrate compliance.
Once the software has passed validation and testing, it is deployed to the intended
environment. Installation processes and procedures are followed to ensure that the software is
correctly set up.
After deployment, the software enters the operational phase. This involves ongoing
monitoring, support, and maintenance. Regular updates, bug fixes, and enhancements are
performed as needed.
Throughout the software's lifecycle, changes may be required due to user feedback,
regulatory updates, or evolving business needs. A structured change management process
ensures that any changes are documented, tested, and validated to maintain the software's
integrity.
At the end of its useful life, the software is retired and decommissioned. Data and
information are archived, and any remaining regulatory requirements are fulfilled. This phase
ensures the proper closure of the software's lifecycle.
Each phase of the SDLC plays a crucial role in ensuring that computerized systems used in
the pharmaceutical industry are developed, validated, and maintained in a controlled and
compliant manner. Regulatory agencies, such as the FDA, require adherence to these phases
to ensure the safety, efficacy, and integrity of products and processes.
Q6: What is V model, agile model and waterfall model and their differences?
A6: The V Model, Agile Model, and Waterfall Model are three distinct software development
methodologies, each with its own approach to managing the development process. Here's an
overview of each model and their key differences:
Waterfall Model:
The Waterfall Model is a linear and sequential approach to software development. It follows
a structured step-by-step process, where each phase must be completed before moving to the
next. The key phases in the Waterfall Model include requirements gathering, system design,
implementation, testing, deployment, and maintenance. This model is suited for projects with
well-defined and stable requirements, where changes are less likely to occur. However, it can
be rigid and less adaptive to changing requirements.
Agile Model:
The Agile Model is an iterative and incremental approach that focuses on collaboration,
flexibility, and customer feedback. It breaks the development process into small, manageable
iterations or sprints. Each iteration includes requirements gathering, design, coding, testing,
and delivery of a working increment of the software. Agile methods prioritize customer
satisfaction and embrace changing requirements even late in the development process.
Examples of Agile methodologies include Scrum, Kanban, and Extreme Programming (XP).
V Model:
The V Model, also known as the Verification and Validation Model, is an extension of the
Waterfall Model. It emphasizes the relationship between development phases and their
corresponding testing phases. The V Model involves a parallel development and testing
process. For every development phase, there is a corresponding testing phase, forming a "V"
shape. For example, the requirement phase is followed by the requirement verification phase,
design phase by design verification, and so on. This model ensures that testing and
verification are closely tied to each development step.
Differences:
1. Approach:
2. Flexibility:
3. Phases:
- Agile: Iterations with phases like planning, designing, coding, and testing in each iteration.
4. Customer Involvement:
5. Documentation:
6. Project Size:
- Waterfall: Best suited for small to medium-sized projects with stable requirements.
- Agile: Suitable for various project sizes, particularly beneficial for complex and evolving
projects.
- V Model: Well-suited for projects with clearly defined requirements and significant testing
needs.
A7: Handling a discrepancy in the Computer System Validation (CSV) lifecycle in the
pharmaceutical industry requires a systematic approach to identify the root cause, assess the
impact, and implement appropriate corrective and preventive actions. Here's a step-by-step
process to handle a discrepancy in CSV validation:
- Document the nature of the discrepancy, including its description, location, and the stage of
validation where it occurred.
- Capture all relevant details, such as date, time, personnel involved, and any observed
deviations from expected behavior.
2. Immediate Containment:
- If the discrepancy poses an immediate risk to patient safety, product quality, or data
integrity, take necessary steps to contain the issue. This might involve stopping the affected
process or system.
- Assemble a cross-functional team with expertise in validation, IT, quality assurance, and
relevant business areas.
- Use tools like fishbone diagrams, 5 Whys, or Failure Mode and Effects Analysis (FMEA) to
explore potential causes.
4. Impact Assessment:
- Evaluate the impact of the discrepancy on product quality, patient safety, data integrity, and
regulatory compliance.
- Determine whether the discrepancy has affected other related systems, processes, or data.
- Develop a corrective action plan to address the immediate issue and prevent recurrence.
- Define steps to rectify the discrepancy and bring the system back into compliance.
- Identify preventive actions to mitigate the risk of similar discrepancies occurring in the
future.
6. Change Control:
- Plan and execute the necessary revalidation activities, such as IQ, OQ, and PQ, as
applicable.
- Prepare a discrepancy report that outlines the incident, investigation findings, root cause
analysis, corrective actions, and preventive measures.
9. Regulatory Notifications:
- Provide training to personnel involved in the discrepancy and its resolution to prevent
recurrence.
- Communicate the findings and actions to relevant teams to enhance awareness and prevent
similar issues.
- Monitor the effectiveness of the corrective and preventive actions over time.
- Conduct periodic reviews to verify that the discrepancy has been effectively resolved and
that the system remains in a compliant state.
12. Continuous Improvement:
- Use the lessons learned from the discrepancy to improve validation processes, procedures,
and documentation.
- Implement changes that can enhance the overall CSV process and prevent similar
discrepancies in the future.
A8: Change control in the Computer System Validation (CSV) lifecycle within the
pharmaceutical industry refers to the systematic process of managing and documenting
changes to computerized systems, software, hardware, or related components to ensure that
these changes are implemented in a controlled and compliant manner. Change control is a
fundamental aspect of maintaining the integrity, reliability, and regulatory compliance of
computerized systems throughout their lifecycle.
- Any proposed change to a computerized system or its components starts with a formal
change request. This request includes details such as the reason for the change, the scope of
the change, and the potential impact on the system.
- The team determines the potential impact of the change on system functionality, data
integrity, regulatory compliance, and other critical factors.
3. Risk Assessment:
- A risk assessment is conducted to evaluate the potential risks associated with the change.
This assessment considers factors such as the criticality of the system, the nature of the
change, and potential impacts on patient safety, product quality, and data integrity.
- Based on the assessment and risk analysis, the change control board or relevant decision-
making body evaluates the change request.
- The decision is made to approve, reject, or require further analysis for the proposed change.
5. Change Implementation:
- If the change is approved, an implementation plan is developed. This plan includes details
such as the timeline, responsible individuals, and necessary resources.
- The change is executed according to the plan, which may involve updating software,
modifying configurations, or making hardware adjustments.
- Changes to computerized systems often require validation and testing to ensure that the
changes do not adversely affect the system's functionality, data integrity, or compliance.
- All changes and associated activities are documented in change control records. These
records capture the details of the change request, assessment, risk analysis, implementation
plan, testing results, and any deviations or issues encountered.
- A change control report is prepared summarizing the entire change process, including the
rationale, assessment outcomes, and validation results.
- After implementation and testing, the changes are verified to ensure that they were
successfully executed and have achieved the desired outcome.
- Verification and approval may involve a final review by the change control board to
confirm that the change has been properly executed and meets the intended objectives.
- Stakeholders affected by the change, including system users, are informed about the
implemented change and any relevant training required to adapt to the changes.
- A period of post-implementation monitoring ensures that the change has not introduced
unintended consequences or issues.
- Ongoing monitoring also helps confirm that the system continues to operate as intended
after the change.
Change control is a crucial process in maintaining the integrity of computerized systems and
ensuring that modifications are implemented in a controlled manner to minimize risks and
maintain compliance with regulatory requirements.
Q9: What is Requirement traceability matrix, why it's required and what are its contents?
1. Ensuring Fulfillment: The primary purpose of an RTM is to ensure that all requirements,
whether functional, technical, or regulatory, are met during the development, testing, and
validation phases of a project.
4. Verification and Validation: The RTM aids in verification by confirming that each
requirement has been implemented as intended. It also assists in validation by showing that
the implemented system meets the intended business needs and user expectations.
1. Requirement ID: A unique identifier assigned to each requirement for easy reference and
tracking.
4. Test Cases: The test cases or scenarios developed to verify and validate each requirement.
This includes details such as input data, expected outputs, and pass/fail criteria.
5. Validation Criteria: The criteria that will be used to determine if the requirement has been
successfully validated during user acceptance testing.
6. Status: The current status of each requirement, indicating whether it has been implemented,
tested, validated, or any other relevant stage.
7. Change History: Any changes made to the requirement, including modifications, updates,
and related decisions.
9. Traceability Links: Links to related documents, such as user stories, use cases, functional
specifications, design documents, and test cases.
10. Comments and Notes: Any additional comments, notes, or observations that provide
context or explanations for the requirement's status or implementation.
The RTM serves as a critical tool for project managers, business analysts, developers, testers,
and other team members to ensure the successful and accurate execution of the project's
requirements. It aids in maintaining transparency, consistency, and accountability throughout
the project lifecycle, ultimately contributing to the delivery of a high-quality end product that
meets the defined business needs and objectives.
Q10: How many environments for validation need to be used for CSV validation of a GMP
relevant software?
A10: For CSV (Computer System Validation) of a GMP (Good Manufacturing Practice)
relevant software, typically three environments are used: Development, Validation, and
Production environments. Each environment has a specific purpose in the validation process
and helps ensure the integrity and compliance of the software system.
1. Development Environment:
- Purpose: The development environment is where the software is created, programmed, and
configured by developers.
- Activities: Developers write and test code, build software components, and integrate
features.
- Characteristics: This environment is not intended for validation or testing; it's focused on
building and coding.
2. Validation Environment:
- Purpose: The validation environment is where the software is tested rigorously to ensure
that it meets the predefined requirements and regulatory standards.
- Activities: Testing, validation, and verification activities are performed here, including unit
testing, integration testing, system testing, and user acceptance testing (UAT).
- Characteristics: The validation environment closely mirrors the production environment and
should be set up to simulate real-world conditions.
3. Production Environment:
- Purpose: The production environment is the live environment where the validated software
is used for its intended purpose.
- Activities: The software is used by end-users to perform actual tasks and operations.
It's important to note that the three environments should be distinct and isolated from each
other to prevent any unintended interactions or risks to the validated system. The data and
configurations in these environments should also be consistent to ensure accurate testing and
validation results.
In addition to these primary environments, some organizations may also have a Staging or
Pre-Production environment, which serves as an intermediate step between the Validation
and Production environments. This environment is used for final testing and validation before
software is deployed to the live Production environment.
The use of these environments helps ensure that the software system is thoroughly tested,
validated, and ready for production use, while also adhering to regulatory requirements and
GMP standards.
Q11: Describe the difference between 21cfr part 11 and annex 11 or things which makes
annex 11 apart from 21 CFR part 11.
A11: 21 CFR Part 11 and Annex 11 are both regulatory guidelines that provide requirements
for electronic records and electronic signatures in the pharmaceutical industry. However, they
are associated with different regulatory bodies and cover slightly different aspects of
electronic record-keeping and compliance.
- Regulatory Body: 21 CFR Part 11 is a regulation issued by the U.S. Food and Drug
Administration (FDA).
- Scope: It specifically addresses electronic records and electronic signatures used in FDA-
regulated industries, including pharmaceuticals, biotechnology, and medical devices.
- Requirements: Part 11 outlines the criteria for ensuring the integrity, authenticity, and
reliability of electronic records and electronic signatures. It covers various aspects, such as
system validation, audit trails, electronic signatures, access controls, and security measures.
- Applicability: Part 11 is applicable to all FDA-regulated organizations that use electronic
records and electronic signatures in their processes.
Annex 11:
- Scope: It provides guidance on the use of computerized systems in the GMP (Good
Manufacturing Practice) environment, with a focus on ensuring data integrity, reliability, and
compliance in the pharmaceutical industry.
1. Regulatory Source: The most significant difference is that 21 CFR Part 11 is a regulatory
requirement issued by the FDA, while Annex 11 is a guideline provided by the EMA.
3. Scope: Part 11 covers a broader range of electronic records and signatures used in FDA-
regulated industries, while Annex 11 specifically addresses computerized systems in the
GMP environment.
5. Validation: Both guidelines emphasize the importance of system validation, but the details
and terminology may vary between the two.
6. Audit Trails: Both guidelines discuss the need for comprehensive and secure audit trails,
but there might be variations in the specifics of implementation.
7. Electronic Signatures: Both guidelines address electronic signatures, but the terminology
and requirements may differ.
8. Data Integrity: Annex 11 places a significant focus on data integrity throughout the entire
data lifecycle, including creation, modification, storage, retrieval, and archiving.
It's important for pharmaceutical companies to be aware of the specific requirements of the
regulatory bodies in their respective regions and to ensure compliance with the relevant
guidelines, whether it's 21 CFR Part 11 in the U.S. or Annex 11 in the EU.
Q12: How do you decide if a software system is GxP relevant in pharmaceutical industry?
A12: Determining whether a software system is GxP (Good Practice) relevant in the
pharmaceutical industry involves assessing its impact on processes that affect product quality,
patient safety, and regulatory compliance. GxP encompasses various regulations and
guidelines, such as GMP (Good Manufacturing Practice), GLP (Good Laboratory Practice),
and GCP (Good Clinical Practice). Here's how you can decide if a software system is GxP
relevant:
1. System Functionality:
- Consider whether the software system directly or indirectly affects any GxP processes,
including manufacturing, testing, packaging, labeling, distribution, and quality control.
- Assess if the software system is involved in any decision-making, data capture, or reporting
that affects regulatory compliance, product quality, or patient safety.
2. Data Integrity:
- Examine if the software system handles critical data, such as raw data from laboratory
equipment, batch records, or electronic signatures.
- Determine whether the system maintains data integrity, audit trails, and electronic
signatures in accordance with GxP requirements.
- Evaluate whether the software system controls and automates critical manufacturing or
testing processes that impact product quality and safety.
- Consider whether the system ensures consistent and accurate execution of GxP processes.
- Determine if the software system generates reports, documents, or records that need to
comply with GxP regulations and guidelines.
- Check if the system provides traceability and auditability for all relevant activities.
5. Regulatory Requirements:
- Check if the software system needs to support compliance with specific regulations, such as
21 CFR Part 11 or Annex 11.
6. Product Impact:
- Assess the impact of the software system on the final pharmaceutical product's quality,
safety, efficacy, or patient outcomes.
- Determine if the software system influences the batch release process, stability studies, or
other critical aspects of product quality assurance.
7. Patient Safety:
- Consider whether the software system plays a role in clinical trials, adverse event reporting,
patient data management, or pharmacovigilance activities.
8. Supplier Qualification:
- Evaluate the software system's vendor or supplier to ensure they follow GxP principles and
provide necessary documentation.
9. Risk Assessment:
- Conduct a risk assessment to identify potential risks associated with the software system's
usage in GxP processes.
- Determine the criticality of the software's impact on product quality, patient safety, and
regulatory compliance.
- Assess whether the software system requires validation activities to demonstrate its fitness
for intended use, data integrity, and regulatory compliance.
Ultimately, the decision to classify a software system as GxP relevant should involve cross-
functional collaboration between IT, quality assurance, regulatory affairs, and relevant
business units. It's crucial to thoroughly analyze the software system's functions, impact, and
compliance requirements to make an informed determination.
Q13: Which are the CSV Validation deliverables required for each category of software?
A13: Computer System Validation (CSV) deliverables can vary based on the categories of
software and their impact on GxP processes in the pharmaceutical industry. Here are the
typical CSV deliverables for different categories of software:
- User Requirements Specification (URS): Clearly defines the user's functional requirements
and expectations for the software.
- Functional Specification (FS): Detailed description of how the software will meet the
defined user requirements.
- Design Specification (DS): Detailed design of the software including architecture, data
flows, and user interfaces.
- Risk Assessment: Documented assessment of potential risks associated with the software's
usage, data integrity, and impact on GxP processes.
- Validation Plan: Outlines the approach, scope, responsibilities, and resources for the
validation process.
- Test Protocols (IQ, OQ, PQ): Detailed scripts for Installation Qualification, Operational
Qualification, and Performance Qualification testing.
- Traceability Matrix: Links requirements to tests and verifies that each requirement is
adequately tested and documented.
- Validation Summary Report: Summarizes the validation process, results, deviations, and
overall compliance status.
- User Training Documentation: Describes how users should interact with and operate the
software to ensure data integrity and compliance.
- Change Control Documentation: Tracks any changes made to the software and their impact
on validation status.
- Risk Assessment: A simplified assessment of potential risks associated with the software's
usage.
- Validation Plan: A streamlined validation plan focusing on critical aspects of the software.
- User Training Documentation: Basic user training materials to ensure accurate usage.
- Change Control Documentation: Tracks significant changes that impact the software's
validation status.
- Risk Assessment: An assessment of potential risks considering the software's intended use.
- Validation Plan: An overview of how the software will be validated and integrated into GxP
processes.
- Installation Qualification (IQ) Record: Documentation confirming successful installation of
the software.
- Functional Testing Records: Basic functional testing results ensuring the software works as
intended.
The level of detail and complexity of these deliverables may vary based on factors such as the
software's impact on GxP processes, criticality, complexity, and regulatory requirements. It's
essential to align the validation approach and deliverables with the software's intended use
and the associated risks. Cross-functional collaboration between IT, quality assurance, and
business units is crucial to ensure that the appropriate level of validation is conducted and
documented for each category of software.
What are the 5 additional requirements required for open system compared to closed
systems?
A14: In the context of computer systems and software validation in the pharmaceutical
industry, "open systems" and "closed systems" refer to different types of software
environments:
Closed System:
Open System:
An open system, on the other hand, is a software application or environment that is designed
to interact with external systems and can exchange information or data with other systems.
Open systems are more flexible and versatile, allowing for integration with third-party
software, data sharing, and interoperability. However, the increased connectivity of open
systems can introduce additional complexity and potential risks.
When it comes to computer systems validation in the pharmaceutical industry, there are
additional requirements for open systems compared to closed systems:
1. Data Integrity:
Open systems need more robust data integrity controls to ensure the accuracy, completeness,
and reliability of data exchanged between systems.
Open systems must be designed to seamlessly communicate and integrate with other systems,
requiring compatibility testing and validation.
4. Change Management:
Open systems may undergo frequent changes due to evolving external systems or data
sources. A robust change management process is needed to control updates and assess their
impact.
Open systems introduce additional risks related to data integrity, security breaches, and
interoperability issues. A thorough risk assessment and mitigation strategy are essential to
address these risks.
Overall, open systems provide increased flexibility and potential benefits through their ability
to interact with other systems, but they also require more comprehensive validation efforts
and stringent controls to ensure data integrity, security, and compliance with regulatory
requirements. It's important for pharmaceutical companies to carefully assess the nature of
the system (open or closed) and tailor their validation approach and requirements
accordingly.
Q15: How document approval is done and how execution will be performed is done in CSV
lifecycle in pharmaceutical industry?
A15: In the computer system validation (CSV) lifecycle in the pharmaceutical industry,
document approval and execution play crucial roles in ensuring that the validation process is
conducted effectively and in compliance with regulatory requirements. Here's how document
approval and execution are typically managed:
Document Approval:
4. Document Version Control: Documents are often assigned version numbers or revisions to
track changes and updates. Any changes made to the documents are documented and
reviewed to ensure they do not compromise the integrity of the validation process.
Execution refers to the process of conducting the actual validation activities as outlined in the
approved validation documents. This phase ensures that the system or software meets its
intended specifications and functions as expected. The steps involved in execution include:
1. Test Script Execution: Following the approved test scripts, the validation team executes
various tests on the system or software. These tests may include functionality tests,
performance tests, security tests, and more.
2. Data Collection: During test script execution, data is collected to document the outcomes
of the tests. Data collected includes observations, measurements, screenshots, and any
deviations encountered.
3. Issue Identification: If any issues or deviations are identified during test script execution,
they are documented and reported. These issues may include system failures, unexpected
behaviors, or discrepancies from expected outcomes.
4. Issue Resolution: Any identified issues are investigated and resolved. The resolution
process involves determining the root cause of the issue and implementing corrective and
preventive actions to address it.
5. Review and Approval: The results of test script execution, including data collected and
issue resolution, are reviewed by validation team members and QA personnel. Once the
results are deemed satisfactory and compliant, they are approved.
6. Documentation: The outcomes of test script execution, including any issues encountered
and their resolutions, are documented in validation reports. These reports provide a
comprehensive overview of the validation activities conducted.
Both document approval and execution ensure that the validation process is thorough,
accurate, and compliant with regulatory requirements. These processes contribute to the
overall quality and reliability of computer systems used in the pharmaceutical industry.
A16: Computer System Assurance (CSA) refers to the comprehensive and systematic
approach taken by organizations to ensure the reliability, security, and compliance of
computer systems and software applications. CSA encompasses a wide range of activities
aimed at providing confidence in the proper functioning and integrity of computer systems
used in various industries, including pharmaceuticals, healthcare, finance, and more. It goes
beyond traditional validation processes and focuses on the ongoing assurance of the entire
computerized environment.
1. Risk Management: Identifying, assessing, and managing risks associated with computer
systems to ensure data integrity, patient safety, and compliance with regulatory requirements.
3. Lifecycle Management: Addressing the full lifecycle of computer systems, from planning
and design to operation, maintenance, and retirement, while ensuring compliance with
relevant regulations and guidelines.
5. Data Integrity: Ensuring the accuracy, consistency, and reliability of data generated and
processed by computer systems, including preventing unauthorized changes, deletions, or
tampering.
10. Training and Competency: Ensuring that personnel using, managing, and maintaining
computer systems are adequately trained and competent in their roles.
CSA emphasizes the holistic approach of maintaining and assuring the performance,
reliability, and compliance of computer systems over time. It aligns with the principles of
good manufacturing practices (GMP), quality management systems (QMS), and risk-based
decision-making. Ultimately, CSA contributes to the overall assurance that computer systems
consistently produce accurate and reliable results while meeting regulatory and business
requirements.
1. Introduction:
2. Validation Approach:
- Explanation of any risk-based decisions and rationale for the approach taken.
3. Validation Activities:
- Mention of any deviations, changes, or unexpected events encountered during the validation
process.
4. Results:
- Explanation of the impact of these findings on the validation process and the system's
integrity.
6. Conclusions:
- Overall assessment of the validation results and their alignment with pre-defined criteria.
- Determination of whether the system, process, or equipment meets the required standards
and specifications.
8. Final Approval:
9. Appendices:
10. Annexes:
Q18: Describe the Contents of validation plan for CSV validation project
A18: A Validation Plan for a Computer System Validation (CSV) project is a comprehensive
document that outlines the strategy, scope, objectives, and approach for validating a software
application or computerized system. The plan serves as a roadmap for the entire validation
process and provides a clear framework for all stakeholders involved. The contents of a
typical Validation Plan for a CSV project include:
1. Introduction:
- Explanation of the overall validation strategy, including the rationale for the approach
chosen.
- Explanation of how the project aligns with regulatory requirements and industry best
practices.
3. Project Organization:
5. System Description:
- Reference to relevant regulations, guidelines, and standards that the project aims to comply
with.
- Breakdown of the validation activities into phases (IQ, OQ, PQ, etc.).
8. Validation Requirements:
- Establishment of clear and measurable acceptance criteria for each validation phase.
- Explanation of how changes to the system or deviations from the plan will be managed.
- Description of the change control process and how deviations will be documented and
resolved.
- Explanation of the reporting structure, including progress reports and final documentation.
The Validation Plan provides a structured approach to ensure that the CSV project is
conducted systematically, aligns with regulatory requirements, and produces reliable and
compliant results. It serves as a reference document for project execution, ensuring that all
activities are carried out as planned and that the software application or system is validated
effectively.
1. Introduction:
2. Project Summary:
4. System Description:
- Concise overview of the system or software application, including its functionalities and
features.
5. Validation Phases:
- Summary of the activities carried out in each validation phase (IQ, OQ, PQ).
- Overview of the key tests, tests scripts, and test cases executed.
6. Acceptance Criteria:
- Presentation of the acceptance criteria used to determine the success or failure of each test.
7. Test Results and Findings:
9. Risk Assessment:
- Summary of the risk assessment process conducted, including identified risks and
mitigations.
- Overview of any changes made to the system during the validation process.
12. Conclusion:
- Overall assessment of the validation effort and its success in meeting the objectives.
- Statement on whether the system is deemed validated and ready for use.
13. Recommendations:
- Sign-off section for approval by relevant stakeholders, including project team members and
management.
The CSV Qualification Summary Report serves as a concise record of the validation process,
results, and conclusions. It provides stakeholders with a high-level view of the validation
effort and helps ensure transparency, compliance, and accountability throughout the
validation lifecycle.
1. Identification of Issues:
During various validation phases (e.g., IQ, OQ, PQ), discrepancies, deviations, or non-
conformances may be identified. These issues could include deviations from requirements,
failures in tests, or other problems affecting the system's integrity or functionality.
2. CAPA Initiation:
The investigation aims to identify the underlying reasons for the issue. Root cause analysis
involves a thorough examination of the process, system, personnel, and environmental factors
that contributed to the problem.
Based on the findings of the root cause analysis, a CAPA plan is developed. This plan
outlines the corrective actions to address the immediate issue and the preventive actions to
prevent similar issues from occurring in the future.
Corrective actions involve addressing the immediate problem identified. This could include
fixing the issue, modifying the system, updating documentation, or making other necessary
changes.
Preventive actions involve addressing the root cause of the issue to prevent its recurrence.
This could include process improvements, training, procedural changes, or system
enhancements.
Before closing the CAPA, it's essential to verify that the corrective and preventive actions
taken are effective. This may involve retesting the affected system components, reviewing
updated documentation, and confirming that the issue has been resolved.
9. CAPA Closure:
Once the corrective and preventive actions have been implemented and verified as effective,
the CAPA can be closed. A formal closure report is generated, summarizing the issue, actions
taken, and their outcomes.
All aspects of the CAPA process, including the initial issue, investigation, actions taken,
verification, and closure, are documented. These records serve as a historical record of the
issue and its resolution.
The successful closure of CAPA ensures that any identified issues are addressed and the
system is brought into compliance. CAPA outcomes are considered during the final review
and approval of the validation efforts, as they demonstrate the system's integrity and
readiness for use.
Evaluating CAPA in the CSV lifecycle ensures that any deviations or non-conformances
identified during the validation process are appropriately managed, documented, and
resolved. This contributes to the overall quality, compliance, and reliability of the validated
computerized system.
A21: Performing a system assessment is a crucial step in the Computer System Validation
(CSV) process to ensure that a software system meets regulatory requirements, business
needs, and quality standards. Here's how you can perform a comprehensive system
assessment:
Clearly define the scope of the system assessment, including the specific software
application, its functionalities, and the intended use. Identify the objectives of the assessment,
such as ensuring compliance, functionality, and security.
2. Gather Requirements:
Collect and document the system requirements, including functional, technical, regulatory,
and business requirements. This information serves as a benchmark for evaluating the
system's capabilities.
3. Review Documentation:
Examine existing documentation related to the software system, such as user requirements,
technical specifications, design documents, and user manuals. This helps in understanding the
system's architecture, design, and intended use.
4. Functional Assessment:
Evaluate the system's functionalities against the documented user requirements. Verify that
the software meets the intended purpose and performs as expected.
5. Technical Assessment:
Review the technical aspects of the system, including its architecture, interfaces, data flows,
and integrations. Ensure that the system is technically sound, scalable, and compatible with
other systems.
6. Security Assessment:
Assess the system's security features and controls. Ensure that appropriate access controls,
authentication mechanisms, data encryption, and other security measures are in place to
protect sensitive data.
Verify that the system maintains the integrity of data throughout its lifecycle. Check for data
validation, audit trail capabilities, data backups, and recovery processes.
8. Risk Assessment:
Conduct a risk assessment to identify potential risks associated with the system's use,
functionality, security, and data integrity. Evaluate the impact and likelihood of these risks.
9. Compliance Assessment:
Evaluate the system's compliance with relevant regulations and industry standards, such as 21
CFR Part 11, GAMP 5, and other applicable guidelines.
If the system is purchased from a vendor, assess the vendor's qualifications, support, and
validation documentation to ensure that the system meets regulatory requirements.
Review and update relevant documentation, such as the Validation Plan, User Requirements
Specification, Functional Specifications, and Test Scripts, based on the assessment findings.
Identify any gaps or discrepancies between the system's current state and the desired state.
Document these gaps and develop plans to address them.
13. Risk Mitigation Strategies:
Develop strategies to mitigate identified risks and gaps. Determine whether changes,
enhancements, or additional controls are needed to address the identified issues.
Based on the assessment findings, adjust the overall CSV validation strategy to ensure that all
risks and requirements are adequately addressed.
Obtain approvals and sign-offs from relevant stakeholders, including quality assurance,
regulatory affairs, and system users, to proceed with any necessary changes or enhancements.
As part of a continuous improvement approach, document lessons learned from the system
assessment to enhance future validation efforts.
Performing a comprehensive system assessment allows you to identify potential issues early
in the CSV process, address them effectively, and ensure that the software system is fit for its
intended use while meeting regulatory and quality requirements.
A22: Determining if a system is GxP (Good Practice) relevant is an important step in the
Computer System Validation (CSV) process within the pharmaceutical industry. Here's how
you can decide if a system is GxP relevant:
Consider the purpose of the system and its impact on regulated processes, data, products, or
services. If the system plays a role in producing, testing, controlling, or documenting GxP-
related activities, it is likely GxP relevant.
Review the applicable regulations and guidelines that govern your organization's activities.
Check whether the system's functionalities, data, or processes fall under regulatory oversight.
Common regulations include 21 CFR Part 11, EU GMP Annex 11, and ICH Q7.
Determine if the system manages, generates, or stores data critical to patient safety, product
quality, or regulatory compliance. Systems handling data subject to data integrity
requirements are often considered GxP relevant.
Evaluate whether the system supports robust audit trail capabilities that ensure accountability
for changes and actions. Systems with audit trail functionality are commonly required for
GxP purposes.
Check if the system enforces appropriate user access controls, authentication, and
authorization mechanisms. GxP systems often require strict control over user roles and
permissions.
7. Electronic Signatures:
Determine if the system allows electronic signatures that meet regulatory requirements for
approval, review, or other critical actions. Electronic signatures are essential in GxP
environments.
Verify that the system maintains accurate, complete, and reliable data. GxP-relevant systems
must ensure data accuracy and integrity throughout their lifecycle.
Consider whether the system supports controlled workflows and processes to ensure
compliance with GxP requirements. This includes approval workflows and change control
processes.
Assess the complexity of validating the system. GxP-relevant systems often require thorough
validation efforts to ensure compliance and data integrity.
Determine if the system's functionality directly impacts patient safety and product quality. If
there's a risk to either, the system is likely GxP relevant.
Conduct a risk assessment to evaluate the potential risks associated with the system. Consider
risks related to data integrity, process control, regulatory compliance, and patient safety.
Engage with relevant stakeholders, including quality assurance, regulatory affairs, and
subject matter experts, to determine if the system meets GxP criteria.
Document the rationale for considering the system GxP relevant. Clearly outline the factors,
regulations, and considerations that led to the decision.
It's important to note that the decision of whether a system is GxP relevant is a critical one, as
it will impact the level of validation effort and regulatory requirements. Always consult with
internal regulatory experts and quality professionals to ensure accurate classification.
A23: Backup, restore, archival, and business continuity plans are critical components of an
organization's data management and disaster recovery strategies. In the context of the
pharmaceutical industry, these plans are essential for ensuring data integrity, regulatory
compliance, and the ability to maintain operations even in the face of unexpected disruptions.
Here's an overview of each concept:
1. Backup Plan:
A backup plan involves creating copies of critical data and information to protect against data
loss due to hardware failures, software glitches, human errors, or other unexpected events.
Backups are typically stored in a separate location from the original data to ensure their
availability if the primary data is compromised.
2. Restore Plan:
A restore plan outlines the process of recovering data from backups to the original or
replacement systems. This plan defines the steps to restore data, the order in which data
should be restored, and the resources required to perform the restoration. Regular testing of
the restore process is crucial to ensure its effectiveness.
3. Archival Plan:
An archival plan focuses on the long-term retention of data for regulatory compliance,
historical records, and reference purposes. It involves identifying which data should be
retained, establishing retention periods, and organizing data in a structured manner. Archival
plans ensure that data remains accessible, searchable, and secure over time.
A business continuity plan is a comprehensive strategy that outlines how an organization will
continue to operate during and after disruptive events, such as natural disasters, power
outages, cyberattacks, or other emergencies. It encompasses not only data recovery but also
overall business processes, resources, personnel, and communication strategies.
- Data Classification: Identify data types, categories, and their criticality to prioritize backup,
restoration, and archival efforts.
- Backup Frequency: Define how often backups are performed (e.g., daily, weekly),
considering the frequency of data changes and the acceptable level of data loss.
- Backup Locations: Determine where backups will be stored, ensuring they are physically
separate from primary data to protect against data loss.
- Retention Policies: Specify how long backups and archived data will be retained based on
regulatory requirements and business needs.
- Testing and Validation: Regularly test backup, restore, and archival processes to ensure
their effectiveness. Validation ensures that data can be recovered accurately and within
acceptable timeframes.
- Data Encryption: Ensure that data backups and archives are encrypted to maintain data
confidentiality and integrity.
- Incident Response: Outline the steps to be taken when a data loss or disruption occurs.
Define the roles and responsibilities of individuals involved in responding to incidents.
- Testing and Drills: Regularly conduct testing and drills of the business continuity plan to
ensure that all stakeholders are familiar with their roles and responsibilities during a crisis.
These plans collectively contribute to safeguarding critical data, maintaining compliance, and
enabling a quick recovery from unforeseen disruptions. In the pharmaceutical industry, where
data integrity and patient safety are paramount, these plans play a crucial role in ensuring
operational resilience and regulatory adherence.
A24: Deviations in the context of Computer System Validation (CSV) are typically handled
during the "Execution and Testing" phase of the CSV lifecycle. This phase involves testing
the system to ensure that it meets the defined requirements and functions correctly. During
this phase, any deviations or discrepancies that are identified are documented, investigated,
and resolved.
Here's how deviations are typically handled during the "Execution and Testing" phase of
CSV:
1. Identification of Deviations:
During testing, if any discrepancies, errors, or deviations from the expected behavior or
requirements of the system are identified, they are documented as deviations.
2. Documentation of Deviations:
The deviations are documented in a formal manner, which includes capturing details such as
the nature of the deviation, where and how it occurred, the impact on the system, and the
potential risk associated with it.
3. Assessment of Deviations:
The identified deviations are assessed to determine their significance and potential impact on
the system's functionality, data integrity, and compliance.
4. Investigation:
A thorough investigation is conducted to understand the root cause of the deviation. This may
involve analyzing logs, code, configuration settings, or any other relevant data to identify
why the deviation occurred.
The root cause analysis aims to identify the underlying reasons for the deviation. This helps
in implementing corrective and preventive actions to prevent similar deviations in the future.
6. Risk Evaluation:
The impact and severity of the deviation are evaluated to determine the level of risk
associated with it. This assessment guides the decision-making process on how to address the
deviation.
7. Decision-Making:
Based on the assessment of the deviation and associated risks, a decision is made on how to
proceed. This decision could involve accepting the deviation, implementing corrective
actions, or deciding to reject the system if the deviation is critical.
Corrective actions are planned and executed to address the deviation. This may involve
modifying the system configuration, fixing the code, updating documentation, or other
necessary actions.
All actions taken to investigate, address, and resolve the deviation are documented, including
a description of the deviation, the investigation findings, actions taken, and verification of the
resolution.
The resolution of the deviation, along with supporting documentation, is reviewed, approved,
and signed off by appropriate stakeholders, such as quality assurance, validation, and project
management.
12. Reporting:
A deviation report is generated summarizing the details of the deviation, investigation, root
cause analysis, corrective actions, and verification.
Handling deviations in the "Execution and Testing" phase ensures that any issues are
identified, addressed, and documented in a systematic manner. This helps in maintaining the
integrity of the validation process and ensures that the system meets the defined requirements
and compliance standards.
A25: FRA (Functional Risk Assessment) is typically prepared during the "Planning" phase of
Computer System Validation (CSV). The "Planning" phase is the initial phase of the CSV
lifecycle and involves defining the scope, objectives, and approach for the validation project.
FRA is an important component of the planning process as it helps in identifying potential
functional risks associated with the computer system.
Here's how FRA preparation fits into the "Planning" phase of CSV:
1. Scope Definition:
In the "Planning" phase, the scope of the validation project is defined. This includes
identifying the computer systems that require validation, determining the functionalities that
need validation, and understanding the criticality of the systems to the overall operation.
2. Objective Setting:
The objectives of the validation project are established. This involves clarifying the goals of
the validation, such as ensuring data integrity, compliance with regulatory requirements, and
system reliability.
4. FRA Preparation:
As part of the "Planning" phase, the FRA is prepared. The FRA involves identifying and
assessing potential functional risks associated with the computer system. This includes
analyzing the system's functionalities, interactions, interfaces, and potential failure points.
5. Risk Identification:
During FRA preparation, potential functional risks are identified. These risks could relate to
data integrity, system functionality, process impact, regulatory compliance, and patient
safety.
6. Risk Assessment:
Each identified risk is assessed to determine its potential impact and likelihood. The risk
assessment helps prioritize risks based on their significance and the potential consequences
they could have.
Based on the risk assessment, strategies to mitigate identified risks are developed. These
strategies could involve implementing controls, process improvements, validation activities,
or other measures to reduce the likelihood or impact of identified risks.
8. Documentation:
The FRA is documented, detailing the identified risks, their assessment, potential
consequences, and proposed mitigation strategies. This documentation becomes an important
reference for the validation project.
The FRA findings and mitigation strategies are incorporated into the overall validation plan.
This ensures that the validation activities are aligned with the identified risks and that
appropriate focus is given to addressing them.
The FRA documentation is reviewed and approved by relevant stakeholders, such as quality
assurance, validation, and project management. Their input ensures that all critical risks are
adequately addressed.
The FRA outcomes play a role in planning the execution of validation activities. It guides the
focus of testing and validation efforts toward the areas of highest risk.
The FRA is a foundational document that sets the stage for the validation activities to follow.
By identifying and assessing functional risks early in the planning phase, organizations can
develop a targeted and effective validation strategy that addresses potential challenges and
ensures the integrity of the computer system and its impact on regulated processes.
Q26: If there are discrepancies found during PQ which are GxP relevant and customer wants
to go live of application, what is your advice to Client to use application in production and
how it will be tracked for closure in future?
A26: When discrepancies are found during Performance Qualification (PQ) that are GxP
relevant and the customer wants to proceed with the application going live in production, it's
important to carefully consider the potential risks and implications. Here's my advice to the
client on how to proceed and ensure proper tracking for closure of the discrepancies in the
future:
Advice to Client:
1. Risk Assessment:
Conduct a thorough risk assessment to evaluate the impact of the discrepancies on GxP
compliance, patient safety, data integrity, and overall system functionality. This assessment
will help you make an informed decision.
2. Risk Mitigation:
Implement risk mitigation strategies to minimize the impact of the discrepancies. This could
involve additional procedural controls, manual workarounds, enhanced monitoring, or other
measures that reduce the risk associated with the identified discrepancies.
3. Go-Live Decision:
If the risk assessment indicates that the identified discrepancies do not pose significant risks
to GxP compliance and patient safety, you can consider proceeding with the application
going live in production. However, ensure that appropriate controls and monitoring
mechanisms are in place to manage the identified discrepancies.
4. Communication:
5. Document Decision:
Document the decision to proceed with the application going live, along with the risk
assessment, risk mitigation strategies, and the approval of relevant stakeholders. This
documentation will serve as a record of the decision-making process.
6. Ongoing Monitoring:
Implement an ongoing monitoring plan to closely track the performance of the application in
production. This includes monitoring for any adverse events, deviations, or incidents related
to the identified discrepancies.
Set up a discrepancy tracking system that captures all identified discrepancies, their
associated risks, and the actions taken for mitigation. This system will serve as a central
repository for managing and tracking discrepancies.
2. Action Items:
Assign action items to responsible individuals or teams for addressing each identified
discrepancy. Specify the corrective and preventive actions that need to be taken to resolve the
discrepancies.
3. Due Dates:
Assign due dates for each action item to ensure that the discrepancies are addressed within a
reasonable timeframe. Due dates should consider the urgency and potential impact of each
discrepancy.
Establish a review and approval process for the corrective and preventive actions. This
ensures that proposed actions are well-documented, effective, and aligned with regulatory
requirements.
After implementing the corrective and preventive actions, conduct follow-up and verification
to ensure that the discrepancies have been effectively addressed and resolved. This may
involve testing, validation, and documentation review.
Once the discrepancies have been satisfactorily addressed and resolved, close them in the
discrepancy tracking system. Document the actions taken, the outcomes, and any relevant
supporting evidence.
7. Periodic Review:
Periodically review the discrepancy tracking system to ensure that all identified discrepancies
have been properly addressed and closed. This review is important for maintaining GxP
compliance and ensuring the ongoing integrity of the application.
By following these steps, the client can make an informed decision about proceeding with the
application going live while managing the identified discrepancies and ensuring that they are
effectively addressed and tracked for closure in the future.
Q27: What are the procedures that needs to be followed after system released for use to
maintain the system in compliance during its life cycle?
A27: After a system is released for use, several procedures need to be followed to ensure that
the system remains in compliance and continues to operate effectively throughout its
lifecycle. Here are the key procedures that should be followed:
Implement a robust change control process to manage any changes to the system, including
software updates, configuration changes, and hardware modifications. All changes should be
properly evaluated, documented, reviewed, approved, and validated to ensure that they do not
negatively impact system compliance or functionality.
Conduct regular periodic reviews and monitoring of the system's performance, data integrity,
and compliance. This includes reviewing system logs, audit trails, and user access records to
identify any anomalies or deviations from expected behavior.
Maintain strict user access control to ensure that only authorized personnel have access to the
system. Regularly review and update user access permissions based on job roles and
responsibilities.
Implement and maintain security measures to protect the system from unauthorized access
and ensure the integrity of data. This includes measures such as encryption, password
policies, and user authentication.
Establish and maintain robust backup and restore procedures to ensure that critical system
data is regularly backed up and can be restored in case of data loss or system failure.
Develop and maintain a comprehensive disaster recovery plan that outlines procedures for
recovering the system and its data in the event of a major system failure, natural disaster, or
other emergencies.
8. Vendor Management:
9. Validation Maintenance:
Periodically review and update the system validation documentation to ensure that it remains
current and accurately reflects the system's configuration, functionality, and compliance.
Stay informed about relevant regulatory updates and ensure that the system remains
compliant with evolving regulatory requirements throughout its lifecycle.
Develop procedures for the eventual retirement of the system, including proper data archival
and migration to ensure data integrity and compliance with data retention requirements.
By following these procedures, organizations can maintain the system in compliance, ensure
data integrity, and uphold GxP standards throughout the entire lifecycle of the system.
Q28: What is root cause analysis in pharmaceutical industry and what are the tools used?
A28: Root cause analysis (RCA) is a systematic process used in the pharmaceutical industry
to identify the underlying factors that contribute to problems, issues, or non-conformances.
The goal of RCA is to determine the primary cause of a problem rather than just addressing
its symptoms. By identifying and addressing the root cause, organizations can implement
effective corrective and preventive actions to prevent recurrence of the issue.
2. 5 Whys: This technique involves asking "Why?" repeatedly (usually five times) to explore
the cause-and-effect relationships underlying a problem. It helps to drill down to the root
cause by uncovering multiple layers of causation.
3. Fault Tree Analysis (FTA): FTA is a systematic approach that uses logical diagrams to
analyze the relationships between various potential causes and their effects. It is particularly
useful for complex systems with multiple interrelated factors.
4. Failure Modes and Effects Analysis (FMEA): FMEA is a proactive approach that assesses
potential failure modes, their causes, and their potential effects. It assigns a risk priority
number (RPN) to each failure mode to prioritize corrective actions.
5. Pareto Analysis: Also known as the 80/20 rule, Pareto Analysis helps identify and
prioritize the most significant contributing factors by focusing on the few vital factors that
account for the majority of the issues.
6. Change Analysis: Examining changes that were made before the issue occurred can help
identify whether they are related to the problem. This tool can be particularly useful for
investigating deviations and discrepancies.
7. Process Mapping: Visualizing the process involved in the problem can help identify
potential areas where issues might arise. Process maps help in understanding the sequence of
steps and interactions.
8. Data Analysis and Trending: Analyzing data, trends, and statistical information can
provide insights into patterns and anomalies that could be causing the problem.
10. Root Cause Tree Analysis: Similar to FTA, this method breaks down causes and sub-
causes in a tree structure to systematically identify root causes.
The choice of tool depends on the complexity of the problem and the context in which it
occurs. Often, a combination of tools may be used to thoroughly investigate and identify the
root cause. It's important to note that the RCA process should be documented, and the
identified root causes should be verified and validated before implementing corrective
actions.
A29: Data integrity refers to the accuracy, completeness, consistency, and reliability of data
throughout its entire lifecycle, from creation to archival. It is a critical aspect in the
pharmaceutical industry to ensure the quality, safety, and efficacy of products and to maintain
compliance with regulatory requirements.
ALCOA is an acronym that represents key principles of data integrity. It stands for:
2. Legible: Data should be easily readable and understandable. Handwritten entries should be
clear and indelible.
3. Contemporaneous: Data should be recorded at the time of the activity or event. Delayed
recording of data can raise concerns about accuracy and authenticity.
4. Original: Data should be the first recording of an observation or result. Transcription errors
and copies of data should be avoided whenever possible.
5. Accurate: Data should be error-free and reflect the true values and observations.
Calculations, measurements, and other data should be precise.
ALCOA++ extends the ALCOA principles by adding additional requirements to ensure data
integrity:
7. Consistent: Data should be uniform and coherent, both within a single record and across
related records.
8. Enduring: Data should be retained for the required retention period and remain accessible
and legible throughout its lifecycle.
9. Available: Data should be easily retrievable and accessible when needed for review, audits,
and regulatory inspections.
10. Originality: Data should be generated in its original form, and any changes should be
appropriately documented and justified.
11. Traceable: There should be a clear and documented trail of data, showing its creation,
modification, and review history.
ALCOA and ALCOA++ principles are essential to maintain data integrity and ensure that
data is trustworthy and reliable. Regulatory agencies, such as the FDA and EMA, emphasize
the importance of adhering to these principles in various guidance documents and regulations
to prevent data manipulation, errors, and fraud in the pharmaceutical industry. Proper
implementation of these principles helps to build confidence in the accuracy and authenticity
of data generated during various stages of drug development, manufacturing, and distribution.
Conclusion:
In the pharmaceutical industry, where precision, accuracy, and compliance are paramount,
Computer System Validation plays a pivotal role. It safeguards patient safety, data integrity,
and regulatory compliance by ensuring that computer systems operate as intended and
contribute to the production of high-quality pharmaceutical products. Embracing CSV not
only meets regulatory requirements but also instills confidence in the industry's commitment
to excellence and patient welfare.