|

Computerized System Operational Qualification (OQ)

1. Purpose and Scope

Operational Qualification establishes documented evidence that a computerized system operates as intended across all defined functional and regulatory requirements.

This phase verifies system behavior under controlled conditions and confirms that configured functionality, workflows, and controls perform consistently and reproducibly. It applies to all GxP-relevant computerized systems where system operation directly impacts data integrity, product quality, or regulatory compliance. Operational Qualification focuses on what the system does.


2. Role of OQ within the Validation Lifecycle

Operational Qualification follows Installation Qualification and precedes Performance Qualification. Within the lifecycle:

  • Installation Qualification confirms correct deployment
  • Operational Qualification verifies functional operation
  • Performance Qualification confirms performance under routine use

OQ provides objective evidence that the system meets defined requirements and that all critical functions operate within established limits prior to release.


3. OQ Testing Strategy

Operational Qualification testing must be structured, controlled, and directly aligned with system requirements. The objective is to demonstrate that all critical functions operate as intended and that testing is performed in a consistent, reproducible, and fully traceable manner. The following principles define how OQ testing is designed and executed.

  • Requirement traceability: Each test must be directly linked to a specific user requirement or functional specification. This ensures that all critical system functions are verified and that no requirement is left untested.
  • Risk-based test coverage: Testing must prioritize functions that impact product quality, patient safety, and data integrity. High-risk functions require deeper and more rigorous testing, including negative and boundary conditions.
  • Defined test conditions: Tests must be executed in a controlled environment with predefined inputs, user roles, and system states to ensure repeatability and consistency of results.
  • Positive and negative testing: Normal operation must be verified alongside challenge conditions such as invalid inputs, unauthorized access attempts, and incorrect workflow execution.
  • Boundary condition testing: System behavior must be verified at operational limits such as maximum field lengths, data ranges, and system thresholds.
  • Controlled test execution: All tests must follow approved scripts with predefined expected results. Ad hoc testing is not acceptable for validation evidence.
  • Objective evidence collection: Each test step must be supported by evidence such as screenshots, audit trail entries, or system-generated records demonstrating actual system behavior.

These principles ensure that OQ testing is systematic, defensible, and capable of demonstrating compliance with regulatory expectations. They establish the foundation for consistent execution, reliable evidence generation, and complete traceability across all tested functions.

The diagram below illustrates the structure of Operational Qualification testing as a requirement-driven and evidence-based process. It shows how user requirements are translated into test scripts, executed under controlled conditions, and evaluated based on objective evidence. The traceability layer demonstrates continuous linkage between requirements, test execution, and recorded results.

Operational Qualification Functional Testing Model diagram showing flow from user requirements to test scripts, test execution, system response, objective evidence, and pass or fail evaluation, with a traceability layer linking requirements to tests and evidence.

4. Functional Verification

Functional testing confirms that system features operate according to defined requirements. Typical verification includes:

  • System messages and prompts: Confirm that system prompts, warnings, and confirmations are appropriate, clear, and triggered under correct conditions.
  • Workflow execution: Verify that defined workflows execute correctly, including status transitions, required steps, and enforcement of process sequence.
  • Data entry and processing: Confirm that data can be entered, saved, modified where permitted, and processed according to system rules without loss or corruption.
  • Calculation and logic verification: Validate that all calculations, formulas, and system logic produce correct and consistent results under defined conditions.
  • Configuration-driven behavior: Confirm that system configuration settings control functionality as intended, including enabling or restricting features.
  • Report generation and output accuracy: Verify that reports reflect accurate, complete, and current data, and that formatting and calculations are correct.

Each function must be tested with defined inputs and expected outputs, with results documented and verified.


5. Data Integrity and Regulatory Controls

OQ must verify that the system enforces data integrity and regulatory compliance requirements. Key verification areas include:

  • audit trail generation and completeness
  • capture of original and modified values
  • timestamp accuracy and synchronization
  • user identification associated with actions
  • record protection against unauthorized changes

Testing must demonstrate compliance with 21 CFR Part 11 and ALCOA+ principles where applicable.


6. Security and Access Control Testing

System security must be verified through controlled testing of access and permissions. Typical tests include:

  • Auditability of access events: Ensure that login attempts, access failures, and permission changes are recorded in the audit trail.
  • Role-based access enforcement: Verify that users can only access functions and data permitted by their assigned roles, and that restricted functions are inaccessible.
  • Unauthorized access prevention: Confirm that attempts to access the system without valid credentials or appropriate permissions are blocked.
  • Password policy enforcement: Verify that password complexity, expiration, reuse restrictions, and lockout rules are enforced as defined.
  • Session management controls: Confirm that session timeouts occur after defined periods of inactivity and that re-authentication is required.
  • Access changes and propagation: Verify that changes to user roles or permissions take effect correctly and do not allow residual access.
  • Segregation of duties: Confirm that critical functions cannot be executed by a single user where separation is required.

Negative testing must confirm that restricted actions cannot be performed.


7. Electronic Signatures

Where applicable, electronic signature functionality must be verified. Testing includes:

  • signature execution process
  • linkage of signature to record
  • capture of user identity and timestamp
  • enforcement of dual authentication where required

Electronic signatures must be permanently linked to the corresponding electronic records.


8. Interface and Data Exchange Testing

Interfaces must be verified to ensure accurate and complete data transfer. Verification includes:

  • Interface configuration control: Confirm that interface parameters such as endpoints, credentials, and formats are correctly configured and controlled.
  • Data mapping verification: Confirm that data fields are correctly mapped between systems and that transferred data retains accuracy and meaning.
  • Transfer completeness: Verify that all expected data is transferred without omission, truncation, or duplication.
  • Trigger and timing behavior: Confirm that data transfers occur at the correct time, based on defined triggers or schedules.
  • Error detection and handling: Verify that failed or incomplete transfers are detected, logged, and flagged for investigation.
  • Prevention of duplicate or partial records: Confirm that system controls prevent creation of duplicate entries or incomplete data records.
  • Data reconciliation capability: Verify that transferred data can be reconciled between source and destination systems.

Exception handling must be tested to confirm that errors are detected, controlled, and traceable.


9. System Behavior and Error Handling

OQ must verify system response under both expected and abnormal conditions. Testing includes:

  • system response to invalid inputs
  • error message clarity and accuracy
  • system stability during processing
  • handling of interrupted operations

All errors must be controlled and must not compromise data integrity.


10. Performance under Defined Conditions

OQ includes verification that the system performs within expected operational limits under controlled conditions. Typical checks include:

  • response time for key operations
  • processing time for data transactions
  • system behavior with multiple users
  • system stability during extended operation

Performance limits must be defined and verified where relevant.


11. OQ Protocol Structure and Execution

Operational Qualification is executed using a detailed protocol or test plan that defines:

  • test scripts and step-by-step instructions
  • expected results for each step
  • acceptance criteria
  • traceability to requirements
  • evidence collection requirements

Each test must include:

  • test step
  • expected result
  • actual result
  • pass or fail determination
  • reference to objective evidence

Deviations must be documented, investigated, and resolved.


12. Acceptance Criteria

Acceptance criteria for OQ must be objective, measurable, and linked to requirements. Typical criteria include:

  • Traceable and complete evidence: All tests must be supported by clear, reviewable evidence demonstrating that acceptance criteria are met.
  • Functional completeness: All critical system functions must operate as defined in the requirements without errors or unexpected behavior.
  • Data integrity enforcement: Audit trails, record protection, and traceability must function correctly and consistently across all tested scenarios.
  • Security compliance: Access controls must prevent unauthorized actions and enforce defined user permissions without exception.
  • Accurate data processing: All calculations, data transformations, and outputs must produce correct and reproducible results.
  • Interface reliability: Data exchanges must be accurate, complete, and consistently executed without data loss or corruption.
  • System stability: The system must operate without crashes, data loss, or inconsistent behavior during testing.
  • Controlled error handling: Errors must be detected, communicated, and managed without compromising system integrity or data reliability.

All failures must be resolved or justified prior to system release.


13. Deliverables and Approval

Operational Qualification results in formal documentation demonstrating that the system operates according to defined requirements.

Typical deliverables include:

  • executed OQ protocol or test plan
  • completed test scripts
  • screenshots and audit trail records
  • deviation reports and resolutions
  • summary report

Approval confirms readiness for Performance Qualification or system release where PQ is not required.


14. Link to Ongoing Lifecycle Control

Operational Qualification establishes verified system functionality under controlled conditions.

This verified state must be maintained through:

  • change control
  • periodic review
  • requalification where required

Any change to functionality, configuration, interfaces, or security controls must be evaluated for impact and may require partial or full re-execution of OQ.