Verification and Validation Policy

EFFECTIVE DATE: October 10, 2016

APPROVED BY: Mike McCoy

LAST UPDATED: May 5, 2017

PURPOSE

The purpose of this document is to establish the policy defining the minimum expectations and success criteria for all configuration items and artifacts that require a quality review.

SCOPE OF THE POLICY

This policy applies to specified project artifacts as relates to Government software/services developed by ISI.

APPLICABILITY

Responsibility for conducting and documenting quality is divided between many different Roles and is highly dependent upon the type of quality check (peer review, testing, demonstration, walkthroughs, simulation, etc.) being conducted. Project Leaders (PL) are accountable to ensure that all project personnel that conduct quality reviews are adhering to this policy.

AUTHORITY AND COMPLIANCE

This policy is authorized by the COO. Compliance to this policy will be evaluated through an appraisal/audit process. Results will be provided to the appropriate personnel and non-compliance/issues will be submitted to the Business Unit Leads and the Executive team for remediation as part of the BUL/Executive Cadence.

POLICY STATEMENT

The Verification and Validation Policy establishes minimum expectations to ensure quality checks are performed, throughout the software development lifecycle in the form of verification and validation methods, on organizationally identified work products.

InnovaSystems government projects shall:

  1. At minimum, the project will ensure the following quality checks are performed:

    • Peer Reviews
    • Test (manual or automated)
    • Demonstrations
  2. Plan and schedule these quality checks as part of the following project’s work products:

    • Requirements (of all types)
    • Software Code
    • Test Cases
    • Software Supporting Documentation (SUM, SAM, On-line Help)
  3. Ensure appropriate verification / validation environments for the type of quality check to be conducted.

  4. Perform checks against a defined set of quality criteria using established organizational or project derived procedures.

  5. For Events, such as peer reviews and demonstrations, document the stakeholders, relevant discussions, defects found, event time and resulting actions (w/due dates) based on the outcome of the review or demo and store per the project’s specified artifact type / storage location.

  6. Ensure all action items are tracked and verify defects were corrected, prior to closure.

Process Guidance Version: 10.4