Trial By Fire Solutions / Internal Audit Checklist
Revision x.x / Page 1 of 14

CSV Audit Checklist

1Management

Aspect / What to Look For / Findings
Personnel assigned to the project are qualified based on appropriate education, training, and/or experience / Are the requirements for the various roles established?
Is there evidence (resumes, training files) that employees are qualified for their role(s)? Identify the various roles and sample from select types to assess.
If additional training was required, was training arranged and are training records stored in personnel files.
Are the personnel records properly maintained, protected, readily retrievable, etc.?
The company has an SDLC / Is the SDLC defined?
Are the deliverables defined by the SDLC?
Is there a mechanism to establish traceability between the requirements and to test?
Does the SDLC cover infrastructure (SW CM, change management, release management, etc.)?
Are employees trained / briefed on the SDLC?
Do affected employees demonstrate comprehension of the SDLC?

2Planning

Aspect / What to Look For / Findings
Is a SW Development Plan established? / Is the plan documented?
Is the plan baselined (approved)?
Is the plan kept up-to-date?
How are changes to the plan communicated to the team?
Do affected employees have access to the plan?
Are affected employees using the latest plan?
Plan Infrastructure / Does the plan define, as applicable, standards that apply, methods used, and tools used?
Is there evidence that applicable standards are being followed?
Is there evidence that the defined methods are used (and only the defined methods used)?
Is there evidence that the defined tools are used (and only the defined tools used)?
Artifacts / Does the plan define the artifacts to capture throughout the process?
List the reviews indicated by the Plan
Are all the defined artifacts captured per the Plan?
CM Planning / Is the CM planned (may be a separate document)?
Is the responsibility for CM defined?
Are the methods / tools for CM defined?
Does the plan incorporate CM methods for external software (e.g., libraries, COTS / SOUP)?
Maintenance / Is maintenance planned?
Does maintenance address the procedures for receiving & documenting issues / complaints?

3Development

Aspect: / What to Look For / Findings
Requirements / Are requirements established for the system?
How are requirements controlled?
How are requirements communicated to the team?
How are changes to the requirements managed?
Were the requirements reviewed for completeness, consistency, verifiability, etc.?
Were security requirements established?
Were requirements approved before implementation (possibly incrementally)?
Is the requirements change history captured?
Is there any evidence of outdated specifications being used?
Risk Management / ?????? Need to think through this a bit…
Interfaces / Did the requirements and/or design define the interfaces between the major components (internal)?
Did the requirements and/or design define the interfaces between internal and external components?
Database / Is the schema documented and controlled?
Design / Were the design artifacts developed and captured per the Plan?
Were design reviews conducted per the Plan?
Are the artifacts (including design review records) being controlled per Document / Record control procedures?
Implementation / If coding standards were used, were they followed? Consistently / throughout?
If unit testing was required (full or subset), were unit test records captured and controlled per document control procedures?
If unit testing was required, was objective evidence captured to confirm the unit meets the pre-defined acceptance criteria?
If changes were made to the unit after unit testing (and unit testing was required), was unit testing re-done?
Was unit testing completed per the Plan?
Integration / Is integration conducted per the Plan?
If integration testing was required (full or subset), were unit test records captured and controlled per document control procedures?
If integration testing was required, was objective evidence captured to confirm the unit meets the pre-defined acceptance criteria?
If changes were made to components involved in integration testing (and integration testing was required), was integration testing re-done?
Was integration completed per the Plan?

4V&V

Aspect / What to Look For / Findings
Verification / Validation / Was V&V planned (could be part of the overall plan or maintained in a separate plan)?
Were V&V activities conducted according to the Plan?
Were the V&V artifacts captured per the Plan?
Are the V&V artifacts managed per Document Control procedures?
Was acceptance criteriadefined for V&V?
Were V&V protocols approved before execution?
Was objective evidence captured to show conformance to the acceptance criteria?
Was the tester and date test conducted captured?
Was sufficient information about the test / environment captured?
Were results captured in accordance with GDP?
For update releases, was a V&V Plan established that defines the minimum (re)test requirements?
For updated releases, was the V&V Plan followed?
For updated releases, were the V&V artifacts captured and managed per document control?
Were non-conformances (test failures, test protocols, anomalies) from V&V documented and managed per document control?
Were all issues from V&V activities tracked to closure?

5Operations

Aspect / What to Look For / Findings
Service Providers / Has the service provider been qualified?
Has the physical security (premises) provided by the service provider been validated?
Has the logical security (“hacking”) provided by the service provider been validated?
Configuration Management / Are the classes, types, categories, or lists of items to be controlled as defined in the Plan being controlled per the Plan?
Is the responsibility for CM being carried out by those responsible defined in the Plan?
Are the methods / tools for CM as defined in the Plan being used?
Are only the methods / tools for CM as defined in the Plan used?
How are baselines identified (e.g., tagged)?
Are baselines fully identified? Should include code base as well as support software, libraries, etc.
Change Control / Can specific changes (to a baselined component) be associated with a change approval vehicle?
Are only the approved changes implemented for a particular change approval vehicle?
For a specific change, was the change verified?
Issue tracking / Were issues to baselined elements documented and tracked to closure?
Are all remaining known issues documented?
Are all known issues with non-developed software documented and assessed for impact?
Tools / Identify the tools used in the development and management of the system.
Has an assessment been made as which tools require validation?
For each tool requiring validation, has the validation been conducted?
For each validation, have the periodic reviews been conducted to ensure the tool remains in a validated state?
For all tools, are updates assessed before roll-out?
For all tool updates, is impact coordinated with the affected users?
For all tools, are the tools adequately protected from change (deliberate or unintentional)?

6Release / Post-Release

Aspect / What to Look For / Findings
Installation / Are the installation (and check-out) procedures defined?
Were the installation procedures followed for each installation?
Release / For each release, were the required artifacts provided (e.g., VDD, updated User information, etc.)?
Was an archive made for each release?
Did the archive contain the complete configuration, including compilers, SOUP / COTS, IDE, etc.?
Is the archive readily identifiable and retrievable?
Is the archive safely protected from change?
Is the archive safely secured from damage?
Are users made aware of all known issues with the release?
Were update releases coordinated with the customers to minimize impact?
Issues / Are records regarding post-release issues documented and tracked to closure?
Are records from post-release issues managed per document control?
Are post-release issues assessed for impact?
If post-release issue impact analysis determines a likely impact on use / users, are users notified?
Support Software / Are periodic checks being made to determine if new issues have arisen with support software?
If new issues with support software have been identified, are they being assessed for (potential) impact?
If new issues with support software have been identified and determined to have a likely impact on use / users, are users being notified?
Continuity / Are backups being conducted to ensure preservation of the system deployed?
Are backups being conducted to ensure preservation of the customer’s data?
Has the backup system been validated?
Are backups stored in a separate location from the source?
Are backups stored in an environmentally-controlled location?
Is there an agreement / plan for retention of backup data?
How much data could be lost in the event of a catastrophe? Is this acceptable?
Is this worst-case data loss understood by the customer(s)?

Reviewed by (Project Manager): ______Date: ______

Signature by Project Manager indicates that the findings have been reviewed and accurately represent the project status on the date signed.

Auditor (sign): ______Date: ______

Confidential