4phones WEBREP test plan

Version 1.13

Preface

The reason for producing a test plan is to provide a complete framework for the testing throughout the software development lifecycle. By using consistent testing and recording techniques for the entire system, the chances of errors in the system not being discovered and corrected are minimised.

Table of contents

Preface

1Introduction

1.1Objectives

1.2Background

1.3Scope

1.4References and documents

1.5Distribution

2Test items

2.1Program modules

3Features to be tested

3.1Requirements documentation

3.2Unit testing

3.3Integration testing

3.4System testing

3.5Acceptance testing

4Features not to be tested

4.14Phones financials system

4.2Other modules of E-Commerce project

5 Approach

5.1Ambiguity review

5.2Unit tests

5.3Integration tests

5.4System tests

5.5Smoke tests

5.6Regression tests

6Item pass/fail criteria

7Test Deliverables

8Suspension criteria

8.1Suspension criteria

8.2Resumption criteria

9Environmental needs

9.1Hardware

9.2Software

10Responsibilities

10.1Test group

10.2Development team

11Staffing and training needs

11.1Test group

12Schedule

13Risks and contingencies

14Testing tasks

15Approvals

Appendix A — ambiguity review checklist

Appendix B – unit test cases

Unit test procedure specification 1301

Procedure steps

Appendix C — integration test cases

Integration test procedure specification 1401

Procedure steps

Appendix D — system test cases

Appendix E – acceptance test cases

Acceptance test procedure specification 1601

Procedure steps

Appendix F — test schedule

1Introduction

1.1Objectives

The test plan for the WEBREP project should assist in the organising and execution of testing by:

  • Identifying the responsibilities for testing each part of the software.
  • Identifying what parts of the software are to be tested.
  • Identifying how testing is to be conducted, reported, analysed and acted upon.
  • Ensuring comprehensive testing coverage of the software.
  • Ensuring the software is of high quality; the software will replace/support the intended business functions and achieves the standards required by the company for the development of new systems.

1.2Background

WEBREP forms the reporting component of the 4Phones E-Commerce website.

Franchisees and 4Phones staff will be able to access the WEBREP system and obtain reporting information about their sales and customers for the current period and for monthly histories. This will allow them to interrogate the data to identify the products they have sold, the types of customers and buying patterns. They will also be given prospective customer information from promotions and advertising. Most of this information will come from the current financial system on a monthly basis.

1.3Scope

This test plan is for the 4Phones WEBREP system being developed by the WEBREP project team. Testing will comprise testing of requirements, unit, integration, system and acceptance testing for the program modules that make up the WEBREP software.

1.4References and documents

  • ANSI/IEEE Std 829–1998, IEEE standard for software test documentation
  • WEBREP business requirements report document — cp_2.htm
  • WEBREP data flow diagram — cp_4.htm
  • WEBREP functional hierarchy diagram — cp_5.htm
  • WEBREP regional buying patterns report — excel spreadsheet — QMS_CP8.xlw
  • WEBREP regional buying patterns data entry prototype – cp_9.htm
  • WEBREP E-Commerce website development — start and end dates — cp_11.htm
  • WEBREP system test plan version 1.06
  • WEBREP test plan schedule – Gantt chart
  • WEBREP system test schedule – Gantt chart

1.5Distribution

David Blair, IT Manager; Eloise de Silva, Project Manager; Andrea Williams, Systems Administrator; John Green, Network Administrator; Tony Meyer, Business Analyst; Greg Carr, Business Analyst; Elaine Smith, Analyst Programmer; Greg O'Neill, Analyst Programmer; Bea Pedantic, Software Tester (Contractor).

2Test items

All items that produce the WEBREP reports are to be correct if they comply with the following documents precisely:

  • WEBREP business requirements report document — cp_2.htm
  • WEBREP data flow diagram — cp_4.htm
  • WEBREP functional hierarchy diagram — cp_5.htm
  • WEBREP regional buying patterns report — excel spreadsheet — QMS–CP8.xlw
  • WEBREP regional buying patterns data entry prototype — cp_9.htm
  • WEBREP Regional Product Growth Data Entry Prototype — cp_9.htm

The versions to be tested will be placed in the appropriate libraries by the configuration administrator. The administrator will also control changes to the versions under test and notify the test group when new versions are available.

2.1Program modules

The following modules of the system have been identified as those components that make up the WEBREP project.

Type / Name
Executable / WEBREPinterface.exe
Report File / RegionalBuyingPatterns.rpt
Report File / RegionalProductGrowth.rpt

3Features to be tested

3.1Requirements documentation

The requirements specification is to be tested to ensure that no ambiguities occur in functional requirements and to improve the quality of those requirements (requirements-based testing methodology). An ambiguity review will be conducted to ensure that all requirements are testable.

3.2Unit testing

  • All source code for WEBREPInterface.exe, including any class libraries created for WEBREPInterface.exe
  • Report files RegionalBuyingPatterns.rpt and RegionalProductGrowth.rpt

Covering test items: TI8001, TI8002… etc.

3.3Integration testing

As all other E-Commerce modules operate independently of WEBREP they can be excluded from Integration testing. Integration testing will comprise testing WEBREP’s ability to retrieve requested data from the 4Phones financial system and integration with the 4Phones network security system, with particular emphasis on the data being passed between the 4Phones Financials Database, WEBREPInterface.exe, browser client, printers and the reports files, RegionalBuyingPatterns.rpt and RegionalProductGrowth.rpt

Covering test items: TI9001, TI9002… etc.

3.4System testing

System testing will comprise functionality, usability, performance, multiple site, usability and disaster recovery. These are detailed in the WEBREP system test plan and WEBREP system test schedule (in Appendix B of the system test plan).

3.5Acceptance testing

Acceptance testing will be conducted in conjunction with an end user (4Phones staff member) to ensure that WEBREP functions to business requirements and the use-cases generated from those requirements. All elements of the business requirements report are to be tested. The acceptance test procedure specification can be found in Appendix E of this document.

Covering test items: TI0024, TI0025… etc.

4Features not to be tested

4.14Phones financials system

Programs and data in this system function independently of WEBREP, apart from being data source for WEBREP. It is assumed all data validation, data integrity and program behaviour are correct and maintained independently of WEBREP within the financials system. The one exception to this case is the data validation routines built in to WEBREP, which are unit tested within WEBREP.

4.2Other modules of E-Commerce project

The WEBREP module functions independently of the other 4Phones E-Commerce Project modules, such as WEBIN and WEBORD, as data for the reports is extracted directly from the 4Phones Financials System.

5 Approach

5.1Ambiguity review

The test team will conduct an ambiguity review to ensure that all requirements are testable. This will involve testing all requirements via the ambiguity checklist. Any requirements that fail this test will require re-design until they pass. The ambiguity review checklist can be found in Appendix A of this document.

5.2Unit tests

It is the responsibility of the development team to:

  • design and build drivers or stubs for this unit testing process
  • conduct the unit tests
  • record and report test metrics for the code coverage analyser software to the test manager
  • ensure adequate code coverage of unit testing

The unit test procedures specification can be found in Appendix B of this document.

5.3Integration tests

The test team will develop test cases to verify that the components of WEBREP function together correctly. Specifically, the test cases generate will cover the interaction of the 4Phones financials database, WEBREPInterface.exe, client browsers (intranet and internet), printers and the reports files, RegionalBuyingPatterns.rpt and RegionalProductGrowth.rpt.

The integrated test procedure specification can be found in Appendix C of this document.

5.4System tests

System testing will encompass functionality, usability, performance, multiple site, usability and disaster recovery. The system test methodology can be found in the WEBREP system test plan.

5.5Smoke tests

The test team will select or develop test cases to be used for smoke tests. Failure of any of these critical cases indicates the software is not ready to complete further testing. Smoke test cases are simple but critical cases that exercise the software from end-to-end. Smoke tests will be implemented in unit testing, integration testing and system testing.

5.6Regression tests

Automated tests including unit, integration, system and smoke tests will include regression-testing methodology. The test and development teams will develop the automated scripts, software stubs and data to automate regression testing for nightly builds and version releases.

6Item pass/fail criteria

In the testing of the system, a successful test for a component or item is determined by the compliance with the relevant documentation for that component or item. To determine this, each test will have to be verified by at least two members of the project group.

7Test Deliverables

Generated from the testing phase of development for this system will be the following list of documents:

Test documentation:

  • Test plan
  • Test procedure specifications
  • Test logs
  • Test summary report

Test data:

  • Copies of all data entry and reply screens
  • Copies of database tables.

8Suspension criteria

8.1Suspension criteria

The test will be suspended if the WEBREPInterface.exe or report files:

  • cause corruption to the indexes or data of the 4Phones financials system
  • generate operating system faults on the test platforms
  • generate a load on the system hardware that affects other 4Phones users (such as excessive network traffic)
  • cause a failure in a smoke test.

8.2Resumption criteria

Testing will resume once a new build of WEBREPInterface.exe or the .rpt reports have been distributed to the testing team, which resolve the issue or issues causing suspension of the test.

9Environmental needs

9.1Hardware

The test server will be of similar specification to the 4Phones web servers. Currently this is a dual Pentium 2GHZ P4 server with 1GHZ RAM. Hardware specification of client machines is independent of this test, provided they are of minimum specification to support the current client web browser.

9.2Software

Operating system

  • Server — Windows 2003 with same service packs as 4Phones web servers.
  • Clients — Microsoft windows XP with service pack 2.

Tools

Microsoft VBScript, BugTraker database, Windows Performance Monitor.

Software to be tested

WEBREPInteface.exe, RegionalBuyingPatterns.rpt, RegionalProductGrowth.rpt

10Responsibilities

10.1Test group

The test group provides the overall management of the test process and Test expertise:

  • Tony Meyer — business analyst
  • Greg Carr — business analyst
  • Bea Pedantic — software tester.

10.2Development team

This team produces the software to be tested and responds to the system test incident reports

  • Elaine Smith — analyst programmer
  • Greg O’Neill — analyst programmer.

11Staffing and training needs

11.1Test group

  • Test Manager — Greg Carr
  • Senior Test Analyst — Tony Meyer
  • Test Analysts — Elaine Smith, Bea Pedantic

12Schedule

See Appendix F of this document.

13Risks and contingencies

If the testing schedule is significantly impacted by system failure, the Project Manager, Eloise de Silva, has agreed to provide a full time Analyst Programmer to assist with debugging.

If hardware problems impact the system during the day, the test team has agreed to reschedule testing for the evening.

14Testing tasks

In the following table, the tasks that have to be completed, the order that they are to be completed and the people responsible for their completion are outlined.

Table 1: Testing timetable and responsibilities

Task / Predecessor tasks / Responsibility
1 / Prepare test plan / Greg Carr, Tony Meyer
2 / Prepare test procedure specifications / 1 / Bea Pedantic, Elaine Smith, Tony Meyer
3 / Conduct unit testing / 2 / Greg O’Neil, Elaine Smith
4 / Integration testing / 3 / Greg O’Neil, Elaine Smith, Bea Pedantic
5 / System testing (See system test plan and schedule for details) / 4 / Greg Carr, Tony Meyer, Elaine Smith, Bea Pedantic
6 / User acceptance testing / 5 / T Bea Pedantic, Tony Meyer

15Approvals

IT Manager
______Date____/___/___

Project Manager
______Date___/____/___

Appendix A — ambiguity review checklist

The ambiguity review checklist is made up of 15 classes of ambiguities that are commonly found in requirements. It is used to manually review requirements and identify all of the ambiguities so they can be eliminated.

Ambiguity review checklist

1 / The dangling Else / 
2 / Ambiguity of reference / 
3 / Scope of action / 
4 / Omissions / 
5 / Ambiguous logical operators / 
6 / Negation / 
7 / Ambiguous statements / 
8 / Random organisation / 
9 / Built-in assumptions / 
10 / Ambiguous precedence relationships / 
11 / Implicit cases / 
12 / Etc. / 
13 / I.E. versus E.G. / 
14 / Temporal ambiguity / 
15 / Boundary ambiguity / 

Note: For a more complete description of the ambiguity review process and checklist, search the net for the ‘Bender-Ambiguity Review White Paper’.

Appendix B – unit test cases

Unit test procedure specification 1301

Test procedure specification identifier

Test leap year handling in WEBREPInterface.exe

Purpose

Determine if leap years are handled well by WEBREPInterface.exe

Special requirements

In order to conduct this test the necessary hardware needs to be available and operational.

Software stub written and tested to supply WEBREPInterface.exe with 29th February in non leap year.

Procedure steps

Log

Record the execution of this procedure on the test log labelled ‘test log 1301.’

Set up

1Latest build of WEBREPInterface.exe to be compiled on developer’s machine.

2Software stub for date entry compiled, tested and ready for use.

Proceed

1Start WEBREPInterface.exe on developer’s machine.

2Start Test Stub on developer’s machine.

3Enter date 29/02/2003 into test stub.

4Record error message, if any.

Measure

Error message should show ‘That date is not permitted’ and cursor to return to date field in test stub. In this case, the test is marked as a pass. Otherwise, test is marked as fail.

Wrap Up

Terminate WEBREPInterface.exe and test stub programs. Complete log.

Appendix C — integration test cases

Integration test procedure specification 1401

Test procedure specification identifier

Test parameter passing between WEBREPInterface.exe and report files.

Purpose

Determine dates are passed correctly between WEBREPInterface.exe and report file RegionalBuyingPatterns.rpt

Special requirements

Necessary test environment hardware needs to be available and operational.

  • Latest builds of WEBREPInterface.exe and RegionalBuyingPatterns.rpt installed.
  • User login required with access to WEBREP module.

Procedure steps

Log

Record the execution of this procedure on the test log labelled ‘test log 1401’.

Set up

Have WEBREP module running in browser on test machine.

Proceed

1Login in as dummy user created for testing.

2Start reports from menu.

3Select ‘Regional Buying Patterns’ report.

4Enter start and end dates for current month.

5Select ‘Run’.

6Wait for ‘regional buying patterns’ report to open in browser window.

7Confirm data is correct for given date range by comparing with excel reports run from 4Phones financials system for same date range.

Measure

Print report in browser window and excel report from 4Phones financials.

Wrap up

Log dummy user out of system. Complete log.

Appendix D — system test cases

See Appendix A in System Test Plan for details of System Test Cases.

Appendix E – acceptance test cases

Acceptance test procedure specification 1601

Test procedure specification identifier

Generate date-driven regional buying sales report in timely fashion for dial-up users.

Purpose

Determine if the regional buying report can be generated for the specified dates within 30 seconds of the initial request.

Special requirements

In order to conduct this test the necessary hardware needs to be available and operational.

  • User login for a franchisee needs to have been created and known to be functioning with regards to access and printing.
  • An established dial-up link of no less than 28.8K to the Internet.
  • Stopwatch, to record start and end times of report generation.

Procedure steps

Log

Record the execution of this procedure on the test log labelled ‘test log 1601.’

Set up

  • Established network access to 4Phones Financial System with dial-up modem (minimum connection speed is 28.8K)
  • Login with as a 4Phones Franchisee via an Internet browser outside the 4Phones offices

Proceed

1Select ‘reports’ from menu

2Select ‘regional buying sales report’

3Select June date range in the report criteria

4Select ‘Go’

5Start stopwatch

6Once data is received in browser window, stop the stopwatch.

Measure

1Confirm that data in browser is correct for June date range.

2Record time taken to generate the report. If less that 30 seconds and data is correct, the test is recorded as pass.

Wrap up

Log franchisee user out of system and complete log.

Appendix F — test schedule

Figure 1: Test schedule Gantt chart

- 1 -