assurance Test Plan

Home Based Practice software testing
Project Name:
Md Zahirul Hoque Bhuiyan
8/30/2014
Version No. 2.0
This document is a test plan for the functionalities of website where test strategy and test features are described in details.


Document Information and Approvals

Version History
Version # / Date / Revised By / Reason for change
1.0 / PMO Review
Document Approvals
Approver Name / Project Role / Signature/Electronic Approval / Date

assurance Test Plan

Table of Contents

Introduction

Scope

Test Objectives

Testing Goals

What to be tested...... 1

What not to be tested...... 2

Test Methodology...... 4

EntranceCriteria...... 4

Exit Criteria...... 4

Test Execution...... 4

Test Scenarios...... 5

Test Case/Script Development...... 6

Defect Reporting...... 6

Test Environment

Requirements...... 6

Testing Platform...... 7

User Acceptance TestPlan...... 7

Definition...... 7

Testing Requirements...... 7

Testers/Participants...... 7

Testing Schedule...... 7

Assumptions and Risks...... 8

Assumptions...... 8

Risks...... 8

Go/No-go Meeting...... 9

Roles and Responsibilities...... 9

Sign-off and Acknowledgement...... 10

Test Director – Defect Tracking Process...... 12

assurance Test Plan

Introduction

In this document, we are going to develop a strategic test plan for Hotwire group is going to add some new functionality to the existing functionalities in their official website. In this strategic test plan, road map has been drawn about how will be the test conducted.

Scope

The overall purpose of testing is to ensure that web application meets all of its technical, functional and business requirements. The purpose of this document is to describe the overall test plan and strategy for testing the website application. The approach described in this document provides the framework for all testing related to this application. Individual test cases will be written for each version of the application that is released. This document will also be updated as required for each release.

Test Objectives

The quality objectives of testing the application are to ensure complete validation of the business and software requirements:

  • Verify software requirements are complete and accurate
  • Perform detailed test planning
  • Identify testing standards and procedures that will be used on the project
  • Prepare and document test scenarios and test cases
  • Regression testing to validate that unchanged functionality has not been affected by changes
  • Manage defect tracking process
  • Provide test metrics/testing summary reports
  • Schedule Go/No Go meeting
  • Require sign-offs from all stakeholders

Testing Goals

The goals in testing this application include validating the quality, usability, reliability and performance of the application. Testing will be performed from a black-box approach, not based on any knowledge of internal design or code. Tests will be designed around requirements and functionality.

Another goal is to make the tests repeatable for use in regression testing during the project lifecycle, and for future application upgrades. A part of the approach in testing will be to initially perform a ‘Smoke Test’ upon delivery of the application for testing. Smoke Testing is typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is crashing frequently, or corrupting databases, the software is not in a stable enough condition to warrant further testing in its current state. This testing will be performed first. After acceptance of the build delivered for system testing, functions will be tested based upon the designated priority (critical, high, medium, low).

What to be tested:

The following features of the website will be tested for accuracy

Home page

Hotels Menu

Cars Menu

Flights

Cruises Menu

Deals Menu

Activities Menu

Cruises Menu

Inside the Home page, user registration and sign in process will be tested in details. Other options will just be tested for user interface testing. Deals, Activities and Cruises menus will be tested for User interface design.

What not to be tested:

As this is just a practice testing, all the testing will be done just to demonstrate testers’ capability of the procedures. No deep digging test of the features will be performed such as purchase with credit card. Load test will not be done.

Quality

Quality software is reasonably bug-free, meets requirements and/or expectations, and is maintainable. Testing the quality of the application will be a two-step process of independent verification and validation. First, a verification process will be undertaken involving reviews and meetings to evaluate documents, plans, requirements, and specifications to ensure that the end result of the application is testable, and that requirements are covered. The overall goal is to ensure that the requirements are clear, complete, detailed, cohesive, attainable, and testable. In addition, this helps to ensure that requirements are agreed to by all stakeholders.

Second, actual testing will be performed to ensure that the requirements are met. The standard by which the application meets quality expectations will be based upon the requirements test matrix, use cases and test cases to ensure test case coverage of the requirements. This testing process will also help to ensure the utility of the application – i.e., the design’s functionality and “does the application do what the users need?”

Reliability

Reliability is both the consistency and repeatability of the application. A large part of testing an application involves validating its reliability in its functions, data, and system availability. To ensure reliability, the test approach will include positive and negative (break-it) functional tests. In addition, to ensure reliability throughout the iterative software development cycle, regression tests will be performed on all iterations of the application.

Audience

  • Business Team
  • Project Management Team
  • Application Development Team

Role of Home Based Software Testing Team:

The role of Home Based Software Testing Team during Integration Testing includes the following:

  • Creation of this Test Approach/Plan
  • Creation of framework for testing:

Identification of all release inclusions

Determining best approach for getting all inclusions tested

Defect lifecycle management

Preparation and distribution of testing metrics

Coordination of QA resources doing the testing

Meetings

Overall monitoring/coordination of the QA testing effort regardless of who is actually performing the work during the integration testing phases.

Exception User Acceptance Testing (UAT)

  • Home Based Software Testing team works closely with:

Project Lead and SME

Developers for

QA Testing Support Phases

The QA testing support period consists of the following phases:

Start / End / Task
7/01/14 / 7/25/14 / Development The time period for developers team to complete Technical Design Documents, Develop the new code base, successful Unit Test Plan development and execution in the Development environment. As well as accomplish the initial Code Build, Deployment and Smoke Test in the INT test environment.
7/05/14 / 7/28/14 / QA Planning
- QA and Hotwire SME ensure test data is ready for testing
- QA attends regularly scheduled Cross Training and weekly project meetings.
- QA performs test analysis, planning, and documentation.
- QA provides review of this test plan, test requirements, test case scripts, test data, and obtains feedback from Business SMEs and AD.
- QA Provides review of QC project test artifacts
7/30/14 / 8/15/14 / QA Test Execution INT - Time frame within which QA executes planned testing, validation, and defect reporting and tracking in the INT Environment.
Hotwire SME provides ongoing support in application knowledge, regression test cases and test data.
- When conditions of Success Criteria are met QA provides Sign-Off to migrate to QA.
8/16/14 / 8/21/14 / QA Smoke test in QA (Subject to Sign-off in INT by 8/15/13)
QA Test Execution QA
- QA executes planned testing, validation and defect reporting and tracking in the QA Environment. - QA executes planned Light functional testing for new features and Smoke regression tests to verify the build is ready for UAT. - SME provides ongoing support in application knowledge.
Note: QA testing and UAT may overlap in the QA environment.

Project Quality Assurance

All project artifacts are posted to the project SharePoint site, located at: {site SharePoint URL}

Fast Track Project Required Documents

Project Artifacts / Complete
Project Proposal in EPM / 
Initiation Phase Checklist
Project WBS
Project Charter
Business Requirements Document
Project Plan Review Checklist
Analysis Phase Checklist
Design Review Checklist
Conceptual IT Architecture Review Checklist
Application Architecture Design
System Architecture Design
Code Review Checklist
Implementation Plan Checklist
QA Test Plan
Test Planning Checklist
Deployment Readiness Assessment Checklist
User Acceptance Sign Off
Service Level Agreement and Checklist
Lessons Learned
Close out Report

Test Methodology

Entrance Criteria

  • All business requirements are documented and approved by the business users.
  • All design specifications have been reviewed and approved.
  • Unit testing has been completed by the development team, including vendors.
  • All hardware needed for the test environment is available.
  • The application delivered to the test environment is of reliable quality.
  • Initial smoke test of the delivered functionality is approved by the testing team.
  • Code changes made to the test site will go through a change control process.

Exit Criteria

  • All test scenarios have been completed successfully.
  • All issues prioritized and priority 1 issues resolved.
  • All outstanding defects are documented in a test summary with a priority and severity status.
  • Go/No-go meeting is held to determine acceptability of product.

Test Execution

The test execution phase is the process of running test cases against the software build to verify that the actual results meet the expected results. Defects discovered during the testing cycle shall be entered into the project SharePoint Team Site Defect list or Quality Center (offered by OIT). Once a defect is fixed by a developer, the fixed code shall be incorporated into the application and regression tested.

These following testing phases shall be completed (if applicable):

Unit Testing

Unit testing is performed by the report developers at hotwire development team in their development environment. The developers know and will be testing the internal logical structure of each software component. A description of the unit testing should be provided to the project team.

Functional Testing

Functional testing focuses on the functional requirements of the software and is performed to confirm that the application operates accurately according to the documented specifications and requirements, and to ensure that interfaces to external systems are properly working.

Regression Testing

Regression testing shall be performed to verify that previously tested features and functions do not have any new defects introduced, while correcting other problems or adding and modifying other features.

Integration Testing

Integration testing is the phase of software testing in which individual software modules are combined and tested as a group. In its simplest form, two units that have already been tested are combined into a component and the interface between them is tested. In a realistic scenario, many units are combined into components, which are in turn aggregated into even larger parts of the program. The idea is to test combinations of pieces and eventually expand the process to test your modules with those of other groups. Eventually all the modules making up a process are tested together.

Interface Testing

This testing follows a transaction through all of the product processes that interact with it and tests the product in its entirety. Interface testing shall be performed to ensure that the product actually works in the way a typical user would interact with it.

Destructive Testing

Destructive testing focuses on the error detection and error prevention areas of the product. This testing is exercised in an attempt to anticipate conditions where a user may encounter errors. Destructive testing is less structured than other testing phases and is determined by individual testers.

User acceptance testing

User acceptance testing activities will be performed by the business users. The purpose of this testing will be to ensure the application meets the users’ expectations. This also includes focuses on usability and will include; appearance, consistency of controls, consistency of field naming, accuracy of drop down field information lists, spelling of all field name/data values, accuracy of default field values, tab sequence, and error/help messaging

Browser Testing

Functional and Regression as defined in this document are executed using the MetLife standard browser:

INT Test Environment Browsers Under Test

Hotwire Standard Browser / All versions of I.E

Other supported Browsers:

Hotwire Supported Browsers / All versions of Mozilla Firefox / All versions of Safari

Test Artifacts:

Some new functionality will be added to Those new and some existing functionalities will be tested and regression test will be performed at every functionality addition to check the functional integrity of the web application.

Below are the high-level scenarios that will be tested. These scenarios are derived from the Requirements Matrix and Use Cases.From these, detailed test scripts will be created.

Validate the following new functionalities:

Validate that two new menu options called Current Offers and Member Options have been added in the menu bar.

Validate that Facebook like option has been added near the footer of the website.

Validate that Tweeter (Tweet) option also included near the footer of the website.

Validate the following existing functionalities:

In the home page on top of menu bar on the top right corner of the screen there will be two lines of text and links.

Validate that the text Welcome appeared and a link Sign in/ Register appeared and sign in and register options works accordingly.

Validate that underneath the above link there will be two more links and they are ‘My account’ and ‘New to Hotwire’ and those links are working.

Validate that the Deals menu in the menu bar is working accordingly.

Validate that the Activities menu in the menu bar is working accordingly.

Validate that the Cruises menu in the menu bar is working properly.

Validate that the Packages menu in the menu bar is working properly.

Validate that the Flights menu in the menu bar is working properly.

Validate that the Cars menu in the menu bar is working properly.

Validate that the Hotels menu in the menu bar is working properly.

In the home page, underneath the menu bar there is a label called Hot Wire Hot Rate Locator. Underneath that there are four options tab and Bundle +Save label.

Validate that Hotel reservation option and related features are working properly.

Validate that Car reservation option and related features are working properly.

Validate that Flight reservation option and related features are working properly.

Validate that Cruises reservation option and related features are working properly.

Underneath the Bundle + Save label, there are 4 options.

Validate that Flight + Hotel option are working according to the system specifications.

Validate that when clicked on Flight + Hotel + Car, it views the appropriate options and they are working.

Validate that Flight + Car option are working according to the system specifications.

Validate that when clicked on Hotel + Car option, appropriate fields are displayed and they are working according to the system specifications.

In each cases validate that all of the options and sub pages open in the same window.

Test Script Development

Test script design is the central focus of a software quality assurance process. A test script is defined as a written specification describing how a single or group of business or system requirement(s) will be tested. The test script consists of a set of actions to be performed, data to be used, and the expected results of the test. The actual results of the test are recorded during test execution. Test scripts will also be updated as testing proceeds.

Test Scripts written for this project include the following:

  • Test Script ID
  • Test Cases verified
  • Requirementsverified
  • Purpose of test
  • Any dependencies and/or special set-up instructions required for performing the test
  • Test description and steps
  • Expected results

Defect Reporting

Issues/defects are tracked for resolution with the following guidelines:

  • Issues will be reported based upon documented requirements.
  • Issues will be tracked by the testing team, reported and entered into QualityCenter.
  • Issues will be fixed by the development team based on the priority/severity assigned by the test lead.
  • All critical/priority 1 defects will be fixed before release to production.

See the Defect Tracking Process at the end of this document for detailed instructions on how to log and track defects in Quality Center.

Test Environment

Requirements

Client Server Technical Requirements:

  • Mixed browsers supported (Internet Explorer, Firefox, Mozilla)
  • Oracle Database
  • Client Platform: PC and Macintosh
  • Production server location:

Testing Platform

  • Desktop PC – the application supports all A-Grade browsers for Windows and Mac operating systems , as defined by Yahoo!’s Graded Browser Support standards. Windows 2000/IE6 may be excluded.
  • Test server location:

User Acceptance Test Plan

Definition

The overall purpose of testing is to ensure the performs at an acceptable level for the customer. This section outlines the detailed plan for user acceptance testing of this application.

This test plan will be used to record the customer’s sign off of the documented scenarios. Detailed test scripts/cases have been developed and will be used to record the results of user testing. This document is a high level guide, and is not intended as a replacement for any specific user acceptance testing procedures that individual areas might have.