QUALITY CENTER BEST PRACTICES
Mercury Quality Center Best Practices Overview
This section lists the quality areas that are optimized using Mercury Quality Center. The Mercury Quality
Center best practices follow the same outline. Topics listed in bold are detailed in the following section.
Mercury Quality Center best practices are divided into the following groups:
• People: Describes best practices for organizing your testing team, interacting with other teams, and
training staff to manage your Mercury Quality Center effectively.
• Process: Describes guidelines for processes enabled and automated by Mercury Quality Center and
processes required to manage and operate a Mercury Quality Center of Excellence.
• Product: Mercury Quality Center Deployment – Describes best practices for implementing Mercury
Quality Center, including initial implementation and ongoing support and maintenance.
A brief overview of the subprocesses for each process follows.
People:
• The Mercury Quality Center Organization
1. Quality group personnel profiles and skill levels
2. Building the team
3. Roles and responsibilities
4. Mercury Quality Center team education program
Process:
• Test Strategy Creation
· Manual and automated testing approach
• Requirements Management
· Requirements gathering and definition process
· Requirements implementation in Mercury Quality Center
• Test Plan Development
· Risk assessment based on quality goals
· Estimation of work effort
· Organizing the test plan
• Test Design and Execution Strategy, by test type
· Business function/component test
· Business process test
· Business integration test
· System integration test
· Regression test
· User acceptance test (UAT)
• Functional Testing Automation
· Test automation approach
· Automated script design and development
· Script maintenance and overall test asset management
• System, Environment, and Data Management
· Test environment management (including test system)
· Test data management
· Test data planning
· Creating data
· Maintaining data
· Supplying and scrubbing data
• Defect Management
· Defect lifecycle
· Defect attributes and definitions
· Defect analysis
· Alignment with Problem Management process
• Quality Management and Reporting
· Determining key performance indicators
· KPI tracking and analysis
· Reporting
• Mercury Quality Center of Excellence Processes
· Determining quality goals
· Assessing the impact of Mercury Quality Center on existing testing and development processes
· Planning for change in communication
· Making the transition
· First implementation iteration
· Expanding operation to an entire department
· Expansion to several lines of businesses (LOBs)
· Elements of a shared services structure
· Work request process and SLAs
· Project management
· Charge-back models
Product:
• Mercury Quality Center Infrastructure and Setup
· Identifying requirements for infrastructure deployment
· Planning the architecture
· Identifying the hardware – sizing considerations
· Identifying the security environment
· Optimal setup
· Set-up sequence and time estimates
· Scalability considerations for large deployments
• Mercury Quality Center Integrations
• Mercury Quality Center Administration
· Managing user groups and user group permissions
· Customize module access
·
· Project customization
1. Suggested field customizations
2. Field customization models
· Template project definition, implementation, and maintenance
· Project initiation
· Project archiving
• Mercury Quality Center System Maintenance
· Administrator communication process and feedback
· Server management
· Web and Application server
· Database server
· File server
· Version control
· Backup administration/maintenance
· Data archiving
· Upgrades and patches
· Maintaining high availability
· Disaster recovery
· Localization
Mercury Quality Center Best Practices Examples:
This section contains a limited number of excerpts of the best practices that are utilized in Mercury
Quality Process Service and Mercury Functional Testing Automation Service. This white paper is not
intended to constitute a summary of the complete best practices; therefore, only a few of the best
practices are shown. There is one best practice example for each of product, people, and process
areas. The examples in this section include:
• Quality group personnel profiles and skill levels.
• Test data planning and creation.
• Project customization.
Quality Group Personnel Profiles and Skill Levels:
The organizational design detailed below is a flexible model that can be adapted to organizations of
various sizes and resources, and will contribute to the successful deployment and operation of Mercury
Quality Center.
Building the Team
To maximize resource utilization and make efficient use of available skill sets, it is imperative to build a
team of people to support the Mercury Quality Center deployment. Involving the right people from the
start will help expedite the planning and investigation stage, and uncover potential implementation
setbacks early in the process.
The matrix below contains an example of the details that are appropriate for a subset of the team. This
includes the individual’s function, required technical skill sets, prerequisite knowledge, and more
concrete responsibilities of the team members within the context of the overall process. A complete list
must contain specifications for all roles on the team. Full details on each of the positions are available
during a deployment of Mercury Quality Center.
Personnel with the appropriate skill sets are selected based on the size and scope of the operation,
and in alignment with the new quality management processes. Successful implementation of Mercury
Quality Center requires educated users. Although formal user training takes only one day, it is
recommended to provide the team with additional on-the-job mentoring for the period of approximately
one week. Subsequently establish a communication plan allowing the team access to the Mercury
Quality Center Champion for possible questions.
Information on other Organizational Design sub-topics, such as when best to expand the team, the
interface between the team and other IT and business groups, and how to transfer knowledge to the
individual development teams, is included as part of a full Mercury Quality Center engagement.
Test Data Planning and Creation:
Test data management is an important part of the system, environment, and test management process.
It is recommended to form a team to develop a set of data deliverables to be completed at critical
points in the project lifecycle. These deliverables help ensure that test data is planned and delivered
appropriately. In addition, data review sessions should be held with all key stakeholders to review
deliverables and to provide feedback.
The following section details some of the deliverables and processes that should be implemented to
provide high quality support.
Test Data Planning:
The first point in the software development lifecycle that involves test data planning activities is during
completion of the design.
As the projects’ requirements or change requests are finalized for each release of the projects, the Test
Data Management team completes a release level Test Data Plan. This document is to be created after
Release Management has packaged the overall release. The Test Data Plan and its milestones need to
align with Release Management’s timelines and milestones for each release.
The Test Data Plan should include input from all stakeholders. This allows the Test Data Management
team to coordinate and facilitate necessary data-related activities to aid in a successful release. Further,
the Test Data Plan should focus on the planning and level of effort required of the Test Data
Management team. Testing and development teams should work with the Test Data Management team
to complete the Test Data Plan. The Test Data Plan includes, but is not limited to, information such as:
• Phases of testing to be supported.
• Scope of release.
• Test Data Management level of effort.
• Logistics and milestones by project, test phase, and environment.
• Identification of resources and procedures for test data creation.
• Requirements for data refresh(es) and/or data restoration(s).
Once the Test Data Plan document is created, it should be reviewed by the test execution organization
for approval.
Creating Data
The Test Data Management team is responsible for providing test data to the testing organization based
on the requirements approved in the test plan. The Test Data Management team provides the test data
prior to the test execution phase.
The most efficient and effective way to provide and manage the test data is by use of formalized data
processes and tools. To ensure success, the Test Data Management team implements numerous test
data preparation processes such as data re-use, data consolidation, data restoration, and data sharing.
The Test Data Management team, with the supporting DBAs and System Administrators, can facilitate
data creation in a number of different ways, depending on the unique requirements of the data
requested.
One technique is to copy data from production or other environments into the target test environment.
This method should be used when data that meets the requirements is found to exist within other
environments.
A second method employed is to manually enter data even though the actual entry may be done
by any of a number of organizations based on data input documents provided by the Test Data
Management team. This method should be used when data does not already exist in other
environments and manual entry is a feasible option given the volume of data requested.
Other possibilities include using data generation tools such as the Usage Generator, restoring
data stored in the data repository, and modifying existing data in the environment to meet the new
data needs. Although the test scripts should be created by the testing organization, the Test Data
Management team would provide the test data that would serve as input to the scripts.
If any modification of existing scenarios is needed prior to the start of testing, the Test Data
Management team would be responsible for facilitating those changes from a data perspective. Any
data modifications required as part of a test become the responsibility of the execution team.
Additional information on the system, environment, and data management processes is included as
part of
a full Mercury Quality Center engagement.
Project Customization
Mercury Quality Center provides extensive flexibility in customizing the product for specific projects. It
is important to plan customization carefully. With a greater number of available user fields, and the
ability to add memo fields and create input masks, users can customize their Mercury Quality Center
projects to capture any data required by their testing process.
The Mercury Quality Center Administrator should customize projects to meet the specific needs of the
testing team. This includes adding and customizing fields, and creating categories and lists that reflect
the needs of a specific testing project and suit the project’s unique quality objectives, standards, and
testing approach. The Administrator can modify the behavior of Mercury Quality Center fields by:
• Restricting users to selecting only values from associated lists.
• Making entry in certain fields mandatory.
• Preserving a history of values entered in a specific field.
• Including data unique to your project by creating user fields.
• Associating these fields with Mercury Quality Center and user-defined lists.
The Administrator can customize and add additional fields that may be critical to collecting relevant
quality metrics. The data quality increases as the drop-down lists and automatic fill-ins are used.
Identify the information required for evaluating application readiness and progress of the testing,
development, and other relevant IT processes. Proper customization of Mercury Quality Center helps to
manage multi-application testing efforts.
Suggested Field Customizations
The suggested field customization options below relate to the major IT processes supported or
impacted by Mercury Quality Center:
• Requirements Management
Various types of requirements can be differentiated by creating custom fields. These fields can
indicate whether a specific requirement relates to sizing, system, performance, business process
priority, business criticality, and so forth. Considerations such as the cost of a requirement change
can also be expressed with a custom field.
• Change Management
Change requests on requirements can be tracked and managed with custom fields indicating the
current status of the request (new / pending / cancelled). Another custom field can track the number
of design changes requested after the release process.
• Configuration Management
Use custom fields to monitor the number of configuration errors detected for each module in the test
plan tree.
• Application Development
For more complete information regarding costs and resources, it is possible to create custom fields
that estimate the development time for tests and the deviation from expected and actual
development times.
• Quality Assurance
To track weighted defect metrics specific to the testing project, create custom fields for easy
reference.
• Manage Releases
Custom fields can be created to track the versions before each release or in some cases the version
number in which certain defects or enhancements will be implemented.
• Production Management
Tracking response time with custom fields can help to detect performance and availability problems.
• Problem Reporting and Management
Monitor problems that arise after tuning or upgrading by creating custom fields that indicate
the number of problems, their causes, and the cost of fixing. The cost field could be visible to
a select set of project planners and managers or QA analysts.
Field Customization Models:
Use the defect entity field examples below as a guideline for how to name and define the function of
each option in a customized field.
• Defect Type
Provides a drop-down list of values that can be used for defect analysis:
Configuration – Select this type when the defect is due to a problem in the configuration of the
application, application server, or database server.
Data – Select this type when the defect is data-related, such as incorrect values for particular region.
Process – Select this type when there is a problem with the process not matching the system.
System – Select this type when the problem can be identified with an application area.
• Impact
Provides a drop-down list of the following values to track impact: