UNIVERSIDADE FEDERAL DE MINAS GERAIS

Pós-Graduação em Ciência da Computação

Disciplina: Engenharia de Software

Professor: Rodolfo Resende

1° Semestre de 2011

The Realities of Software Process Improvement

Review Protocol Document

1.  Introduction

Implementing a successful Software Process Improvement (SPI) is a challenging issue that many software companies face today. Many companies have invested huge amount of money in improving their software processes. This can be confirmed through several research papers that present the results of SPI programs. However, yet there is little evidence of a dramatic change in either the productivity or the quality of their products [1]. Examining the facts about productivity and quality we observed a common agreement among the researches about the difficult to have “order of magnitude” of improvements. This belief is shared by top people of Software Engineering, like Frederick P. Brooks [2] “There is inherently no silver bullet.” and Robert L. Glass [3] “most software tool and technique improvements account for about a 5 to 35 percent increase in productivity and quality”, not improvements of powers of 10 as it has been claimed. Thus is important to investigate the real benefits of SPI.

1.1.  Software Process Improvement

The concept of software process improvement is defined as an approach to improving software-intensive organizations, based on process capability models of software, through the establishment, evaluation and improvement of capacity of their critical processes. The purpose is to make them more efficient and effective when considering the strategic goals of the organization. Typically the strategic goals consist of the selection and composition of factors such as control and reduce delays, costs, and other resources for the development, maintenance and operation of software-intensive systems that meet the users’ requirements, with a minimum number of errors.

Approaches for process improvement, for example, the IDEAL [4], ISO/IEC 15504 [5] uses a reference process model that explore and represent best practices, define a measurement for assessing the capacity of processes and provides a roadmap for the improvement of processes. The two main types of reference models used are CMMI and ISO/ IEC 15504.

2.  Research question

The goal of this study is to investigate the SPI results described in the literature and what data exists to support their value. We followed Kitchenham’s guidelines [6] for Systematic Literature Reviews (SLR).

The research question (RQ) of the study:

What are the realities of SPI programs?

This question can be split into the following more specific research questions:

RQ1: What reasons for SPI are the most important to organizations? (eg. increase profit, enlarge marketshare, etc)

RQ2: Which reference/assessment models the organizations adopted? (eg. CMMI, ISO 15504, adhoc)

RQ3: What are the main aspects that are measured in SPI initiatives? (eg. productivity, quality, ROI, marketshare)

RQ4: What are the lessons learned with the SPI programs?

RQ4.1: What are the main problems? What are the main problems during the different phases of a cycle? What are the main problems during the different cycles? What are the usual solutions?

RQ4.2: How the lessons learned change with the cycles/unfolding of SPI initiatives?

3.  Search Strategy

In order to perform an exhaustive search for primary studies, our search consists of an online search in relevant digital libraries.

Online search (2010-2010):

·  IEEEXplore: http://www.ieeexplore.ieee.org/

·  ACM Digital Library: http://portal.acm.org/

·  Science Direct: http://www.sciencedirect.com/

·  SpringerLink: http://www.springerlink.com/

·  Wiley inter Science: http://www3.interscience.wiley.com/cgi-bin/home

The search results can be manually organized at Mendeley tool (http://www.mendeley.com/).

3.1.  Search String

For the online search in digital libraries, search keywords are very important for the quality of the retrieved results, so they must be chosen carefully. Based on the PICO [6] criteria, search strings could be derived and adjusted so that they could be executed on different digital libraries. Below we present the Search String SS, derived from the research questions, listing the keywords identified for the (P) and (I) and (C) and (O) structure.

·  (P) Population: software.

·  (I) Intervention: process improvement.

·  (C) Comparison: does not exist.

·  (O) Outcomes: -

SS: (software) AND (“process improvement”)

4.  Study selection

The study will be conducted in three steps:

·  First step: the goal of this step is to remove duplicate and irrelevant papers. This will be carried out evaluating only the title of the article.

·  Second step: the goal of this step is to eliminate articles which abstract is not related with any of the research questions. This process will evaluate the abstract and the keywords.

·  Third step: the goal of this step is to scan the paper searching for relevant information.

·  Fourth step: the goal of this step is to evaluate the paper and collect the required information.

The inclusion criteria(*) to be considered are:

·  Papers that contained an explicit result of process improvement initiative of a software development organization.

·  Papers with focus or main focus on a SPI initiative (It was included articles where the SPI initiative was only one element of the articles as well as articles for which the SPI initiative was the main purpose of the article).

The following studies should be excluded(*):

·  Publications/reports for which only an abstract or a PowerPoint slideshow are available.

·  Short papers, editorials, posters, position papers and introductions of keynote, workshop, mini-tracks, special issues, or tutorials

·  Studies that are based only on expert opinion, i.e., it is merely a “lessons learned” report based on expert opinion (it is not a research paper)

·  Studies presented in languages other than English

·  Studies not related to any of the research questions

·  Studies whose findings are unclear and ambiguous (i.e., results are not supported by any evidence)

·  Studies external to Software Engineering

·  Duplicate reports of the same study. When many papers originated from the same study exist in different journals or conferences, the most complete version of the study will be included in the review.

·  Studies containing several unsupported claims or frequently referring to existing work without providing citations

·  Studies that present tool evaluation, methodology experimentation or process implementation in a company without an SPI focus.

·  Studies that deals with an SPI initiative but there is no practical application in a company

·  Studies that focus on an evaluation of an SPI assessment method/standard but they do not provide enough description of the improvement initiative (which areas to improve, only the assessed CMMI Level: x before and y after).

·  Papers that do not include an explicit result of process improvement initiative of a software development organization.

(*) Observe that there will be some entries that will not attend the inclusion nor the exclusion criteria these entries MUST be maintained.

5.  Study quality assessment

The following checklist will be used to access the quality of primary studies. The contents and scoring rules were adapted from the Zhang and others´ [7] and Sulayman & Mendes [8]. For each question, the study’s quality is evaluated as ‘yes’, ‘partial’, or ‘no’, which are scored with the value 1, 0.5, and 0 respectively.

Table 2. Study Quality Assessment checklist

No. / Question / Detailed questions
1 / Is there a clear statement of the aims of the research? / Is there a rationale /purpose/ objective/ research question/ hypothesis/ proposition for why the study was undertaken?
Is there a clear statement of the study’s primary outcome (i.e. reduction of cost, or defects, productivity improvement)?
2 / Is there an adequate description of the context in which the research was carried out? / Did the researcher describe "The software development organization (e.g. size, nature, structure)"?
Did the researcher describe "The software process that should be improved (e.g. requirement engineering, project management)"?
Did the researcher describe "The identified problems and proposed solutions."?
Did the researcher describe "The Pilot project that the SPI initiative has been implemented."?
3 / Is it clear the purpose of the SPI initiative? / Is there a rationale behind the execution of the SPI initiative? (The software development organization would like to receive a certification, reduce costs, improve quality).
4 / Was the study research design appropriate to address the aims of the research? / Has the researcher justified the research design (e.g. have they discussed how they decided which methods to use)?
5 / Are threats to validity analyses addressed in a systematic way? / Are limitations of the study discussed explicitly?
Does the paper explicitly mention the possibility of threats to validity (internal and external)?
Does the report give a realistic and credible impression?
6 / Does the paper explicitly mention the possibility of bias? / Has the research critically examine the potential bias of selection, reporting or experimenter?
7 / Was there a control group with which to compare treatments? / Is there a control group?
Were they representative of a defined population?
8 / Were the study cases appropriate for the study? / Was the study cases appropriate to access the SPI implementation results?
9 / Are the data collection procedures sufficient for the purpose (data sources, collection, storage, validation)? / Were all measures well defined (e.g. unit and counting rules)?
Are the measurements relevant to address the objective of the study?
Has the research justified the method that were chosen?
Is the data collected in a way that address the research question?
Is it clear how data was collected (e.g. semi-structured interviews, focus group etc.)?
Are sufficient data for the analysis?
10 / Are the analysis procedures sufficient for the purpose? / Is the analysis methodology defined, including roles and
review procedures (i. e. There is a description of the analysis process)?
Are sufficient raw data to support the finding?
11 / Are the findings (postive and negative) presented? / Are the results explicit?
Are the findings discussed in relation to the original research questions?
Has the researcher discussed the credibility of their findings (e.g. triangulation, respondent validation, more than one analyst)?
Are the conclusions justified by the results?
12 / Are the results useful for another organization or researchers? / Are there clear conclusions from the analysis, including recommendation for practice/future research?
Are the proposed questions and corresponding answers reported?
Does the researcher discuss the contribution of this study to the existing literature or current practice?

The data of each paper will be aggregate using the median value for quality scores.

6.  Data collection

Title
Year
Source
Organization / Number of organizations*
Business model*
Organization context*
Size of organizations*
Number of employees*
Geography*
Methodology / Research method*
Data collection*
Data analysis*
Analysis method*
SPI Program / Motivation
Reference model/standard*
SPI Implementation approach/model*
List of problems detected
Qualitative measures*
Quantitative measures*
Main topic area of the SPI program*
SPI project duration*
List of proposed solutions
Main lessons learned
Changes in lessons learned
Success factors - Failed Factor*
Benefits
Notes

7.  Deadlines

No. / Activity / Deadline / Template Worksheet
1 / First Step – Title Selection / 03/05/2011 / 1a. etapa
2 / Second Step - Abstract / 17/05/2011 / 2a. etapa
3 / Third Step - Scanning / 31/05/2011 / 3a. etapa
4 / Fourth Step – Complete Analysis / 07/06/2011 / 4a. etapa + the complete information of each selected paper (#1, #2, etc..)

8.  References

  1. Glass, R. L. 1999. The realities of software technology payoffs. Commun. ACM 42, 2 (Feb. 1999).
  2. Brooks, F. P. 1987. No Silver Bullet Essence and Accidents of Software Engineering. Computer 20, 4 (Apr. 1987), 10-19.
  3. Glass, R. L. 2002 Software Engineering: Facts and Fallacies. Addison-Wesley Longman Publishing Co., Inc.

4.  McFeeley, B. IDEAL - A User's Guide for Software process Improvement, Handbook CMU/SEI-96-HB-001, 236 pages, 1996.

5.  ISO/IEC 15504. The International Organization for Standardization and the International Electro technical Commission. http://www.iso.org/iso/catalogue_detail.htm?csnumber=38932

  1. Kitchenham, B. 2004. Procedures for Performing Systematic Reviews, Keele University, Technical ReportTR/SE0401.
  2. Zhang, H., Kitchenham , B., and Pfahl , D. 2008. Reflections on 10 years of software process simulation modelling: A systematic review. In International Conference on Software Process (ICSP’08), Leipzig, Germany, 2008. Springer.
  3. Sulayman, M. and Mendes, E. 2009. A Systematic Literature Review of Software Process Improvement in Small and Medium Web Companies. International Conference on Advanced Software Engineering and Its Applications, ASEA 2009 Held as Part of the Future Generation Information Technology Conference, FGIT 2009, Jeju Island, Korea, December 10-12.

Appendix

Search String

Digital Libray / Search String
IEEEXplorer / String "process improvement" and "software"
Publisher: IEEE + IET
Content types: Conferences and Journal
Publication Year: 2010 to 2010
Full text + metadata
Subject: Computing & Processing (Hardware/Software)
ACM / String: all of this text (and): “software” “process improvement”
Publisher since: 2010 Published before: 2010
Wiley inter Science / Search for:
software and “process improvement”
In: All Fields
Product type: all
Between 2010 - 2010
Science Direct / Subject: Computer science
String: “software” and “process improvement”
Dates: 2010 to 2010
Include: Journals and All Books
Springer Link / Search For (All words) > "software" "process improvement"
Subject > Computer Science
Publication Date > January 01, 2010 and December 31, 2010

Wiley

ACM

IEEE

Science

Springer