Feasibility Evidence Description (FED)
A. Description of the FED
Sections A1-A3 of this Feasibility Evidence Description (FED) section of the LeanICM guidelines are intended to be read by CS577 students -- to help them understand what the FED is about and how to write their projects’ FED -- and are not intended to be included in the submitted FED.
1. Purpose
The purpose of the Feasibility Evidence Description (FED) is to show that the other artifacts (OCD, SSRD, SSAD, and LCP) are consistent and complete, in the sense that:
· Satisfaction of all the requirements specified in the SSRD will result in satisfaction of the Win-Win agreements made by the critical stakeholders including the organizational goals (benefits, ROI, etc.) desired by the customer and the client.
· An implementation of the system whose architecture / design is specified in the SSAD will satisfy all product-related requirements specified in the SSRD and the Prototypes
· Execution of the project described in the LCP will:
o Result in the production of the system whose architecture / design is specified in the SSAD.
o Satisfy all project-related requirements specified in the SSRD, including the requirement of completion within schedule and within budget
I.e., the purpose of the FED is to ensure that the system developers have not just created a number of system definition elements, but have also demonstrated the completeness and consistency of these elements in ensuring, with high probability, the feasibility of accomplishing the desired goals – both product-wise and project-wise.
That such a document is critical to project success is motivated by the fact that:
· In industrial and governmental software development projects – and, of course, in CS577 projects – different development team members write different ones of the op con, requirements, architecture / design, and project management artifacts. (Note that the terms “Operational Concept Definition,” “System and Software Requirements Description,” System and Software Architecture Description,” and “Life Cycle Plan” are ICM terms for these artifacts.)
· There is often no team member who reads the two or more artifacts needed to check any one the four points listed above – and rarely any team member who reviews the complete set of artifacts.
· As a result, project success can be severely jeopardized. (Construction of the FED should alert its writer(s) to inconsistencies within and among project artifacts and various types of incompleteness; these should result in urging authors of those artifacts involved to edit them in order to remove the problems identified, and such editing should be followed up by the FED’s authors to ensure that their feasibility arguments are correct with respect to the actual contents of the other artifacts.)
2. FED Life Cycle Process
The Expected Benefits (Section 2.2) and Benefits Chain (Section 2.3) sections of the Operational Concept Description (OCD) should provide starting points for estimating the required client investments, the added value or cost savings resulting from the product’s use, and the resulting business-case return on investment. (ROI)
The FCR version of the FED will usually be incomplete, as key decisions usually remain to be made during the Foundation phase. Any shortfalls in the evidence for the feasibility with respect to the criteria above should be treated as risks, documented in FED section 5, and reflected in the Top – N risk resolution status summary in Life Cycle Plan (LCP) section 4.1.4. Shortfalls in subsequent versions of the FED should follow the same procedure.
3. Completion Criteria
3.1 Valuation Commitment Review (VCR)
· Risks and their mitigations gathering from the first few meetings with the client and also from the OCD’s expected benefits and benefit chain sections.
3.2 Foundations Commitment Review (FCR)
· Assurance of consistency among the system definition elements abovefor at least one feasible architecture (this should be done via appropriate combinations of analysis, measurement, prototyping, simulation, modeling, reference checking, etc.)
· Assurance of a viable business case analysis for the system as defined
· Requirements traceability and architecture feasibility
· Assurance that all major risks are either resolved or covered by a risk management plan
3.3 Development Commitment Review (DCR)
· Assurance of consistency among the system definition elements abovefor the architecture specified in the SSAD
· Assurance of a viable business case analysis for the system as defined
· Assurance that all major risks are either resolved or covered by a risk management plan
· Elaboration of Requirements traceability and architecture feasibility
3.4 Initial Operational Capability (IOC)
· Feasibility rationale for future increments beyond IOC
· Validation of business case and Benefits Chain (OCD 2.3) assumptions
B. Sections of the FED Document
The (sub-) sections listed below describe the base format for the FED. For readability by faculty, TA’s, graders, and CS577b students, every submitted FED document (Early FED, FCR FED, DCR FED, and IOC FED) should contain each of the indicated sections, with the title and section numbering unchanged. (See the General Guidelines for exceptions.) The text under each (sub-) heading describes what should be in the corresponding (sub-) section. (The texts in these (sub) -sections of the present document are to be read by CS577 students -- to help them prepare the contents of the corresponding (sub-) sections of their FED documents -- and are not intended to be included in the submitted FED’s.)
1. Introduction
1.1 Purpose of the FED Document
Summarize any significant difference between the content of the FED and the Win-Win negotiated Agreements. Identify any major FED issues that have not yet been resolved.
1.2 Status of the FED Document
Summarize the major changes of FED document comparing to the previous version or the major changes that might affect the project feasibility.
2. Business Case Analysis
In this section you analyze the software system’s return on investment (ROI): ROI= (Benefits-Costs)/Costs by using supporting data from subsections.
2.1 Cost Analysis
· Costs include actual client costs for system development, transition, operations, and maintenance. Project team costs are zero, but participation by non-developer stakeholders does cost (salary, overhead, etc.). Transition costs can include equipment purchase, facilities preparation, COTS licenses, training, conversion, and data preparation costs. Operation costs can include COTS licenses, supplies, system administration, and database administration costs. Maintenance costs can include hardware and software maintenance. The COCOMO II maintenance estimator can be helpful in producing the cost analysis.
· Cost incurred should include one-time and recurring costs of personnel, hardware, software, etc.
2.1.1 Personnel Costs
· Personnel costs should be estimated in terms of effort. One – time effort includes development and transition effort by clients, users, etc; while recurring effort includes effort operational and maintenance effort.
· When the project has zero budget, no hardware and software purchase, it does not mean that there is no cost. Cost can occur in terms of effort or time spent to the project. Table 19 shows example of personnel costs in terms of hours spent to the project. Basically, we do not count number of hours from student development team.
Table 19: Example Personnel Costs of Volunteer Tracking System
Activities / Time Spent (Hours)Development Period (24 weeks)
Valuation and Foundation Phases: Time Invested (CS577a, 12 weeks)
Client: Meeting via email, phone, and other channels [3 hrs/week * 12 weeks] / 36
Client Representatives: Meeting via email, phone, and other channels [2 hrs/week * 12 weeks] / 24
Architecture Review Board [1.5 hrs * 2 times * 2 people] / 6
Development and Operation Phases: Time Invested (CS577b, 12 weeks)
Client: Meeting via email, phone, and other channels [5 hrs/week * 12 weeks] / 24
Maintainer: Meeting via email, phone, and other channels [8 hrs/week * 12 weeks] / 96
Architecture Review Board and Core Capability Drive-through session [1.5 hrs * 3 times * 2 people] / 9
Deployment of system in operation phase and training
- Installation & Deployment [5 hrs * 3 times * 2 people]
- Training & Support [5 hrs * 2 times * 2 people] / 50
Total / 245
Maintenance Period (1 year)
Maintenance [3 hr/week * 52 weeks] / 156
Total / 156
2.1.2 Hardware and Software Costs
· Table 20 and Table 21 shows example of cost that occurs in Volunteer Tracking System Project. Table 20 shows hardware and software cost required during the development period while Table 21 shows the cost required after the transition.
Table 20: Hardware and Software Costs – Development
Type / Cost / RationaleHardware – Web Server / $1500 / A new machine is needed to act as a web server for the system.
Hardware – Web Hosting / $200 / Although the CSC already has its own host, this new system requires additional cost based on the annually hosting fee. Starting from fall 2006, until the end of spring 2007, this is an amount needed to be spent.
Software – Adobe Dreamweaver CS3 / $399 / Used in developing the system and the team website.
Table 21: Hardware and Software Costs – Production
Type / Cost / RationaleHardware – Web Server / $0 / Since the development machine can be used as a production machine, no cost is required here.
Hardware – Web Hosting / $200 / Although the CSC already has its own host, this new system still requires additional cost based on the annually hosting fee.
2.2 Benefit Analysis
· Possible benefits are expressed in financial terms compared to costs, such as increased sales and profits, or reduced operating costs.
· Non-financial benefits and costs should also be included.
· The value added may also describe non-monetary improvements (e.g. quality, response time, etc.), which can be critical in customer support and satisfaction.
· Table 22 shows example of benefit in terms of hours saved from using the developed application.
Table 22: Benefits of California Science Center System
Current activities & resources used / % Reduce / Time Saved (Hours/Year)Application data entry
- Volunteer coordinator
50 applications * 10 mins = 500 mins / 90% / 7.5
Time sheet data entry
- Volunteer coordinator
5 hrs * 52 weeks / 90% / 234
Job request
- Supervisor (7 departments)
7 * 1 hr * 52 weeks / 50% / 182
- Volunteer coordinator
1 hr * 52 weeks / 50% / 26
Job assignment
- Volunteer coordinator
10 hr * 52 weeks / 60% / 312
Total / 761.5
2.3 ROI Analysis
· Include a Return-On-Investment (ROI) analysis. Table 24 - Table 25 shows example of ROI analysis
· Figure 10 and Figure 11 are the result of ROI Analysis plotted in graphical information.
Table 24: Example ROI Analysis of Volunteer Tracking System
Year / Cost / Benefit(Effort Saved) / Cumulative Cost / Cumulative Benefit / ROI
2007 / 245 / 0 / 245 / 0 / -1.00
2008 / 156 / 761.5 / 401 / 761.5 / 0.90
2009 / 171 / 761.5 / 572 / 1,523 / 1.66
2010 / 188 / 761.5 / 760 / 2,284.5 / 2.01
Figure 10: Example ROI Graph of Volunteer Tracking System
3. Architecture Feasibility
· In this section you provide arguments to justify the feasibility of achieving the desire system architecture.
3.1 Level of Service Feasibility
· Provide arguments to justify the Level of Service (L.O.S.) requirements specified in the SSRD. You do this by demonstrating -- through analysis, detailed references to prototypes, models, simulations, etc. -- how the designs will satisfy the SSRD L.O.S. requirements (SSRD section 5). Complete coverage of the L.O.S. requirements is essential. If applicable, provide links to files containing detailed evidence of the feasibility of required levels of performance, interoperability, and dependability.
· Note that ambitious Level of Service requirements and their tradeoffs can be the most difficult requirements to satisfy, and the most difficult requirements for which to argue satisfiability in advance of actual implementation. Table 10 summarizes some useful product and process strategies for addressing Level of Service requirements. Additional useful tables are in the FED section 2.2.5 of the previous set of MBASE (not LeanMBASE) Guidelines (version 2.4.2) at http://cse.usc.edu/classes/cs577a_2004/guidelines/index.html.
Table 10: Level of Service Product and Process Strategies
Attributes / Product Strategies / Process StrategiesDependability / Accuracy Optimization, Backup/ Recovery, Diagnostics, Error-reducing User Input/output, Fault-tolerance Functions, Input Acceptability Checking, Integrity Functions, Intrusion Detection & Handling, Layering, Modularity, Monitoring & Control, Redundancy / Failure Modes & Effects Analysis, Fault Tree Analysis, Formal Specification & Verification, Peer Reviews, Penetration, Regression Test, Requirements/Design V & V, Stress Testing, Test Plans & Tools
Interoperability / Generality, Integrity Functions, Interface Specification, Layering, Modularity, Self-containedness / Interface Change Control, Interface Definition Tools, Interoperator Involvement, Specification Verification
Usability / Error-reducing User Input/output, Help/ explanation, Modularity, Navigation, Parameterization, UI Consistency, UI Flexibility, Undo, User-programmability, User-tailoring / Prototyping, Usage Monitoring & Analysis, User Engineering, User Interface Tools, User Involvement
Performance / Descoping, Domain Architecture-driven, Optimization (Code/ Algorithm), Platform-feature Exploitation / Benchmarking, Modeling, Performance Analysis, Prototyping, Simulation, Tuning, User Involvement
Adaptability (Evolvability / Portability) / Generality, Input Assertion/type Checking, Layering, Modularity, Parameterization, Self-containedness, Understandability, User-programmability, User-tailorability, Verifiability / Benchmarking, Maintainers & User Involvement, Portability Vector Specification, Prototyping, Requirement Growth Vector Specification & Verification
Development Cost / Schedule / Descoping, Domain Architecture-driven, Modularity, Reuse / Design To Cost/schedule, Early Error Elimination Tools And Techniques, Personnel/Management, Process Automation, Reuse-oriented Processes, User & Customer Involvement
Reusability / Domain Architecture-driven, Portability Functions / Domain Architecting, Reuser Involvement, Reuse Vector Specification & Verification
All of Above / Descoping, Domain Architecture-driven, Reuse (For Attributes Possessed By Reusable Assets) / Analysis, Continuous Process Improvement, Incentivization, Peer Reviews, Personnel/Management Focus, Planning Focus, Requirement/ design V&V, Review Emphases, Tool Focus, Total Quality Management
3.2 Capability Feasibility
· Provide arguments to justify the capability requirements specified in the SSRD. You do this by demonstrating -- through analysis, detailed references to prototypes, models, simulations, etc. -- how the designs will satisfy the SSRD capability requirements (SSRD section 3). Complete coverage of the capability requirements is essential.