Testing and Code Coverage Document
Team 1: Zach Broderick, Daniel Sacco, Soe San Win
CS509 – HW2 Deliverable – March 24, 2010
Introduction
This design document describes the testing and code coverage evaluation strategies deployed by Team-1: Zachary Broderick, Daniel Sacco and Soe San Win at CS-509 class at Worcester Polytechnic Institute.
As we discussed in class about testing strategies, our team decided to deploy unit testing and integration testing by using the JUnit testing utility available in the eclipse IDE. Our unit testing is defined as the testing of the behavior of every class we implemented for WordSteal. To meet this end, we will write JUnit test cases for all of the methods we implemented for each class. On the other hand, we defined our integration testing to be two-fold. First, we will determine the functional subsystems, and deploy rigorous subsystem testing to them. After we have met the functional standards for the subsystems, we will put the whole system together for the final integration testing.
Apart from testing the software for functional correctness, we will also execute nightly code coverage testing using the EclEmma plug-in available via the eclipse IDE. Code coverage testing is important because it will help us in evaluating the validity and relevance of our implementation to the Functional and Non-functional requirements defined in Homework-1. We will record the code coverage as returned by the nightly tests, and plot them against the daily time axis to show our incremental evolution of the development of our software application and tests.
The rest of the document is organized as follows:
-The testing Section will describe the steps we took toward our unit testing for each entity, boundary, controller, mainframe, and utility classes we implemented as well as their current status.
-The code Coverage Section will describe the status of our code coverage evaluation tests.
-The conclusion will sum up the important information we described throughout the project.
Testing
This section will describe the unit and integration testing methods we deployed. We organized the project files by creating the “src” and “Test” folders containing the packages of the classes and test cases we implemented respectively in the top-level WordSteal folder as shown in following figure.
As we discussed in class, the test cases will be able to see the source files of the project, and we will be able to evaluate the code coverage of the source files and the test cases separately.
Unit Testing
The goal we have towards unit testing is to cover all of the possible scenarios we can think of via test cases. Tests this way are simple enough for the entity classes while they are complicated for the boundaries, mainframe, and controllers because they operate at the trigger of events. We covered the test cases for these event-driven classes by creating the artificial, deterministic events via JUnit testing.
Boundaries
For boundaries, we have the Graphical User Interface (GUI) classes, namely: AboutFrame, InstructionFrame, WinningFrame, and NewGameFrame as we described in our Homework-1.
We decided to cover our tests on the critical application boundary classes first, i.e., the boundary class that is required to obtain the functional, basic application working. Therefore, we tested the NewGameFrame first because it will be required to take the players’ information into the system. We simulated the event of entering the players’ information by setting the information inputs on the NewGameFrame, and tested to see if the information is properly stored inside of the Game entity object created.
We will eventually test and evaluate the information-only boundary classes via JUnit testing after most of our functionally essential classes are implemented, and tested. The following figure shows the coverage of the source code for the NewGameFrame as evaluated by EclEmma.
MainFrame
Similar to boundaries,mainframe is also made up of event-driven GUI classes. However, since most of the classes under the mainframe are used for drawing the game’s mainframe window, we can test them by simulating events for the stub decorator as described in our design case study document, and evaluate the Drawer classes used to draw it, namely: DrawBoard, DrawGrabbedTile, DrawLayerBase, DrawRack, DrawTilesOnBoard, and draw TilesOnRack. The remaining classes: BoardRackPanel, and MainFrame are tested by themselves. To meet the unit testing of mainframe, we developed JUnit test cases for StubDecorator, MainFrame, and BoardRackPanel.
The evaluation of the code coverage for the mainframe is as shown in the following figure:
Entities
Entities are the simplest of the classes to test. However, there are a lot of test cases to cover due to the number of entities we defined. To test the entities, we tried to cover all of the behavior of the methods implemented in the entities classes, i.e., whether we can properly manipulate the information stored on the entities in the way we expect in the situations we came up with.
The test and evaluation of the entities can be seen as shown in the following figure. We can see that there is low code coverage for the entities compared to the evaluation of other classes we described so far. This is because of the entity classes that, as we described in our task list, will be implemented after we have accomplished the minimum requirements for Homework-2. We can clearly see that those entity classes are Dictionary, UndoManager, GameTimer, and GameLog. As of right now some of the entity test cases fail because their stubs are in place but the test cases have not been implemented yet.
Controllers
Controllers are similar to the boundaries and mainframe in the respect that they are all event-driven classes. Therefore, we simulated the determined events and then checked to see if the outcomes of those events are the same as we expected.
Therefore, we followed the same strategy, and tested the most essential controller class: NewGameController out of AccessibilityController, and MoveTileController. The following figure shows the code coverage evaluation of the controllers.
Utilities
As for the coverage of the utility classes that we used to construct other of our classes, we can see the coverage in following figure
Integration Testing
Since our essential test cases, and code evaluation is going well, we are ready to move on to integration testing of our current system while implementing the remaining classes we described earlier.
Test Cases Check-in Policy
In order to avoid the pitfalls of conflicting code in the repository, we decided on checking in the initial test case code only after a meeting of teammates to resolve the issues that might have arose. We worked on our own copy of project on our machines, and came to the meeting with our changes and updates, and checks the code into the repository only after all of the teammates agree on the quality of the code. For future integrations we will have team meetings when we believe they are necessary. At these meetings we will review and discuss each others code and check them in. We will do this in order to avoid code conflicts.
Currently, Zach is the main person taking up the challenge of reviewing the code since additional features on his mainframe implementation could mean changes of other classes everywhere. Thus, he notifies Soe San and Dan about the changes and makes sure that our test cases matches with the standards he implemented.
Overall Code Coverage Evaluation
Currently, our system has met the requirement of test case study as described by the Homework-2 assignment since we can launch the unit test cases for the whole system via the test folder: Right-Click test folder in Package Explorer -> Coverage as -> JUnit Test Case.
The final EclEmma code coverage evaluation of the latest system can be seen as follows:
Conclusions
In conclusion, we stated our unit test and integration test strategies, code coverage evaluation steps, and the latest status of our system. Parallel implementation, testing, and evaluation have saved us from several traps, and pitfalls of team development, and we will be practicing the methodologies we described in the document throughout the project.