Home

  About Us

  Products

  Process Models

  SE Resources

  Commentary

  Contact us

Breaking News!

A new blog ...

visit OnCenter, Roger Pressman's running commentary on the world at large

A new edition ... the 7th edition of Software Engineering is available now

A new book ... Roger Pressman and David Lowe on Web Engineering

A first novel ... Roger Pressman's first novel is a technothriller -- The Aymara Bridge

A new curriculum! RSP&A has partnered with QAI to develop a comprehensive Internet-based software engineering curriculum.

A redesigned site! ... we've done a major redesign and added many new features for 2009 - 2010

 
Adaptable Process Model
Task II.12 Verify and Validate the Deliverable



IMPORTANT NOTICE: The complete Adaptable Process Model (APM) is provided for informational purposes and for assessment by potential users. The APM is copyrighted material and may not be downloaded, copied, or extracted for use in actual project work. The full hypertext (html) version of the APM may be licensed for use and customization within your organization. Contact R.S. Pressman & Associates, Inc. for complete licensing information.

Task II.12 Verify and validate the deliverable

For non-executable deliverables, the intent of this task is to conduct appropriate reviews (see Section 5.2 for details). For executable deliverables, the intent of this task is to design and execute tests with the intend of uncovering errors. Subtasks for executable deliverables are considered below.

II.12.1 Finalize test plan and strategy.

Intent: The intent of this task is to finalize the plan for unit, integration and validation testing.

Mechanics: The preliminary test plan and procedure, created in Task II.10.6, is revised to reflect the source code implementation of the software. Final testing strategy is developed.

Application of Formal Methods: none

Application of CASE Tools: t.b.d

SQA Checklist:

    1. Does the test plan define an adequate number of milestones that will enable the testing activity to be tracked?

    2. Has time been scheduled for debugging the errors that are uncovered during testing?

    3. Has provision been made for change control as the software is modified during testing?

    4. Are distinct test case design methods to be used to generate test cases as part of the test strategy?

    5. Have provisions been made for recording metrics on errors?

Do's & Don'ts

    Do: Take the project schedule into account. There is no point in defining a testing strategy that call for the integration of a module that has not yet been designed.

    Don't: Assume that every module that "enters" the test strategy will be perfect. Be sure to define a unit testing approach as well as integration and validation.

Deliverables: Integration Test Plan and Strategy


II.12.2 Design white-box tests for selected program components.

Intent: The intent of this task is to design white-box test cases that will fully exercise each program component.

Mechanics: The preliminary test plan and procedure, created in Task II.10.6, is revised to reflect the source code implementation of the software. New and revised white-box test cases are developed.

Application of Formal Methods: white-box test case design methods

Application of CASE Tools: t.b.d.

SQA Checklist:

    1. Have test cases been generated for each program module?

    2. Will the test cases result in path coverage?

    3. Have program loops been explicitly exercised?

    4. Are all conditions exercised on their true and false side?

    5. Have regression tests for reusable components been acquired? Have they been scheduled for execution?

Do's & Don'ts

    Do: Recall that white-box tests are best applied to unit level testing.

    Do: Select critical modules from white-box testing, if resources are limited. Critical modules are those that are most error prone or those that address multiple requirements.

    Do: Be sure that every statement in each program component is executed at least one time during testing.

    Don't: Rely on black-box tests only. Certain classes of errors will go undetected!

Deliverables: white-box test cases and expected results

 

II.12.3 Design black-box tests for selected program components and for the integrated system.

Intent: The intent of this task is to design black-box test cases that will fully exercise the input and output domain of the software.

Mechanics: The preliminary test plan and procedure, created in Task II.10.6, is revised to reflect the source code implementation of the software. New and revised black-box test cases are developed.

Application of Formal Methods: black-box test case design methods

Application of CASE Tools: t.b.d.

SQA Checklist:

    1. Have test cases been organized by input class?

    2. Have tests been designed to exercise the program at it boundaries?

    3. Have unusual and erroneous input conditions been reflected in test case designs?

    4. Have all expected results been documented?

    5. Have test cases been tracked to requirements to ensure that all requirements have been tested?

Do's & Don'ts

    Do: Recall that black-box tests are best applied throughout testing and are used heavily for regression testing.

    Do: Use test case design methods (e.g., equivalence partitioning) that result in the discovery of classes of errors.

    Don't: Rely on black-box tests only. Certain classes of errors will go undetected. white-box tests should also be conducted.

Deliverables: black-box tests and expected results

 

II.12.4 Review all test cases to ensure that adequate test coverage has been achieved. Make revisions as required.

See guidelines for formal technical reviews discussed in Chapter 5, Umbrella Activities, Section 5.2.

 

II.12.5 Execute white-box tests for selected program components according to test plan.

Intent: The intent of this task is to execute white-box tests by following the testing strategy defined in Task II.12.1.

Mechanics: Use information in the test plan to determine the sequence of white-box tests. As errors are found, conduct necessary debugging.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

    1. Have the results of each white box test been saved?

    2. Have all white box test cases been executed?

    3. Have all errors uncovered been corrected?

    4. Have expected results and actual results been compared for all tests?

Do's & Don'ts

    Do: Keep careful records. The more information you collect on errors, the better your ability to improve the software engineering process.

    Do: Be certain to archive all white-box test case and the results obtained from them.

Deliverables: Test results and error records

 

II.12.6 Execute black-box tests for selected program components according to test plan.

Intent: The intent of this task is to execute black-box tests by following the testing strategy defined in Task II.12.1.

Mechanics: Use information in the test plan to determine the sequence of black-box tests. As errors are found, conduct necessary debugging.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

    1. Have the results of each black box test been saved?

    2. Have all black box test cases been executed?

    3. Have all errors uncovered been corrected?

    4. Have expected results and actual results been compared for all tests?

Do's & Don'ts

    Do: Keep careful records. The more information you collect on errors, the better your ability to improve the software engineering process.

    Do: Be certain to archive all black-box test case and the results obtained from them.

Deliverables: Test results and error records

 

II.12.7 Prepare all documentation for release.

Intent: The intent of this task is be certain that all software documentation (both technical documents for internal use and customer documentation) reflect the changes that have been made as a consequence of testing. All documents are prepared for release to the customer.

Mechanics: The log of all modifications to the software made as a consequence of testing is reviewed to ensure that documentation reflects all changes. Documents are reviewed and modified as required.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

    1. Has documentation been revised to reflect changes made during debugging?

    2. Have all errors been recorded and categorized?

    3. Does the documentation accurately reflect user interaction and system behavior?

Do's & Don'ts

    Do: Be certain to use configuration management and control tasks (see Chapter 5, Umbrella Activities) as changes are made to software documentation.

Deliverables: Revised/updated software documentation

 

II.12.8 Define system integration strategy and initiate system integration testing.

Intent: The intent of this task is to develop an approach for integrating the software into the product and conducting test to validate operation, interfaces and performance.

Mechanics: An integration strategy is developed in conjunction with product engineering and system integration. The software is delivered to the system testing group.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.


Use Browser "back" arrow or return to APM Process Design Language Description


Site search! We've added links to a search engine that will enable you to search our entire site for information you need. Enter the appropriate word or phrase below.

PicoSearch




Home About us Products Product Models SE Resources Commentary Contact us

Web site and all contents © R.S. Pressman & Associates, Inc. 2001 - 2010, All rights reserved.
Free website templates