Task definition: Task IV.22 Evaluate the request for changes
IV.22.1 Identify requirements modifications required due to a change request;
IV.22.2 Perform SCM functions;
IV.22.3 Generate engineering change order;
end Task Definition Task IV.22
Task definition: Task IV.23 Write a project plan for making changes
IV.23.1 Develop cost/effort and schedule estimates for the maintenance;
IV.23.2 Develop a Project Plan for the maintenance activity;
end Task Definition Task IV.23
Task definition: Task IV.24 Assess the potential impact of side effects
IV.24.1 Identify maintenance-specific risks and side effects;
IV.24.2 Project likelihood that risks and side effects will occur;
IV.24.3 Project impact of risks and side effects on developers / customers and on overall maintenance strategy;
IV.24.4 Write a Risk/Side Effect Mitigation, Monitoring and Management Plan.Review as required;
end Task Definition Task IV.24
Task definition: Task IV.25 Make necessary changes to the engineering models
IV.25.1 Refine analysis model to reflect application maintenance information;
IV.25.2 Modify the analysis model;
Begin Task IV.25.2
case of: analysis approach:
analysis approach = structured analysis
repeat until (all data object, relationships, attributes are defined)
25.2.1 Modify the data model using appropriate notation;
25.2.1.1 Identify data objects and control items that require change;
25.2.1.2 Define new data objects required for maintenance;
25.2.1.3 Indicate connections among objects and items;
25.2.1.4 Specify attributes for revised and new data objects;
25.2.1.5 Identify the relationships associated with each connection;
25.2.1.6 Develop or revise an entity relationship diagram, if required;
FTR: review the data model internally;
endrep
repeat until (data flow representation is complete)
25.2.2 Modify the functional model using appropriate notation;
25.2.2.1 Evaluate existing DFDs and define the domain of change;
25.2.2.2 Use grammatical parse on revised statement of scope for the maintenance;
25.2.2.3 Refine data flow model in domain of change;
25.2.2.4 Develop process specifications (PSPECs) for new transforms (functions) at lowest data flow level;
25.2.2.5 Revise process specifications (PSPECs) for modified transforms (functions) at lowest data flow level;
FTR: review the revised data flow model internally;
endrep
repeat until (data flow representation is complete)
25.2.3 Modify the behavioral model using appropriate notation;
25.2.3.1 Make a list of "events" that drive the system;
25.2.3.2 Indicate those system states that are externally observable;
25.2.3.3 Develop a top-level state transition model that shows how the product moves from state to state;
25.2.3.4 Refine the state transition model;
FTR: review the behavioral model internally;
endrep
analysis approach =object-oriented analysis
do while (class model needs refinement)
repeat until (all classes are defined)
25.2.1' Identify new classes to accommodate the maintenance;
25.2.1'.1 Identify classes/objects using grammatical parse of the enhanced software scope;
25.2.1'.2 Use inheritance to create enhanced classes ;
25.2.1'.3 Define relationships between classes;
25.2.1'.4 Define aggregate classes, as appropriate;
25.2.1'.5 Specify the class hierarchy;
FTR: review the class model internally;
review the class model with the customer;
endrep
repeat until (all attributes are defined)
25.2.2' Specify attributes and operations for each class;
25.2.2'.1 Specify attributes for each class by identifying generic information that is relevant to all instances;
25.2.2'.2 Identify the methods (operations) that are relevant to each class;
FTR: review the class model internally;
review the class model with the customer;
endrep
repeat until (class hierarchy are defined)
25.2.3' Modify the class hierarchy using appropriate notation;
25.2.3'.1 Specify public vs. private class attributes and methods;
25.2.3'.2 Describe object behavior;
review the class model internally;
review the class model with the customer;
endrep
enddo
25.2.4 Specify technical constraints and validation criteria;
25.2.5 Conduct formal technical reviews (FTRs) of model;
25.2.6 Make revisions as required;
25.2.7 Create requirements documentation (form will vary with project) using models as input;
25.2.8 Begin preliminary test planning;
endcase
end Task IV.25.2
IV.25.3 Modify the design model;
Begin Task IV.25.3
case of: design approach
design approach = structured design
repeat until (all data structures are defined)
25.3.1 Modify data structures based on analysis model;
25.3.1.1 Develop a data structure for each new data object;
25.3.1.2 Modify existing data objects, as required;
25.3.1.3 Modify database schema, If required;
25.3.1.4 Specify restrictions associated with each data structure;
FTR: review the data design internally;
endrep
repeat until (program architecture is defined)
25.3.2 Modify program architecture;
25.3.2.1 Map changes in data flow model into a revised program architecture;
25.3.2.2 Partition the architecture both vertically and horizontally and develop a revised structure chart;
FTR: review the architectural design internally;
endrep
repeat until (all program modules are described)
25.3.3 Describe new or revised program modules;
25.3.3.1 Adapt PSPECs (task 25.3.2.4) to become processing narratives for each new or revised module;
25.3.3.2 Develop description of each new or revised module interface;
25.3.4 Develop procedural (detailed) design for each new or revised module;
25.3.4.1 Develop a list of processing steps, based on processing narrative developed in Task 25.3.3.1;
25.3.4.2 Use stepwise refinement to develop a processing algorithm;
25.3.4.3 Apply "proof of correctness" techniques to verify correctness of the procedural design;
FTR: review the procedural design internally;
endrep
design approach = object-oriented design
repeat until (all subsystems are defined)
25.3.1' Modify each subsystem affected by the maintenance;
25.3.1'.1 Establish the level of concurrency for the problem;
25.3.1'.2 Allocate subsystems to processors;
25.3.1'.3 Allocate system functions to tasks;
25.3.1'.4 Define methods for managing global resources;
25.3.1'.5 Establish system control structure;
25.3.1'.6 Define mechanisms for initiation, termination and error handling;
FTR: review the system design internally;
endrep
repeat until (object design is complete)
25.3.2' Engineer reusable components corresponding to the class structure;
25.3.2'.1 Design classes in the problem domain;
25.3.2'.2 Design classes for the human-computer interface;
25.3.2'.3 Design classes for data management;
25.3.2'.4 Design classes for task management;
25.3.3' Modify object structure;
25.3.3'.1 Translate attributes into corresponding data structures;
25.3.3'.2 Design algorithms for each method, apply steps noted in Task 25.3.4;
25.3.4' Modify message structure;
25.3.4'.1 Review the class relationship model developed in Task 25.3.1'.2;
25.3.4'.2 Define messages associated with each relationship;
25.3.4'.3 Create a message template [Booch, 1992];
25.3.4'.4 Consider message synchronization issues;
FTR: review the system design internally;
endrep
25.3.4 Specify revised interface design, if required;
25.3.5 Modify design documentation (form will vary with project) using the modified design model as input;
25.3.5 Conduct formal technical reviews (FTR) of model;
25.3.6 Make revisions as required;
endcase
end Task IV.25.3
IV.25.4 Create a revised test plan and procedure for maintenance;
end Task Definition Task IV.25
Task Definition: Task IV.26 Implement the change
repeat until (deliverable is finalized)
if deliverable is non-executable then
DPP: develop appropriate documentation;
else {source code is to be generated}
do while (new or revised modules remain to be coded)
IV.26.1 Select program components to be revised;
IV.26.2 Modify code in the appropriate programming language;
IV.26.3 Conduct code walkthroughs (FTR) of modified or new components;
IV.26.4 Make revisions as required;
enddo
endif
endrepeat
end Task Definition Task IV.26
Task Definition: Task IV.27 Verify and validate revised deliverable
IV.27.1 Revise existing test plan and procedure to accommodate the maintenance changes;
Begin Task IV.27.1
repeat until (test strategy is finalized)
27.1.1 Review validation criteria and existing test plan, if one has been written; modify to accommodate the maintenance;
27.1.1.1 Review the test plan to determine if project schedule provides completed modules when needed for testing;
27.1.1.2 Modify the test plan to conform to the actual module completion dates;
27.1.1.3 Review validation criteria to ensure that requirements changes have been accommodated;
27.1.1.4 Add new classes of tests, as required;
27.1.2 Describe the integration strategy for the maintenance;
27.1.2.1 Select order of functional implementation for the maintenance;
27.1.2.2 Define integration strategy for the maintenance;
27.1.2.3 Design clusters (builds) and indicate where stubs/drivers will be required;
parallel activity: CreateTestSoftware
do while (stubs and drivers remain to be developed)
analyze requirements/interfaces for stubs and drivers;
design stubs and drivers;
generate code for stubs/drivers;
test stubs and drivers to ensure accurate interfaces and operation;
DPP: create operating instructions for stubs and drivers;
enddo
endparallel
27.1.2.4 Consider regression testing requirements and indicate where regression tests are to be used;
27.1.2.5 Specify special testing resources;
27.1.3 Identify program components that require unit testing;
27.1.3.1 Define criteria for selection of unit test candidates;
27.1.3.2 Determine whether unit tests can be appropriately scheduled;
27.1.3.3 Estimate resources required to conduct unit testing;
endrep
end Task IV.27.1
IV.27.2 Design test cases to exercise all changes made during the maintenance activity;
Begin Task IV.27.2
repeat until (white box test cases are designed)
27.2.1 Review module procedural design to determine most appropriate white- box method for new and revised modules;
27.2.1.1 Compute module cyclomatic complexity, if this has not already been done;
27.2.1.2 Isolate all independent logical paths;
27.2.1.3 Isolate all loops;
27.2.2 Design basis path tests;
27.2.2.1 For each independent path, examine component interface to determine inputs that will force module to execute that path.
27.2.2.2 Specify output values/conditions that are expected;
27.2.2.3 Indicate any special testing considerations;
27.2.3 Design loop tests;
27.2.3.1 For each loop specify type;
27.2.3.2 For each independent path, examine component interface to determine inputs that will force module to execute loops as required by loop test criteria;
27.2.3.3 Specify loop exit values/conditions that are expected;
27.2.4 Design other white-box tests
27.2.4.1 Consider efficacy of data flow testing and design tests if appropriate;
27.2.4.2 Consider efficacy of brach-relational operator testing and design tests if appropriate;
27.2.4.3 Consider efficacy of other white-box tests and design tests if appropriate;
endrep
repeat until (black box test cases are designed)
27.2.5 Review existing black box tests to determine those that are applicable to the enhanced system;
27.2.5.1 Examine architecture to isolate modified component clusters (builds) that are candidates for black-box testing;
27.2.5.2 Isolate local requirements for each program cluster;
27.2.5.3 Review broad program requirements and test classes specified during analysis activities;
27.2.6 Design black-box tests for each new or revised component cluster for use during integration testing;
27.2.6.1 Define equivalence classes for data that flow into the cluster;
27.2.6.2 Define boundary values for data that flow into the cluster;
27.2.6.3 Specify corresponding equivalence class and boundary value tests;
27.2.6.4 Specify expected output;
27.2.7 Design black-box tests for the enhanced program as a whole;
27.2.7.1 Define equivalence classes for data that flow into the program.;
27.2.7.2 Define boundary values for data that flow into the program;
27.2.7.3 Specify corresponding equivalence class and boundary value tests for the program;
27.2.7.4 Specify expected output;
endrep
end Task IV.27.2
IV.27.3 Execute white-box and black-box tests with application
maintenance refinements;
Begin Task IV.27.3
27.3.0 Review all test cases to ensure that adequate test coverage has been achieved. Make revisions as required;
repeat until (white box tests have been executed)
27.3.1 Execute a white box test case;
27.3.1.1 Compare expected and actual results;
27.3.1.2 Note discrepancies;
27.3.1.3 Record all errors for later correction or immediate action. Create an error log;
parallel activity: debugging
do while (errors remain to be corrected)
diagnose error symptom;
if cause of error is known then
27.3.2 Correct error
invoke Task IV.27.3.1;
27.3.3 Modify source code and design model based on debugging changes. Create a change log;
27.3.4 Review (FTR) all substantive changes for side effects;
else {cause of error is not known}
invoke Task IV.27.3.1 to exercise code connected to symptom;
ask colleagues to assist in diagnosis;
endif
enddo
endparallel
endrep
repeat until (black box tests have been executed)
27.3.5 Execute a black box test case;
27.3.5.1 Compare expected and actual results;
27.3.5.2 Note discrepancies;
27.3.5.3 Record all errors for later correction or immediate action. Create an error log;
parallel activity: debugging
do while (errors remain to be corrected)
diagnose error symptom;
if cause of error is known then
27.3.6 Correct error
invoke Task IV.27.3.5;
27.3.7 Modify source code and design model based on debugging changes. Create a change log;
27.3.8 Review (FTR) all substantive changes for side effects;
else {cause of error is not known}
invoke Task IV.27.3.5 to exercise code connected to symptom;
ask colleagues to assist in diagnosis;
endif
enddo
endparallel
endrep
end Task IV.27.3
IV.27.5 Conduct regression tests;
IV.27.6 Prepare all application maintenance documentation;
IV.27.7 Develop a strategy for re-introduction of the software;
Begin Task IV.27.7
27.7.1 Define system integration strategy and initiate system integration testing. (see Task II.12.8.1);
27.7.2 Establish customer integration and live-test strategy;
27.7.3 Revise and implement customer training, as req'd. (see Task II.13.2)
end Task IV.27.7
IV.27.8 Create an revised Software Release Package;
Begin Task IV.27.8
27.8.1 Ensure that all documents, data and programs are placed under formal configuration control (see Chapter 5, Umbrella Activity U.4);
27.8.2 Construct a set of deliverables for the customer;
27.8.3 Define a set of internal documentation for technical reference;
27.8.4 Develop a System Support Plan(large projects only);
27.8.5 Apply final SQA procedures to ensure completeness;
27.8.6 Write release memo;
End Task IV.27.8
II.27.9 Roll-out to each customer;
end Task Definition Task IV.27
Task Definition: Task IV.28 Release the modified software
IV.28.1 Deliver the modified software to each customer;
IV.28.2 Define feedback mechanisms;
IV.28.3 Update the help-desk to accommodate multiple customer locations;
IV.28.4 Conduct training and provide implementation support;
IV.28.5 Collect and analyze customer feedback;
end Task Definition Task IV.28
Use Browser "back" arrow or return to APM Process Design Language Description