Home

  About Us

  Products

  Process Models

  SE Resources

  Commentary

  Contact us

Breaking News!

A new blog ...

visit OnCenter, Roger Pressman's running commentary on the world at large

A new edition ... the 7th edition of Software Engineering is available now

A new book ... Roger Pressman and David Lowe on Web Engineering

A first novel ... Roger Pressman's first novel is a technothriller -- The Aymara Bridge

A new curriculum! RSP&A has partnered with QAI to develop a comprehensive Internet-based software engineering curriculum.

A redesigned site! ... we've done a major redesign and added many new features for 2009 - 2010

 
Adaptable Process Model
New Application Development
Process Design (Tasks)

Task definition: Task II.7 Translate concept to software scope

II.7.1 Conduct meeting with customer to acquire application domain specific or product domain specific information;

II.7.2 Define data domain of the application;

II.7.3 Define the functions/behaviors that the software is to perform and the computing environment in which the software will reside;

II.7.4 Define technical constraints;

II.7.5 Define validation criteria;

II.7.6 Make a quick estimate of project size;

II.7.7 Create a statement of Software Scope;

endTask definition: Task I.7

 

Task definition: Task II.8 Plan the software project

II.8.1 Ensure that scope is defined adequately;

II.8.2 Research or reassess the availability of existing software that can achieve part of all of the defined scope;

II.8.3 Develop the project plan;

II.8.4 Consider viability of subcontracting software development to a third party;

II.8.5 Review the project plan with management and customer;

II.8.6 Make revisions as required;

endTask definition: Task II.8



Task definition: Task II.9 Assess project risk

{implemented via Risk analysis umbrella task}

endTask definition: Task I.9

 

Task definition: Task II.10 Develop an engineering model

II.10.1 Review description of data, function and behavioral information;

II.10.2 Refine data, function and behavioral information, as required;

II.10.3 Create or extend a prototype of the software;

begin Task II.10.3

  • II.10.3.1 Create a paper model (e.g., a mathematical model) of the software;

    II.10.3.2 Create a mock-up prototype of the software;

    II.10.3.3 Create a simulation model of the software;

    II.10.3.4 Build an operational prototype;

  • endtask Task II.10.3

    II.10.4 Engineer an analysis model for the software;

  • begin Task II.10.4

    case of: analysis approach

    analysis approach = structured analysis

  • repeat until (all data objects, relationships, attributes are defined)
  • 10.4.1 Engineer a data model using appropriate notation;
  • 10.4.1.1 Identify data objects and control items;

    10.4.1.2 Indicate connections among objects and items;

    10.4.1.3 Specify attributes for data objects;

    10.4.1.4 Identify the relationships associated with each connection;

    10.4.1.5 Develop an entity relationship diagram, if required;

  • FTR: review the data model internally;

    review the data model with the customer;

  • endrep

    repeat until (data flow representation is complete)

  • 10.4.2 Engineer a functional model using appropriate notation;
  • 10.4.2.1 Develop a "context level" data flow model;

    10.4.2.2 Use grammatical parse as method for refining to next level;

    10.4.2.3 Refine data flow model to multiple levels;

    10.4.2.4 Develop process specifications (PSPECs) for transforms (functions) at lowest data flow level;

  • FTR: review the data flow model internally;

    review the data flow model (top levels only) with the customer;

  • endrep

    repeat until (data flow representation is complete)

  • 10.4.3 Engineer a behavioral model using appropriate notation;
  • 10.4.3.1 Make a list of "events" that drive the system;

    10.4.3.2 Indicate those system states that are externally observable;

    10.4.3.3 Develop a top-level state transition model that shows how the product moves from state to state;

    10.4.3.4 Refine the state transition model;

  • FTR: review the behavioral model internally;

    review the behavioral model with the customer;

  • endrep

  • analysis approach =object-oriented analysis

    do while (class model need refinement)

  • repeat until (all classes are defined)
  • 10.4.1' Identify candidate classes for the system;
  • 10.4.1'.1 Identify classes/objects using grammatical parse of the Software Scope;

    10.4.1'.2 Define relationships between classes;

    10.4.1'.3 Define aggregate classes, as appropriate;

    10.4.1'.4 Specify the class hierarchy;

  • FTR: review the class model internally;

    review the class model with the customer;

  • endrep

    repeat until (all attributes are defined)

  • 10.4.2' Specify attributes and operations for each class;
  • 10.4.2'.1 Specify attributes for each class by identifying generic information that is relevant to all instances;

    10.4.2'.2 Identify the methods (operations) that are relevant to each class;

  • FTR: review the class model internally;

    review the class model with the customer;

  • endrep

    repeat until (class hierarchy are defined)

  • 10.4.3' Engineer class hierarchy using appropriate notation;
  • 10.4.3'.1 Specify public vs. private class attributes and methods;

    10.4.3'.2 Describe object behavior;

  • FTR: review the class model internally;

  • review the class model with the customer;

    endrep

  • enddo

    10.4.4 Specify technical constraints and validation criteria;

    10.4.5 Conduct formal technical reviews (FTRs) of model;

    10.4.6 Make revisions as required;

    10.4.7 Create requirements documentation (form will vary with project) using models as input;

    10.4.8 Begin preliminary test planning;

  • endcase

    endtask Task II.10.4

    II.10.5 Engineer a design model for the software;

    begin Task II.10.5

    case of: design approach:

  • design approach = structured design

    repeat until (all data structures are defined)

  • 10.5.1 Engineer data structures based on analysis model;
  • 10.5.1.1 Develop a data structure that is appropriate for each data object;

    10.5.1.2 Identify a database schema, If required;

    10.5.1.3 Specify restrictions associated with each data structure;

  • FTR: review the data design internally;

  • endrep

    repeat until (program architecture is defined)

    10.5.2 Engineer program architecture;

  • 10.5.2.1 Select appropriate architectural style(s) from available architectural patterns

    10.5.2.2 Evaluate alternative styles to determine best fit for application

    If architectural style is a call-and-return pattern then
  • 10.5.2.3 Map data flow model into a program architecture;

    10.5.2.2 Partition the architecture both vertically and horizontally and develop a structure chart;

  • endif

    FTR: review the architectural design internally;

  • endrep

    repeat until (all program modules are described)

    10.5.3 Describe program modules.

  • 10.5.3.1 Adapt PSPECs (task 10.4.2.4) to become processing narratives for each module;

    10.5.3.2 Develop description of each module interface;

  • 10.5.4 Develop procedural (detailed) design for each module;

  • 10.5.4.1 Develop a list of processing steps, based on processing narrative developed in Task 10.5.3.1;

    10.5.4.2 Use stepwise refinement to develop a processing algorithm.

    10.5.4.3 Apply "proof of correctness" techniques to verify correctness of the procedural design;

  • FTR: review the procedural design internally;

    endrep

  • design approach = object-oriented design

    repeat until (all subsystems are defined)

  • 10.5.1' Define each subsystem
  • 10.5.1'.1 Establish the level of concurrency for the problem

    10.5.1'.2 Allocate subsystems to processors

    10.5.1'.3 Allocate system functions to tasks

    10.5.1'.4 Define methods for managing global resources

    10.5.1'.5 Establish system control structure

    10.5.1'.6 Define mechanisms for initiation, termination and error handling

  • review the system design internally;

  • endrep

    repeat until (object design is complete)

  • 10.5.2' Engineer reusable components corresponding to the class structure.
  • 10.5.2'.1 Design classes in the problem domain.

    10.5.2'.2 Design classes for the human-computer interface.

    10.5.2'.3 Design classes for data management.

    10.5.2'.4 Design classes for task management.

  • 10.5.3' Design object structure.

  • 10.5.3'.1 Translate attributes into corresponding data structures.

    10.5.3'.2 Design algorithms for each method, apply steps noted in Task 10.5.4.

  • 10.5.4' Design message structure.

  • 10.5.4'.1 Review the class relationship model developed in Task 10.4.1'.2.

    10.5.4'.2 Define messages associated with each relationship.

    10.5.4'.3 Create a message template [Booch, 1992].

    10.5.4'.4 Consider message synchronization issues.

  • review the system design internally;

  • endrep

    10.5.4 Specify interface design, if required.

    10.5.5 Create design documentation (form will vary with project) using the design model as input.

    10.5.5 Conduct formal technical reviews (FTR) of model.

    10.5.6 Make revisions as required.

    endcase

    endtask Task II.10.5

    II.10.6 Create preliminary test plan and procedure for the software;

    EndTask definition Task II.10

     

    Task Definition: Task II.11 Construct a deliverable

    repeat until (deliverable is finalized)

  • if deliverable is non-executable then
  • DPP: develop appropriate documentation;
  • else {source code is to be generated}

  • do while (modules remain to be coded)
  • II.11.1 Select program components to be coded;

    II.11.2 Code in the appropriate programming language;

    II.11.3 Conduct code walkthroughs (FTR) of selected components;

    II.11.4 Make revisions as required;

  • enddo

  • endif

  • endrepeat

    EndTask definition Task II.11

     

    Task Definition: Task II.12 Verify and validate the deliverable

    II.12.1 Finalize test plan and strategy.

    repeat until (test strategy is finalized)

  • begin Task II.12.1

    12.1.1 Review validation criteria and preliminary test plan, if one has been written; modify, if req'd;

  • 12.1.1.1 Review the test plan to determine if project schedule provides completed modules when needed for testing;

    12.1.1.2 Modify the test plan to conform to the actual module completion dates;

    12.1.1.3 Review validation criteria to ensure that requirements changes have been accommodated;

    12.1.1.4 Add new classes of tests, as required;

  • 12.1.2 Describe the integration strategy for the system;

  • 12.1.2.1 Select order of functional implementation;

    12.1.2.2 Define integration strategy for each function;

    12.1.2.3 Design clusters (builds) and indicate where stubs/drivers will be required;

    parallel activity: CreateTestSoftware

  • do while (stubs and drivers remain to be developed)
  • analyze requirements/interfaces for stubs and drivers;

    design stubs and drivers;

    generate code for stubs/drivers;

    test stubs and drivers to ensure accurate interfaces and operation;

    DPP: create operating instructions for stubs and drivers;

  • enddo

  • endparallel

    12.1.2.4 Consider regression testing requirements and indicate where regression tests are to be used;

    12.1.2.5 Specify special testing resources;

  • 12.1.3 Identify program components that require unit testing;

  • 12.1.3.1 Define criteria for selection of unit test candidates;

    12.1.3.2 Determine whether unit tests can be appropriately scheduled;

    12.1.3.3 Estimate resources required to conduct unit testing;

  • end Task II.12.1

     

    II.12.2 Design white-box tests for selected program components;

    Begin Task II.12.2

    repeat until (white box test cases are designed)

  • 12.2.1 Review module procedural design to determine most appropriate white- box method;
  • 12.2.1.1 Compute module cyclomatic complexity, if this has not already been done;

    12.2.1.2 Isolate all independent logical paths;

    12.2.1.3 Isolate all loops;

  • 12.2.2 Design basis path tests;

  • 12.2.2.1 For each independent path, examine component interface to determine inputs that will force module to execute that path;

    12.2.2.2 Specify output values/conditions that are expected;

    12.2.2.3 Indicate any special testing considerations;

  • 12.2.3 Design loop tests;

  • 12.2.3.1 For each loop specify type;

    12.2.3.2 For each independent path, examine component interface to determine inputs that will force module to execute loops as required by loop test criteria;

    12.2.3.3 Specify loop exit values/conditions that are expected;

  • 12.2.4 Design other white-box tests;

  • 12.2.4.1 Consider efficacy of data flow testing and design tests if appropriate;

    12.2.4.2 Consider efficacy of brach-relational operator testing and design tests if appropriate;

    12.2.4.3 Consider efficacy of other white-box tests and design tests if appropriate;

  • endrep

    end Task II.12.2

     

    II.12.3 Design black-box tests for selected program components and for the integrated system;

    Begin Task II.12.3

    repeat until (black box test cases are designed)

  • 12.3.1 Review program architecture and software requirements to determine most effective black-box testing strategy;
  • 12.3.1.1 Examine architecture to isolate component clusters (builds) that are candidates for black-box testing;

    12.3.1.2 Isolate local requirements for each program cluster;

    12.3.1.3 Review broad program requirements and test classes specified during analysis activities;

  • 12.3.2 Design black-box tests for each component cluster for use during integration testing;

  • 12.3.2.1 Define equivalence classes for data that flow into the cluster;

    12.3.2.2 Define boundary values for data that flow into the cluster;

    12.3.2.3 Specify corresponding equivalence class and boundary value tests;

    12.3.2.4 Specify expected output;

  • 12.3.3 Design black-box tests for program as a whole;

  • 12.3.2.1 Define equivalence classes for data that flow into the program;

    12.3.2.2 Define boundary values for data that flow into the program;

    12.3.2.3 Specify corresponding equivalence class and boundary value tests for the program;

    12.3.2.4 Specify expected output;

  • endrep

    end Task II.12.3

    II.12.4 Review all test cases (FTR) to ensure that adequate test coverage has been achieved. Make revisions as required;

    II.12.5 Execute white-box tests for selected program components according to test plan;

    Begin Task II.12.5

    repeat until (white box tests have been executed)

  • 12.5.1 Execute a white box test case;
  • 12.5.1.1 Compare expected and actual results;

    12.5.1.2 Note discrepancies;

    12.5.1.3 Record all errors for later correction or immediate action. Create an error log;

  • parallel activity: debugging

  • do while (errors remain to be corrected)
  • diagnose error symptom;

    if cause of error is known then

  • 12.5.2 Correct error;

    invoke Task II.12.5;

    12.5.3 Modify source code and design model based on debugging changes. Create a change log.

    12.5.4 Review (FTR) all substantive changes for side effects.

  • else {cause of error is not known}

  • invoke Task II.12.2 to exercise code connected to symptom;

    ask colleagues to assist in diagnosis;

  • endif

  • enddo

  • endparallel

  • endrep

    end Task II.12.5

     

    II.12.6 Execute black-box tests for selected program components according to test plan;

    Begin Task II.12.6

    repeat until black box tests have been executed

  • 12.5.1 Execute a black box test case;
  • 12.5.1.1 Compare expected and actual results;

    12.5.1.2 Note discrepancies;

    12.5.1.3 Record all errors for later correction or immediate action. Create an error log;

  • parallel activity: debugging

  • do while (errors remain to be corrected)
  • diagnose error symptom;

    if cause of error is known then

  • 12.5.2 Correct error

    invoke ExecuteBlackBoxTests;

    12.5.3 Modify source code and design model based on debugging changes. Create a change log;

    12.5.4 Review (FTR) all substantive changes for side effects;

  • else {cause of error is not known}

  • invoke Task II.12.3 to exercise code connected to symptom;

    ask colleagues to assist in diagnosis;

  • endif

  • enddo

  • endparallel

  • endrep

    EndTask definition Task II.12.6

     

    II.12.7 Prepare all documentation for release;

    II.12.8 Define system integration strategy and initiate system integration testing;

    Begin Task II.12.8

  • 12.8.1 Define a strategy for integrating the software with product hardware and other system components;

    12.8.2 Design black-box tests that exercise the software in the product context; Focus should be on interface testing and timing/performance issues;

    12.8.3 Select regression tests from the black-box and white box test suites (Tasks II.9.2 and II.9.3) for use in system testing;

    12.8.4 Deliver software to the system testing group;

    12.8.5 Conduct system tests;

    12.8.6 Collect system test results and evaluate quality issues raised. Initiate SCM procedures if changes to the software must be made;

  • end Task II.12.8

  • EndTask definition Task II.12



    Task Definition: Task II.13 Evaluation of the deliverable

    II.13.1 Create an Software Release Package.

    Begin Task II.13.1

  • 13.1.1 Ensure that all documents, data and programs are placed under formal configuration control (see Chapter 5, Umbrella Activity U.4);

    13.1.2 Construct a set of deliverables for the customer;

    13.1.3 Define a set of internal documentation for technical reference;

    13.1.4 Develop a System Support Plan(large projects only);

    13.1.5 Apply final SQA procedures to ensure completeness;

    13.1.6 Write release memo;

  • end Task II.13.1

    II.13.2 Design and implement customer training, as req'd;

    II.13.3 Establish customer integration and live-test strategy;

    Begin Task II.13.3

  • 13.3.1 Define an alpha testing environment within the development facility;

    13.3.2 Observe alpha-tests and collect and evaluate quality issues raised during alpha-testing. Initiate SCM procedures if changes to the software must be made;

    13.3.3 Define a beta-testing strategy for the software product;

    13.3.4 Ship the software to selected customers along with beta-test evaluation guidelines;

    13.3.5 Evaluate beta-test reports from the customer(s) and collect and evaluate quality issues raised. Initiate SCM procedures if changes to the software must be made;

    13.3.6 Produce a customer test status report as alpha and beta testing are conducted;

  • end Task II.13.3

    II.13.4 Transmit deliverable(s) to customer(s);

    II.13.5 Define customer feedback mechanism(s);

    II.13.6 Establish "help desk" capability for customer(s);

    II.13.7 Monitor early customer application of product/system;

    II.13.8 Collect and analyze customer feedback;

    EndTask definition Task II.13



    Task Definition: Task II.14 Define recommended customer modifications

    II.14.1 Conduct meeting(s) with customer(s) to review/define modifications to requirements implied by feedback collected during Task 13;

    II.14.2 Define impact on data domain of the software;

    II.14.3 Define impact on the functions/behaviors that the software is to perform;

    II.14.4 Review in the context of technical constraints;

    II.14.5 Make quick estimate of new costs, time requirements and resources;

    II.14.6 Negotiate modifications vs. cost, schedule, resources;

    II.14.7 Write a mini-spec defining those modifications that are to be undertaken by the developer;

    EndTask definition Task II.14


    Use Browser "back" arrow or return to APM Process Design Language Description


    Site search! We've added links to a search engine that will enable you to search our entire site for information you need. Enter the appropriate word or phrase below.

    PicoSearch




    Home About us Products Product Models SE Resources Commentary Contact us

    Web site and all contents © R.S. Pressman & Associates, Inc. 2001 - 2010, All rights reserved.
    Free website templates