The Airlie Software Council and the Software Program Managers Network (SPMN) have been sponsored by the United States Department of Defense in an effort to establish a practical set of software best practices and accompanying guidelines and tools to support them.
The checklists that follow are reproduced from the Little Yellow Book of Software Management Questions (July, 1995)the result of work conducted by the Airlie Council and the SPMN. They address 9 "Principle Practices" that should exist for all software projects. (SEPA, 5/e, Part Two, addresses project management in detail)
Full information about the activities of the Airlie Council and the SPMN and complete content of the many useful resources for software project managers and others can be found at: http://spmn.com.
Formal Risk Management
1. What are the top ten risks as determined by government, technical, and program management staff? When did you last check your top ten nsks?
2. How do you identify a risk?
3. How do you resolve a risk?
4. How much money and time do you have set aside - for risk resolution?
5. What risks would you classify as show-stoppers and how did you derive them?
6. How many risks are in the risk database? How recently did you update the database?
7. How many risks have been added in the last six months? Describe the most recent risk added to the database? When was it added? What was your mitigation plan for it?
8. Can you name a risk that you had six months ago and describe what you did to mitigate it?
9. What risks do you expect to mitigate or resolve in the next six months?
10. Are risks assessed and prioritized in terms of their likelihood of occurrence and their potential impact on the project? Give an example.
11. Are as many viewpoints as possible (in addition to the project team's) involved in the risk assessment process? Give an example.
12. If you will not remain for the project's completion, what risks have been identified that will remain after you leave? Are any of them imminent? Will you leave a transition plan?
13. Pick a risk and explain the risk mitigation plan for it.
14. What is your top supportability risk?
15. What percentage of risks impact the final delivery of the system? How did you arrive at that decision?
16. To date, how many risks have you closed out?
17. Who in this meeting/briefing is the risk officer? Is the role of risk officer their primary responsibility? If not, what percentage of the officer's time is devoted to being the project's risk officer?
18. What percentage of your risks have been identified by non-managerial workers? How many risks were identified by the contractor?
19. How are identified risks items given project visibility?
Agreement on Interfaces
1. Who approves the user interfaces?
2. Which user organization approves the product?
3. When was the last time a joint user/project meeting was held?
4. Describe the amount of user involvement on this project over the last year.
5. How does the team encourage frequent communication between themselves and stakeholders?
6. Identify all key interfaces and external systems that interface with your system.
7. Who at the operator level has reviewed these input/outputs and interfaces?
8. Can you take a complete census of inputs/outputs across system boundaries?
9. Can you take a complete census of the interfaces in the hardware/software configuration?
10. Which interfaces are yet "to be determined" and when will they be resolved?
11. Give the name and organization of real users that have participated in defining these interfaces and inputs/outputs.
12. What is the target date for the resolution of the last operational interface issue?
13. How do you plan to test external interfaces?
14. Has the user agreed to all risks that have been identified as not resolvable by the project?
15. Is there a representative of the user attending this meeting?
16. What is the method that the user representatives used to describe the user interface/external interfaces?
17. Do you have a program level interface database? Is it on-line? Is it controlled by the prime contractor? Is it under configuration management control?
18. Are there any external interfaces to your system that are changing while you are building the interfaces?
Peer Reviews, Inspections, Walkthroughs, etc.
1. Are you doing peer reviews? If so, is there a written procedure for conduct of reviews?
2. Show a recent review report for this project.
3. What kinds of products are reviewed?
4. Do you have a review schedule for this project?
5. Who has reviewed the project schedule?
6. Who has reviewed the software development plan?
7. Do people from the government program office participate in the reviews? Are their comments included in review reports?
8. Is there a place where I can access information about the reviews?
9. What are the major categorizations of problems identified in reviews?
10. Have any products failed a review (if no, add a quality risk to the risk list)?
11. How many significant defects or problems were found?
12. How many products have been required to be re-reviewed because of insufficient quality?
13. How do you track defects back to the review process? How do you relate defect tracking and the review process?
14. What is the feedback loop between review results and risk management?
15. How many previously conducted project reviews have had an impact on the schedule?
16. How much slack have you put in the schedule to deal with problems from the reviews?
17. Show how your review process reflects your defect target.
18. What are the measurable criteria for reviews?
19. How do you track and close action items from reviews?
Metric-Based Scheduling and Management
1. What is the current CPI (cost performance index) for your projcct?
2. What is thc TCPI (To Complcte Performance Index) productivity needed to complete your project on budget?
3. What are the key program issues and the metrics that you have selected to provide visibility into the projects (e.g., goals, risks)?
4. Do you have a customer/contractor agreed upon program metric plan?
5. How do your metrics tie in to your managemcnt process (e.g. tracking, controlling)?
6. How do you use metrics for prediction and process improvement?
7. How are these metrics reported to you? How do analyze them? What do you do with the data? How frequently is the data reported?
8. How is your project metric program tied to your risk managcment program?
9. Have you collected the data you need for the control panel?
10. Can you show your current control panel?
11. What is the current estimate? How were the estimates calculated?
12. What is your growth rate for software size? What do you expect software size to be two years after deployment?
13. What was the method for doing the cost/schedule estimate for the program office? Who did it?
14. What is the basis for the earned value you are reporting? Who in your office is responsible for supporting it?
15. What is the difference between predicted earned value and actual earned value for the software?
16. What percentage of total employee hours are overtime? What percentage of staff hours billed on a task are for direct efforts vs. indirect support, administration, etc. ?
17. How are new tasks added to the work breakdown structure?
18. What percentage of time is charged against previously closed tasks (i.e., rework)?
Binary Quality Gates at the Inch-Pebble Level
1. Do you use binary quality gates on your project? (Quality gates are checkpoints that ensure the quality and integrity of products before they are used in the next step of development.)
2. How many quality gates are in your development
3. Do quality gates serve as checkpoints for product quality at discrete points in the project schedule prior to product release for general use?
4. How are you assured that your criteria and requirements are flowing down to the quality gate level?
5. Describe your requirements traceability mechanism. How are you tying it to your quality gates and their criteria?
6. List five representative quality gates for the project.
7. How many defects were discovered for products that have previously passed quality gates?
8. What quality gates have you established that ensure proof of concept ( i.e., early deliveries, demos, time on machine)?
9. What is your definition of inch pebble? Do you use them on your project? What are the lowest level tasks on your activity network?
10. Give an example of the lowest level tasks on your activity network?
11. How many tasks do you have at the lowest level of the work breakdown structure?
12. What is the longest duration of the lowest level task?
13. What is your highest cost item of your lowest level task?
14. What is the average cost and duration of the lowest tasks that have yet to be determined or decomposed?
15. What provisions exist in the plan for tasks that have yet to be determined or decomposed?
16. What type of reviews are conducted to assess the quality of all engineering products before they are released for project use?
Program-Wide Visibility of Project Plan and Progress vs. Plan
1. What progress indicators are visible to all members of the project?
2. What is the feedback mechanism for indications of unacceptable health in the project?
3. Do you have an anonymous reporting channel? How does it work? Who has access to it? How do you use it?
4. What is on the critical path today?
Defect Tracking Against Quality Targets
1. What kind of defects do you track (faults, failures o both?
2. What is your delivered quality target? What evidence do you have that you are currently managing toward that target?
3. What kind of analysis do you perform on defect data? How are results used (e.g., tracking improvement, prediction)?
4. How many defects are currently open? What are their priorities?
5. How are metrics on defect data gathered and analyzed to provide feedback to management?
6. How do you project expected defects?
7. Are you collecting metrics on the length of time between defect opening and closing?
8. What percentage of defects are found in an activity subsequent to the activity they were inserted?
9. On average, how many defects passed at least one quality gate prior to the gate in which they were detected?
10. Are project defects entered in a configuration management system? At what activity do you start collecting defect data, and when do you plan to stop?
11. How do you trace defects in quality gates back to their insertion point?
12. What do you do with items that are of an unacceptable quality?
13. Do you produce a configuration status account? If so, how often?
14. What is the average cost of fixing defects?
Configuration Management
1. Is there a documented configuration management process for this project?
2. What classes of information does your project control (e.g., CASE files, budgets)?
3. What items are under control? How is the decision to control them made?
4. List items that aren't under configuration control.
5. Describe the process for change control on the project.
6. Who on the project is responsible for change control of baselined and non-baselined items?
7. Do you have a configuration control board? If so, who are its members?
8. Do you have a process for controlling non-product software that is shared?
9. How does the developer make releases to the acquirer?
10. How does the acquirer take delivery of items from the developer?
People-Aware Management Accountability
1. How do you assess project morale? What is your plan for correcting moral problems when they occur?
2. Who are your experts? What are they expert in? What percentage of their time is spent on this project?
3. What are the weakest parts of your team and the contractor team? What is being done about them?
4. What feedback about individual performance is provided to the contractor's program manager?
5. Are you using the concept of an integrated product team (IPT)? If so, how are they organized?
6. If the project is not using integrated teams or if users are not on the integrated product team, what are you doing to ensure user involvement?
7. What incentives do you have in place to encourage productivity?
8. How is the project prepared to deal with productivity shortfalls?
9. How do dispersed work groups (including subcontractors) communicate?
10. Do dispersed work groups (including subcontractors) use the same development environments?
11. What was your total staff turnover for the year, how many were voluntary and how many were involuntary?
12. Describe the training that you have given to technical staff on this project. Is such training made available to all members of the IPT?
13. Are all people on the project using common tools?
14. Is there a consistent set of project standards that all team members are following What are they?
Return to Checklist Table of Contents