Introduction to Software Engineering

Lecture 9 - Overview of software project management

Jim Briggs, 7 December 1998

Read Sommerville (5th ed.) chaps 28-31.

Software Management

Different from other types of project management because:

Management activities

  1. Proposal writing
  2. Project costing
  3. Project planning and scheduling
  4. Project monitoring and reviews
  5. Personnel selection and evaluation
  6. Report writing and presentations

Software management structures

Programmer productivity

Possible measures (per man-month?) with potential down-sides:

Paradox is that design, documentation, testing, etc. all subsumed by one measure.

Quality of programmer very much affects productivity. Very often it is the dominant factor.

So does use of methodologies.

Project planning

Effective management depends on thorough planning.

Milestones

Intangible project => only measure of progress is documentation. Good milestone is finished document.

Tie in with lifecycle model: at sub-stage level.

Too many milestones implies team spends more time writing reports than doing work. One every 2-3 weeks?

Option analysis

Analyse alternative paths to goals.

Score options against goals (weighted). Arbitrary, but useful.

Cannot override critical objectives or constraints.

Economic analysis (payoff matrix). Probability of any scenario occurring.

Scheduling

Problems to anticipate: staff leaving or being ill; hardware breakdown; late delivery of essential components; intractable problems.

Estimate, then double it!

Task interdependencies. Critical path analysis.

PERT charts (best, worst, expected cases). Gantt charts (bar chart).

Cost Estimation

Boehm in Software Engineering Economics identifies 7 techniques for cost estimation:

  1. Algorithmic - relate some metric (e.g. size) to previous projects to come up with a prediction.
  2. Expert judgement.
  3. Estimation by analogy (with similar projects).
  4. Parkinson's law (project will cost whatever there is available to spend on it!).
  5. Pricing to win (whatever the customer has to spend). Very common.
  6. Top-down estimation - consider overall properties of project then sub-divide among components.
  7. Bottom-up - estimate cost of each component and sum.

COCOMO model: organic, semi-detached and embedded mode projects. Basic model PM = A * KDSI b, where A ranges from 2.4 to 3.6 and b from 1.05 to 1.20.

Intermediate model applies multipliers to the basic model. These based on attributes of the project. Complete model estimates cost as sum of sub-system costs.

Assume you know (or are told) what a man-month costs (for different types of staff).

Configuration Management

CM planning

Concerned with the development of procedures and standards for managing an evolving software system. How to control change? How to manage systems that have been subject to change? How to release those changed systems to customers? Tied in with quality assurance.

Decide what items are to be managed (normally formal documents). Project plans, specifications, designs, programs, test data, etc.

Then need:

Database to record configuration information:

Change control

Comes into effect once product is released (not applicable during development?).

Submit change request; validate it; assess and cost; decide whether to accept; apply.

System building

Have all the components that make up a system been included in the build?

Has the appropriate version of each required component been included?

Are all required data files available?

Are data file names used correctly?

Is the appropriate version of the compiler, etc. available?

Version control

Versions and releases.

Numbering/naming scheme.

Make and SCCS

Documentation

User documentation:

System documentation:

Document quality. Writing style.

Configuration control applies to these too.

Quality assurance

Related to verification and validation, though really V&V is a technical function and QA is a management function.

Often carried out by separate QA team that bypasses management chain.

Bersoff: "Quality assurance consists of those procedures, techniques and tools applied by professionals to ensure that a product meets or exceeds pre-specified standards during a product's development lifecycle; and without specific prescribed standards, quality assurance entails ensuring that a product meets or exceeds a minimal industrial and/or commercially acceptable level of excellence.''

Quality plan to set out attributes that product should exhibit, and (importantly) how those should be assessed.

Wrong to assume what ``high quality'' actually means. Cannot assume that a quality development process results in a quality product. But many people do. Easier to do!

Quality of process: define process standards (such as how reviews should be conducted); monitor process to ensure standards complied with; report process to management and customer.

Software reliability

Primary task of QA is to ensure product reliability.

Remember - perceived reliability is paramount.

Assess operational reliability, not latent faults.

Reliability metrics:

All dependent on time, but could be elapsed time, processor time, or some discrete unit such as transactions.

Reliability growth models. If we assume that each repair reduces the number of faults (not true) then reliability is monotonically increasing. Improvement in reliability per repair decreases as faults as repaired.

Process standards

Encapsulate best practice.

Provide framework for QA process. QA can become checking that software adheres to standards.

Assist in continuity. Learning effort is reduced.

Standards for:

Standards often set remote from development process (e.g. by another department or higher management).

Better to:

Software metrics

State of art is that few metrics are useful as predictors or related to product attributes which we wish to discover.

Control metrics (used by management): effort expended, elapsed time, disk usage, etc.

Predictor metrics (used to predict product quality): readability index, complexity. Based on the (flawed) assumptions that (i) we can measure something; (ii) that a relationship exists between what we can measure and what we would like to know; and (iii) the relationship is understood, valid and can be expressed by a formula or model.