Skip to Main Content
Contact Us
Meeting Cancellation Policy

Established September, 1992

Newsletter of the Boston SPIN

Issue 9, March/April 1996

Return to Newsletter Index



APRIL 30 (Tuesday) -- Deadline for Boston SPIN Steering Committee nominations.


MAY 8 (Wednesday) -- BCS Software Quality Group

Angie Kacher (Atria) on Atria's Configuration Management Tools.
A technical overview and demonstration of Clearcase, Attache,
ClearTrack and MultiSite. Cosponsored by the Software Division
of the American Society for Quality Control.
7PM, BCS Waltham office.
Info: John Pustaver, 508-443-4254,

MAY 19-20 -- The National Software Council Open Meeting

Held in conjunction with Quality Week '96
Sheraton Palace Hotel, San Francisco, California
Info: 707-643-4423, ext 2;

MAY 20-23,1996 -- 8th SEPG Conference, Atlantic City, N.J. "Broadening the Perspective for the Next Century"

MAY 28 (FOURTH Tuesday) -- Boston SPIN Monthly Meeting

6:30 PM (refreshments), 7:00-8:30 PM (meeting)
GTE, Building #5, 77 A Street, Needham, MA
(Admission free, Wheelchair accessible)

Back to top


Nominations are being collected for next year's Steering Committee. For a nomination form, contact Jack H. Arabian (Nominating Committee Chair) email: The deadline for nominations is April 30.

We have 2 separate email lists, one for just membership (this newsletter) and one for LOTS of announcements that we receive from process organizations and forward out. To add yourself to the announcements list send email to

Issue #6 (Spring 1996) of the IEEE TCSE Software Process Newsletter is now available electronically. This issue focuses on the results of Phase 1 of the SPICE Trials. To get copies you can:

  • Go to the SPN home page at URL: and view (and print) the file using ghostview
  • Use anonymous ftp at the following site: ""; directory "pub/spn"; and file "" or ""

For more info contact Khaled El-Emam,

Back to top


January1996 Meeting Report
by Ed Maher, courtesy of Digital Equipment Corporation


Cleanroom Engineering


Philip Hausler, Manager of IBM's Cleanroom Software Technology Center


Philip has managed development projects using Cleanroom techniques, and currently manages a department which provides education and consultation for the technology transfer of Cleanroom processes. He did a good job of selling his services. He was knowledgeable, could support his points with anecdotes and metrics, and was very good at answering questions. His emphasis was more on describing the problems with traditional software processes and the benefits of Cleanroom (the "why" of Cleanroom) than on the "what andhow" of Cleanroom.

Cleanroom is a technique for removing all of the defects before executing the code. It includes a very disciplined spec/design/code process and approaches "testing" as "certification". As a way of contrasting Cleanroom with typical software engineering, Philip characterized typical software engineering as: first inject defects during design and code, then attempt to achieve quality by test and debug.

The benefits of Cleanroom include: decreased life-cycle costs, improved quality, higher productivity, easier maintenance, better software documentation, strong team cross-training, and the existence of a truly repeatable process.

He passed out a 1994 IBM Systems Journal paper on this topic ("Adopting Cleanroom software engineering with a phased approach" by P.A. Hausler, R.C. Linger, and C.J. Trammell) and referenced the following articles:

  1. W.W. Gibbs, "Software's Chronic Crisis"; Scientific American, Sept. 1994
  2. Alice LaPlante, Computerworld, survey of 150 corporate IS managers, March 1995.
  3. Mary Shaw, "Prospects for an Engineering Discipline of Software", IEEE Software, Nov. 1990

(Please note that Cleanroom is a series of complex techniques. He mentioned that they teach a three day class on it. What I am providing is a summary of a summary. This report is likely to have some misplaced emphasis and absurd simplifications.)

Back to top


He started out by providing some industry data demonstrating the problems with the state of software engineering today (from the Gibbs and the LaPlante articles mentioned above): - For every three new large-scale software systems put into operation, one is canceled. - The average software development project exceeds its schedule by half. - 75% of all large systems are "operating failures". - Only 25% of all projects complete on time and within budget, and meet customers' expectations. - Less than 11% of IS project managers know their critical paths.

One of the obstacles when introducing Cleanroom is the mindset that it is easy to fix software defects, so it's OK to inject them (as contrasted with hardware development, where a lot of focus is placed on ensuring that most of the defects are removed during design). He stated that it is not really easy or cheap to test and debug defects out of software; it's just that the cost of software rework isn't as obvious as the cost of hardware rework.

To support this point, he put up a slide on the "cost of defect correction"; defects cost 5 times as much in design as they do in req't/spec, they cost 50 times as much in test, and 100+ times as much post release. In other words, the best way to produce quality software products is to originate the quality during development.

At this point, he mentioned a frequent complaint of any proposed process improvement: namely, that it is an attempt to stifle creativity. His response was that Cleanroom moves the creativity into the spec and design activities where it belongs.

Underlying the Cleanroom technique are four principles:

  • Defect prevention Produce defect-free software before executing the code.
  • Reliability certification Certify that there are no defects (versus trying to test the quality in).
  • Intellectual control Control process and work products using statistical control, risk management, and other techniques (i.e., having concrete knowledge of where you are).
  • Teamwork Small teams of people with shared responsibilities peer-review all work products. He noted that the synergy from combining these things is more powerful than the sum of their individual contributions.

The Cleanroom process includes the following phases:

  • Specification (Black Box spec and usage-based spec)
  • Incremental Development Planning
  • Development (increments of: design, preparation of test cases, updating of specs, implementation, and certification).

Some key attributes of the process include:

  • Each incremental build contains a set of user-visible features that are each 100% complete (each one is a superset of its predecessor).
  • All deliverables are peer reviewed and the design and codegoes through "Correctness Verification" (more about this below).
  • The concept of teamwork.
  • The use of measurement. Cleanroom also includes a process improvement loop to allow for dynamic changes to the process as needed based on the certification activity and other measurements.

He very briefly explained that there are underlying mathematical principles that support these techniques. Fortunately he didn't go into them and said that you don't have to understand them to use them.

What follows is a brief description of each major phase:

Back to top


He felt that it was very important to get the whole product specified up front and that the spec should contain both a black box specification and a detailed description of the external behaviors. The black box spec is the primary input to the design activity and the external behaviors are used to assist in preparing test cases (for the certification). The spec should be peer reviewed and should be validated against the product requirements. In response to a question, he did state that he has seen Cleanroom work with the Spec done in increments, but that he doesn't recommend it.

Incremental Development

The explicit objective of Cleanroom development is to produce a complete fully documented design that meets the spec, and to deliver increments of zero-defect code to the certification team. The functions undergo "correctness verification" rather than unit testing. He described "correctness verification" as being more than an inspection; it isn't just a reading against the upstream documents. It is very detailed and precise, and it includes analysis, demonstrations, and proofs of correctness. All code has accompanying comments that describe the black box behavior (including every side-effect) of the code.

The code verification then involves checking the code, the comments, and the specification. This is important for meeting the zero defect goal, but it also makes for cheaper and easier maintenance.


One of the underlying principles of Cleanroom is that quality comes from rigorous development, not from testing and debugging. Testing should resolve two key questions:

  • What test cases should be executed?
  • When is reliability sufficient to stop testing?

They use the usage models (which were created as part of the specification activity) in conjunction with statistical quality control methods to determine the most appropriate testing scenarios. By tracking Mean Time To Failure (MTTF) they are able to predict the reliability of a product. If the MTTF doesn't start to grow dramatically over time, then they are likely to send it back to design. To do otherwise could be an attempt to debug-in the quality (a violation of one of the fundamental Cleanroom principles).

Ideally, there should be a physical split between the development team and the certification team.

To illustrate his point about how ineffective traditional software testing can be, he used an analogy where we are asked to "test" a book by reading 20% of the pages. We find 100 defects and they all are fixed and verified. Is the book now ready for release?

Related to this was a discussion around the fact that management frequently starts to worry if defects are NOT being found during test.

In response to a question, he stated that the Cleanroom projects at IBM apply all of the same techniques to bug fixes.

He closed with some data to demonstrate the benefits of Cleanroom. This is based on over 1.5 million lines of code (commercial & government; and a variety of languages, environments, and applications). They have measured defect density to be 2.6 defects per KLOC pre-release and 0.2 defects per KLOC post- release. (Contrast that with Howard Rubin's data -- presented at the April 1995 SPIN -- which shows an industry average post- release defect density of 3.79 defects per KLOC.} They also have measured productivity to be 1.5 to 5 times better than non- Cleanroom software development.

Back to top

Some additional questions and answers:

Q: If I was to attempt to introduce Cleanroom, on what kind of project should I first try it?

A: Pick a fairly small project -- 3 - 6 months in duration with 5 - 8 people. They don't recommend that all aspects of Cleanroom be introduced into a project at once. The Cleanroom paper that he passed out contained a table suggesting how to phase in Cleanroom techniques.

Q: What about rapidly changing requirements?

A: With incremental development, you really need to have a process that allows the spec to be completed before design. A Cleanroom project must have good requirements management. He also said that you can't use unstable requirements as an excuse to not spec completely or properly.

Q: Why not extend Cleanroom back to encompass the "requirements" phase?

A: It is unnecessary since part of the specification includes clarifying and documenting the requirements from the users' perspective.

The handout contains a slide showing the "Cleanroom Paradigm Shift". He didn't present this, but I found it interesting:

  • Individual operations --> Team operations
  • Waterfall development --> Incremental development
  • Informal specification --> Black box specification
  • Informal design --> Box structure refinement
  • Defect correction --> Defect prevention
  • Individual unit testing --> Team correctness verification (pre-execution)
  • Path-based inspection --> Function-based verification
  • Coverage testing --> Statistical usage testing
  • Indeterminate reliability --> Certified reliability
  • Difficult & costly maintenance --> Effective and efficient maintenance 

(I think that this demonstrates how much change is involved in a switch to Cleanroom; both in the processes followed and in the results achieved.)

Back to top

March 1996 Meeting Report
by Ed Maher, courtesy of Digital Equipment Corporation


Managing Customers' Expectations


Naomi Karten, consultant;

Author -- "Managing Expectations:Working With People Who Want More, Better, Faster, Sooner, NOW!";

Publisher -- "Perceptions & Realities" -- a newsletter that focuses on how to deliver superior service and build win-win relationships.


(For this discussion, the term "customer" refers to your internal customers -- individuals or groups that receive deliverables or service from you -- as well as the paying end-users of products or services.)

Some of the key points:

  • When trying to determine and manage your customer's expectations, it is important to consider how the customer is treated and not just focus on the service or product itself.
  • There is value in establishing a formal "Service Level Agreement" as a communication tool for creating a common understanding between the two parties regarding; services,responsibilities, and priorities.
  • Seven ways to manage customer expectations:
    1. Build a strong foundation.
    2. Identify communication preferences.
    3. Clarify your services.
    4. Create service standards.
    5. Implement service tracking and reporting.
    6. Conduct service reviews.
    7. Plan conflict resolution process.

Even though she focused on how suppliers can better manage their customers, I also learned some things that I can do to be a better customer. For example, help my suppliers have a better understanding of my expectations, and acquire a better understanding of their constraints.

Back to top


Naomi started out by asking us to think about what is important to us when we are the customer (in a restaurant, auto shop, supermarket, airport, etc.) The responses included: respect, courtesy, attention, predictability, honesty, quality products, convenience, and trust.

She noted that these desires can be seperated into two categories: - Those that are attributes of the product or service (for example, quality and predictability). She labels this category the technical element of the service. - Those that are attributes of how we are treated (for example, respect and honesty). She labels this category the human element of the service.

Naomi then described a study that was done on the causes of people changing product or service providers. It showed that 30% of the time, the change was because of dissatisfaction with the product and that 70% of the time it was because of dissatisfaction with how they were treated. The clear message is that we should focus more on the human element as we work on improving how we deal with our customers. This doesn't mean that you can have a lousy product as long as you treat people with respect. However, it does indicate that putting focus on how people are treated can allow you to get away with a small slip in product or service.

She emphasized the importantance of listening to customers. To support this, she described a situation where she was working with a company that was planning to redesign their processes. They got 15 internal people in a room, broke off into three groups and started brainstorming about how best to deliver their service. Someone suggested that perhaps they should involve customers in this exercise. They quickly recruited three customers and put one in each of the three break-out groups. She observed that in all three cases, the single customer was now doing all the talking -- with the other people taking lots of notes. They had been prepared to go ahead and redesign a process without input from the most appropriate source.

You also need to be flexible and recognize that you can't satisfy all customers with the same process. This was illustrated by her presentation of the following quotes from two different customers of the same company; - "I want to be kept informed on a regular basis." - "I want to be advised of only the exceptions."

Service Level Agreements (SLAs)

A Service Level Agreement (SLA) is a formal negotiated agreement between two parties designed to manage expectations about: - The services and service level quality - The responsibilities of both parties - The steps both parties can take to succeed

A key benefit of an SLA is increased awareness by both parties of each other's sensitivities. It also addresses the fact that frequently one side (often the customer) doesn't even know that they have responsibilities.

Attributes of an SLA include: - It is a communication tool (this is often missing in customer relationships). - It can be a conflict prevention tool. - It can be a living document -- as contrasted with a contract. (Contracts are frequently created unilaterally at the outset and thrown on the shelf until one party feels the need to use one to make a point or gain leverage.) - It should provide an objective process for gauging the effectiveness of the service.

As great as an SLA is, it is not a magic bullet. Naomi corrected some common misconceptions about how an SLA could be used: - It shouldn't be used as a forcing function. - It shouldn't be looked at as a "complaint-stifling" mechanism. On the contrary, an SLA invites complaints. - An SLA won't work if it is unilaterally created by one side and imposed on the other. Part of what makes it work is that its creation and subsequent updates involve the two parties coming together. - It is not a quick fix.

Back to top

The Seven Ways to Manage Expectations

The last part of her presentation was a description of seven things that can be done to help manage customer expectations. It is important to remember that successful management of expectations doesn't mean that you always meet expectations. It does increase the likelihood that there is a common understanding of expectations.

  1. Build a strong foundation: This involves taking steps to better understand your customer's perspectives so that you can then appreciate what's important to them. It also involves helping your customer to better understand your perspective so that they can appreciate your options and constraints. She suggests that both sides ask themselves the following questions regarding the other party: - What do you need from them to do the work? - What are you willing to do for them?

    Just having this dialogue can result in both parties knowing each other better and having an improved working relationship.

  2. Identify communication preferences: This involves analyzing the communication styles of both parties, expecting changes in the communication preferences over the life of the project, and aligning your communication style with their communication style.

    She described an example involving a VP who was having a difficult time getting on his CEO's calendar. The VP noticed that he often ran into the CEO on the elevator and had 30 seconds to make his pitch. Over time he became very good at making succinct 30 second pitches that the CEO could approve on the spot. Sounds great....then he got a new CEO! He continued with his style of making pitches in 30 seconds and almost got fired. The good news is that he noticed in time that his new CEO had different communication preferences than his old CEO (and different from the style that he himself had become used to).

    She provided another personal example involving a nasty working situation that was improved when she learned that the other party preferred to get status using pictures and graphs -- as opposed to pure text. Again, the recognition and resolution of a problem that was based in communication preferences improved a working relationship.

  3. Clarify your services: Ask yourself; "Do my clients understand my services?" It frequently isn't the case. If you perceive unreasonable expectation, perhaps it could be due to themnot understanding your side of the relationship.
  4. Create service standards: Help to manage expectations between the parties by ensuring the same understanding of the timeframe and conditions of the service delivery. This is most appropriate for those things that you know your customer doesn't like. An example of this involves a service provider that heard from a customer that they were uncomfortable with the Voicemail system. They didn't mind the tool, but they feared that no one was ever going to listen to their call ("Voicemail is a Black Hole"). The provider created a service standard that required all calls to be returned within a fixed period of time regardless of the status. This satisfied the customers and didn't require the provider to abandon Voicemail.

    She provided another personal example; She was on a plane that was sitting at the gate. Someone got on the speaker and said: We are experiencing mechanical problems, we don't know when they will be fixed, and we will give you an update every 15 minutes regardless of status. This was an excellent service standard: they kept everyone informed; they had a schedule for communication; and they told the customers their plan. She knew exactly what to expect. (This wouldstop being so excellent after a few hours.)

  5. Implement service tracking and reporting: Identify a small number of things that can be tracked and reported. Consider the customer perspective, identify red flag situations, and be alert to a pattern.

    This is an area where she spent more time cautioning us on the things not to do then she did describing what we should do. To illustrate how measuring can negatively influence behavior she cited pizza delivery being "guaranteed within 30 minutes or it's free." The intent was to make the customers confident that they would get their pizza promptly and if they didn't, they would be happy because it was free. The result was automobile accidents (some serious).

    Another example involved a phone-based service provider that stated that they would answer all calls by the second ring. Soon they started rushing off the phone to get the next call. This solved one problem and introduced a new one. This measure did not last long.

    She asked us if we thought the following was true or false: "Measuring anything is better than measuring nothing"?

    Her answer was "false". However, there was discussion about how, in some instances, there is value in measuring as a way to internalize the characteristics of a measure (LOC for example). But, measuring as an end in itself is not recommended. Later on (in response to a question) she said that you really have to be careful when designing measures. You may achieve the desired value but not the desired behavior.

  6. Conduct service reviews: These reviews can be simple; just meet and find out how things are going. This revisiting of the relationship helps to preempt conflict. These reviews are more important if the relationship is new or critical. She recommended that the following be answered up front: - What is the objective of these reviews? - What review process should be followed? - How often should these reviews be conducted? - What are the grounds for an interim review?
  7. Plan conflict resolution processes: This include an escalation procedure, the use of service reviews, and an expectations manager (Naomi referred to this person as a relationship ombudsman). Conflict cannot be avoided -- but acknowledging its existence and having mechanisms to deal with it can help. The environment should also allow it to be OK for someone to voice a grievance with the other party.

As a summary she reiterated the importance of: - Ongoing communication with the customer - Understanding the customer's expectations - Involving both parties in the agreements (otherwise they really wouldn't be agreements) - Explicit relationship building

Questions & Answers

Q: In a provider/customer relationship, the customer wants maximum capability and the provider wants to perform with maximum efficiency. How do you balance this?

A: It is true that the two parties frequently have different agendas. The key is for both parties to recognize this and to develop an understanding of what is driving the other party. You don't have to agree, but there is value in the heightened sensitivity. Doing this may result in trade- offs that might not otherwise have come up (for example, trading cost for speed). As for external customers, you can't always have this dialogue with all of them, but you can educate them on why things are the way they are.

Q: When you're trying to get a Service Level Agreement going with an internal customer and they just don't have the time to be involved (early or at all), can you do it unilaterally?

A: The reality is that you can't collaborate on everything. When that happens, do some of the stuff on your own and give them the opportunity to review it. This shows them that you care about service, and may even influence them to make the time to be more involved.

Back to top


From the January minutes of the North New Jersey SPIN, as recorded by Peter Spool

-- Quotes or paraphrases from Susanne Kelly (Vice President in Citibank's Corporate Technology Office), as she spoke about Citibank's assessment program. Susanne will be speaking at the SEPG Conference in Atlantic City next month:

"In the past, there was a big risk in changing too fast. Today, the risk is in not changing fast enough....

"Good comedy improvisation can be wonderful, but bad comedy improvisation can be terrible. The same is true of software development....

"[Both processes and bicycles] must be fast, balanced, tuned, and customized to the owners. Both bicyclists and software developers must have clear vision, skills, be part of a professional team, and always be in training."

Back to top


The Boston SPIN is a forum for the free and open exchange of software process improvement experiences and ideas. Meetings are usually held on third Tuesdays, September through June.

We thank our sponsors, GTE and Raytheon. We also thank U/Mass at Lowell for hosting our Web page.

For information about SPINs in general, including HOW TO START A SPIN, contact: DAWNA BAIRD of SEI, (412) 268-5539,

Boston SPIN welcomes volunteers and sponsors. For more information about our programs and events contact:

The Software Engineering Institute (SEI) Technical Assessments, Inc.
ESC/ENS (Bldg 1704)
5 Eglin St
Hanscom AFB MA 01731-2116
(617) 377-8324; Fax (617) 377-8325;

IN THE SPIN is published monthly or bimonthly September through June. Letters, notices (including job postings), and contributed articles are welcomed. Articles do not necessarily represent the opinions of anyone besides their authors.

IN THE SPIN is available only by email. TO SUBSCRIBE send email address to

SEND letters-to-editor, notices, job postings, calendar entries, quips, quotes, anecdotes, articles, offers to write, and general correspondence to Sallie Satterthwaite, 508-369-2365, If possible, please format input as text with explicit line breaks and the maximum line length seen here.

Send SPIN Doctor questions to Judy Brodman,

Our WEB HOME PAGE is at The following will also work:

Return to Newsletter Index

Back to top