Established September, 1992
Newsletter of the Boston SPIN
Issue 5, June/July 1995
SEP 19 (Tue)
Next Boston SPIN meeting, program tba
8th Software Engineering Process Group (SEPG) Conference, Atlantic City, N.J.,
"Broadening the Perspective for the Next Century"
At the June SPIN meeting, the following officers were elected:
Many thanks to the Nominating Committee headed up by Matt Cohn and also to GTE in Needham for continuing to host our meetings. "Facilities" consistently gets "Good" to "Excellent" in the Feedback forms.
Count Us Among The Weberati
The BOSTON SPIN WEB HOME PAGE is at http://www.cs.uml.edu/Boston-SPIN/
Hearty congratulations to Rachel Silber (email@example.com) and team.
SEPG95 Conference Materials In The Bag
The Boston SPIN still has some full sets of the Software Engineering Process Group SEPG95 Conference Materials (including Tutorials). We will give you a full set (including a handy tote bag) for a $10 donation to the Boston SPIN. You can obtain the SEPG 95 materials in one of three ways:
For further information such as postage costs ($6-$21 domestic), contact:
ESC/ENS (Bldg. 1704) (TAI)
LOGOS International is under contract with the Air Force at ESC, Hanscom AFB, MA to produce a handbook for Software Process Improvement, based on the CMM. The target group for the handbook is software organizations that are small or have small projects, but interest in the handbook is coming from a wider range of groups, including large businesses.
The handbook is intended to help organizations with insufficient resources or know-how to initiate a full-scale process improvement effort by providing these organizations with a means to jump start their process improvement programs. The handbook will provide templates for documentation, examples of best practices in organizations (what practices have worked well in satisfying KPA goals), pitfalls to avoid in process improvement initiatives, and anything else that might be helpful to immature organizations.
If you have any information that could be useful to the handbook (document templates, best practices, etc.), please contact
Burlington Job Opening
This person has an opportunity to make a tremendous difference in our software process.
Project lead position for Software Tools and Release Management group. Responsible for tools to support development (CM, development environments) for both Unix and Windows NT platforms. Responsible for release management and build scheduling to support engineering. Responsible for installation and software packaging issues. Work closely with engineering, QA, doc, and product production.
Programs "R" You:
Your input is critical to our putting on good programs. Our Program Committee welcomes your suggestions and your offers to present. Please contact Ken Oasis at (617) 734- 1017, firstname.lastname@example.org.
STRIVING FOR MATURITY
|IMPORTANT||Priority = 1||Priority = 2|
|Tendency = 1||Tendency = 3|
|NOT IMPORTANT||Priority = 3||Priority = 4|
|Tendency = 2||Tendency = 4|
(Adapted from S. Covey)
Urgent & not important tasks tend to get done before important & not urgent tasks. It should be the other way around, to be most effective. Although SEPG members regard software improvement as important, the tasks usually are not urgent, thus they are often neglected. Dangerously, if this situation is allowed to continue, and there are no consequences for SEPG tasks not done, they start to be perceived as unimportant "busy work".
CHANGE OURSELVES FIRST
When the SEPG was formed, the challenge we faced was to change the way we responded to the corporate culture, and begin putting our future well-being ahead of interruptions and short term demands. And we needed to demonstrate our own commitment to doing up-front work before we could, for example, ask a software practitioner to delay coding until he or she had completed and reviewed the requirements and design specs. However, the corporate culture was just too strong, and the perceived risk too great, and the SEPG members continually relegated SEPG tasks to the back seat. We tried a variety of approaches to improve this situation:
The following table helps to illustrate our situation:
Vision + Skills + Incentives + Resources + Action Plan ==> Change
... + Skills + Incentives + Resources + Action Plan ==> Confusion
Vision + ... + Incentives + Resources + Action Plan ==> Anxiety
Vision + Skills + ... + Resources + Action Plan ==> Slow Change
Vision + Skills + Incentives + ... + Action Plan ==> Frustration
Vision + Skills + Incentives + Resources + ... ==> False Starts
(C. Myers, SEI)
Our most obvious missing element has been Resources (i.e., time), but the deeper missing element is Incentives. Given sufficient Incentives, the SEPG team members would allocate their time, thus providing the missing Resources. With these factors identified, we have been working to institute the changes which will help the SEPG to succeed. We have created a software improvement tactical plan and preliminary schedule. And we have secured our sponsor's support for treating the software improvement plan as a project with equal priority, and equal commitment, as our product development projects. We will be held accountable, by management, for completing the tactical plan. Our sponsor has been strong in his messages that SEPG tasks are important and must be delayed no more, nor less, than other tasks. And he is backing us up, with management and with other departments, when SEPG work takes time away from other work. With the importance and urgency of SEPG work elevated to the same level as other work, we have the incentives to provide the resources to create the desired cultural change.
= An SEPG must recognize that its first task is to change itself. Then it can lead the greater organization to substantive change. = Software improvement work must not be less important, nor less urgent, than the organization's nominal work. Otherwise the software improvement work will not get done.
= More than just management support is needed. Ongoing management reinforcement is key to convincing SEPG members, and software practitioners, that it's OK to change.
Guy Mott is SEPG team leader in the Test Engineering Department at a high technology Fortune 500 company in Irvine, California. Guy will be moving to the Boston area in fall of 1995 and would like to establish contacts in the Boston area to continue his work striving for software excellence. For questions or comments, contact Guy at 714-633-6716, email@example.com.
Process Improvement + Metrics = Performance Improvement
Dr. Howard A. Rubin (consultant, speaker, and writer in the areas of technology, modeling software dynamics, implementing software metrics and management "flight simulation)
This was a very entertaining and informative presentation. It was delivered using multimedia -- music, interview clips, scenes from movies and TV, and the execution of process simulation software tools. The only downside was that (as happens frequently at the SPIN) he was delivering a presentation in one hour that was obviously constructed to be delivered with more time. He covered a lot of ground;- The competitive importance of having a quantitative understanding of your organization. This included presentation of data showing where the U.S. stands with the rest of the world in a number of areas (allocation of engineering budget, productivity, quality, and process initiatives).
- A warning that the organizations that are leading the pack (in terms of productivity and quality) are extending their lead at a rapid pace.
- Advice that metrics should be incorporated into any process improvement program. In addition, he recommends that there be quantitative business goals that can be used to help manage the improvement activity and not just used to measure the outcome.
- A demonstration of a process simulation tool that can be used to help plan and manage process/technology changes. The conclusions in his talk are based on a large collection of industry data (18,000 observations, 11,500 organizations sampled).
Warning! This report contains a lot of data. This was much easier to digest as part of a well-delivered multimedia presentation with graphs than it is to read in a report.
A serious problem with how most organizations use metrics is that they use them as a measure of the end state and not as a navigation tool throughout the process. He continuously stressed the importance of continuous measurement to provide an organization with competitive advantage. His data shows the following;
- Only 1 in 5 organizations understands the size of their software products
-Only 1 in 30 organizations understands how that size changes from year to year
- Only 1 in 100 organizations has a quantitative understanding of their product quality. This indicates that if you are part of that minority that is able to measure and manage this information, then you have a significant competitive advantage over those companies that are not. He provided some comparisons of the United States with the rest of the world;
- The USA is currently in line with the rest of world in how engineering activities are divided; 51% new development, 18% systems migration, and 31% maintenance/support.
- The USA is slightly ahead of the rest of the world in effort spent up front (USA spends 18% of effort on requirements analysis versus 16% for the rest of the world)
- For productivity -- which is measured as product size per person-year. The US is below average (6.9 KLOC per person- year for USA versus 12 KLOC per person-year for all surveyed.)
- The USA was lower than the rest of the world in average defect rates (2.3 defects per KLOC for the USA versus 3.79 defects per KLOC for all surveyed.) However, the USA rate was 15 times higher than the lowest observed. In the use of process standards/models:
- ISO 9000; The USA is last in the world in ISO 9000 certification (2% for the USA software organizations versus 20% for the rest of the world). It is also last in the world in interest in ISO 9000 certification (only 6% of U.S. software organizations plan to be ISO 9000 certified).
- Software Process Assessments; The USA is on par with the rest of the world (33% of U.S. software organizations have had an assessment versus 35% for all surveyed). For process maturity, the USA has slightly more Level 1 organizations (77% USA versus 74% for everybody), significantly fewer level 2 organizations (13% versus 18%), and slightly higher at Level 3 (10% as opposed to 8%). [A downside from my USA-centric point of view was that his data was mostly sliced into the following categories: everyone, USA, Canada, and Non-Canada. This allowed Canada to be compared to the rest of the world (excluding Canada), but the USA could not be compared to the rest of the world (excluding the USA). EM]
WHAT DOES ALL OF THIS MEAN?
All of his data (which has been collected since 1978) indicates that the leaders are learning and moving ahead of the competition at a rapid pace. If an organization stands still, it will lose significant ground. Right now the best in class are 200 times more productive and are producing products of 100 times higher quality than the average. (More recent data indicates that the productivity gap is approaching 600.) He described a traditional approach to process improvement (attributed to Bill Curtis; VP and Chief Scientist of TeraQuest Metrics, Inc.):
The suggested approach was to:
- In the first he was stressing the importance of getting control of events before you try to change them, then build your ability to manage and control your process.
- In the second clip he was addressing what he though was the most common misunderstanding of the CMM -- that the CMM is some kind of mold that you have to fit. He stressed that the CMM structure and framework are useful; but that it should not be used as a checklist. The (frequently referenced) Hughes Aircraft experience was cited. Hughes went from Level 2 to Level 3 in two years at a cost of $450,000. They calculated the value of this change to be 2 million dollars annually. A detailed description of their experience was published in the July 1991 issue of IEEE Software.
PROCESS IMPROVEMENT + METRICS (the SEI Time Warp)
Again, he stressed the importance of integrating process improvement with metrics that can then accelerate the change. It is also critical to focus the goals of any improvement activity on engineering targets. Improvement programs frequently define goals such as: "inspect 80% of new code", or "use formal estimating techniques". These are not good goals, they are mechanisms for achieving a goal. He provided an example of a good goal for a process improvement initiative:
- Within 18 months from the start of the initiative we should plan for the attainment of: a 33% reduction in development cycle time, a 25% increase in development productivity, and a 50% decrease in support costs related to rework and repair. Note that all of these are measurable and based on engineering and business objectives. To reiterate: integrate metrics into your process improvement activities. Use them to navigate toward your goal as well as to see the benefit -- NOT JUST AS A MECHANISM FOR MEASURING THE END STATE. It is also important that process improvement activities include making appropriate changes in the reward schemes (to encourage the change and discourage resisting the change)
DIMENSIONS OF READINESS
People need to understand why change is needed, understand what the change is, and be prepared to deal with it. To this end, he identified eight dimensions of readiness for technology change;
Each one of these can be rated on a scale of 0 to 5 for the current state and the target state (based on readiness to implement the change). Progress (in terms of readiness) can then be periodically measured and tracked. The discipline of assessing and tracking readiness of each factor independently forces you to think about all of them. This then allows you to make more informed decisions. The alternative would be to make decisions or perceive readiness based on your gut or based on the subset of factors that happen to be more visible.
PROCESS FLIGHT SIMULATION
Airlines do not send pilots up in a new plane without first having them run a few simulations, but engineering managers are often expected to manage critical projects using processes, tools, or technology with which they have no experience. Running a simulation allows you to model your current process, test out the impact of potential new technologies, and be prepared for unexpected side effects of a chosen strategy. He ran simulation software. The focus of his simulation was on moving up the CMM scale (Note that the simulation is not limited to CMM-based process improvement -- that is just what he chose for a demonstration). The simulation showed that when moving up the Maturity Scale, productivity will go downward initially, but then start to go drastically upward. Another thing that was evident was that the productivity gap between the junior people and the senior people starts to narrow. I point these out as examples of the types of things that were simulated. This was done very fast and there were a number of other factors changing on the screen. He did not have time to describe them all and I did not have time to capture everything that he did describe.
Q: Since the range of organizational productivity is currently observed to be 600-1 (most productive - average). How is this distributed?
A: It is very bottom heavy; the organization exhibiting the 600 productivity factor is an outlier. However, since 1991, the average has gone up by a factor of 4! This reinforces the point that if you do not improve, then you are quickly falling behind.
Q: How can you improve productivity if you are maintaining a legacy system that was poorly developed in the first place?
A: You need to see what you can do differently. For example; contract out the work, re-engineer the system, or move the application to another platform.
Q: Are any Japanese companies on the high end of productivity?
A: Not in his data
Q: How does the formal use of software re-use affect productivity?
A: His data does not show that as a significant factor.
Q: Do the organizations that are exhibiting the highest productivity have anywhere to go? Can they improve any more?
A: These organizations probably can not improve with what we now know -- but there are always new things available. The organizations that are already pushing the envelope are the ones most prepared to find and capitalize on new technologies as they come along. Copies of his presentation are available from Dr. Rubin; FAX: (914) 764-0536 e-mail: firstname.lastname@example.org
Dear SPINners: Summer Solstice Greetings! I apologize for not being here last month. I probably missed writing the column more than you missed reading it. The weeks right before the SEPG National Conference were very hectic for all of us on the planning committee! I hope that many of you had a chance to attend the conference. Being somewhat biased, I think we produced one of the best conferences thus far for gathering practical experience data... at least that's what you told us!
To open the second day of the conference, there was the infamous "Beantown Brawl" between Bill Curtis and James Bach. exquisitely executed by Herb Krasner. It was, as touted, a very interesting exchange of ideas; and, in some cases Bill's and James' views were not so far apart (a warm-up debate for Pres. Bill and Newt?). The "round breakers" were an interesting touch. the final one being Watts Humphrey who managed to add a few comments of his own as he crossed the stage carrying the final round number! We need to create more vehicles similar to the "brawl" for the exchange of differing views; this mechanism allows us to listen and decide for ourselves which side we belong to or which views fit our current situation. I personally think it was a tie... and I'm looking forward to the rematch! How about it Bill and James?
For a first time at an SEPG Conference, there was a SPIN booth in the Exhibit area... and, for those of you who purchased the now famous Dragon T-shirt, "May the Process be with you"! The Boston SPIN booth ended up being a great place to meet other SPINners and I did manage to meet a lot of you there.
In the last column, because so many of you had questions about establishing and continuing meaningful metric programs, I said that I would spend the next few columns discussing metrics and measurement. This month, based on questions and comments that you had following the Return-on-Investment (ROI) presentation that Donna and I made at the conference, I have decided to discuss two areas together -- metrics and ROI.
You all appear to be very interested in ROI data. I assume that the interest stems from the need to present ROI information to your management to ensure that they initiate or continue your process improvement funding. In the early stages of your process improvement program, your management might be content with seeing what other organizations have been able to realize as returns on their investments in process improvement. But at some point, your management will want to see a real return based on their investment -- a real return being a representation of process improvement for your organization based on your organization's investment. And how do you represent the improvement within your own organization? The question of how to calculate ROI within an organization is probably one of the most difficult and critical questions that many of you face right now.
Let me try to quickly and simply present a number of steps that may aid you in answering the ROI question from an improvement point of view, not a dollar point of view, for your organization:
* Define your return(s) This is the hardest definition to agree upon within your organization. For your corporation, a return will be dependent on corporate and customer goals; for your software organization and your project, a return will be dependent on the organization's and project's goals for products, e.g., quality, productivity, etc.; for your business area or program managers, a return may be judged by customer satisfaction and by return or increased business.
* Define the representation mechanism for the return(s) This is the point at which you very carefully choose the metric(s) that will represent the return(s) to all concerned parties.
* Create a "footprint" In order to calculate ROI, you need to have a "footprint", as Dr. Howard Rubin refers to it. The footprint represents where you are now in the areas that you have chosen to track as the representation of your improvement. * Collect the return data This data is collected by means of your chosen metric(s).
* Calculate your ROI A simple ROI is a calculation based on the comparison of your footprint or last measurement with a current or updated measurement.
Even the simple way sounds too simple -- five steps and you have ROI data. But each step above contains areas where incorrect or misguided decisions can send you off in the wrong direction, or worse yet, can cause the demise of your metrics program.Metrics used for ROI are part of your basic set of metrics. You should be careful not to collect too many or too few metrics.
- Collection of too much data can cause frustration on the part of your software engineers. First, it takes time from their "software duties"; second, perhaps more importantly, the data is never used.
- Too little data can lead to incorrect decisions or conclusions. In the metrics and ROI research that Donna and I conducted, we found that Level 1 organizations were collecting, on average, 11 metrics -- with "collecting" being the key word -- not "using"! Level 5 organizations were found to be collecting AND using 31 + metrics. Level 1 organizations were not always collecting useful data or the type of data that would be useful in a historical data base as they matured. So you also need to be aware of what you will need for historical data as your organization matures.
Therefore, choose your metrics carefully -- make them useful to YOUR organization, and choose ones that will be useful to you at Level 5 as well as at Level 2. Don't be overly ambitious in your choices! And, plan for success!!
If you need to calculate ROI from a dollar point of view, your task will be considerably more difficult. You will have to go through similar steps for investments as I outlined above for returns. Investments can also come in many 'flavors' -- resources (including $$), training, tools, facilities, etc. You need to be able to convert these investments into a dollar value and then convert your return(s), which can be more than just the simple metric representation outlined above, into dollars also. You can then calculate a true ROI. It is not an easy task, nor one to be taken lightly if you want a correct representation of what your organization is accomplishing. And remember, a return may be negative at any time based on the types of investments being made, such as training and tool investments. But the return should take a positive turn in the future if all is working correctly in the organization.
ROI calculation is really too large a subject to cover in this column but I hope that the ideas I have suggested will help many of you initiate your ROI program as part of your metrics program. If you continue to have questions in this area, we can discuss it again. It is a favorite topic of mine.... Can you tell?
Please continue to send your comments and questions through the summer. I will be back in print in September but will be available all summer to answer any questions you might have that require an immediate response.
Here in the North, summer means a less "formal" way of life, fewer rigid "plans and procedures" -- a less "institutionalized" way of life -- more cookouts, long walks on the beach at Cape Cod, summer nights of gardening, midnight swims, picnics by a waterfall in Vermont.... I guess life reverts to "Level 1" for us for a few months.... enjoyably "chaotic" !!!
Enjoy your summer.
This column is for you; let's make a difference!! Send your comments and questions to " Dear SPIN doctor" at: email@example.com or directly to the Editor. Sign them or use a "pen-name" -- I respect your confidentiality!! --
--The SPIN Doctor
The Boston SPIN is a forum for the free and open exchange of software process improvement experiences and ideas.
Meetings are held on third Tuesdays, September- June.
We thank our sponsors: GTE and BULL INFORMATION SYSTEMS.
For information about SPINs in general, including ***HOW TO START A SPIN***, contact:
DAWNA BAIRD of SEI,
Boston SPIN welcomes volunteers and new sponsors.
For more information about our programs and events, contact:
Technical Assessments, Inc.,
ESC/ENS (Bldg 1704), 5 Eglin St,
Hanscom AFB MA 01731-2116; (617)
377-8324; FAX (617) 377-8325;
IN THE SPIN is published monthly September - June, with occasional exceptions (q.e.d.).
Letters, notices (including job postings), and contributed articles are welcomed. Articles do not necessarily represent the opinions of anyone besides their authors.
IN THE SPIN is available only by email.
TO SUBSCRIBE send email address to firstname.lastname@example.org (Ron Price).
SEND letters-to-editor, notices, job postings, calendar entries, quips, quotes, anecdotes, articles, offers, and general correspondence to
to Jim Holloway,