Use Historical Data for More Accurate Software Project / Release / Sprint Etimation

Software cost estimation is difficult  and still not a genuine profession yet. Many organizations still use the ‘expert estimate’ in some form, which are basically opinions without a solid basis of data. Also, the practice of planning poker – assigning story points to backlog items – is in fact a form of an Expert estimate. Story points are very useful on a team level, but not suitable for high-level project planning and forecasting. Unfortunately, human estimates are very likely to be optimistic, and many projects, agile and traditional, start with unrealistic estimates: team too small, duration to short and effort/cost estimate too low. Check for instance the research of Daniel Kahneman (https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow).

Steve McConnell’s great book ‘Software Estimation, Demystifying the black art’ shows us that (optimistic) expert estimates in fact are likely to result in non-linear overruns of cost and schedule. Some reasons for this are the extra management attention, stress in the team (more defects, lower maintainability) and ‘adding people to a late project only makes it later’. Pessimistic estimates result in linear extra costs due to Parkinson’s law (people will find some way to use extra hours when the work could have been completed earlier). Parametric estimates, based on functional size, relevant historical data and parametric models result in a more realistic estimate, and therefore in no or limited extra costs. The next figure shows these ideas.

 

Even in the world of agile software development, overruns are very common, although disguised by the fact that the scope of the functionality delivered is variable, while the cost and schedule are rather fixed. However, overruns become evident when the desired functionality, or minimum viable product, is not ready in time and extra sprints are required to fulfill the requirements regarding minimum functionality. Great news for suppliers, as the risk they ran in fixed price projects is gone, but not so good news for the customer organizations.

As most companies in the IT industry have not reached a higher-level maturity in their software estimation processes, necessary to be able to use parametric estimation, overruns are still very common. In the following model, constructed by Dan Galorath, it becomes clear that organizations need to reach at least maturity level 2 to be able to use parametric estimation.

The International Software Benchmarking Standards Group (ISBSG) helps organizations to improve their estimation process maturity by providing industry data of completed projects, releases and sprints. The current version of the database contains over 8000 projects, which can be obtained in an Excel spreadsheet. It helps organizations that don’t have their own historical data available to get some idea about the productivity that is average for their type of project. When the size is known, the ISBSG data can be used to filter the historical data in such a way that the relevant similar projects are listed and the most likely productivity can be chosen based on that.

Check the site http://isbsg.org/project-data/ for a free sample of the data available. As the data is collected from  industry, not all fields are filled in, but the main fields like effort, duration, size and defects are usually available.

 

Software Cost Engineering is becoming an acknowledged profession!

Software Cost Engineering is a profession! However, in many organizations even large projects are estimated by just asking the engineers, architects, testers, project managers and other experts to come up with the estimate, based on their experience. In the Software Cost Engineering community, this is often referred to as ‘Estimation maturity level 0’. Although these expert estimates are sometimes quite accurate, they are also subjective, not repeatable, not backed up by data, not verifiable and therefore not defensible. The problem is that humans (experts) systematically underestimate software projects and that starting software projects with unrealistically optimistic expectations usually results in huge overruns in the end. When the customer asks on what ground the estimate is based, you really don’t wish to answer: “our engineers just think so”. The International Cost Estimation and Analysis Association (ICEAA), the International Function Point User Group (IFPUG) and the Dutch Nesma (formerly known as Netherlands Software Metrics Association) are forming a consortium to develop a Software Cost Engineering Body of Knowledge (SCEBoK) and a certification program for Certified Software Cost Engineer. Although this initiative just started, already a lot of enthusiastic reactions were received from the industry and many well-known software metrics gurus offered their support. The SCEBoK and associated certification have a huge potential in an industry that is struggling to come up with accurate software project estimations, for sure resulting in more successful projects and less waste.

ISBSG supports this initiative and I am positive that the data ISBSG provides will help a lot of software cost engineers to create accurate estimates based on actual capabilities.

October 2015 – International Workshop on Software Measurement & ISBSG IT Confidence Conference

October: international events and the thrill of new ideas

October is always a special month for me as there are usually a number of inspiring international events to attend.

This year I was very fortunate to be able to attend the International Workshop on Software Measurement (IWSM-Mensura) in Krakow (Poland) and the IT Confidence conference together with the yearly workshop of the International Software Benchmarking Standards Group (ISBSG) in Florence (Italy).  In this blog I would like to give you a little taste of the excitement that these events gave me.

IWSM 2015

The IWSM Mensura conference is the result of the joining of forces of the International Workshop on Software Measurement (IWSM) and the International Conference on Software Process and Product Measurement (Mensura).  Together they form the conference where new ideas from the world of academic research meet practical improvements from industry on topics of measuring software. This is especially interesting as there are many people from the industry that are struggling with similar challenges and I often get new ideas in the presentations as well as during the conversations with peers in the coffee and lunch breaks.

Take for instance the ongoing struggle that many of us face when dealing with software projects that experience more (or less) than usual non-functional requirements. There is always a lot of discussion about what non-functional requirements really are. At the IWSM, a joint IFPUG/COSMIC glossary was presented that can help as a guideline in this.

There were many more interesting presentations of course, most of them (and the papers) can be downloaded from http://www.iwsm-mensura.org/

 

IT Confidence 2015 

The IT Confidence conference is all about IT data collection and benchmarking and is usually connected to the yearly ISBCG workshop. This year was the third edition, hosted by the Italian Function Point User Group / Italian Software Metrics Association (GUFPI-ISMA) in the beautiful city of Florence. One of the most interesting and new ideas I encountered was the concept of ‘Triangle Benchmarking’ proposed by Pekka Forselius from Finland. Most of us are familiar with the concept of the iron triangle in project management, where scope, duration and effort (cost) are on the three sides of the triangle. In fact, the triangle can take a number of shapes, depending on a number of variables. Some examples:

Nesma Blog IT Confidence and ISBSG diag

 

These 4 triangles show different projects, either planned or completed. It also shows how important it is to measure the scope of the functionality, because otherwise it is simply not possible to understand the shape of the triangle. Forselius proposes to use the following units to draw the triangles:

1 cm = 200 FP

1 cm = 100.000 EUR or 1.000 hours

1 cm = 3 months

This way the triangles can be drawed and compared to for instance ‘industry typical’ triangles, like the government example above. He adds another step, to include colors to show which elements are good, normal or bad, which makes it easy for management to understand.

Traffic Lights

I think this concept of Triangle Benchmarking is quite interesting because the ability to see the shape of the triangle gives additional information and makes it easy to understand.

ISBSG workshop (www.isbsg.org)

Being the ISBSG president, the yearly ISBSG workshop is always an important and exciting event for me. The representatives of the ISBSG members come together to talk about and to work on topics like data collection, data validation, new website, new reports, new products, pricing and a lot of other topics. The member representatives present in Florence this year were Stavros Pechlavinidis (DASMA, Germany), Pekka Forselius (FiSMA, Finland), Ton Dekkers (Nesma), Thomas Fehlmann (Swiss-ICT), Luigi Buglione (GUFPI-ISMA and IFPUG), Arlene Minkiewicz (Price Systems, USA), Raul Fernandez (Leda-MC, Spain), Dan Galorath (Galorath, USA) and myself (Nesma and ISBSG president). This year, we welcomed a new ISBSG member at the workshop that we know very well in Nesma as well: Nesma Gold partner René Notten, founder and CEO of MetricsQuest. Also current CEO of ISBSG, John Ogilvie, was present.

A few of the actions and decisions we made:

Data collection

ISBSG needs more data, especially also for the Maintenance & Support repository.

A few important decisions that were made are regarding data collection. In order to make it easier and less time consuming to submit data, 4 different types of data collection forms will be issued: very concise, concise, normal and extended. The very concise questionnaire contains only a few questions. ISBSG is going to try to embed this with the new website that is planned to go live any time soon now. In addition, ISBSG is going to become more proactive to get data. We are going to align with tool vendors to embed functionality in their tooling to submit data automatically. An automatic extraction tool for QSM SLIM Data manager is already available on request.

Reports

ISBSG is going to renew a number of older special analysis reports with new data. In addition, we are planning to issue a number of 1-pagers that are going to answer a number of well-known goals and questions, like ‘How do I select the IT supplier with the best productivity?’

Corporate subscribers

ISBSG is going to improve the service to its corporate subscribers by issuing more releases of the repository, only to them, during the year when more data is available.

Portal

Due to the fact that most users think the data portal is hard to use, ISBSG decided to discontinue this service. We are going to investigate the possibility to supply an online version of the comparative estimation tool instead.

 

Annual General Meeting

At the Annual General Meeting, the new office bearers were elected. The existing office bearers were unanimously re-elected. The Executive of the ISBSG is therefore also the coming year:

President: Harold van Heeringen (Nesma)

Vice-President: Thomas Fehlmann (Swiss-ICT)

Honorary Treasurer: Christine Green (IFPUG)

For more information about ISBSG, please contact me or visit the site www.isbsg.org

Of course I have left out many great presentations, topics, discussions et cetera from this blog, but I hope I gave you some insight in some of the things that are happening in our industry. Maybe this inspires you to attend an event in the future and/or to contribute some of your thoughts as well. Knowledge sharing remains a very important aspect, as we still need to convince the vast majority of the industry of the use of software metrics in topics like Benchmarking, Project Estimating, Outsourcing, Productivity Measurement and Project Control. Therefore, also please feel free to ask your questions and provide your ideas also on the Nesma forum .