2016 IT CONFERENCE

ISBSG and ICEAA Southern California co-organised the 2016 IT Confidence Conference

 In 2013, ISBSG organized a one-day conference aligned with the traditional yearly workshop. The name of the conference, IT Confidence, was chosen to stress the fact that the IT industry could actually gain more confidence to deliver successful projects when using mature measurement, estimating techniques and relevant historical data.

Since 2013 (Rio de Janeiro), 2 additional successful events were organized: 2014 (Tokyo) and 2015 (Florence). In 2016, the IT Confidence was organized together with the Southern California Chapter of the International Cost Estimating and Analysis Association (ICEAA) in Los Angeles. ISBSG Gold member Galorath was the host of the event. Visit the 2016 IT Confidence Conference website.  

 Displayed below are the presentations from the 2016 IT Confidence Conference:

 

Data Driven Cost Estimating – Karen McRitchie (Galorath)

Download the presentation – Data Driven Cost Estimating and the Role of Industry and Private Data.

 

Cloud Total Ownership Costing: Considering the Technologies, Costs and Benefits – Dan Galorath (Galorath Inc) and Steve Woodward (Cloud Perspectives)

Download the presentation – Cloud Total Ownership Costing: Considering the Technologies, Costs and Benefit

 

Measuring and Estimating an Internet of Things Project – Thomas Fehlman (Euro Project Office)

Download the presentation – Measuring and Estimating an IoT Project

 

Cloud Solutions – Infrastructure, Platform or Software – Arlene Minkiewicz (PRICE Systems)

Download the presentation – Cloud Solutions – Infrastructure, Platform or Software

 

Improve Estimation Maturity – Harold van Heeringen (Metri)

A large proportion of software projects are delivered over-budget and behind time.  This presentation investigates the importance of accurate project estimation.

Download the presentation – Improve Estimation Maturity Using Functional Size Measurement and Industry Data.

 

Software Data Collection – A Historical Perspective – Randall Jensen (Software Acquisition Consultant)

Download the presentation – Software Data Collection – A Historical Perspective

 

Rates vs Cost per Function Point  – An updated cost analysis  – Rafael de la Fuente (Leda mc), Dácil Castelo (Leda mc) & Raúl Fernández (Leda mc)

Download the presentation – Rates vs Cost per Function Point  – An updated cost analysis 

 

Agile Benchmarks: What Can You Conclude?  – Reifer Consultants

Download the presentation – Agile Benchmarks: What Can You Conclude?

 

No Estimates/Yes Measurements – Why Shouldn’t Agile Teams Waste Their Time and Effort in Estimating – Pekka Forselius (4Sum Partners Ltd)

Download the presentation – No Estimates/Yes Measurements – Why Shouldn’t Agile Teams Waste Their Time and Effort in Estimating

 

 

International Software Benchmarking Standards Group Overview

ISBSG Mission

ISBSG is a not-for-profit organisation that has been helping IT professionals since 1997.  It aims to help you improve the planning and management of your IT projects.  This, in turn, will improve your productivity and  give you better control of your project costs.

We achieve this aim through the sharing of knowledge.  2 Repositories of IT project data exist for you to benchmark your project against.

Compare your estimations against similar project data.  Are your estimations of effort, team size, phase duration, number of defects etc similar to projects in the Repository?  If not, why not?

An extensive collection of reports and books also provides a wealth of information to help you.

How does it work?

IT professionals from around the world send us metrics about their IT projects.  Their anonymity is guaranteed of course. In return for their data, we offer them free ISBSG report, books, subscriptions and data.

This data undergoes stringent validation processes.  If it adheres to our standards, it is added to one of our Repositories.

This Repository data is then used by IT professionals for benchmarking purposes.

The Repositories

Our 2 Repositories contain:

Development & Enhancement data form more than 6,000 projects      and

Maintenance & Support data  data for more than 1,000 projects

The data originates from a wide range of countries and organisations.  Many different application types and industries are represented.

 

For more details on the ISBSG, download  a Powerpoint presentation.

2015 IT CONFERENCE

Visit the IT Conference Conference 2015 website

 

Automated FPA within a Continous Integration Environment – T.Barbieri, M.Pasquale

We present a prototype of a technical framework to perform AFP (Automated Function Point Analysis) on software built on a canonical architecture (Java Enteprise with Entities modeled using JPA 2.0 – the Java Persistence Framework).  The solution is integrated in Sonarqube.  It allows for on-the-fly assessment of a new quality metric that we propose, called “Technical Delay”.

We define Technical Delay as the Technical Debt of the project expressed in Function Points per hour, instead of LOC/hour.  Using AFPs as a normalization factor, Technical Delay can be used as a SLA metric on Agile Projects.  Especially in contexts where Continuous Integration is used and the metric needs to be constantly recomputed and monitored.  The presentation will show early results on pilot projects applying the methodology.

Download Presentation – Automated FPA in Continuous Integration Environment

 

Boundary or No Boundary? That’s the (asset) question! – L.Buglione

This presentation is valid for one credit for the IFPUG Certification Extension Program.  It discusses the issue of properly defining boundaries in modern multi-layered applications, moving from the CPM statements and analyzing it from a different viewpoint, that’s the Asset Management one.

Download Presentation – Boundary or No Boundary

 

Metrics in Software Verification and Validation: Some Research Challenges (Keynote speech) – A. Fantechi, 


The complex software applications controlling critical safety domains, such as transportation, raise increasing concerns about the ability of software development, verification and validation processes to avoid the presence of software faults in such applications.  How to measure this is a matter of debate, especially since theory and practice tell us that perfect complex software does not exist.  An account of currently applied techniques and of open research challenges will be given.

Download Presentation – Metrics in Software Verification & Validation

 

Triangle Benchmarking in Practice – P. Forselius,

The speaker introduced a new approach for IT project performance benchmarking at the IT confidence 2014 in Tokyo.  Feedback from the audience was enthusiastic and encouraging.  Both the customers and suppliers welcomed a new, simple, but effective way to compare IT programs and projects against each other, and with the industry’s top performers.

The Triangle Benchmarking concept is based on very few measurements and reasonably light data collection, compared to traditional benchmarking services. It can be easily applied within all kinds of organizations and especially well with Agile software development projects, as the old heavier benchmarking services were applicable and affordable only for very large corporations.

In this presentation Pekka Forselius will introduce how to use Triangle Benchmarking effectively, and what kind of decisions can be made based on the results.

Download Presentation – Triangle Benchmarking

 

Why Are Estimates Always Wrong: Estimation Bias and Strategic Mis-estimation – D. Galorath

Officially an estimate is “the most knowledgeable statement one can make regarding cost, schedule, effort, risk and probability.” That is an excellent definition.  However, research shows people are generally hardwired to mis-estimate, and those estimates are nearly always on the low, extremely optimistic side. This is a disaster because viable estimates are core to project success as well as ROI determination and other decision making.

After decades of studying estimation, it has become apparent that:   Most people don’t like to and/or don’t know how to estimate, those who estimate are often optimistic, full of unintentional bias and sometimes strategic mis-estimating.  While many of us spend time on model errors the biggest source of estimation error usually comes from people, either by accident or strategically.  There is little we can do except provide reality checks via things like parametric models, analogy data, and process rigor.

This presentation discusses the issues of estimation bias and strategic mis-estimation as well as how to mitigate these issues.  Both the results of Nobel prize winning work and subsequent discoveries by researchers and the author will be discussed.

Download Presntation – Why Can’t People Estimate?

The Influence of Poor Planning on Software Team ManPower and Productivity – C.Gencel, L.Buglione

In physics, potential energy corresponds to ‘capacity of doing work’.  In software engineering, the potential energy of a development team corresponds to team’s cumulative intellectual work capacity in a development environment for developing a piece of software during a period of time.  Hence, the efficiency (or in commonly used terms, productivity) of software development can then be denoted as the ratio of the amount of ‘output work’ produced to the team’s cumulative intellectual energy input to do this work.  So, any waste in development would decrease productivity.

The focus of this presentation is on the input effort delivered by the team to do work, rather than the work output and factors affecting productivity.  We will first revisit and clarify some fundamental concepts such Team Size and Team Power, and then investigate empirically the nature of the relationship between Average Team Power and Average Team Size, which we normally would expect to move together.  The results indicate that for a large number of projects, the Average Team ManPower increases up to a point and stays around this figure even though the reported Average Team Size increases further. These preliminary findings suggest poor planning, and hence inefficient utilization of people in projects that might have resulted in longer durations or higher costs.

Download Presentation – Influence of Poor Planning on Teams & Productivity

 

Automotive – Tips for benchmarking a different software G.Lami

This presentation will present and discuss the particularities of software for the Automotive sector in order to point out which could be further data to be possibly collected in a next version of ISBSG repositories, in particular the Development and Enhancement (D&E) one, e.g. moving from the safety and security issue that’s deeply managed by ISO norms as the 26262 family.

Download Presentation – Tips for Benchmarking

 

Software Data Collection Supporting Defensible Estimates – A. Minkiewitz

Cost modeling and estimation has a long and interesting history in industry and many models and methods have been applied to estimate projects. Over time more and more industry and government professionals are asking for models built or tuned with data that is very specific to their industry and their organization.  Unfortunately, many organizations do not have the infrastructure, processes or tools for collecting project data efficiently.  Among those who do, some still struggle to find the best way to use their data effectively.

This paper chronicles a journey that PRICE Systems has traveled with an Army customer to develop and institutionalize a process for the collection and application of historical cost data for software projects. It will discuss the obstacles encountered and the lessons learned along the way.  Attendees will learn how this organization is now armed with the right tools and processes to deliver defendable, credible estimates to their program office.

Download Presentation – Software Data Collection: Supporting Defensible Estimates 

 

Using ISBSG Data to Calculate Agile Velocity – T. Fehlmann

Software Developers can face difficulties in predicting the functional size of the finished product, without a plan.   It is important for Agile teams to have the ability to calculate their velocity. Agile Velocity is normally measured by story points.  However, story points are not a size but an effort metric and velocity refers to size not effort.

This talk explains how the ISBSG database can be used to determine velocity in terms of size, based on the story points assigned to story cards. No additional metrics are needed except some automated COSMIC counting.  The approach uses the powerful concept of transfer functions, and also the Buglione-Trudel matrix for determining story cards priorities.

Download presentation – Agile Velocity.

 

 

2014 IT CONFERENCE

Analysis of the Factors that Affect Productivity of Enterprise Software – T. Furuyama

This presentation reports the analysis results of clarifying factors that affect productivity of enterprise software projects as follows.

(1) Productivity is inversely proportional to the root of fifth power of the test case density and fault density respectively.

(2) Projects, where high security or reliability level software is required has low productivity, and projects where objectives and priorities are very clear, projects where documentation tools are used, and projects where sufficient work space is provided all have high productivity.

(3) Productivity of the project managed by skillful project manager is low because he/she tries to detect many faults.

(4) If work conditions of a project where high security, reliability, or performance and efficiency level software is required are poor such that work space is narrow or role assignment and each person’s responsibility are not clarified, the project has remarkably low productivity.

Download a paper on this topic.  Download a presentation.  Download a Japanese presentation.

 

Why Can’t People Estimate: Estimation Bias and Strategic Mis-Estimation- D. Galorath,  

Many people view an estimate as a quick guess that no one believes anyhow.  But producing a viable estimate is core to project success as well as ROI determination and other decision making.

In decades of studying the art and science of estimating it has become apparent that:

  • most people don’t like to and/or don’t know how to estimate;
  • those that estimate are often always wildly optimistic, full of unintentional bias;
  • strategic mis-estimating provides misleading estimates when it occurs. However, it is also obvious that viable estimates can make projects successful, outsourcing more cost effective, and help businesses make the most informed decisions.

That is why metrics and models are essential to organizations, providing the tempering with that outside view of reality that is recommended by Daniel Kahneman in his Nobel Prize winning work in estimation bias and strategic mis-estimation.

Download a paper on this topic.  Download a presentation.

Sizing for estimating, measurement and benchmarking – C. Green

This presentation talks about how sizing can be a normalising factor for both estimating, measurement and benchmarking.  It introduces the need to utilise a size measure for both functional as well as non-functional size. This utilises the IFPUG method Function Point Analysis (FPA) as well as Software non-functional Assessment Process (SNAP).

The presentation takes the view of estimating to measurement for projects – to benchmarking for organisations utilising industry data as the competitive comparison.

The presentation touches on issues with requirements.  It examines how to utilise FPA and SNAP to re-cover this.  It examines Accuracy levels of size assessment for estimating, high-level view of other data then size that should be collected – but focus is on sizing as a measure – not a full measurement program.

Download presentation – Sizing for Estimating, Measuring and Benchmarking

Measuring Tests using COSMIC – T. Fehlmann & E. Kranich

Information and Communication Technology (ICT) is not limited to software development, mobile apps and ICT service management but percolates into all kind of products with the so-called Internet of Things.

ICT depends on software where defects are common. Developing software is knowledge acquisition, not civil engineering. Thus knowledge might be missing and consequently leading to defects and failures to perform. In turn, operating ICT products involves connecting ICT services with human interaction, and is error-prone as well.

There is much value in delivering software without defects. However, up to now there exists no agreed method of measuring defects in ICT.  UML sequence diagrams is a software model that describes data movements between actors and objects and allows for automated measurements using ISO/IEC 19761 COSMIC. Can we also use it for defect measurements that allows applying standard Six Sigma techniques to ICT by measuring both functional size and defect density in the same model? It allows sizing of functionality and defects even if no code is available. ISO/IEC 19761 measurements are linear, thus fitting to sprints in agile development as well as for using statistical tools from Six Sigma.

Download presentation – Measuring Tests Using Cosmic

New topics of “IPA/SEC White Paper 2014-2015 on Software Development projects in Japan” – M. Saeki

By analyzing historical data from the software industry, it is possible to improve the software productivity and quality. This is done through benchmarking and management decisions about software development practices.

Software Reliability Enhancement Center (SEC) of Information-technology Promotion Agency, Japan  continuously collects new data from software development projects.  This is done with co-operation from more than twenty companies and is published in the “IPA/SEC White Paper on Software Development projects in Japan” periodically.

The White Papers report the analysis of software development/maintenance projects in the Japanese IT industry. This quantitatively demonstrates technological competence concerning software productivity and quality.  IPA/SEC will publish “IPA/SEC White Paper 2014-2015 on Software Development projects in Japan” and the addendum in this autumn. Their quantitative analyses are backed by a 3,541 project data set. This will contain more than 10 new analyses concerning software productivity and quality.

In this presentation, new analyses about the following topics will be shown:
(1) The relationship among function size, product size, and effort in each development phase.
(2) Productivity variation factors – Productivity (for example, development effort per function point) varies due to reliability requirement grades, number of pages of design documents per function point, and number of test cases per function point.
(3) Reliability variation factors – Reliability (for example, number of identified defects in service per function point) varies due to reliability requirement grades and maturity level of development organization (for example, quality assurance system).

Download presentation – Software Development Projects in Japan

Towards an Early Software Effort Estimation Based on the NESMA Method (Estimated FP) – S. Ohiwa, T. Oshino, S. Kusumoto & K. Matsumoto

The function point (FP) is a software size metric that is widely used in business application software development. Since FPs measure the functional requirements, the measured software size remains constant regardless of the programming language, design technology, or development skills involved. In addition, when planning development projects, FP measurement can be applied early in the development process. A number of FP methods have been proposed.

The International Function Point Users Group (IFPUG) method and the COSMIC method have been widely used in software organizations.

FP is considered one of the most promising approaches in software size measurement, but nevertheless it does not prevail over all Japanese software industries. One of the reasons prohibiting the progress of introducing FPs into software organization is that function point counting needs a lot of effort. According to the IPA/SEC White Paper on Software Development Projects in Japan 2010-2011, the penetration rate of FP in Japanese software development companies is only 43.8 percent. Also, the survey on Information System User Companies by JUAS disclosed that the penetration rate of FP in Japanese information system user companies is less than 30 percent.

NESMA provides some early function point counting methods. One of them is the estimated function point counting method (called NESMA EFP).  In the EFP, a counter first determines all functions of all function types (ILF, EIF, EI, EO, EQ) in the target specifications. Then, the counter rates the complexity of every data function (ILF, EIF) as Low, every transactional function (EI, EO, EQ) as Average, and calculates the total unadjusted function point count. The counting effort is quite small in comparison with the IFPUG method, but there are not many articles that show the usefulness of the NESMA EFP based on actual software project data, especially for application of software cost prediction.

This paper aims to evaluate the validity of using the NESMA EFP as an alternative to the IFPUG FP in the early estimation of software development effort. In the evaluation, we used the software development data of 36 projects extracted from a software repository that maintains 115 data items of 512 software development projects collected by the Economic Research Association from 2008 through 2012. Common characteristics of these 36 projects are as follows:
•  Software was newly developed.
•  Software development includes the following five software-specific low-level processes; architectural    design, detailed design, construction, integration, and qualification testing.
•   Actual FP and total amount of effort are available.
•   Actual functional size of each function type in all functions is available.
•   The function types for each function have realistic functional sizes. For example, the average functional size of ILF of each function is from 7 to 15.

Main results of the empirical evaluation, and these contributions to software development are as follows;
(1) There is an extremely high correlation between the IFPUG FP count and the NESMA EFP count
Figure 1 is a scatter plot showing the relationship between the IFPUG FP count and the NESMA EFP count in 36 software development projects. The coefficient of determination between these two FP counts is 0.970.

This result is not inconsistent with previous empirical evaluation by NESMA reported in the document “Early Function Point Counting.” In the NESMA evaluation, the upper bound of the FP count was about 3,000. On the other hand, the upper bound is about 30,000 in this evaluation. It implies that we can use the NESMA EFP more widely as an alternative to the IFPUG FP in software development projects in Japan than before. Also, the NESMA EFP may be useful for individuals and companies, who are considering whether to use the IFPUG FP in their software development projects, to evaluate feasibility of the IPFUG FP application.
(2) There is a high correlation between the NESMA EFP count and the software development effort
Figure 2 is a scatter plot showing the relationship between the NESMA EFP count and the total amount of software development effort in 36 software development projects. The coefficient of determination between these two FP counts is 0.823. It implies that we may be able to use the NESMA EFP to predict software development effort in the early stages of software development project.

Early software effort estimation is one of the most important issues in software project management, so this result also encourages many individuals and companies who are considering whether to use the IFPUG FP in their software development projects. The coefficient is high enough, but we should continue further discussion and data analysis to eliminate or adjust some outliers to improve the accuracy of effort prediction by the NESMA EFP.

Download presentation – Towards an Early Software Effort Estimation Based on Nesma Method

Software Rates vs Price of Function Points: A cost analysis – R.D. Fernández, R. De La Fuente & D. Castelo,

Implementing productivity models helps in the understanding of Software Development Economics, which up to now is not entirely clear. Most organizations believe that the only way to achieve improvements is lowering software rates. With a background of three years of statistical data from large multinational clients, LEDAmc presented at UKSMA 2012 Conference a study showing how the relationship between software rates and cost per function point differs from what could be expected, sometimes even far from expected. The experience gained by LEDAmc through the implementation of software productivity models over the last two years brings new and updated insights to this study, which will be presented during the conference.

Download presentation – Software Rates vs FP Price

Beyond the Statistical Average: The KISIS Principle (Keeping it Simple is Stupid) – J. Ogilvie,

Based on the speaker’s experience negotiating and managing many outsourcing contracts using Function Points as a Key Performance Indicator, this presentation describes the pitfalls that can be experienced if one takes too simplistic a view of the meaning and use of Function Point data and suggests ways in which they may be avoided.
Starting with a typical outsourcing scenario, and using ISBSG project data, techniques to improve the effectiveness of a Function Point program are demonstrated.
Particular emphasis is made on the importance of setting baselines appropriate to the environment to be measured and deciding how to determine if agreed performance targets are achieved.
The use of statistical analysis beyond just averages, to enable a more sophisticated and pragmatic interpretation of measurement data is demonstrated. The view that a little statistical analysis can actually uncover “lies and damn lies” is offered.
Finally, a template for design of a successful Function Point Program is presented.

KISIS Principle IT Confidence 2014 OgilvieDownload presentation – KISIS Principle

New Look at Project Management Triangle – P. Forselius

Almost every Project Management book introduces the project management triangle. Almost every certified Project Manager thinks that she or he understands the relationships between the elements of triangle correctly: “The larger the scope, the more cost and time needed”. However, especially in ICT industry majority of the projects overrun both the budget and schedule, and deliver less functionality than expected. In this presentation we take another look at the project management triangle, to learn how to get more outcomes with spending less money and time.

Update presentation – Project Management Triangle

2013 IT CONFERENCE

What is your quest for software analytics? –  S. Woodward

One of the core considerations with data analytics is recognizing “what is your quest”.  Many options and approaches are used in data analytics, several of which are of interest to the software sector.

The world has changed culturally and technically, the need to be value focused and innovative is more important today than ever before.  Steven shares several perspectives and analogies, where data with analytics can alter future behaviour, lowering risk and optimizing the solution.  Several updates will also be shared from cloud, government and academia regarding activities and how the metrics community can collaborate.

Download presentation – What is your logical quest for software analytics?

IT Data Collection, Analysis and Benchmarking: From Processes and Benchmarks Into Real Business Value – D. Galorath

In an IT context companies struggling to increase profits and often view IT as a necessary evil: one that consumes resources rather contributes to the bottom line. These organizations often don’t see value in data collection analysis or benchmarking either. However, IT can be a significant contributor when IT decisions are made after measuring and estimating both cost and return.

IT data collection analysis and benchmarking continue to improve the cost of IT systems and help make decisions regarding where to spend money to stop the bleeding. As such, repeatable processes for estimating cost, schedule and risk will be addressed along with the “iron triangle” of software. The Iron Triangle looks at issues of cost, schedule, scope and quality and helps determine what must give when client increases scope, reduces schedule or reduces budget.

Additionally this presentation will address the risk adjusted Total Cost of Ownership and return an IT investment along with its consistency with long-range investment and business strategy of an organization measured against risk and key technical and performance parameters and technical debt.Finally, since this presentation will address the overriding business concerns: how much value does this software contribute to the business and is the best place to spend the money.

Download Presentation – IT Collection, Analysis & Benchmarking

Using Benchmarks to Accelerate Process Improvement – J. Schofield

Organizations are constantly pressured to prove their value to their leadership and customers. A relative comparison to “peer groups” is often seen as useful and objective, thus benchmarking becomes an apparent alternative.

Unfortunately, organizations new to benchmarking may have limited internal data for making valid comparisons. Feedback and subsequent “action” can quickly lead to the wrong results as organizations focus on improving their comparisons instead of improving their capability and consistency.

Adding to the challenge of improving results, software organizations may rely on more readily available schedule and financial data rather than KPIs for product quality and process consistency. This presentation provides measurement program lessons learned and insights to accelerate benchmark and quantification activities relevant to both new and mature measurement programs.

Download Presentation – Use Benchmarks to Accelerate Process Improvement

KPIs used in a 6,000 Function Points Program – M. Silveira

This presentation provides a walthrough over application development KPIs that were used to understand the performance of a 6,000 Function Points program. This program composed of 18 modules/projects was delivered in 20 months consuming over 220,000 hours. Several analysis were performed during the program execution but the presentation focus on the final results and lessons learned. The major metrics areas/KPIs that will covered area: Sizing, Duration, Effort, Staffing, Change, Productivity, Defect, Use Case

Download presentation – KPIs Used in a 6,000 FP Program

Lessons Learned from the ISBSG Database – A. Minkiewicz 

As corporate subscribers and partners to the International Software Benchmarks Standards Group (ISBSG ), PRICE has access to a wealth of data about software projects.

The ISBSG was formed in 1997 with the mission “To improve the management of IT resources by both business and government through the provision and exploitation of public repositories of software engineering knowledge that are standardized, verified, recent and representative of current technologies.” This database contains detailed information on close to 6000 development and enhancement projects and more than 500 maintenance and support projects. To the best of this author’s knowledge, this database is the largest, most trusted source of publically available software data that has been vetted and quality checked.

The data covers many industry sectors and types of businesses though it is weak on data in the aerospace and defense industries. Never the less, there are many things we can learn from analysis of this data.

The Development and Enhancement database contains 121 columns of project information for each project submitted. This information includes information identifying the type of business and application, the programming language(s) used, Functional Size of the project in one of many Functional Measures available in the industry (IFPUG, COSMIC, NESMA, etc.), project effort normalized based on the project phases the report contains, Project Delivery Rate (PDR), elapsed project time, etc.

PRICE Systems has recently partnered with ISBSG with licenses to both the data repositories. Although we cannot distribute the data to those without subscriptions, there is no reason we can’t use analysis of this data to provide guidance to users of our software estimation tool, TruePlanning.

One effort focused on developing calibrated software estimation templates for True S based on various scenarios across industry sector, application type, development type (new or enhancement) and language type (3GL,4G). This exercise combined data mining, statistical analysis, and expert judgment.

This paper discusses the methodology used to derive these templates and presents the findings of this research. While the actual analysis is focused on a particular software estimating model, the research, analysis and techniques should inform similar analyses that are tool agnostic.

Download presentation – Lessons Learned from the ISBSG Database

Software Estimation – The next level – T. Dekkers

The Total Cost Management (TCM) Framework of the Authority for the Advancement of Cost Engineering (AACE) International is an Integrated Approach to Portfolio, Program and Project Management. It provided a structured, annotated process map that explains each practice area of the cost engineering field in the context of its relationship to the other practice areas including allied professions. In other words; it is a process for applying the skills and knowledge of cost engineering.

A key feature of the TCM Framework is that it highlights and differentiates the main cost management application areas: project control and strategic asset management. In this paper the focus is on project control.

In the TCM Framework, the Basis of Estimate (BOE) is characterised as the one deliverable that defines the scope of the engagement and ultimately becomes the basis for change management. When prepared correctly, any person with (capital) project experience can use the BOE to understand and assess the estimate, independent of any other supporting documentation. A well-written BOE achieves those goals by clearly and concisely stating the purpose of the estimate being prepared (i.e. cost/ effort/duration study, project options, funding, etc.), the project scope, cost basis, allowances, assumptions, exclusions, cost risks and opportunities, contingencies, and any deviations from standard practices.

A BOE document is a required component of a cost estimate. Because of its relevance in the set of AACE International recommended practices (RP) a BOE document is present. This template provides guidelines for the structure and content of a cost basis of estimate.

Although not always happy with the opinion that the Software Services Industry is different than other industries, analysis of the BOE shows that the structure is applicable but needs to be adapted to meet the practise in Software Services. In addition, the terminology used does not reflect the activities, components, items, issues, etc. of the Software Services Industry.

The tailored version; Basis of Estimate – As Applied for the Software Services Industries provides guidelines for the structure and content of a cost basis of estimate specific to the software services industries (i.e. software development, maintenance & support, infrastructure, services, research & development, etc.).

With this BOE a structure is provided for further standardisation of the Estimation Process, a more consistent use of metrics (sizing, effort, schedule, quality), transparent options for control (benchmark, audit, bid validation) and a common approach on assumptions and associated risks.

Download presentation – Software Estimation – The Next Level

Software Rates vs cost per Function Point: a cost analysis of 10.000 software projects from 8 large clients – D. Castelo, R. De La Fuente

Implementing productivity models helps in the understanding of Software Development Economics, which up to now is not entirely clear. Most organizations believe that the only way to achieve improvements is lowering software rates.

With a background of three years of statistical data from large multinational clients, the presentation shows how the relationship between software rates and cost per function point differs from what could be expected, sometimes even far from expected. Leaning on a statistical demonstration, the reached conclusion is that excessive pressure on software rates destroys the concept of software rates in outsourcing processes. The study results also lead us to some considerations regarding how software development activity is understood and managed, both from the client and the software provider perspective.

Download presentation – Software Rates vs Cost per FP

Application Lifecycle Management and process monitoring through an integrated and low-cost solution, mainly based on Open Source Software products – N.Bertazzo, D. Gagliardi, S. Oltolina, G. Ruffatti

Requirements play a crucial role in the definition of the boundaries and identity of a project: their traceability, correct development, sharing with the stakeholders and validation determine the project failure or success. Moreover, Quality Assurance (QA) processes facilitate project management activities.

Proper measures and indicators are considered as key information to know if a project is on the right way or not. Engineering Group (www.eng.it) intends to show how its integrated solution, recognized compliant to CMMI-DEV principles and based on SPAGO4Q (www.spago4q.org) and the QEST nD model (Buglione-Abran), made of a set of open source and low cost tools, allows to:
• manage the application lifecycle in a complex, flexible and shared environment, enhancing the communication among the project stakeholders;
• manage internal project assessment activities by the QA Department;
• monitor projects and measure performances, allowing the information sharing among the stakeholders.

The use of an integrated, low-cost solution compliant to the CMMi requirements, which can be easily extended and integrated with other corporate tools, has been a key success factor at Engineering Group, fostering the adoption of well-defined ALM processes and an effective software lifecycle management. This solution can integrate other applications developed by different divisions of the company, reducing the duplication of information and fostering the sharing of lessons learned.

Download presentation – Application Lifecycle Management

Analysis of ISBSG r12 Data for Understanding Testing Effort – A. Abran & K. Jayakumar

Analysis of ISBSG R12 data has been investigated to understanding the effort expended on the testing phase of software development projects. The available data was filtered to retain web based/client server projects in order to make the results applicable for current generation projects. Data was further filtered to include projects where size was measured using either IFPUG FPA or COSMIC.

Analysis of data resulted in three homogenous groups of projects:
(a) Projects consuming less test efforts (up to 1 hr per function point)
(b) Projects consuming average test efforts (above 1 hr but less than 3 hrs per function point) and
(c) Projects consuming high test efforts (above 3 hrs per function point)

The presentation will provide key statistics computed for each of the three groups including size to effort relationship.  Further analysis of the three groups resulted in two key findings which will be discussed in the presentation:
(1) More than 60% of the projects which consumed less testing efforts (first group) had specification reviews/ design reviews/ code reviews as part of their development phase while projects in the second and third groups had only a small percentage of projects which had such reviews prior to testing.
(2) Another interesting observation from the analysis is related to test automation where in over 90% of projects where tests were automated consumed less testing efforts.
Even though both of the above observations are intuitive, the study with ISBSG data provides quantitative objective support.

The presentation will also provide key test effort statistics on various subset of R12 data such as Development Projects, Enhancement Projects, Business Applications, Real-time applications, Projects measured using IFPUPG FPA and Projects measured using COSMIC. As there are no ISBSG reports specifically on software testing efforts, this information will be useful to the industry.

Download presentation – Analysis of ISBSG R12 Data

Are we really bad? A look at software estimation accuracy – P. Hill

Using data from completed software projects in the ISBSG repository, we will look at how people have gone about estimating their software projects and how well they did it. We will look at estimation techniques used, the accuracy of estimates and relationships between the estimates.
We will then offer practical tips and some steps you can take to determine how realistic your own estimates are.

Download presentation – Are we really that bad?