2013 IT CONFERENCE

What is your quest for software analytics? –  S. Woodward

One of the core considerations with data analytics is recognizing “what is your quest”.  Many options and approaches are used in data analytics, several of which are of interest to the software sector.

The world has changed culturally and technically, the need to be value focused and innovative is more important today than ever before.  Steven shares several perspectives and analogies, where data with analytics can alter future behaviour, lowering risk and optimizing the solution.  Several updates will also be shared from cloud, government and academia regarding activities and how the metrics community can collaborate.

Download presentation – What is your logical quest for software analytics?

IT Data Collection, Analysis and Benchmarking: From Processes and Benchmarks Into Real Business Value – D. Galorath

In an IT context companies struggling to increase profits and often view IT as a necessary evil: one that consumes resources rather contributes to the bottom line. These organizations often don’t see value in data collection analysis or benchmarking either. However, IT can be a significant contributor when IT decisions are made after measuring and estimating both cost and return.

IT data collection analysis and benchmarking continue to improve the cost of IT systems and help make decisions regarding where to spend money to stop the bleeding. As such, repeatable processes for estimating cost, schedule and risk will be addressed along with the “iron triangle” of software. The Iron Triangle looks at issues of cost, schedule, scope and quality and helps determine what must give when client increases scope, reduces schedule or reduces budget.

Additionally this presentation will address the risk adjusted Total Cost of Ownership and return an IT investment along with its consistency with long-range investment and business strategy of an organization measured against risk and key technical and performance parameters and technical debt.Finally, since this presentation will address the overriding business concerns: how much value does this software contribute to the business and is the best place to spend the money.

Download Presentation – IT Collection, Analysis & Benchmarking

Using Benchmarks to Accelerate Process Improvement – J. Schofield

Organizations are constantly pressured to prove their value to their leadership and customers. A relative comparison to “peer groups” is often seen as useful and objective, thus benchmarking becomes an apparent alternative.

Unfortunately, organizations new to benchmarking may have limited internal data for making valid comparisons. Feedback and subsequent “action” can quickly lead to the wrong results as organizations focus on improving their comparisons instead of improving their capability and consistency.

Adding to the challenge of improving results, software organizations may rely on more readily available schedule and financial data rather than KPIs for product quality and process consistency. This presentation provides measurement program lessons learned and insights to accelerate benchmark and quantification activities relevant to both new and mature measurement programs.

Download Presentation – Use Benchmarks to Accelerate Process Improvement

KPIs used in a 6,000 Function Points Program – M. Silveira

This presentation provides a walthrough over application development KPIs that were used to understand the performance of a 6,000 Function Points program. This program composed of 18 modules/projects was delivered in 20 months consuming over 220,000 hours. Several analysis were performed during the program execution but the presentation focus on the final results and lessons learned. The major metrics areas/KPIs that will covered area: Sizing, Duration, Effort, Staffing, Change, Productivity, Defect, Use Case

Download presentation – KPIs Used in a 6,000 FP Program

Lessons Learned from the ISBSG Database – A. Minkiewicz 

As corporate subscribers and partners to the International Software Benchmarks Standards Group (ISBSG ), PRICE has access to a wealth of data about software projects.

The ISBSG was formed in 1997 with the mission “To improve the management of IT resources by both business and government through the provision and exploitation of public repositories of software engineering knowledge that are standardized, verified, recent and representative of current technologies.” This database contains detailed information on close to 6000 development and enhancement projects and more than 500 maintenance and support projects. To the best of this author’s knowledge, this database is the largest, most trusted source of publically available software data that has been vetted and quality checked.

The data covers many industry sectors and types of businesses though it is weak on data in the aerospace and defense industries. Never the less, there are many things we can learn from analysis of this data.

The Development and Enhancement database contains 121 columns of project information for each project submitted. This information includes information identifying the type of business and application, the programming language(s) used, Functional Size of the project in one of many Functional Measures available in the industry (IFPUG, COSMIC, NESMA, etc.), project effort normalized based on the project phases the report contains, Project Delivery Rate (PDR), elapsed project time, etc.

PRICE Systems has recently partnered with ISBSG with licenses to both the data repositories. Although we cannot distribute the data to those without subscriptions, there is no reason we can’t use analysis of this data to provide guidance to users of our software estimation tool, TruePlanning.

One effort focused on developing calibrated software estimation templates for True S based on various scenarios across industry sector, application type, development type (new or enhancement) and language type (3GL,4G). This exercise combined data mining, statistical analysis, and expert judgment.

This paper discusses the methodology used to derive these templates and presents the findings of this research. While the actual analysis is focused on a particular software estimating model, the research, analysis and techniques should inform similar analyses that are tool agnostic.

Download presentation – Lessons Learned from the ISBSG Database

Software Estimation – The next level – T. Dekkers

The Total Cost Management (TCM) Framework of the Authority for the Advancement of Cost Engineering (AACE) International is an Integrated Approach to Portfolio, Program and Project Management. It provided a structured, annotated process map that explains each practice area of the cost engineering field in the context of its relationship to the other practice areas including allied professions. In other words; it is a process for applying the skills and knowledge of cost engineering.

A key feature of the TCM Framework is that it highlights and differentiates the main cost management application areas: project control and strategic asset management. In this paper the focus is on project control.

In the TCM Framework, the Basis of Estimate (BOE) is characterised as the one deliverable that defines the scope of the engagement and ultimately becomes the basis for change management. When prepared correctly, any person with (capital) project experience can use the BOE to understand and assess the estimate, independent of any other supporting documentation. A well-written BOE achieves those goals by clearly and concisely stating the purpose of the estimate being prepared (i.e. cost/ effort/duration study, project options, funding, etc.), the project scope, cost basis, allowances, assumptions, exclusions, cost risks and opportunities, contingencies, and any deviations from standard practices.

A BOE document is a required component of a cost estimate. Because of its relevance in the set of AACE International recommended practices (RP) a BOE document is present. This template provides guidelines for the structure and content of a cost basis of estimate.

Although not always happy with the opinion that the Software Services Industry is different than other industries, analysis of the BOE shows that the structure is applicable but needs to be adapted to meet the practise in Software Services. In addition, the terminology used does not reflect the activities, components, items, issues, etc. of the Software Services Industry.

The tailored version; Basis of Estimate – As Applied for the Software Services Industries provides guidelines for the structure and content of a cost basis of estimate specific to the software services industries (i.e. software development, maintenance & support, infrastructure, services, research & development, etc.).

With this BOE a structure is provided for further standardisation of the Estimation Process, a more consistent use of metrics (sizing, effort, schedule, quality), transparent options for control (benchmark, audit, bid validation) and a common approach on assumptions and associated risks.

Download presentation – Software Estimation – The Next Level

Software Rates vs cost per Function Point: a cost analysis of 10.000 software projects from 8 large clients – D. Castelo, R. De La Fuente

Implementing productivity models helps in the understanding of Software Development Economics, which up to now is not entirely clear. Most organizations believe that the only way to achieve improvements is lowering software rates.

With a background of three years of statistical data from large multinational clients, the presentation shows how the relationship between software rates and cost per function point differs from what could be expected, sometimes even far from expected. Leaning on a statistical demonstration, the reached conclusion is that excessive pressure on software rates destroys the concept of software rates in outsourcing processes. The study results also lead us to some considerations regarding how software development activity is understood and managed, both from the client and the software provider perspective.

Download presentation – Software Rates vs Cost per FP

Application Lifecycle Management and process monitoring through an integrated and low-cost solution, mainly based on Open Source Software products – N.Bertazzo, D. Gagliardi, S. Oltolina, G. Ruffatti

Requirements play a crucial role in the definition of the boundaries and identity of a project: their traceability, correct development, sharing with the stakeholders and validation determine the project failure or success. Moreover, Quality Assurance (QA) processes facilitate project management activities.

Proper measures and indicators are considered as key information to know if a project is on the right way or not. Engineering Group (www.eng.it) intends to show how its integrated solution, recognized compliant to CMMI-DEV principles and based on SPAGO4Q (www.spago4q.org) and the QEST nD model (Buglione-Abran), made of a set of open source and low cost tools, allows to:
• manage the application lifecycle in a complex, flexible and shared environment, enhancing the communication among the project stakeholders;
• manage internal project assessment activities by the QA Department;
• monitor projects and measure performances, allowing the information sharing among the stakeholders.

The use of an integrated, low-cost solution compliant to the CMMi requirements, which can be easily extended and integrated with other corporate tools, has been a key success factor at Engineering Group, fostering the adoption of well-defined ALM processes and an effective software lifecycle management. This solution can integrate other applications developed by different divisions of the company, reducing the duplication of information and fostering the sharing of lessons learned.

Download presentation – Application Lifecycle Management

Analysis of ISBSG r12 Data for Understanding Testing Effort – A. Abran & K. Jayakumar

Analysis of ISBSG R12 data has been investigated to understanding the effort expended on the testing phase of software development projects. The available data was filtered to retain web based/client server projects in order to make the results applicable for current generation projects. Data was further filtered to include projects where size was measured using either IFPUG FPA or COSMIC.

Analysis of data resulted in three homogenous groups of projects:
(a) Projects consuming less test efforts (up to 1 hr per function point)
(b) Projects consuming average test efforts (above 1 hr but less than 3 hrs per function point) and
(c) Projects consuming high test efforts (above 3 hrs per function point)

The presentation will provide key statistics computed for each of the three groups including size to effort relationship.  Further analysis of the three groups resulted in two key findings which will be discussed in the presentation:
(1) More than 60% of the projects which consumed less testing efforts (first group) had specification reviews/ design reviews/ code reviews as part of their development phase while projects in the second and third groups had only a small percentage of projects which had such reviews prior to testing.
(2) Another interesting observation from the analysis is related to test automation where in over 90% of projects where tests were automated consumed less testing efforts.
Even though both of the above observations are intuitive, the study with ISBSG data provides quantitative objective support.

The presentation will also provide key test effort statistics on various subset of R12 data such as Development Projects, Enhancement Projects, Business Applications, Real-time applications, Projects measured using IFPUPG FPA and Projects measured using COSMIC. As there are no ISBSG reports specifically on software testing efforts, this information will be useful to the industry.

Download presentation – Analysis of ISBSG R12 Data

Are we really bad? A look at software estimation accuracy – P. Hill

Using data from completed software projects in the ISBSG repository, we will look at how people have gone about estimating their software projects and how well they did it. We will look at estimation techniques used, the accuracy of estimates and relationships between the estimates.
We will then offer practical tips and some steps you can take to determine how realistic your own estimates are.

Download presentation – Are we really that bad?