Issues with IFPUG Counting Practices Version 4

Search:   

Document Info.
First presented at
IFPUG Spring Conference 1996
Reference Category
Tools and Techniques

Related Info.
Articles
Services
Training

Reference
Categories
Articles by Title
External Ref's
Other Web Sites

by: M J Burn-Murdoch

The author entered the IT world as a programmer in 1967. He managed his first IT project in 1970. He started using function points as a solution to a management problem in 1984. Since then he has been counting Function Points for a large number of clients in a multiplicity of industries throughout Western Europe. He has counted systems varying in size from a few function points to systems in excess of 50,000 function points. This length and breadth of practical application of the technique makes him one of the worlds most experienced function point counters. He has served on the management committee of the United Kingdom Function Point User Group since 1990 and was responsible for organising the conferences from 1990 to 1995.

1. Introduction.

Version 4 of the Counting Practices Manual has now been in use for some two years. It is time to take stock and evaluate the greatly enhanced definition and explanation of the rules. In general the document has proved to be a considerable improvement on its predecessors. The improved manual must reduce the opportunities for different interpretations within the function point community. However there are a number of issues with Version 4 of the counting practice. Some of the issues are specific to version 4. Others issues date from earlier versions.

Function Points is referred to as a measurement. It is important to realise it is a statistical measure. Function point counters are not measuring systems so much as statistically sampling them. The technique relies on definitions of categories and their consistent interpretation to provide a comparable index of system size. Inevitably at the boundary of a category steps from one category to another are sudden. This gives rise to perceived injustices in the eyes of the "victims" of the counts (project development staff) and to genuine difficulties of interpretation with considerable consequences. A low complexity input equates to 3 function points, a medium complexity input to 4 function points. Only 1 function point difference but a 33% increase which can be caused by adding a single field to a screen. This is not an issue just a consequence of the technique.

A problem that results from this consequence is that differing interpretations of the counting practices amongst counters can have a serious affect on counting consistency. It is an intention of this paper to try to resolve areas where different interpretations can occur. The actual interpretation is not as important as long as every counter applies the same interpretation.

It is intended to supply possible solutions to the issues raised. Some of the solutions would require a change to the technique derived from Alan Albrechts original research. Others solutions merely seek to clarify existing rules. Some of the proposed solutions may require additional or revisited research to test the validity of the postulations. The function point community should expect the rules to evolve as the sophistication of the business problems to be solved increases over time and to extend the useful scope of the technique.

The General Systems Characteristics raise considerable questions but it is not intended to cover this area of Function Point Analysis within this paper.

2. Transaction Functions

2.1 External Inquiries and External Input External Output Pairs

A typical insurance application requires the ability to examine payment histories for specific policy holders.

The input DET is the Policy Holder, the output DETs are: Policy Number, Policy Holder, Policy Holders Address, Payment Date, Payment Source, Payment Reference and Payment Amount. The ILFs are Policy, Policy Holder and Payments.

 Score  EQ  In l/l  Out 7/3  Complexity medium  FPs = 4

A simple change to the scenario by adding a Total Payments To Date either changes the score by 200% or doesn't affect it at all

 either EI 1/1   Complexity low FPs = 3
  EO 8/3   Complexity medium FPs = 5
or the Payments ILF holds the total anyway so there is no calculation and therefore the original scoring stands.

Two hundred percent differences based upon the whim of the designer should not be acceptable.

The suggested solution to this problem is to have only one transaction function. For scorings see 2.3 Complexity Granularity.

2.2. External Outputs That Update Files

The only transaction that specifically mentions updating files is the External Input. In a typical sales ledger system the process that prints an invoice also records that the invoice has been issued, its number and the date on which it was issued. Printing the invoice is definitely an External Output and yet files are updated.

There are two possible solutions to this problem either change the rules so that an External Output can update a file and just count it as an additional file reference or again remove the differences between the transaction functions.

2.3. Complexity Granularity

The Counting Practices have always had just three levels of complexity (currently low, medium and high). This means that a report with four FTRs and six DETs and a report with more than one hundred DETs and the same number of FTRs would both count as high complexity. Similarly an input with a single DET and an input with four times the number of DETs would both count as low complexity.

There is at least a perceived problem with the granularity of complexity. This can either be written off as a factor of the particular statistical technique employed or in combination with 2.1 Only One Transaction Function moving to very low, low, medium, high and very high complexity.

To introduce the additional complexities, the complexity matrices could be numbered columns 1 through 3 and rows 1 thou 3. Then complexity l/l = very low and 3/3 = very high. The remaining combinations would remain the same as now. This should tend to give a bell shaped curve to the complexity rather than the current square.

The postulated scoring is very low = 3, low = 4, medium = 5, high = 6 and very high = 7. This would retain the five scoring values but may alter the distribution of the scores. Detailed research would be needed to study the effects over a large number of projects.

2.4. Internal Complexity

Recent experience with actuarial systems and with systems operating regulated pricing mechaniGIFPA in the United Kingdoms Utility Industry has once again raised the issue of internal complexity.

There are currently alternative methods of dealing with this problem. Either using General Systems Characteristic 9 and raising the function point count by 1% or by moving away from function points altogether and using feature points instead. A 1% increase in the system size is not satisfactory for small systems with a heavy algorithmic content. Feature points is not entirely satisfactory because whilst similar they are not the same as function points. A portfolio of rich and sparse algorithmic systems would need to use a mixture of function and feature points.

Trials are in place at three Client sites to accept that algorithm rich systems will have lower productivity rates because outputs are more difficult to produce. The lower productivity is explained by using and algorithm density factor. The algorithm density factor measures the average number of equations within each algorithm and compares this ratio to the number of function points. Correlations of around 0.8 are being achieved between productivity and size. No meaningful results have yet been produced on the algorithm density factor.

It should be noted that the fields of nuclear physics and chaos theory have been doing some investigations into the area of internal complexity. (Ref. Murray Gell-Mann "The Quark and the Jaguar", James Gleik "Chaos", M Mitchell Waldrop "Complexity" and Stuart A Kauffman "The Origins of Order"). These may prove relevant, the author does not feel competent to comment.

2.5. Address DETs

Addresses are currently counted as one DET if they are defined as having n lines of m characters or as individual DETs if the address is defined at a lower level (e.g. street, district, town, county, postcode, country). If a system inputs addresses at the lower level but outputs as n lines how should they be counted?

The current rules would indicate that the decision on counting rests on how the address is processed (i.e. using the above example 6 DETs when input but 1 DET when output. Differences between counts of the same attributes based on programming convenience seems wrong. The suggested solution is count the DETs of an address based on the DETs on the ILF.

3. Logical/Physical Application Boundary

In the context of this paper logical applications are abstract requirements expressed by the customer whereas a physical application is the solution to those requirements.

There are a number of business functions which do not engender function points because nothing apparently crosses the application boundary. For example interest calculations on account balances at a specific point in time. Certainly the usual implementation is to monitor the system clock and at the designated moment trigger the calculations.

When counting function points many practitioners regard the system clock as being part of the application. The clock and associated timing constraints exist in the absence of a computer implemented solution. Therefore the time triggered business functions should be counted as External Inputs with one or more DETs (date?, time?, error message) and file references as necessary to perform and record the calculations.

4. Data Functions

4.1. Mapping Entities Types to External Interface Files

External Interface Files present an anomaly when interpreting the counting rules. When it is decided to implement a requirement for a product entity with DETs of product code and product name by using a product ILF on another system there is no anomaly provided the ILF on the other system has the same structure of 1 RET and 2 DETs. However it is possible that the ILF on the other system has a different structure (e.g. 6 RETs and 99 DETs).

The proposed clarification of the rules is that the EIF should be counted as the applications requirements dictate. Use of the ILF on the other system versus building this systems own product file is a productivity and data integrity issue.

4.2 Link Entity Types

When counting the ILFs within a system, Link Entity types (i.e. entity types expressing a relationship between two other entity types but nothing about that relationship) are ignored. Therefore when counting files referenced they are rightly ignored again. Should a change be made to a link entity type that adds information about the relationship (e.g. an actioned date) then the link entity type becomes an ILF. It is reasonable to count this new ILF as a referenced file in the function that necessitated the change to the link entity but surely it is not reasonable to add complexity to all the other functions on the system that still access that entity but only in its capacity as a link entity type.

The proposed solution is a rule change saying that attributes other than keys and foreign keys must be accessed before an ILF can be counted as a tile referenced.

An alternative solution is a rule change to remove the necessity for the User to be aware of the ILF and then count all Link Entity Types as ILFs. This would also overcome an internal complexity problem. When an application has six unrelated entity types or it has six entity types with n Link Entity types it still has only six ILFs whatever the complexity of the relationships.

4.3. Attributive Entity Types

The examples in the counting practices manual show that whether an attributive entity type (e.g. order line entity in an order -< order line entity structure) is counted as an ILF or a RET within an ILF depends on the decisions made during the input/output dialogue design. These decisions are about the ease of use of the system not about the logical structuring of data stored within the system.

The proposed solution is to adopt the Codd/Date Normalisation rules and to count all entity types at third normal form as either ILFs or EIFs dependent about the existing selection criteria. Investigations into the existing weights would need to be considered to deal with instances where 1 ILF with 6 RETs becomes 6 ILFs.

5. Summary

A number of anomalies within the existing counting practices have been identified and solutions proposed. It is not contended that this is an exhaustive list of problems.

It is accepted that some of the suggestions are more radical than others and therefore require more experimentation and consultation before implementation.

The suggestions are offered in the spirit of trying to improve a well tried and respected product.

The overall objective of the suggestions is to make counts more consistent, a more accurate reflection of system size, easier to perform and easier to sell to a sceptical audience (e.g. the 90% of the IT community who do not measure size) .

Related Information

back to top

Articles

Aspects of Function Point Analysis
There are more benefits from FPA than just deriving size.

Introduction to Function Point Analysis
Defining the size of software has been described as like "trying to nail jelly to a wall" ...

A Comparison of the Mark II and IFPUG Variants of Function Point Analysis
Shows the similarities and main differences between the two variants documented in the IFPUG FPA Counting Practices Manual Release 4.0 and the UFPUG Mark II FPA Counting Practices Manual Version 1.0.

Using Measures to Understand Requirements
Many approaches fashionable with technically-oriented practitioners clearly fail to satisfy the need for clarity of requirements. Some even trade short-term acceleration for long-term maintenance & support costs. What is missing? What can be done to ensure that new technologies help rather than hinder? This paper suggests some simple process improvements that might have made all the difference in a number of cases.

Using COSMIC for Real-Time and Embedded Systems
Exploring the use of COSMIC-FFP based estimation in a real-time and embedded systems context.

Software Size Measurement
Undergoing a renaissance, Functional Size Measurement is applicable thorughout the development, maintenance and support lifecycles.

Services

Applying Software Metrics

Data Collection
Services for identifying, collecting and checking measurements.

Starting a Measurement Programme
A measurement programme is part of a means to an end (one or more business objectives). To deliver any benefit the objective(s) must be clearly understood first and then the measurement programme must be designed to support them.

Supporting a Measurement Programme
Once successfully started, there are various activities required to keep the measurement programme operating effectively and the results relevant.

Assessing Capability

Functional Sizing Audits
To ensure that the selected functional sizing method is being used to produce reliable consistent results.

Tools and Techniques

COSMIC FFP
A method designed to measure the functional size of real-time, multi-layered software such as used in telecoms, process control, and operating systems, as well as business application software, all on the same measurement scale.

IFPUG Function Point Analysis
The original method of sizing, it is currently at version 4. This method is still the most widely used and works well in the business/MIS domain. It is applicable to object oriented developments.

Mark II Function Point Analysis
This method assumes a model of software in which all requirements or ‘user functionality’ is expressed in terms of ‘Logical Transactions’, where each LT comprises an input, some processing and an output component.

Training

Applying Software Metrics

Uses and Benefits of Function Point Analysis  
Learn how FPA can help your projects manage the acquisition, development, integration and support of software systems

FPA Follow-Up Workshop  Advanced Workshop 
An advanced workshop to help experienced practitioners resolve the issues that arise when using unfamiliar technologies.

Function Point Counting Workshop   
Apply your skills in a coached workshop – consolidate your skills and experience on the job.

Sizing E-commerce Applications  Advanced Workshop
An advanced workshop for practitioners wishing to apply functional size measurement to internet-based solutions

Tools and Techniques

COSMIC FFP for Sizing & Estimating MIS and Real-Time Software Requirements  Formal Course 
Learn how to measure the software component of software-intensive systems using the latest ISO-standard method

Practical use of IFPUG Function Point Analysis  Formal Course 
Learn the most popular technique for measuring the functional size of software applications and projects

Practical use of MkII Function Point Analysis  Formal Course 
Learn the UK Government’s preferred technique for measuring the functional size of software applications and projects

GIFPA Ltd. 2016
    
  
 e-mail: sales@gifpa.co.uk
  www.gifpa.co.uk

Copyright 1996-2016 GIFPA Ltd. 2016 All rights reserved.

                                               
Applying Software Metrics
Assessing Capability     
Estimating and Risk       
Improving Processes     
Measuring Performance
Sourcing                       
Tools and Techniques   
             
                
Services               
Training                
Events                  
Reference             
                
About GIFPA         
Opportunities
Copyright & Legal