IPC Maturity Assessment Logo
  • IPC Maturity Model

    TBH - Project Delivery Experts
  • Introduction

    The following questions show a sample of our Integrated Project Controls Maturity Assessment (IPCMA) tool which helps to arrive at an overall score (1 being very low maturity and 5 being industry best practice) for schedule, cost, risk and contingency management.

    This tool quantifies the current state (As-is) and measures it against the required or agreed level of maturity for each individual criterion based on the required future state (To-Be). This sample gives you the opportunity to consider your current and desired future state ratings for each included criterion.

  • Instructions

    This sample assessment includes 5 sample questions selected from the complete toolkit’s criteria. The form is divided into four sections:

    1. Schedule
    2. Cost
    3. Risk, and
    4. Contingency.

    You can choose which sections you wish to complete, and can select “N/A” if any of the criteria are not relevant to your situation.

    Each criterion has a description for a rating of 1 (least mature) through to 5 (Industry Best Practice). You will be asked to select your current and ideal future rating for each criterion. Keep in mind that the required future state may not always be a rating of 5.

    If viewing the form on a mobile device, we recommend viewing in the landscape position.

  • Contents: Schedule Maturity Metrics

  • 1a. Basis of Schedule

    1b. Schedule Basis of Estimate

    1c. Critical Path and Float Management

    1d. Schedule Risk Integration

    1e. Schedule Change Control and Management

  • 1a. Basis of Schedule

  • Level 1 - There is no basis of schedule developed.

    Level 2 - No formal Basis of Schedule developed. Information key elements and assumptions of the schedule is available in various other documents.

    Level 3 - A formal Basis of Schedule has been developed but is not updated to reflect the latest schedule changes. Not all information is covered in the BoS.

    Level 4 - A formal BoS has been developed and is being maintained. The BoS covers the essential elements such as:Scope of the work, Deliverables, WBS, Key Project Dates, Basis of planning estimates, Critical Path and Schedule Contingency.

    Level 5 - A formal BoS has been developed covering everything in criteria 4 and Covers: Risks and Opportunities, Baseline changes, Key project interfaces, near critical path description, Interface Management, integration process.

  •  
  • 1b. Schedule Basis of Estimate

  • Level 1 - Schedule estimates have been made without a basis of estimate or any methodologies or assumptions recorded.

    Level 2 - Schedule estimates have been made and somewhat recorded, but without a full record of methodologies or assumptions.

    Level 3 - A basic basis of estimate exists and all assumptions have been documented.

    Level 4 - A somewhat complete basis of estimate exists with methodologies and assumptions fully recorded. The estimates are based on previous actual project data.

    Level 5 - A factually complete and concise basis of estimate exists which documents the entire project scope and considers risks and opportunities. Actual project data is used to inform future estimates and updates. All assumptions, methodologies and team members have been documented.

  •  
  • 1c. Critical Path and Float Management

  • Level 1 - The schedule does not contain a clear critical path and float is not calculated or regarded in schedule analysis. The schedule may contain multiple constraints or activities with negative float.

    Level 2 - Some attempt has been made to identify the critical path of the schedule. Other activity floats have not been considered.

    Level 3 - The critical path of the schedule has been considered, however the method has not been documented and near critical paths have not been identified Some activities may have negative float.

    Level 4 - The critical path of the schedule is identified and some attempt has been made to identify near critical paths.

    Level 5 - The schedule identifies critical and near critical paths. Float is correctly calculated and considered in schedule analysis to inform decisions. There are no activities with negative float. The method used to identify the critical path(s) is valid and documented.

  •  
  • 1d. Schedule Risk Integration

  • Level 1 - Schedule risks are not considered or integrated into the schedule.

    Level 2 - Schedule risks are considered and documented but confidence levels and likely quantitative impacts are not calculated or included in the schedule.

    Level 3 - Schedule risks are considered in a risk register including the likely impact and the response actions for each risk.

    Level 4 - Schedule risks are documented and integrated into the schedule along with their agreed controls. Schedule Risk Analysis techniques are applied to calculate schedule confidence which is applied as agreed contingency in the schedule.

    Level 5 - Contingent and inherent project risks, opportunities and controls are considered, documented and integrated in the schedule. Schedule Risk Analysis techniques are applied to calculate confidence in finish dates. Risks are updated in the schedule updating cycle. It is understood which schedule paths have the highest probability of influencing the schedule completion and which risks have the most influence on overall schedule variability.

  •  
  • 1e. Schedule Change Control and Management

  • Level 1 - There is no Schedule change management plan developed. Schedule changes are ad-hoc and without being recorded. No record of changes to the baseline and scope of the project is available. Changes are made without formal approval from change controls process.

    Level 2 - Changes are identified and are being addressed in the schedule. No formal Change Controls process is in place. Some level of documentation is available on baseline change management.

    Level 3 - A Schedule Change Control process has been developed and is being followed. Change Order Requests are issued, approved and recorded following the process.

    Level 4 - A formal Schedule Change Control process has been developed and is being followed. Change Order Requests are issued, approved and recorded in a timely manner following the process.

    Level 5 - A formal and comprehensive Schedule Change Control process has been developed and is being followed. Change Order Requests are issued, approved and recorded in a timely manner following the process. The Change Order Requests include a Time Impact Analysis in a contemporaneous manner. The process covers the entire life cycle of the project(s).

  •  
  • Contents: Cost Maturity Metrics

  • 2a. Historical Cost information

    2b. Cost variance causes and estimating mistakes are recorded after the end of each project

    2c. Risk Management (Cost)

    2d. The control process is integrated with production planning and planning process

    2e. Project Cost Accounting is integrated with performance measurement

  • 2a. Historical Cost information

  • Level 1 - Historical Cost information is not used in estimates.

    Level 2 - Historical actual cost information used on an ad-hoc basis.

    Level 3 - Historical actual cost information is used in estimates.

    Level 4 - Historical information is interrogated for productivity and other factors on an ad-hoc basis before they are adopted into cost estimate.

    Level 5 - Historical information is interrogated for productivity and other factors before they are adopted into cost estimate.

  •  
  • 2b. Cost variance causes and estimating mistakes are recorded after the end of each project

  • Level 1 - Cost variances and estimating mistakes are not recorded after each project.

    Level 2 - Cost variances and estimating mistakes are recorded on an ad-hoc basis after each project.

    Level 3 - Cost variances and estimating mistakes are recorded after each project.

    Level 4 - Cost variances and estimating mistakes are recorded along with causes for variances after each project.

    Level 5 - Cost variances and estimating mistakes are recorded along with causes for variances after each project. The information is then used to update the benchmarks for future use.

  •  
  • 2c. Risk Management (Cost)

  • Level 1 - Risk management does not consider risk implications on cost.

    Level 2 - Risk management considers risk implication on cost on an ad-hoc basis.

    Level 3 - Risk management considers risk implication on cost.

    Level 4 - Risk management considers risk implication on cost and allows for P50 and P90 contingencies.

    Level 5 - Risk management considers risk implication on cost and allows for P50 and P90 contingencies. Governance exercised over draw-down on contingency. Where risks are no longer applicable to the project, the project contingency rolls back up into the program contingency for reallocation.

  •  
  • 2d. The control process is integrated with production planning and planning process

  • Level 1 - Cost planning process not integrated with cost control process.

    Level 2 - Cost planning process is integrated with cost control on an ad-hoc basis.

    Level 3 - Cost planning process is integrated with cost control process only at the beginning and end of the project.

    Level 4 - Cost planning process is integrated with cost control process throughout the project.

    Level 5 - Cost planning process is integrated with cost control process throughout the project. An iterative process of cost data transfer between cost planning and cost control ensuring that cost forecasting is accurate and benchmarks are updated in real project time.

  •  
  • 2e. Project Cost Accounting is integrated with performance measurement

  • Level 1 - Project cost accounting is not integrated with performance measurement.

    Level 2 - Project cost accounting is integrated with performance measurement on an ad-hoc basis.

    Level 3 - Project cost accounting is integrated with performance measurement at the highest level.

    Level 4 - Project cost accounting is integrated with performance measurement. All costs are charged against a CBS which is mapped to a WBS. Project performance in the schedule can be linked to costs.

    Level 5 - Project cost accounting is integrated with performance measurement. All costs are charged against a CBS which is mapped to a WBS. Project performance in the schedule can be linked to costs. Root cause analysis in place and operational. Forecasting accurate and operational.

  •  
  • Contents: Risk Maturity Metrics

  • 3a. Allocation of resources for facilitating risk management activities

    3b. Standardised protocols, i.e. tools, techniques and templates

    3c. Centralised and standardised risk register - as a tool

    3d. Risk portfolio view of risk interconnectivity

    3e. Risk Data Analytics

  • 3a. Allocation of resources for facilitating risk management activities

  • Level 1 - Absence of dedicated competent resources for managing risk, i.e. Champions, Controls Team, etc.

    Level 2 - Process to allocate resources across their units/ projects exist, but inactive or absolute. Allocated resources lack the interest or the competency to undertake risk management activities.

    Level 3 - Resources allocation arrangements exist, but consistently unfollowed across the board. Allocated resources have the interest and competency to undertake risk management activities, but not across the board..

    Level 4 - Resources allocation arrangements exist, but minor inconsistency in practice across the different units/ projects. Allocated resources have the interest and competency to undertake risk management activities, but but minor inconsistency.

    Level 5 - Dedicated resources for managing risk, i.e. Champions, Controls Team, etc.

  •  
  • 3b. Standardised protocols, i.e. tools, techniques and templates

  • Level 1 - Absence of standardised systems, risk breakdown structure, risk reporting and risk assessment tools and techniques.

    Level 2 - Tools established, but vague or generic. Tools not proportionate to the majority of project/s/ areas, and discarded or replaced by individual initiatives.

    Level 3 - Tools established but inconsistently followed across the unites/projects. Tools not proportionate to some project/s/ areas, and often discarded or replaced by individual initiatives.

    Level 4 - Tools established, but minor inconsistently in practice is detected across the unites/ projects or the requirements of a unit or project cannot be met.

    Level 5 - Standardised systems, risk breakdown structure, risk reporting and risk assessment tools and techniques. Collected and historical records.

  •  
  • 3c. Centralised and standardised risk register - as a tool

  • Level 1 - Absence of centralised and standardised risk register, that encompasses the minimum elements of the event description, causes and consequences, current control, future treatments, scores, accountabilities and time frames for managing risks. The majority of areas/ or project haven't populated a risk register.

    Level 2 - Multiple versions of registers exist and not following the same structure or format, i.e. RIMS, excel spreadsheets. Registers are incomplete or suffer severe quality issues, missing controls, accountability of managing risks, or responsibility of treatment actions, etc. Multiple areas/or project have no access to risk register.

    Level 3 - Multiple versions of registers exist that follow the same structure or format, i.e. RIMS, excel spreadsheets. Registers are incomplete or suffer quality issues that don’t affect its integrity. Few areas/or projects have no access to risk register.

    Level 4 - One version per unit/project with minor compliance or quality issues. A negligible number of areas/projects haven't populated a risk register.

    Level 5 - Centralised and standardised risk register, that encompasses the minimum elements of the event description, causes and consequences, current control, future treatments, scores, accountabilities and time frames for managing risks.

  •  
  • 3d. Risk portfolio view of risk interconnectivity

  • Level 1 - Absence of a standardised view of risks in one part of the organisation can relate to risks occurring elsewhere and these links and relationships need to be managed just as much as individual risks in isolation.

    Level 2 - A process exists, but abounded, irrelevant or ineffective. Stakeholders are not involved in the analysis. Portfolio risks view rarely taken into consideration when assessing risk.

    Level 3 - A process exists, partially effective, but not being consistently applied throughout the unit/project/s. Portfolio risk view was often taken into consideration when assessing risk.

    Level 4 - A process exists, effective and applicable to the majority of units/projects. The system does not apply to a few areas/projects due to their special needs requirements. Portfolio risk view usually taken into consideration when assessing risk.

    Level 5 - A standardised view of risks in one part of the organisation can relate to risks occurring elsewhere and these links and relationships need to be managed just as much as individual risks in isolation.

  •  
  • 3e. Risk Data Analytics

  • Level 1 - Absence of systematic and standardised protocol for analysing risk information, i.e. Data reconciliation, correlated impact assessment- portfolio view, etc. Risk outputs are used for reporting purposes, check the box style.

    Level 2 - Data Analytics process exists, but abounded, irrelevant or ineffective. Risk analysis occurs on an ad-hoc basis. Risk analysis lacks deep insights into the information that support decision making. i.e. risk profile, common/repeated risk causes, high impact area and lack of action on action areas.

    Level 3 - Data Analytics process exists, partially effective, but not being consistently applied throughout the unit/project/s. Risk analysis often occurs systematically. Risk analysis lacks deep insights into the information that support decision making. i.e. risk profile, common/repeated risk causes, high impact area and lack of action on action areas.

    Level 4 - Data Analytics process exists, effective and applicable to the majority of areas/projects. The system does not apply to a few areas/projects due to their special needs requirements. Risk analysis often occurs systematically. Risk analysis lacks deep insights into the information that support decision making. i.e. risk profile, common/repeated risk causes, high impact area and lack of action on action areas.

    Level 5 - Systematic and standardised protocol for analysing risk information, i.e. Data reconciliation, correlated impact assessment- a program of portfolio view, etc.

  •  
  • Contents: Contingency Maturity Metrics

  • 4a. QRA (Quantitative Risk Assessment) Exercise Planning

    4b. QRA (Quantitative Risk Assessment) Exercise Output Validations

    4c. Horizontal Allocation

    4d. Application of Cost Contingency to the Project Estimate

    4e. Contingency Reallocation/Transfer

  • 4a. QRA (Quantitative Risk Assessment) Exercise Planning

  • Level 1 - Absence of systematic and standardised protocols for planning for QRA, risk assessment workshops, collection of the base schedule and base estimate and any relevant historical data.

    Level 2 - Standard protocols for risk assessment workshops, collection and preparation of the base schedule and base estimate exist, but are abounded, irrelevant or ineffective. The model rarely includes sufficient level of detail and suffers from severe quality issues.

    Level 3 - Standard protocols for risk assessment workshops, collection of the base schedule and base estimate exist, but are not being systematically used across the unit/ projects. The model usually lacks the sufficient level of detail and suffers from moderate quality issues.

    Level 4 - Standard protocols for risk assessment workshops, collection of the base schedule and base estimate exist and are being systematically used across the unit/ projects with minor outliers. The model often lacks a sufficient level of detail and suffers from minor quality issues.

    Level 5 - Systematic and standardised protocols for planning for QRA, risk assessment workshops, collection of the base schedule and base estimate and any relevant historical data exist and are being used with no quality issues.

  •  
  • 4b. QRA (Quantitative Risk Assessment) Exercise Output Validations

  • Level 1 - Absence of systematic and standardised protocols for validating the model outputs, including benchmarking, peer review and relevant stakeholders authorisation.

    Level 2 - Standardised protocols for validating the model outputs, including benchmarking, peer review and relevant stakeholders authorisation exist, but are irrelevant or ineffective.

    Level 3 - Standardised protocols for validating the model outputs, including benchmarking, peer review and relevant stakeholders authorisation exist, but are not being systematically used across the unit/ projects.

    Level 4 - Standardised protocols for validating the model outputs, including benchmarking, peer review and relevant stakeholders authorisation exist and are being systematically used across the unit/ projects with minor exceptions.

    Level 5 - Systematic and standardised protocols for validating the model outputs, including benchmarking, peer review and relevant stakeholders authorisation are being used effectively.

  •  
  • 4c. Horizontal Allocation

  • Level 1 - There is an absence of standardised protocolsfor allocating the contingency horizonal to the project different stages, i.e. Initiation, Strategic Assessment, Concept, Delivery readiness, delivery and finalisation.

    Level 2 - Standardised arrangements for allocating contingency horizontally exist, but are irrelevant or ineffective.

    Level 3 - Standardised arrangements for allocating contingency horizontally exist, but are not being systematically used across the unit/projects.

    Level 4 - Standardised arrangements for allocating contingency horizontally exist, and are being systematically used across the unit/ projects with minor exceptions.

    Level 5 - Standardised protocols exist are are being used effectively for allocating the contingency horizonal to the project's different stages, i.e. Initiation, Strategic Assessment, Concept, Delivery readiness, delivery and finalisation.

  •  
  • 4d. Application of Cost Contingency to the Project Estimate

  • Level 1 - There is an absence of systematic protocols for incorporating, approval and managing cost contingency activities into the project cost estimate.

    Level 2 - Standardised arrangements for incorporating, approval and managing cost contingency activities exist, but are irrelevant or ineffective.

    Level 3 - Standardised arrangements for incorporating, approval and managing cost contingency activities exist, but are not being systematically used across the unit/projects.

    Level 4 - Standardised arrangements for incorporating, approval and managing cost contingency activities exist, and are being systematically used across the unit/projects with minor exceptions.

    Level 5 - Systematic protocols exist and are being used effectively for incorporating, approval and managing cost contingency activities into the project cost estimate.

  •  
  • 4e. Contingency Reallocation/Transfer

  • Level 1 - There is an absence of systematic protocol for reallocation/transfer time and cost contingencies.

    Level 2 - Standardised arrangements for reallocation/transfer of time and cost contingencies exist, but are irrelevant or ineffective.

    Level 3 - Standardised arrangements for reallocation/transfer of time and cost contingencies exist, but are not being systematically used across the unit/ projects.

    Level 4 - Standardised arrangements for reallocation/transfer of time and cost contingencies exist, and are being systematically used across the unit/projects, with minor exceptions.

    Level 5 - Systematic protocols exist for reallocation/transfer of time and cost contingencies.

  •  
  • Contact Information

  •  -
  • Conclusion

  • This was a brief sample of the criteria that TBH uses to conduct a full IPC Maturity Assessment analysis. You can now download a document with the link below which further explains TBH’s maturity assessment methodology and includes some common gaps and recommendations.

    Download Report Here

     

    If you would like a more detailed assessment including bespoke recommendations, please contact:

    General queries: ipc@tbhint.com

    TBH IPC Director: Rob Hammond (rob.hammond@tbhint.com)

    NSW: Ali Dibaj (ali.dibaj@tbhint.com)

    ACT: Travis Harvey (travis.harvey@tbhint.com)

    QLD: Moataz Mahmoud (moataz.mahmoud@tbhint.com)

    VIC and SA: Robbie Breschkin (robbie.breschkin@tbhint.com)

    WA: Mena Messiha (mena.messiha@tbhint.com)

     

  • Should be Empty: