Evaluating Information Technology Investments
Table of Contents
In fiscal year 1996, executive agencies expect to obligate more than $26 billion for information technology (IT) investments and operations. This IT spending represents a critical investment of public tax dollars affecting virtually every government function. Creating a government that works better and costs less demands high returns on information technology (IT) investments and reduced systems development risks.
This guide sets out an analytical framework for linking IT investment decisions to strategic objectives and business plans in Federal organizations; it supplements existing OMB policies and procedures. OMB's objective is to provide information on 1) what OMB expects from agencies and 2) how agencies can reduce the risk and maximize the net benefits from their IT investments. The guide was written with assistance from GAO and is based on strategic information management practices in successful organizations.End Note 1, End Note 2
This guide describes the critical success elements and key phases that should be a part of a mature IT investment process. The IT investment process an agency designs should match the culture and organizational structure of the agency. The overriding objective is that senior managers be able to systemically maximize the benefits of IT investments through use of the IT investment process.
The investment process, depicted in Figure 1 below, consists of three phases: selection, control and evaluation. As Figure 1 indicates, the three phases of the investment process occur in a continuous cycle of selection, control, and evaluation. Information from each phase flows freely among all of the other phases with the exception of evaluation. The evaluation component of the process has a unidirectional information flow to the selection component. The evaluation component is used to verify or modify the criteria used during selection.
-- create a portfolio of IT project investments that maximizes mission performance, using a standard set of criteria for consistent comparison of projects.
-- measure ongoing IT projects against their projected costs, schedule, and benefits and take action to continue, modify, or cancel them.
-- 1) determine the actual return on investment of an implemented investment against the agency's mission and 2) adapt the existing process to reflect "lessons learned".
The control and evaluation phases are conducted throughout the year and their results are fed into the selection phase, which in turn feeds back to the control and evaluation phases.
This guide begins by identifying three attributes that characterize successful investment processes in best practice organizations. The guide then is organized by the phases of the investment process. Within each phase, the guide will describe: 1) the steps involved , 2) applicable management techniques and tools, 3) key questions to ask, and 4) examples from best practice organizations.
While each phase of the investment process has its own requirements for successful implementation, there are some overall organizational attributes that are critical to successful investment evaluation. These shared, critical attributes are: senior management attention, overall mission focus, and a comprehensive portfolio approach to IT investment.
Agency processes should include the following elements:
Senior program managers, with authority to make key business and funding decisions on IT projects, are continuously involved in the process.
A disciplined and structured management forum is used to make IT investment decisions, with the authority to approve, cancel, or delay projects, mitigate risks, and validate expected returns.
Program, Information Resource Management (IRM) , and financial managers with clearly defined roles, responsibilities, and accountability for the success of IT projects. Mechanisms to achieve this include establishing service agreements between providers (IRM/Chief Financial Officer (CFO)) and consumers (line management) of information technology, incorporating IRM/CFO issues and requirements into program plans, and routinely involving the IRM/CFO offices in operational decisions.
Agency processes should:
link strategic planning to the agency's mission goals and customer needs as required by the Government Performance and Results Act (GPRA) of 1993 (Public Law 103-62). This includes developing long-term general goals, setting specific annual performance targets, and annually evaluating actual performance against these targets.
develop mission-related IT measures that link the IRM strategic plan with the agency strategic plan.End Note 3 For example, mission goals should be translated into objective, results-oriented measures of performance, both quantitative and qualitative, which can form the basis for measuring the impact of information technology investments.
determine whether the function to be supported by the investment should be performed in the private sector rather than by an agency of the Federal government.
determine whether the agency proposing to perform the function is the most appropriate agency.
examine the work processes involved to ensure they are efficient, effective, and will take full advantage of the proposed automation.
use mission benefit, not project completion on time and within budget, as an important measure of success for any IT project.
identify all major existing or planned information systems and define their relationship to one another and to the agency's mission.
define a portfolio that includes IT projects in every phase (initial concept, new, ongoing, or fully operational) End Note 4 and for every type (mission critical, cross-functional, infrastructure, administrative, and R&D) End Note 5 of IT system.
develop levels of review, documentation requirements, and selection criteria appropriate to the phase and type of IT system.
define dollar thresholds that can be used to channel projects to the appropriate agency decision levels to best accommodate organization wide versus unit specific impact. Mostimportant is the use of a consistent set of investment decision practices throughout the agency. Some best practice organizations submit projects to thorough investment reviews when costs exceed between 0.5 and 2 percent of the organization's IT budget.
develop criteria for identifying projects of a critical nature that fall below the dollar threshold but should be included in the investment review process.
Each attribute contributes to properly implementing the three phases of the investment process. Senior managers and those helping to install the investment process in each agency should keep these elements in mind during review of the details of the selection, control, and evaluation phases.
The selection phase creates a portfolio of IT project investments designed to improve overall organizational performance. This phase combines rigorous technical evaluations of project proposals with executive management business knowledge, direction, and priorities. Key to this phase is the use of uniform, consistent decision criteria that will allow agency executives to make comparisons of costs, benefits, risks, and returns across project proposals. The four step selection process is:
Step 1 -- screen IT project proposals;
Step 2 -- analyze risks, benefits, and costs;
Step 3 -- prioritize projects based on risk and return; and
Step 4 -- determine the right mix of projects and make the final cut.
Executive management team that makes funding decisions based upon comparisons and tradeoffs among competing project proposals, especially for those projects expected to have organization-wide impact.
Documented and defined decision criteria that examine expected return on investment (ROI), technical risks, improvement to program effectiveness, customer impact, project size and scope.
Pre-defined thresholds and authority levels that recognize the need to channel project evaluations and decisions to appropriate management levels to accommodate unit-specific versus agency level needs.
Minimal acceptable ROI hurdle rates for project approvals -- applicable to all organizational levels -- to minimize risks and increase returns.
Risk assessments that expose potential technical and managerial weaknesses that could impair project success.
IT proposals should be screened for the level of review as well as relevance and feasibility.
A mature investment screening process should prescribe the amount of documentation and level of analytical rigor depending on the project's type (i.e., mission critical, infrastructure, etc.) and phase (i.e., initial concept, new, ongoing and operational). For instance, when senior managers analyze initial concept proposals the questions and documentation would be different from that required for a project that is ready to be awarded and implemented.
Example: One best practice company required more documentation and greater analytical rigor if a proposal would replace or change an operational system vital to keeping the company running or if the concept matched a company-wide strategic goal. Lower-impact proposals that would only affect an office or had a non-strategic objective were not scrutinized in as much detail.
If a project proposal does not meet all the essential requirements necessary for its type and phase, it should be returned to the originating business unit sponsor indicating problems, issues, or documentation that needs further work or clarification.
Following are some of the questions that can be used to screen projects for relevancy to the agency's mission and for technical and organizational feasibility. If the answer to any of these questions is no, a project should not receive consideration and should be returned to the originating unit. Projects that meet these criteria should continue to Step 2 where more rigorous analysis is performed.
At this point, the proposals should be reduced to those with the highest potential to support the agency's critical mission and/or operations.
A detailed evaluation of each proposal's supporting analyses should be conducted and summarized so that senior management can begin examining tradeoffs among competing proposals that are to occur in the next step. At this stage, a technical review team should evaluate the soundness of the project's benefit-cost and risk analyses. In particular, the review team should examine how theproject is expected to improve program or operational performance and the performance measures that will be used to monitor expected versus actual results.
Example: One best practices organization required the project team to present not only the estimated return on investment (ROI), but also the specific assumptions underlying their analysis, why such assumptions are appropriate under these circumstances, and any differences from assumptions used to calculate ROI for comparable projects in the past.
Example: In another best practices organization, qualified staff reviewed and scored all projects using risk criteria before the projects were reviewed for approval by top managers. The top managers considered these risk scores in their decision making process. Risk elements were reported in five categories: security, user and customer impact, system (project) impact, dollar impact, and complexity. Within each category, applicable elements were given a numeric score from 1 (lowest risk) to 5 (highest risk). Under security, for example, elements included the classification levels of information to be processed, how programs and files were protected by security software, and what access controls were to be in place. Total scores from the individual elements in each category were weighted, based upon an agreed upon formula, to reflect the organization's priorities. Weighted scores were included in the top managers' review packages.
During this phase, IT projects are rigorously compared against one another to create a prioritized list of all investments under consideration.
After completing analysis, the agency should develop a ranked listing of information technology projects. This listing should use expected risks and benefits to identify candidate projects with the greatest chances of effectively and efficiently supporting key mission objectives within given budget constraints. End Note 7
One approach to devising a ranked listing of projects is to use a scoring mechanism that provides a range of values associated with project strengths and weaknesses for risk and return issues. Table 1, below, shows an example of how individual risk and return factors might be scored. This example is a hybrid table drawn from multiple best practices organizations. Higher scores are given to projects that meet or exceed positive aspects of the decision criteria. Additionally, in this example, weights have been attached to criteria to reflect their relative importance in the decision process. In order to ensure consistency, each of the decision criteria should have operational definitions based on quantitative or qualitative measures.
A scoring and ranking process such as the one depicted in Table 1 may be used more than once and in more than just this step to "winnow" the number of projects that will be considered by an executive decision-making body down to the best possible choice.
An outcome of such a ranking process might produce three groups of projects:
At the end of this step, senior managers should have a prioritized list of IT projects and proposals with supporting documentation and analysis.
During this phase, an executive level decision making body determines which projects will be funded based on the analyses completed in the previous steps.
Determining the right mix of projects to fund is ultimately a management decision that considers the technical soundness of projects, their contribution to mission needs, performance improvement priorities, and overall funding levels that will be allocated to information technology.
Senior management should consider the following balancing factors when arriving at a final resource allocation and project mix.
After consideration of all of the factors mentioned above, senior management should have enough information to make knowledgeable investment decisions. Senior management should also designate how many times a project is to be reviewed based on the level of risk and any steps that the project team must take to mitigate that risk. For example, one best practices organization requires that senior management only approve projects after a review schedule has been established, (e.g., reviewed once a month for high risk, or once a quarter for lower risk), and specific requirements have been given to the project team to ensure that they mitigate risks, (e.g., develop a risk management plan).
Project review schedules, risk mitigation plans and the cost-benefit plans from prior steps all feed directly into the next section of the investment process -- control.
While agencies select proposals once a year, the control phase is an ongoing activity to review new and ongoing projects, as well as operational systems. During the control phase, senior management regularly monitors the progress of ongoing IT projects against projected cost, schedule, performance and delivered benefits. The frequency of the reviews may vary, but should not wait until the annual budget preparation and deliberation process. How often and to what extent individual projects should be reviewed should have been established as the last step in the Selection phase. Rather than avoiding problems and concerns emerging from unexpected risks, this phase accentuates the need for management accountability by creating pre-arranged checkpoints for projects and forcing corrective action when necessary. If a project is late, over cost, or not being developed according to expectations, then senior management must decide whether to continue, modify, or cancel it. The steps in this phase are to:
Step 1 --monitor projects/systems against projected costs, schedule, and performance; and
Step 2 -- take action to correct any deficiencies.
Before an organization can fully implement the control steps, uniform mechanisms for collecting, automating, and processing data on expected versus actual costs, schedules, and returns should be in place for all projects. End Note 8
Senior managers need to compare the preliminary results being achieved by a project against its projected costs, benefits and risks, and to identify actual or potential managerial, organizational, or technical problems.
Senior management should be able to judge whether a project is on track to achieve its projected mission benefits. The key is to use a set of performance measures consistently so that senior program managers are provided early warning of potential or actual problems. It is essential torefresh these measures as costs, benefits, and risks become better known to ensure the continued viability of an information system prior to and during implementation.
Examples of problems that could affect a project or system include 1) lack of input by program management into the requirements phase of a project, 2) a project that was intended to be cross-functional becomes stove-piped because other offices in the agency do not support it, 3) new requirements have been added, and 4) it is more difficult to use the technology than was anticipated.
Senior program managers in federal agencies often pay most of their attention to new projects and carry ongoing projects as necessary budget items. In best practice organizations, however, ongoing projects are reviewed continually along with new projects and go/no-go decisions are made. No project should be allowed to continue indefinitely through failure. Project continuance should be periodically challenged.
Based on a schedule developed during the selection phase, each project/system should be reviewed with at least the following considerations in mind:
Senior program management should be able to develop a well-informed picture of current and potential problems for each ongoing IT project.
The action should result in the deliberate continuation, modification, or cancellation of each project.
The prior step, pertaining to monitoring of projects, should pinpoint projects that senior management need to make decisions on. What action to take is a management decision.
Senior management should ensure that:
To use an example, many federal agencies are prototyping IT projects before moving into the implementation stage. Monitoring the mission results gained by the prototype allows senior program management to make informed decisions about whether to stop or modify a project at this stage, rather than letting the project continue on into implementation automatically.
Proper control of IT investments enables senior management to mitigate risk of schedule, cost overruns, and development of a product that does not meet the goals originally intended. This process is highly dependent on facts provided through continual measurement of new and ongoing projects. The data fed from the Selection process to the Control process supports this requirement, as do the measurements taken throughout the life of a project.
Evaluation is conducted after a system has been implemented, and is an assessment of the project's success or failure. Using post implementation reviews, data is collected, recorded, and analyzed to compare expected results against actual benefits and returns. Figure 1, shown previously, depicts the evaluation phase in relation to the other two phases. Evaluation is used to 1) decide whether future changes are necessary which can help address serious performance gaps, and 2) make decisions about modifications to the organization's existing evaluation process and selection criteria. This phase is comprised of three steps:
Step 1 -- Conduct Post Implementation Reviews
Step 2 -- Decide on Adjustments
Step 3 -- Lessons Learned
Conduct and review the results of post implementation reviews, focusing on anticipated versus actual results in terms of cost, schedule, performance, and mission improvement outcomes. Determine the causes of major differences between plans and end results.
Most federal agencies accept that recently implemented systems are a fait accompli and move on from there. This point of view is contrary to the investment management philosophy of managing the entire IT portfolio. The primary tool to assess a project in best practice organizations is the post-implementation review. Questions to ask include:
The post-implementation review should inform senior management's decision whether to continue, modify, or cancel operational systems.
Using the results of the post implementation review as a baseline, decide whether to continue without adjustment, to modify the system to improve performance or, if necessary, to consider alternatives to the implemented system.
Even with the best system development process, it is quite possible that a new system will have problems or even major flaws that must be taken care of in order for the agency to get the full benefit of its investment. The post implementation review should provide executive management withuseful information on how best to modify a system, or to work around the flaws in a system, in order to improve performance and to bring the system further in alignment with the needs of its customers.
Using the collective results of post implementation reviews across completed systems, modify the organization's existing investment selection and control processes based on lessons learned.
The information from post implementation reviews helps senior management develop better decision criteria during the Selection process and improve the evaluation of ongoing projects during the Control process.
A mature investment process will help ensure that taxpayer dollars spent on information technology will be used to effectively support the agency's mission objectives. Dwindling resources and higher public demand for service means that a project must be worth doing from a mission perspective, it must be possible to accomplish it at reasonable time and cost, and it must support the strategic direction of the agency.
A mature investment process requires discipline, executive management involvement, accountability, and focus on risks and returns using quantifiable measures. Senior program managers, those with the programmatic responsibility in key business areas, should be involved directly in prioritizing and selecting the IT projects their organization will pursue. Their decisions should be well-informed, based on analytical rigor and robust measures. Furthermore, a mature investment process is a year-round activity, not just a process to be done near budget time. Senior program managers should be involved in devising and enforcing solutions to the problems that inevitably arise. Finally, the mature investment process is a learning process. The real-world results of IT projects and mission programs should be continuously fed back to senior managers as they make decisions on new projects and operational systems.
The Budget | Legislative Information | Management Reform/GPRA