|
Evaluating Information Technology Investments
Table of Contents
IntroductionIn fiscal year 1996, executive agencies expect to obligate more than $26 billion for information technology (IT) investments and operations. This IT spending represents a critical investment of public tax dollars affecting virtually every government function. Creating a government that works better and costs less demands high returns on information technology (IT) investments and reduced systems development risks. This guide sets out an analytical framework for linking IT investment decisions to strategic objectives and business plans in Federal organizations; it supplements existing OMB policies and procedures. OMB's objective is to provide information on 1) what OMB expects from agencies and 2) how agencies can reduce the risk and maximize the net benefits from their IT investments. The guide was written with assistance from GAO and is based on strategic information management practices in successful organizations.End Note 1, End Note 2 This guide describes the critical success elements and key phases that should be a part of a mature IT investment process. The IT investment process an agency designs should match the culture and organizational structure of the agency. The overriding objective is that senior managers be able to systemically maximize the benefits of IT investments through use of the IT investment process. The investment process, depicted in Figure 1 below, consists of three phases: selection, control and evaluation. As Figure 1 indicates, the three phases of the investment process occur in a continuous cycle of selection, control, and evaluation. Information from each phase flows freely among all of the other phases with the exception of evaluation. The evaluation component of the process has a unidirectional information flow to the selection component. The evaluation component is used to verify or modify the criteria used during selection.
-- create a portfolio of IT project investments that maximizes mission performance, using a standard set of criteria for consistent comparison of projects. -- measure ongoing IT projects against their projected costs, schedule, and benefits and take action to continue, modify, or cancel them. -- 1) determine the actual return on investment of an implemented investment against the agency's mission and 2) adapt the existing process to reflect "lessons learned". The control and evaluation phases are conducted throughout the year and their results are fed into the selection phase, which in turn feeds back to the control and evaluation phases. This guide begins by identifying three attributes that characterize successful investment processes in best practice organizations. The guide then is organized by the phases of the investment process. Within each phase, the guide will describe: 1) the steps involved , 2) applicable management techniques and tools, 3) key questions to ask, and 4) examples from best practice organizations.
ORGANIZATIONAL ATTRIBUTES FOR
|
IT Project (1 thru n) |
Weight |
|
DECISION CRITERIA | SCORING |
PERCENT |
Overall Risk Factors |
|
Weights for Risks |
Investment Size - How large is the proposed technology investment, especially in comparison to the overall IT budget? |
1__________5__________10 Large Small |
40 |
Project Longevity - Do projects adopt a modular approach that combines controlled systems development with rapid prototyping techniques? Are projects as narrow in scope and brief in duration as possible to reduce risk by identifying problems early and focusing on projected versus realized results. |
1__________5__________10 Non-modular Modular |
30 |
Technical Risk - How will proposed technology be integrated into existing systems? Will proposed investment take advantage of Commercial Off-The-Shelf (COTS) software and systems? How will the complexity of the systems architecture and software design affect the development of the project? |
1__________5__________10 Experimental Established Custom Industry Standard |
30 |
Overall Return Factors |
|
Weights for Returns SUM=100% |
Business Impact or Mission Effectiveness - How will the technology investment contribute toward improvement in organizational performance in specific outcome-oriented terms? |
1__________5__________10 Low High |
25 |
Customer Needs - How well does the technology investment address identified internal and/or external customer needs and demands for increased service quality and timeliness or reductions in costs? |
1__________5__________10 Low High |
15 |
Return on Investment - Are the return on investment figures using benefit-cost analysis thresholds reliable and technically sound? |
1__________5__________10 Risky Known estimates benefit |
20 |
Organizational Impact - How broadly will the technology investment affect the organization (i.e., the number of offices, users, work processes, and other systems)? |
1__________5__________10 Low High |
25 |
Expected Improvement - Is the proposed investment being used to support, maintain, or enhance existing operational systems and processes (tactical) or designed to improve future capability (strategic)? Are any projects required by law, court ruling, Presidential directive, etc.? Is the project required to maintain critical operations--payroll, beneficiary checks, human safety, etc.--at a minimal operating level? What is the expected magnitude of the performance improvement expected from the technology investment? |
1__________5__________10 Tactical: Strategic: Improves Provides existing new process capability |
15 |
Total Risk Adjusted Score = Weighted Sum
of Overall Risk Factors + Weighted Sum of Overall Return Factors |
|
|
A scoring and ranking process such as the one depicted in Table 1 may be used more than once and in more than just this step to "winnow" the number of projects that will be considered by an executive decision-making body down to the best possible choice.
An outcome of such a ranking process might produce three groups of projects:
Likely winners -- One group, typically small, is a set of projects with high returns and low risk that are likely "winners."
Likely drop-outs -- At the opposite end of the spectrum, a group of high risk, low return projects usually develops that would have little chance of making the final cut.
Projects that warrant a closer look -- In the middle is usually the largest group. These projects have either a high return/high risk or a low return/low risk profile. Analytical and decision-making energy should be focused on prioritizing these projects in the middle group, where decisions will be more difficult to make.
At the end of this step, senior managers should have a prioritized list of IT projects and proposals with supporting documentation and analysis.
During this phase, an executive level decision making body determines which projects will be funded based on the analyses completed in the previous steps.
Determining the right mix of projects to fund is ultimately a management decision that considers the technical soundness of projects, their contribution to mission needs, performance improvement priorities, and overall funding levels that will be allocated to information technology.
Senior management should consider the following balancing factors when arriving at a final resource allocation and project mix.
Strategic improvements vs. maintenance of current operations
Efforts to modernize programs and improve their mission performance may require significant investments in new information systems. Agencies also have operational systems on which the agencies depend to operate their programs as currently structured. These older systems may need to be maintained. A balance should be struck between continuing to invest in older systems and modernizing or replacing them. It may be helpful to track over time the percentage of funding spent on strategic/development vs. maintenance/operations projects.
New projects vs. ongoing projects
The senior managers who choose the final mix of projects to be funded must periodically re-examine projects that have already been approved to ensure that they should still be supported. There may be concerns about a project's implementation, such as greater-than-expected delays, cost overruns, or failures to provide promised benefits. If new projects are more consistent with an agency's strategic initiatives, offer greater benefits for equivalent cost, or present fewer risks, the old projects may need to be canceled.
If a portfolio is managed only to minimize risk, senior management may unnecessarily constrain an agency's ability to achieve results. High risk, high return projects can significantly enhance the value to the public of an agency's IT spending, provided the agency has the capability and carefully manages the risks. Most organizations, however, can only handle a limited number of such projects. As a result, senior management must consciously help balance the amount of risk in the portfolio against the agency's capabilities and ability to manage risk.
Impact of one project on another
Now that federal agencies are trying to integrate their systems, every new project proposal is likely to affect, or be affected by, other project proposals, ongoing projects, or current systems. Senior management must recognize the context in which the new project will be placed and make decisions accordingly. For example, one best practice company has established as a risk the number of dependencies between a new project and other projects/systems.
Other complicating factors can heavily influence how senior management makes a final cut for approved IT projects.
Consider the impact on long range investment opportunities if all of the current projects are funded. Will large current costs preclude or delay better future opportunities? Will large current capital expenditures create even larger maintenance costs in the future?
IT projects sometimes rely on funding and resources from outside agencies or private organizations. If any project under consideration requires critical components from outside the agency, then the value of the agency's investment may be lost if the commitment by the outside party later shifts.
How much does the agency have available for IT investments for this budget year and for the next several years? Besides budget year spending levels and out-year estimates for the agency, the analysis should examine if there are other sources of funding for the projects. The agency should identify these other sources in its investment proposals.
What projects will fit under the spending levels this budget year and in out-years? Senior management can take the final list of projects with their associated costs and determine which projects fit within the spending parameters this budget year and/or in out-years. A project may have a relatively high priority, but resource constraints may preclude funding it this budget year. Senior management can then decide that the project be approved, but that its start date be delayed until funds are available, assuming it still matches the agency priority needs in the coming years.
After consideration of all of the factors mentioned above, senior management should have enough information to make knowledgeable investment decisions. Senior management should also designate how many times a project is to be reviewed based on the level of risk and any steps that the project team must take to mitigate that risk. For example, one best practices organization requires that senior management only approve projects after a review schedule has been established, (e.g., reviewed once a month for high risk, or once a quarter for lower risk), and specific requirements have been given to the project team to ensure that they mitigate risks, (e.g., develop a risk management plan).
Project review schedules, risk mitigation plans and the cost-benefit plans from prior steps all feed directly into the next section of the investment process -- control.
While agencies select proposals once a year, the control phase is an ongoing activity to review new and ongoing projects, as well as operational systems. During the control phase, senior management regularly monitors the progress of ongoing IT projects against projected cost, schedule, performance and delivered benefits. The frequency of the reviews may vary, but should not wait until the annual budget preparation and deliberation process. How often and to what extent individual projects should be reviewed should have been established as the last step in the Selection phase. Rather than avoiding problems and concerns emerging from unexpected risks, this phase accentuates the need for management accountability by creating pre-arranged checkpoints for projects and forcing corrective action when necessary. If a project is late, over cost, or not being developed according to expectations, then senior management must decide whether to continue, modify, or cancel it. The steps in this phase are to:
Step 1 --monitor projects/systems against projected costs, schedule, and performance; and
Step 2 -- take action to correct any deficiencies.
Establish processes to involve senior management in ongoing reviews and force decisive action steps to solve problems early in the project.
Define explicit measures and data used to monitor expected versus actual project outcomes on cost, schedule, and performance which are consistently maintained throughout the organization and readily accessible via automated management information systems.
Create positive incentives for raising real and potential project problems for management attention and action.
Before an organization can fully implement the control steps, uniform mechanisms for collecting, automating, and processing data on expected versus actual costs, schedules, and returns should be in place for all projects. End Note 8
Example: One best practice company has developed a database which stores risk-based assessment data about ongoing IT projects. The company uses RED, YELLOW, and GREEN symbols to evaluate each project on several dimensions; including quality of deliverables, conformance with company project development processes, and technical feasibility. For example, a project that has a YELLOW symbol on the deliverables dimension would mean that the company is concerned the expected deliverable will not meet needs, and that minor improvement is required. An overall assessment symbol is applied to projects as well. Projects with a RED symbol mean that there is a least one RED symbol, or 3 YELLOW symbols, attached to it. The database provides executive management with an easily accessible tool for identifying risks by type or severity.
Senior managers need to compare the preliminary results being achieved by a project against its projected costs, benefits and risks, and to identify actual or potential managerial, organizational, or technical problems.
Senior management should be able to judge whether a project is on track to achieve its projected mission benefits. The key is to use a set of performance measures consistently so that senior program managers are provided early warning of potential or actual problems. It is essential torefresh these measures as costs, benefits, and risks become better known to ensure the continued viability of an information system prior to and during implementation.
Examples of problems that could affect a project or system include 1) lack of input by program management into the requirements phase of a project, 2) a project that was intended to be cross-functional becomes stove-piped because other offices in the agency do not support it, 3) new requirements have been added, and 4) it is more difficult to use the technology than was anticipated.
Senior program managers in federal agencies often pay most of their attention to new projects and carry ongoing projects as necessary budget items. In best practice organizations, however, ongoing projects are reviewed continually along with new projects and go/no-go decisions are made. No project should be allowed to continue indefinitely through failure. Project continuance should be periodically challenged.
Based on a schedule developed during the selection phase, each project/system should be reviewed with at least the following considerations in mind:
How do current costs compare against projected costs?
How does the current schedule compare against the projected schedule?
How does the current performance of the deliverables compare against projected measures?
If we were starting over, would we fund this proposal today?
Have new requirements "crept" into the project?
Have business conditions changed significantly since the project was approved?
Is the project still technically feasible?
Is the project dependent on other projects? Are they late?
Does the project still support the architecture?
Is the project necessary for the successful completion of other projects?
Senior program management should be able to develop a well-informed picture of current and potential problems for each ongoing IT project.
The action should result in the deliberate continuation, modification, or cancellation of each project.
The prior step, pertaining to monitoring of projects, should pinpoint projects that senior management need to make decisions on. What action to take is a management decision.
Senior management should ensure that:
The solution to problems should not be the sole province of the IRM organization. Even when senior management is aware of problems with projects or systems, the solution to the problem is too often left with the information systems organization. Senior managers should ensure that program officials are involved in the solution, since in many instances it may be the business side of the organization which provides a solution.
All management decisions are documented along with data supporting the required changes. Common problems and their solutions, which are applicable to one IT project, should be evaluated as to how they apply to other IT projects under management's purview. To avoid replication of effort for analysis, documentation of management decisions is critical. Federal agencies often treat each budget year as isolated and provide funding for whatever can be supported each year rather than evaluating the IT projects with a historical perspective. By contrast, leading organizations revise their selection processes and IT funding decisions based upon the outcomes produced from the previous year.
To use an example, many federal agencies are prototyping IT projects before moving into the implementation stage. Monitoring the mission results gained by the prototype allows senior program management to make informed decisions about whether to stop or modify a project at this stage, rather than letting the project continue on into implementation automatically.
Proper control of IT investments enables senior management to mitigate risk of schedule, cost overruns, and development of a product that does not meet the goals originally intended. This process is highly dependent on facts provided through continual measurement of new and ongoing projects. The data fed from the Selection process to the Control process supports this requirement, as do the measurements taken throughout the life of a project.
Evaluation is conducted after a system has been implemented, and is an assessment of the project's success or failure. Using post implementation reviews, data is collected, recorded, and analyzed to compare expected results against actual benefits and returns. Figure 1, shown previously, depicts the evaluation phase in relation to the other two phases. Evaluation is used to 1) decide whether future changes are necessary which can help address serious performance gaps, and 2) make decisions about modifications to the organization's existing evaluation process and selection criteria. This phase is comprised of three steps:
Step 1 -- Conduct Post Implementation Reviews
Step 2 -- Decide on Adjustments
Step 3 -- Lessons Learned
Post implementation reviews to determine actual project cost, benefits, risks, and returns.
Maintaining accountability for project performance and success based on quantifiable measures to create incentives for strong project management and senior management ownership.
Modification of selection decision criteria and investment control processes as needed to ensure continual improvement based on lessons learned.
Conduct and review the results of post implementation reviews, focusing on anticipated versus actual results in terms of cost, schedule, performance, and mission improvement outcomes. Determine the causes of major differences between plans and end results.
Most federal agencies accept that recently implemented systems are a fait accompli and move on from there. This point of view is contrary to the investment management philosophy of managing the entire IT portfolio. The primary tool to assess a project in best practice organizations is the post-implementation review. Questions to ask include:
How effective was the project in meeting the original objectives?
How well did the project meet the planned implementation dates?
What mission benefits has the project achieved, and do they match the benefits projected? If not, why not?
Were the original business assumptions that justified the system valid?
What lessons did the team learn from this project?
The post-implementation review should inform senior management's decision whether to continue, modify, or cancel operational systems.
Using the results of the post implementation review as a baseline, decide whether to continue without adjustment, to modify the system to improve performance or, if necessary, to consider alternatives to the implemented system.
Even with the best system development process, it is quite possible that a new system will have problems or even major flaws that must be taken care of in order for the agency to get the full benefit of its investment. The post implementation review should provide executive management withuseful information on how best to modify a system, or to work around the flaws in a system, in order to improve performance and to bring the system further in alignment with the needs of its customers.
Using the collective results of post implementation reviews across completed systems, modify the organization's existing investment selection and control processes based on lessons learned.
The information from post implementation reviews helps senior management develop better decision criteria during the Selection process and improve the evaluation of ongoing projects during the Control process.
Example: After several post implementation reviews of several completed projects, one best practice organization found that it was only realizing a 9 percent return on the projected benefits of its information systems investments. This focused senior management attention on more rigorous and realistic assessments of benefits projections presented during the selection cycle of their investment decision making process. Cost and benefit estimation techniques were improved, based upon quantitative data associated with past systems development efforts. Low value and high risk projects became more readily identifiable in the investment selection and control processes. Within two years, this company saw IT benefits exceed initial projections by some 33 percent.
A mature investment process will help ensure that taxpayer dollars spent on information technology will be used to effectively support the agency's mission objectives. Dwindling resources and higher public demand for service means that a project must be worth doing from a mission perspective, it must be possible to accomplish it at reasonable time and cost, and it must support the strategic direction of the agency.
A mature investment process requires discipline, executive management involvement, accountability, and focus on risks and returns using quantifiable measures. Senior program managers, those with the programmatic responsibility in key business areas, should be involved directly in prioritizing and selecting the IT projects their organization will pursue. Their decisions should be well-informed, based on analytical rigor and robust measures. Furthermore, a mature investment process is a year-round activity, not just a process to be done near budget time. Senior program managers should be involved in devising and enforcing solutions to the problems that inevitably arise. Finally, the mature investment process is a learning process. The real-world results of IT projects and mission programs should be continuously fed back to senior managers as they make decisions on new projects and operational systems.
The Budget | Legislative Information | Management Reform/GPRA
Grants Management
Financial Management | Procurement Policy | Information & Regulatory Policy
Contact the White House Web Master
Privacy Statement