The Defense Department has largely met the minimum performance metric requirements on its major information technology investments, according to a new report, but more than half of the 25 programs studied did not fully disclose whether they achieved their intended goals.
The Government Accountability Office’s annual assessment of DOD’s IT systems, released Tuesday, examined the nearly $9 billion budgeted for the 25 IT programs between fiscal 2021 and 2023 — alongside $31 billion for more than 720 standard IT infrastructure investments — and the operational performance metrics submitted by the department to the Federal IT Dashboard.
The report noted that while 22 of the 25 programs met the minimum number of metrics required by Office of Management and Budget guidance, three did not and two of those didn’t submit metrics at all.
OMB capital planning guidance requires IT programs to submit a minimum of five program operational performance metrics to the Federal IT Dashboard consistent with categories measuring customer satisfaction, strategic and business results, financial performance and innovation.
GAO noted that the Army Contract Writing System submitted four of the required metrics, but the Joint Operational Medicine Information Systems and Air Force’s Maintenance Repair and Overhaul Initiative did not identify any metrics data.
The OMB guidance also requires IT programs to use the metrics to track operational performance goals, and they must note performance targets for the current fiscal year in their submissions.
But the report found that 13 DOD programs did not fully report on the extent to which they achieved their targets: 11 of the programs submitted incomplete data, while two — the Joint Operational Medicine Information Systems and Air Force’s Maintenance Repair and Overhaul Initiative — didn’t report any data.
In the report, officials from the department’s office of the chief information officer acknowledged that the programs should be reporting the performance metrics.
“The officials stated that DOD CIO put checks in place that should improve program reporting and make sure the data are up to date with programs’ operational performance metrics data, but that some programs still had incomplete reporting because those checks had been made incrementally and had only been partially implemented,” the report said.
DOD officials told the GAO that they expected the performance metrics checks to be in place before the department’s next Dashboard submission in June, but “as of March 2023, DOD was unable to confirm that the checks were currently in place to ensure that all programs identify and report complete operational performance metrics for their FY 2024 submission.”
The report also noted that 11 of the 25 programs studied lacked a capability implementation plan that outlines how the programs put in place new system functionality and enhancements.
Two of the programs reported they were in the process of developing plans, while the other eight said “their systems had entered a late stage of development, were nearing retirement, or predated the requirement.”
Finally, the report said that six of the 25 programs did not have approved cybersecurity strategies in place.
“These strategies are to include information such as cybersecurity and resilience requirements and key system documentation for cybersecurity testing and evaluation analysis and planning,” the report said. “Such information is intended to ensure that program staff plan for and document cybersecurity risk management efforts, which begin early in the programs’ life cycle.”
Five of the programs told the GAO that they were planning to develop their strategies, while the sixth did not report having any plans to develop one.
“Although DOD has shown improvement, until the department ensures that all of the programs develop approved cybersecurity strategies, it lacks assurance that programs are positioned to effectively manage cybersecurity risks and mitigate threats. As a result, DOD programs are at increased risk of adverse impacts on cost, schedule and performance,” the report said.
GAO offered two recommendations: that the Defense Secretary direct the CIO to ensure that major IT business programs identify at least the minimum required amount of operational performance metrics to its Federal IT Dashboard submission, and that the DOD CIO to ensure that major IT business programs develop capability implementation plans that address conducting user training and deployment.
DOD officials said they agreed “with the overall content of the report,” but did not concur with the GAO recommendations.
On the first recommendation, DOD officials said they had put in place an audit check in April to ensure that operational metrics were reported, but GAO said the department offered no evidence that the three programs cited had done so for their next Federal IT Dashboard submission.
Regarding the second recommendation, department officials cited DOD instruction 5000.75 — a business systems requirements and acquisition rule that calls for involving users in the IT systems process, from development to sustainment — and said the rule already codified the requirement sought by GAO’s auditors.
While DOD officials acknowledged the need for periodically maturing user training and deployment plans through the system lifecycle, GAO noted that the 11 programs did not have capability implementation plans for user training and deployment and has not yet seen evidence from the department that such plans have since been implemented.