Disrupting Acquisition Blog
What Have You Delivered?
Last year, the Air Force Secretary and Acquisition Executive published a great Air Force Acquisition Annual Report for 2018. It covered a wide range of content, profiling the programs, enterprise level stats, with great charts and graphs in a visually appealing report. The only criticism is it was focused too much on the acquisition business and not on operational impact.
Delivering capabilities that improve operations is the ultimate measure of success for an acquisition program and organization.
Every PEO and SAE should publish an annual report that specifies what systems and solutions were delivered. How many aircraft, ships, and ground vehicles did you deliver last year? How many satellites did you launch? How many new services deliver warfighting capabilities? What major capability upgrades did you deliver to existing systems and services?
Similar to a corporate annual report, how can you assess if your organization had a great, good, or poor year? While individual programs have longer timelines to deliver, at a portfolio or agency level, every year there should be a regular delivery of capabilities. How much did you spend last year and what did you deliver? This could further drive PEOs and SAEs to focus on accelerating deliveries to demonstrate success.
The annual reports should also highlight the mission impact of the capabilities delivered. This could be broken up into a classified and unclassified version for the public. The annual budget submissions include the proposed impacts in requesting funding, which are publicly available, so publishing the end results of deliveries can communicate how effective you were in spending millions or billions of taxpayer dollars to achieve National Security objectives. How quickly and effectively did you respond to changing operations, emerging threats, or exploit leading technologies for military advantage? Are you increasing warfighter lethality?
To drive the right behaviors, these reports should also highlight the number of prototypes, experiments, and minimum viable products were delivered. How many engagements did you have with end users? What key lessons from these efforts shaped the scope, requirements, and designs?
A summary table of the timelines from idea to IOC for each program in the portfolio would be really good way to assess and report the year’s progress. Are there programs with long timelines that leaders would be embarrassed to publicly report? Are there programs with short timelines that innovative leaders would be proud of? Making such data visible and available could go a long way towards encouraging the right behaviors and producing the desired outcomes – delivering capabilities that support operations, now and in the future.
Disclaimer: The opinions expressed here are those of the authors only and do not represent the positions of the MITRE Corporation or its sponsors.
Subscribe to Our Newsletter