Program Management

Analysis of Alternatives (AoA)

The operational command, often with support from the program office, develops and assesses a range of potential alternatives to meet the needs stated in the Initial Capabilities Document (ICD). The analysis shall include:

  • A Status Quo Alternative (As-Is)
  • Proposed alternatives to include:
    • New developments
    • Tailoring or integrating COTS and GOTS products
    • Acquiring capabilities as a service
    • Maturing the legacy system(s)
    • Various systems, system-of-systems, network, and data architecture configurations
    • Hybrids of any above alternatives
  • Estimates of lifecycle costs or total ownership costs
  • Affordability goals and budget constraints
  • Measures of Effectiveness (MOEs) that are operationally relevant and measurable
  • Measures of Performance (MOP) – Technical characteristics required to satisfy MOEs and are measurable and employed as operational test criteria
  • Enterprise impacts beyond a program centric solution
  • Risk and sensitivity analysis of mission, technology, programmatic, to include funding
  • List of assumptions – mission, technology, and programmatic
  • Schedule estimates with associated deliverables
  • Suitability
  • Benchmarking and business process reengineering studies (where applicable)

Both the effectiveness analysis and the cost analysis should address the risks and uncertainties for the alternatives, and present appropriate sensitivity analysis that describes how such uncertainties can influence the cost-effectiveness comparison of the alternatives.

The effectiveness analysis should be tied to the organizational missions, functions, and objectives that are directly supported by the implementation of the system being considered. In some cases, it may be possible to express the assessment of effectiveness across the alternatives in monetary terms, and so effectiveness could be assessed as benefits in the analysis framework. In other cases, the effectiveness might be related to measurable improvements to business capabilities or better or timelier management information (leading to improved decision-making, where it can be difficult or impossible to quantify the benefits). In these cases, a common approach is to portray effectiveness using one or more surrogate metrics. Examples of such metrics might be report generation timeliness, customer satisfaction, or supplier responsiveness. In addition to management information, the effectiveness analysis also should consider information assurance and interoperability issues.

The Director of CAPE (or the equivalent at the Component level) provides AoA Guidance at the MDD which the Component provides an AoA Study Plan structured to address the AoA Guidance. Operational sponsors are encouraged to work with CAPE in conducting an AoA. While most AoA guidance is directed at MDAPs, the Defense Acquisition Guidebook offers AoA considerations for MAIS programs, including a tailored outline of an AoA Study Plan.

Programs embracing the Agile methodology must apply due diligence in an AoA to properly consider the vast solution space in the dynamic IT environment. Yet they should not become bogged down in too much detailed analysis for an extended period given the rapid pace of change in operations and technologies. While AoA updates are required at subsequent milestones, they often are treated as a one-time, major effort during the MSA phase. Agile programs may iterate elements of the analysis throughout the subsequent phases based on an improved understanding of the technical, cost, and operational considerations. This will help shape release planning throughout the program’s development to ensure the program pursues the right solution.

Best Practices for Conducting Successful AoAs:

  • Establish a well-structured, objective plan – Programs cannot conduct an AoA with a predefined solution in mind. Exploring all viable alternatives against objective criteria is essential.
  • Use a small AoA team – Embrace the Agile benefits of small empowered teams over the coordination and consensus building of large teams. A small core team can reach out to a wide array of stakeholders for inputs.
  • Actively involve stakeholders – Ensure the user community is actively represented along with stakeholders from across all of the functional groups.

Incorporate risk – Examine the operational, technical, and business risks to the mission need and potential materiel solution alternatives.

Additional References:

1 Comment

  1. Gary

    Kudos.

    AiDA is a life saver. This is the first time I looked at it, but it will not be the last.

    I am working requirements for a major acquisition program. We have been creating much of the framework for our efforts as we go from past related experience because it has been terribly inconvenient to find existing documentation–as is so often the case we have to few resources and appropriate specialists. This is especially true because it is a space program and we are in an early phase responding to Principal DoD Space Advisor guidance.

    AiDA has made it practical for me to find and use the right compliance documents from the start. It has made it feasible for me to trace the appropriate direction in DoD and USAF and to intelligently tailor what is we apply for the PoR to be and the risk burn down that has to proceed it based on PDSA guidance. I am going to be looking for other documents because we are working across Title 10 and 50 as well, and perhaps

    My thanks,

    Gary

    Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Share This