Research Reports
Test and Evaluation Methods for Middle-Tier Acquisition
PUBLIC RELEASE
March 2024
COMPLETED
September 2023
AUTHORS: Dr. Laura Freeman1 , Mr. Geoff Kerr1
VIRGINIA TECH1
The Department of Defense (DoD) National Defense Strategy has realized the need for advanced technology and more rapid development and fielding of that technology to sustain dominance against peer/near-peer threats. In support of these objectives, the Director, Operational Test and Evaluation (DOT&E) engaged the Acquisition Innovation Research Center in a multi-year contract to advance test and evaluation (T&E) methods within the DoD. The multi-university AIRC team has focused research efforts over the past year in support of the DOT&E Implementation Plan. The research team focused on, under advisement from DOT&E technical leadership, supporting three of the five key pillars within the Implementation Plan: Pillar 1 – Test the Way We Fight; Pillar 2 – Accelerate the Delivery of Weapons that Work; and Pillar 4 – Pioneer T&E of Weapon Systems Built to Change Over Time.
In support of Pillar 1, the AIRC team focused on maturing evaluation methodologies for joint warfighting concepts with the development of a fundamental joint test concept (JTC). The team established a broad community of interest (COI) to explore numerous aspects and challenges around the test and evaluation of joint warfighting concepts. The team conducted a three-phased research approach, each of which concluded in a workshop that first shaped and bound the focus of the research team, developed initial JTC priorities, and concluded with a tabletop simulation of joint operation assessment.
In support of Pillar 2, the research team explored current threats to DoD data and how these threats can be mitigated. The team responded with a data security paper that expands upon the findings. In addition to the data security aspects of accelerating Weapons that Work, the research team developed Bayesian-based approaches to leverage contractor testing, development testing, live fire testing, modeling and simulation results, and operational test to ensure more rapid test and evaluation of weapon systems to expedite fielding of warfighting capabilities. In the future efforts, the research team will be working on a model-based test and evaluation plan (MB TEMP) and supporting the advancement of the integrated decision support key (IDSK) beyond the initial efforts described in the digital engineering for T&E section of this report.
In support of Pillar 4, the research team investigated improved methods for test and evaluation of artificial intelligence/machine learning-enabled DoD systems. The team has developed recommendations on how to ensure AI/ML systems have proper training data while continuing to evaluate and inform system performance while in use. The team further explored methods to evaluate AI/ML systems when the T&E community has limited or no detailed understanding of the underlying AI/ML algorithms within a system. Also, in support of Pillar 4, the multi-university team helped further the application of digital engineering practices in support of test and evaluation. With a realistic look at tooling challenges and current cultural challenges, the team made practical recommendations on the use of model-based systems engineering (MBSE) methods for test planning and execution, in addition to digital linkage from mission requirements through operational assessment. Lastly, Pillar 4 efforts also included maturing a framework to automate security penetration testing. After conducting cyber security research, the team developed early test software methods that enable cyber physical system penetration testing to be automated in concert with a continuous innovation, continuous deployment product development environment. In the next year, the team intends to test these capabilities with case study DoD products.