Defense Data Grand Prix I: Final Technical Report

Research Reports

Defense Data Grand Prix I: Final Technical Report

PUBLIC RELEASE
March 2024

COMPLETED
June 2023

AUTHORS: Dr. Stoney Trent1 , Dr. Hoong Yan See Tao2
VIRGINIA TECH1, STEVENS INSTITUTE OF TECHNOLOGY2

This final technical report summarizes Year 1 of the Defense Data Grand Prix (DDGP), an Acquisition Innovation Research Center (AIRC) competition in which faculty-led teams collaborate with government stakeholders to solve real-world problems. The DDGP was established in 2021 and in line with the Department of Defense’s (DoD) Data Strategy to (a) reduce the two largest barriers (access and knowledge of operational problems) to scaled data analytics in the DoD, (b) incentivize innovations and new perspectives to create unanticipated findings, and (c) increase awareness of defense acquisition system challenges, decisions, and processes.

In the first iteration of the DDGP, 15 teams from seven universities competed across three semester-long heats. The Defense Logistics Agency (DLA) was the primary operational sponsor. As such, DLA’s Chief Data and Analytics Office orchestrated collaborations with 12 problem owners and worked to pilot the governance processes necessary for DDGP competitors to utilize controlled unclassified information (CUI).

Three different universities each won a heat, including Stevens Institute of Technology (Heat 1), Virginia Tech (Heat 2), and Texas Tech University (Heat 3). The overall results illustrate the promise of the DDGP and the accomplishment of two key objectives: 1. provide data-driven analysis to inform DoD operational and policy decisions related to data and operations; and 2. allow researchers to use real-world data to tackle real-world problems in ways that integrate with academic courses and research seminars.

The second Defense Data Grand Prix commenced in January 2024.

Improving the Process for Developing Capability Requirements for Department of Defense Acquisition Programs

Research Reports

Improving the Process for Developing Capability Requirements for Department of Defense Acquisition Programs

PUBLIC RELEASE
March 2024

COMPLETED
September 2023

AUTHORS: Dr. Michael McGrath1, Dr. Donald Schlomer1, Dr. Mo Mansouri1
STEVENS INSTITUTE OF TECHNOLOGY 1

The Joint Capabilities Integration and Development System (JCIDS) is the Department of Defense’s (DoD) formal requirements approval process. It is important to develop and validate joint warfighting capability needs as the basis for acquisition programs. However, for capabilities that need to keep pace with evolving technologies, process delays in requirements validation can cause commensurate delays in delivery of capabilities to the warfighter. The JCIDS deliberate path (as opposed to the JCIDS urgent path) is designed to strike a balance in speed and thoroughness but is often slow in practice.

In the FY 21 National Defense Authorization Act (NDAA), Congress expressed concern that JCIDS is too slow to keep pace with threats and technology, directing the DoD to develop recommendations for streamlining JCIDS. In support of the DoD’s response, in 2022, the Acquisition Innovation Research Center (AIRC) modeled the JCIDS process and used the model to assess the effects of proposed process improvements [AIRC (2022)]. The 2022 AIRC study found that for a sample of 20 Navy programs, JCIDS staffing of a Capability Development Document (CDD) took an average of 336 days. The 2022 study also found that the Special Operations Command (SOCOM) had developed a streamlined requirements process that could reduce requirements review and approval times by more than 50%. The research team recommended that a SOCOM-like process be quantitatively assessed for speed, piloted in the Military Services, and, if successful, adopted for all but the largest
acquisition programs as an alternative to JCIDS.

This report summarizes the follow-on analysis of the SOCOM process, verifying that it is indeed significantly faster than the JCIDS process. The 15 Special Operations Rapid Requirements Documents (SORRDs) examined took an average of 85 days to validate compared to the 157 days on average for SOCOM to approve a full CDD (N=5). Thus, the SORRD process is faster (about half as long) than its counterpart CDD process at SOCOM—and much faster (about a fourth as long) than the average of 336 days for 20 Navy CDDs examined in the prior study.

Test and Evaluation Methods for Middle-Tier Acquisition

Research Reports

Test and Evaluation Methods for Middle-Tier Acquisition

PUBLIC RELEASE
March 2024

COMPLETED
September 2023

AUTHORS: Dr. Laura Freeman1 , Mr. Geoff Kerr1
VIRGINIA TECH1

The Department of Defense (DoD) National Defense Strategy has realized the need for advanced technology and more rapid development and fielding of that technology to sustain dominance against peer/near-peer threats. In support of these objectives, the Director, Operational Test and Evaluation (DOT&E) engaged the Acquisition Innovation Research Center in a multi-year contract to advance test and evaluation (T&E) methods within the DoD. The multi-university AIRC team has focused research efforts over the past year in support of the DOT&E Implementation Plan. The research team focused on, under advisement from DOT&E technical leadership, supporting three of the five key pillars within the Implementation Plan: Pillar 1 – Test the Way We Fight; Pillar 2 – Accelerate the Delivery of Weapons that Work; and Pillar 4 – Pioneer T&E of Weapon Systems Built to Change Over Time.

In support of Pillar 1, the AIRC team focused on maturing evaluation methodologies for joint warfighting concepts with the development of a fundamental joint test concept (JTC). The team established a broad community of interest (COI) to explore numerous aspects and challenges around the test and evaluation of joint warfighting concepts. The team conducted a three-phased research approach, each of which concluded in a workshop that first shaped and bound the focus of the research team, developed initial JTC priorities, and concluded with a tabletop simulation of joint operation assessment.

In support of Pillar 2, the research team explored current threats to DoD data and how these threats can be mitigated. The team responded with a data security paper that expands upon the findings. In addition to the data security aspects of accelerating Weapons that Work, the research team developed Bayesian-based approaches to leverage contractor testing, development testing, live fire testing, modeling and simulation results, and operational test to ensure more rapid test and evaluation of weapon systems to expedite fielding of warfighting capabilities. In the future efforts, the research team will be working on a model-based test and evaluation plan (MB TEMP) and supporting the advancement of the integrated decision support key (IDSK) beyond the initial efforts described in the digital engineering for T&E section of this report.

In support of Pillar 4, the research team investigated improved methods for test and evaluation of artificial intelligence/machine learning-enabled DoD systems. The team has developed recommendations on how to ensure AI/ML systems have proper training data while continuing to evaluate and inform system performance while in use. The team further explored methods to evaluate AI/ML systems when the T&E community has limited or no detailed understanding of the underlying AI/ML algorithms within a system. Also, in support of Pillar 4, the multi-university team helped further the application of digital engineering practices in support of test and evaluation. With a realistic look at tooling challenges and current cultural challenges, the team made practical recommendations on the use of model-based systems engineering (MBSE) methods for test planning and execution, in addition to digital linkage from mission requirements through operational assessment. Lastly, Pillar 4 efforts also included maturing a framework to automate security penetration testing. After conducting cyber security research, the team developed early test software methods that enable cyber physical system penetration testing to be automated in concert with a continuous innovation, continuous deployment product development environment. In the next year, the team intends to test these capabilities with case study DoD products.

Curricula for Startup Business Operations, Financing, and Intellectual Property

Research Reports

Curricula for Startup Business Operations, Financing, and Intellectual Property

PUBLIC RELEASE
March 2024

COMPLETED
December 2023

AUTHORS: Dr. Jose Ramirez-Marquez1
STEVENS INSTITUTE OF TECHNOLOGY 1

The following report is a response to the congressional mandate in Section (Sec.) 834(a)(1) of the 2023 National Defense Authorization Act to “make recommendations on one or more curricula for members of the acquisition workforce on financing and operations of [startup] businesses.” The mandate across elements of Sec. 834 emphasizes the need for innovative approaches to negotiating intellectual property (IP) and data rights in agreements with startup businesses. This research employed a variety of web-based resources to collect information, identifying a diverse portfolio of over 210 existing online courses and other educational materials.

The proposed curriculum, Financing and Operations of Startup and Negotiating and Establishing Intellectual Property and Data Rights, is intended to serve as a comprehensive guide for the Acquisition Innovation Research Center to formulate recommendations and enhance the existing educational offerings for the defense acquisition workforce, contributing to the overarching goal of fostering innovation within the government procurement landscape.

The exploration for a comprehensive curriculum on financing and operations involved a meticulous review of academic resources, leading to a compilation of diverse materials. The report analysis includes global perspectives and industry-specific training resources. The proposed curriculum in Table 1 and Table 2 of the report encompasses key elements across these resources on startup financing and operations, addressing fundamental aspects crucial for the acquisition workforce and undergraduate students.

In response to the directive for innovative approaches to negotiating IP and data rights, additional relevant courses were identified, although limited and directly related to software and software-embedded systems. Courses cover foundational aspects of IP and its applications across industries, providing a comprehensive foundation for navigating IP rights in startup agreements. Table 3 of the report lists key elements across these resources on negotiating and establishing IP and data rights.

Quarterly Research Forums: AIRC and the Defense Acquisition University

Research Reports

Quarterly Research Forums: AIRC and the Defense Acquisition University

PUBLIC RELEASE
February 2024

COMPLETED
October 2023

AUTHORS: Ms. Kara Pepe1
STEVENS INSTITUTE OF TECHNOLOGY 1

AIRC collaborated with the Defense Acquisition University to host quarterly research forums (QRFs) featuring presentations, discussions, and Q&A sessions on research and training opportunities. AIRC experts presented relevant research and received feedback from faculty and practitioners in the field, while also providing DAU with insightful views on its curricula initiatives and challenges in support of acquisition missions of the Department of Defense (DoD).

AIRC established a team of leading academia experts from broad disciplines such as law, policy, business, management, education, engineering, and data science that relate to the breadth of acquisition functions. All QRFs were held virtually with continuous learning points (CLPs) provided.

These quarterly engagements were so well received by the defense acquisition community that DAU is contracting with AIRC to collaborate on an additional set of briefings in the 2023-2024 academic year focused on mega-projects. This area of future research focuses on data visualization, artificial intelligence (AI), and machine learning (ML), as well as other topics to aid in the management of large DoD programs.

Portfolio Performance Analysis and Visualization

Research Reports

Portfolio Performance Analysis and Visualization

PUBLIC RELEASE
December 2023

COMPLETED
September 2023

AUTHORS: Mr. John D. Driessnack1 , Mr. John Johnson1
UNIVERSITY OF MARYLAND 1

According to researchers from the University of Maryland, the Department of Defense needs more efficient data-driven approaches to improve analytic insights on performance and risk at program and portfolio levels. This report shows their efforts to expand the use of portfolio-level data, analysis, and visualization of the data across Program Executive Offices (PEOs), Capabilities, and Missions to inform Integrated Acquisitions Portfolio Review (IAPR) and other portfolio decisions.

The researchers summarized their recommendations into three areas, including to expand quantitative performance management data, define a multidimensional system of systems for portfolios in which to collect that data, and pilot the effort to develop data and create decisional tools that can implement the designs.

Mission-Aware Integrated Digital Transformation for Operational Advantage

Research Reports

Mission-Aware Integrated Digital Transformation for Operational Advantage

PUBLIC RELEASE
December 2023

COMPLETED
September 2023

AUTHORS: Dr. Jitesh Panchal1 , Dr. Waterloo Tsutsui1, Dr. Mikhail Atallah1, Nathan Hartman1, Daniel DeLaurentis1, Dr. Richard Malak2
PURDUE UNIVERSITY 1, TEXAS A&M UNIVERSITY 2

The U.S. Army recognizes the potential of digital modeling to advance ground vehicle capabilities. However, practical challenges emerge in acquiring comprehensive digital data for various vehicle platforms. For instance, older platforms often lack up-to-date digital data, necessitating reverse engineering to create accurate digital replicas. This process faces challenges as it may overlook subtle characteristics and manufacturing discrepancies in the digital models. Furthermore, the lack of standardized digital data practice (e.g., data repositories) complicates the establishment of a cohesive digital modeling infrastructure.

AIRC researchers from Purdue and Texas A&M completed this report that presents valuable insights drawn from in-depth conversations with Department of Defense stakeholders, focusing on critical aspects of digital modeling, data utilization, and data-driven decision-making for ground vehicles. The report addresses challenges and opportunities in these domains and offers strategic considerations for optimizing the DoD’s operational advantage.

The report introduces the Intelligent Front-End (IFE) framework, which optimizes data management, integration, and utilization. The IFE serves as a bridge between existing systems and modern data needs, enhancing user interactions with data. Implementing IFE involves phases such as learning, dual deployment, and full deployment, capturing user interactions to contribute to institutional memory and decision-making.

In future research endeavors, the aim is to develop a versatile decision/reasoning tool framework tailored to various cases, providing high-level guidance to sponsors based on crucial decision-making factors. This framework will enable prioritizing modeling efforts to address specific decision needs. Collaborations with DoD units are planned to gain insights into their decision-making processes, potentially focusing on airframes and sea platforms. The overarching goal is continually refining the approach, leveraging digital modeling and data-driven decision-making to meet evolving sponsor requirements and enhance the DoD’s operational advantage.

Cognitive Assistant for Training Cost Estimators

Research Reports

Cognitive Assistant for Training Cost Estimators

PUBLIC RELEASE
November 2023

COMPLETED
September 2023

AUTHORS:  Dr. Daniel Selva1, Dr. Theodora Chaspari1, Dr. Alejandro Salado2
TEXAS A&M UNIVERSITY 1, UNIVERSITY OF ARIZONA 2

The goal of this research project is to develop a cognitive assistant to support training of new cost estimators in the Department of Defense (DoD). A cognitive assistant (CA) is defined here as an artificial intelligence (AI) tool, usually with a natural language interface, that augments human intellect in a specific task by retrieving and processing relevant information from multiple information sources and providing it to the user at the right time. It also has the capability to learn and adapt to the user and problem at hand.

Cost estimation is a complex iterative process consisting of various steps: gathering the required information, selecting an overall strategy and one or more existing models, developing new models if needed (including calibration and validation), performing the estimate, and conducting sensitivity analyses as appropriate. There are challenges for beginner cost estimators in each of those steps, including dealing with incomplete datasets, appropriately assessing the performance of new models, projecting beyond historical ranges of validity, adequately reporting the level of uncertainty around a point estimate, understanding how to use joint cost-schedule distributions, etc.

Currently, the training of new cost estimators is done primarily through traditional instruction in live classrooms, and thus it is a time-consuming process. Traditional instruction typically implies reduced opportunities for hands-on learning opportunities, which are known to improve learning. This type of instruction is also not tailored to each individual, so the pace can be too fast for some trainees and too slow for others. The use of CAs can allow for more interactive and tailored instruction for each individual and area, as demonstrated with intelligent tutoring systems in other areas of education (Corbett et al., 1997).

Loud and Clear: The Negotiation Game

Research Reports

Loud and Clear: The Negotiation Game

PUBLIC RELEASE
November 2023

COMPLETED
October 2023

AUTHORS: Dr. Daniel J. Finkenstadt1 , Dr. Robert Handfield2
NAVAL POSTGRADUATE SCHOOL 1, NORTH CAROLINA STATE UNIVERSITY 2

The research objective of this project was to improve acquisition workforce training, especially on new acquisition concepts and approaches by investigating if/how gamified training approaches could improve training. Acquisition outcomes were heavily dependent on learning and currency of the Department of Defense (DoD) workforce in the ever-evolving acquisition ecosystem. New approaches were needed to improve training speed, retention, and interest given learning time-constraints and workforce turnover.

AIRC researchers from the Naval Postgraduate School and North Carolina State University produced a set of negotiation scenarios that were incorporated into an interactive player platform that allowed teams to take on various roles within a negotiation team on either the government or industry side. Players competed against each other and tried to reach the optimal solution for their team given their tasks, constraints, and goals. Researchers assessed how teams interacted given various complex negotiation trades, variations of constraints and asymmetric information.

Analysis of participant feedback showed the exercise was enjoyable, promoted creative problem solving, and had potential benefits for acquisition professionals. However, participants desired more time, structure, clarity in expectations, and accessibility. The positive feedback exhibited a learning orientation, while the negatives reflected a performance focus. Overall, the gamified approach shows promise for enhancing negotiation skills vital for acquisition professionals. This research provides an initial methodology and prototype for gamified negotiation training. Further refinement and testing are needed to optimize game design, player experiences, and learning outcomes. Gamified methods can promote engagement and real-world skills, but careful implementation is required for success.

Acquisition with Digital Engineering

Research Reports

Acquisition with Digital Engineering

PUBLIC RELEASE
October 2023

COMPLETED
July 2023

AUTHORS: Mr. Tom McDermott1 , Mr. Geoff Kerr2
STEVENS INSTITUTE OF TECHNOLOGY 1, VIRGINIA TECH 2

In June 2018, the Under Secretary of Defense for Research and Engineering published the Department of Defense (DoD) Digital Engineering (DE) Strategy. Since then, the DoD’s engineering and technical communities have acknowledged and are adopting DE as a transformative, value-added approach to improving weapon system development, capability integration, testing, and sustainment. However, successful DE implementation in acquisition and sustainment must involve all acquisition functions—not just technical ones. 

In other words, acquisition with DE support, a.k.a. DE-enabled acquisition, cannot succeed as an engineering initiative pushed by engineers. It must be pulled into acquisition and sustainment by acquisition and sustainment functionals and fully integrated across all of their activities, including those that are not seen as technical. This report explores some of the methods, processes, and tools in the acquisition and sustainment functions beyond engineering that need to implement DE and realize its benefits, ultimately to our warfighters and taxpayers.