Defense Data Grand Prix 2022-2023

Faculty-Led Applied Data Science Opportunity

The Department of Defense (DoD) is seeking faculty-led student teams to tackle compelling, real-world problems with data science. These challenges offer ready-made practicums that can be integrated with graduate courses and research seminars.

The Acquisition Innovation Research Center (AIRC)—a partnership of 22 U.S. universities—has established a Defense Data Grand Prix prize competition that allows you and your students to get access to these data and problems in semester-long competitions.  

The Defense Data Grand Prix is a challenge effort to solve the most difficult data science issues faced within our government today. These include technology as well as non-technology barriers to enterprise analytics.  For example, the Defense Logistics Agency handles over $42B in goods and services annually from food, equipment, fuel, and armament causing vast amounts of data in our supply-chain tracking system.  The DLA needs to fuse this vast amount of data, optimize it, and analyze it through cutting-edge applications. To leverage modern data science approaches to bear, DLA must grapple with system and data architectures and data governance policies. DLA is soliciting innovative solutions to these and other data science barriers.

In Heat 1, teams worked with DLA analysts to recommend ways to improve access to applicable data. Heat 2 focused on implementing approaches to making data accessible for analyses. Click here for more information on Heats 1 and 2.

Competitors will apply advanced data visualization techniques on findings from the defense acquisition data in Heat 3. Teams may investigate problems such as:

• Aviation Supplier Predictions
Manufacturing Stores and Materiel Shortages
Balancing Freedom of Information with Operations Security
Industrial Capability Program Material Identification

KEY DATES   |   OVERVIEW    |   CONTACT

CHALLENGE STRUCTURE OVERVIEW

The Defense Data Grand Prix is a three heat, 18-month competition to encourage collaboration among academic teams, government sponsors, and corporate sponsors.   It is designed to afford maximum interaction between sponsors and competitors, and reward innovation and shareable findings. Awards will be made to the top teams through their universities after each heat for the authoring and submission of full reports and supporting materials from their efforts. Submissions will be shared amongst all competitors to promulgate best practices and concepts in subsequent heats.  Competitor teams may participate in any or all heats.

HEAT 1

HEAT 1: PLANNING

FALL 2021

Competitors will propose data science objectives and approaches to creating data access and analytics methods. The final submission will be a white paper.

HEAT 2

HEAT 2: ACCESS

SPRING 2022

Competitors will demonstrate scalable access and sharing of real, transformed, or synthetic defense acquisition data. The final submission will be the relevant accessible data sets, a briefing and/or white paper describing their data set, data dictionary and information security, access, and sharing guidelines.

HEAT 3

HEAT 3: ANALYTICS

FALL 2022

Competitors will apply advanced analytics and visualize findings from defense acquisition data.  The final submission will be a white paper and/or briefing, or demonstration of analytic approach(es) and findings.

Within each heat, competition divisions will be established that align with government data owners/sponsors priorities. Competitors will self-select the division in which they intend to compete. Judging will be conducted, and awards administered between the divisions.

A few of the example problem areas include:
Foreign Military Sales (FMS) Order Processing Levels
Predicting Lead-Time Variability
Purchase Request Workload Management Tool
AI Technologies, Freedom of Information Act (FOIA) and Operations Security

At the beginning of each heat, competitors will be provided an overview of operational problems, data characteristics, and other information relevant for their challenge.  This will be via remote video conference, and competitors will select their division after this seminar.

Unlike other data competitions in which problems are well defined and sample data are provided, this competition will begin with problem identification and data inspection.  Teams will be encouraged to work with government sponsors to understand their problems, data, and constraints.  This collaboration will begin with an informational seminar arranged by AIRC at the beginning of each heat.  Subsequent interactions between government sponsors and competitors will be determined and coordinated by the sponsors.  Teams will be expected to address ethics, leadership, and project management.  Successful teams in this competition will:

  1. Participate in government sponsor seminars with other competitors to learn about data, problems and organizations;
  2. Document data and information system architectures to determine curation and sharing needs and constraints;
  3. Apply ethical and legal considerations;
  4. Identify high impact approaches and rationally select among them;
  5. Implement mathematical approaches based on sponsor needs;
  6. Exercise project management skills and effectively contribute to a team;
  7. Interact with a client and deliver the project’s outcomes; and
  8. Effectively provide or present findings to government sponsors.

The Defense Data Grand Prix is a partnership of multiple DoD components.  The principal sponsors are:

DoD Chief Data Officer (CDO) is responsible for strengthening data management across the DoD and accelerating the transition to a data-centric culture.  The CDO oversees the implementation of the DoD Data Strategy, the data governance process, data standards and policies, and the promotion of data acumen across the DoD workforce.

Office of the Under Secretary of Defense for Acquisition and Sustainment (OUSD (A&S)) is responsible for all matters relating to acquisition, including the Defense Acquisition System; system design and development; production; logistics and distribution; installation maintenance, management, and resilience; military construction; procurement of goods and services; material readiness; and maintenance.

Defense Logistics Agency (DLA) and Headquartered in Fort Belvoir, VA, DLA manages the end-to-end global defense supply chain for the five military services, 11 combatant commands, other federal, state, and local agencies and partner and allied nations.  DLA provides more than $40B in goods and services annually.  It supplies 86 percent of the military’s spare parts and nearly 100 percent of fuel and troop support consumables and manages the reutilization of military equipment.

Heat 1 – Fall 2021

23 August – 26 September 2021: Registration

29 September 2021: Orientation Seminar

√ 30 January 2022: Submissions Due

√ 1–4 February 2022: Judging

√ 14 February 2022: Award Ceremony / Submissions Published

Heat 2 – Spring 2022

√ 13 December 2021 – 14 February 2022: Registration

√  28 March 2022: Orientation Seminar

√  30 June 2022: Submissions Due

√  1-8 July 2022: Judging

√  11 July 2022: Award Ceremony / Submissions Published

Heat 3 – Fall 2022

√  1 July – 9 September 2022: Registration

√  13 September 2022: Heat 3 Begins (Orientation Seminar)

√  15 December 2022: Submissions Due

√   19 – 23 December 2022: Judging

√  January 2023: Award Ceremony / Submissions Published

Based on selection by the DoD, awards will be made to the top teams through their universities in compensation for the authoring and submission of full reports and supporting materials from their efforts.

Heat 1: $100k ($40k first place, $30k second place, $20k third place, $10k fourth place)

Heat 2: $100k ($40k first place, $30k second place, $20k third place, $10k fourth place)

Heat 3: $100k ($40k first place, $30k second place, $20k third place, $10k fourth place)

Winning teams will receive award certificates from a senior DoD official in recognition of their accomplishments.

  1. Teams must be coached or led by an eligible university faculty member. A faculty member may coach more than one team but will be limited to one award.
  2. There are no restrictions on the number of teams that can participate from any eligible academic institution.
  3. Teams will be required to sign non-disclosure and data sharing agreements with government sponsors.
  4. All findings will be open source.
  5. Competitors may be sponsored by extramural entities. Any such relationships are the responsibility of the individual teams.
  6. Competitors are subject to the rules and policies of their academic institution.
  7. Federal employees who represent eligible academic institutions may compete but are ineligible for cash prizes.

1. Heat 1:  The final submission will be a white paper describing their effort and results.

The best teams will be selected by the DoD to receive an award through their universities to develop and provide a full report, data, and algorithms on their effort.

2. Heat 2: Competitors will demonstrate scalable access and sharing of real, transformed, or synthetic defense acquisition data. The final submission will be the relevant accessible data sets, a briefing and/or white paper describing their effort, data, and results.

The best teams will be selected by the DoD to receive an award through their universities to develop and provide a full report, data, and algorithms on their effort, including the data set with a data dictionary and information security, access, and sharing guidelines.

3. Heat 3: Competitors will apply advanced analytics and visualize findings from defense acquisition data. The final submission will be a white paper and/or briefing, or demonstration of analytic approach(es) and findings. The use of multimedia is permitted here but no voice-over should be used. The presentations (when applicable) should be created as either a PowerPoint file or a PDF document. Multimedia files may be included as .mp4 files inside a zipped folder with the presentation, along with a readme coversheet.

The best teams will be selected by the DoD to receive an award through their universities to develop and provide a full report, data, and algorithms on their effort.


Information on submissions for all heats:

Multiple files will be submitted in a zip file. A coversheet that acts as a “readme” should be included and should explain clearly each file included in the zipped folder. Since the data set may be very large, it may be shared using an online platform (Google Drive, Dropbox, etc.) and teams will provide a shareable download link. This link must be accessible for at least six months beyond the duration of the competition.

To enable double-blind judging, submissions shall not include any personally identifying information (PII) including implied ownership of cited works (you can cite prior work, but you cannot allude to it being your prior work). Any such PII will be redacted prior to judging. The name of the team and its members may be included on a cover page, preferably submitted as a separate document along with the other products in a zip file and labeled “Team Members.” The zip file itself can contain the Team name used at the time of registration.

Submissions shall not contain any hyperlinks to outside webpages or other documents, unless allowed for explicitly in a specific heat.

Submissions in Heats 1 and 2 shall not contain any multimedia (e.g., videos, animations).

Participants in Heats 2 and 3 do not have to continue using their proposed methodology or the data that was personally acquired in a previous heat. All products from previous heats are open-source and are therefore “fair-game” to use by anyone in later heats.

Heat 2 of the Defense Data Grand Prix is open to teams of participants from the universities in the SERC/AIRC collaborator network.[1] and HBCU/MSI institutions.[2]  Future heats are intended to be open to more institutions, subject to approval by the Contracting Officer.  All team members must be from the same academic institution. All members of participating teams must be U.S. citizens or U.S. Permanent Residents and be 18 years of age or older as of 30 August 2022.

Violation of the rules contained herein or intentional or consistent activity that undermines the spirit of the Challenge may result in disqualification. The Challenge is void wherever restricted or prohibited by law.

[1]  The SERC/AIRC collaborator universities are: Auburn, Carnegie Mellon, Georgetown, Georgia Tech, Massachusetts Institute of Technology (MIT), Old Dominion, Penn State, Purdue, Stevens Institute of Technology (lead), Texas A&M, University of Alabama in Huntsville, University of Massachusetts Amherst, University of Maryland, University of Southern California (USC), University of Virginia, and Virginia Tech. Collaborating military universities include the Air Force Institute of Technology and the Naval Postgraduate School.

[2] Historically Black Colleges and Universities (HBCUs) and Minority Serving Institutions (MSIs) are encouraged to participate in this Challenge.

The following terms and conditions apply to all participants in this Challenge.

Publication:  Teams are encouraged to publish their activities and findings from this competition.  Participants agree to confer, consult, and acquire the consent of the Government prior to the publication or presentation of any Challenge materials, materials associated with the Challenge, or data derived from the Challenge, to assure that no PROPRIETARY INFORMATION or RESTRICTED-ACCESS INFORMATION is released, patent rights are protected, accuracy is ensured, and that no claims are made on behalf of the Government. Publication and/or presentation may be delayed for a reasonable time to afford needed protection.

Costs: The Government and AIRC are not responsible for any costs incurred by challenge participants, including the development of white papers, quad charts, presentation materials, the model, travel, technology, demonstrations, and any other associated costs. All costs incurred throughout the execution of the Challenge are the responsibility of the participants.

Results of the Challenge: Winners will be announced at the conclusion of each heat. AIRC will also announce the winners on the AIRC website and social media channels.

Release of Claims: The participant agrees to release and forever discharge any and all manner of claims, equitable adjustments, actions, suits, debts, appeals, and all other obligations of any kind, whether past or present, known or unknown, that have or may arise from, are related to or are in connection with, directly or indirectly, this challenge or the participant’s submission.

Compliance with Laws: The participant agrees to follow and comply with all applicable federal, state, and local laws, regulations, and policies.

Governing Law: This challenge is subject to all applicable federal laws and regulations. ALL CLAIMS ARISING OUT OF OR RELATING TO THESE TERMS WILL BE GOVERNED BY THE FEDERAL LAWS AND REGULATIONS OF THE UNITED STATES OF AMERICA.

Indemnification: Because of the number of anticipated challenge entries, AIRC cannot and will not make determinations on whether third-party materials in the challenge submissions have protectable intellectual property interests. By participating in this challenge, each participant (whether participating individually, as a team, or as a commercial entity) warrants and assures the Government that any data, analytic approaches, systems, algorithms, or other intellectual property (IP) used for the purpose of submitting an entry for this challenge, were obtained legally and through authorized access to such data, tools, or IP. By entering the challenge and submitting the challenge materials, the participant agrees to indemnify and hold the Government and SERC/AIRC universities harmless against any claim, loss, or risk of loss for patent or copyright infringement with respect to such third-party interests.

Publicity: The Participants may be featured on Federal and SERC/AIRC websites, in newsletters, social media, and in other print and electronic outreach materials. Except where prohibited, participation in this Challenge constitutes the consent to the DoD or AIRC’s use of each Participant’s name, likeness, photograph, logo, voice, opinions, public summary, and hometown and state information for promotional purposes through any form of media, worldwide, without further permission, payment, or consideration.

Due to the nature and objectives of this competition, all findings submitted for judging will be open source.  The DoD and SERC/AIRC are granted the right to publicize Participant names and, as applicable, the names of Team members and legal entities that participated in the submission following the conclusion of the Challenge. By participating in the Challenge, each Participant represents and warrants that they are the sole author or owner of IP in connection with the submission, or otherwise has the necessary rights to use the submission for purposes of the Challenge, including having any and all rights necessary to grant the license rights identified in this section. Each Participant further represents and warrants that the Submission does not infringe any copyright or any other rights of any third party of which the Participant is aware. If open-source code is used in this Challenge, then Participants must only use open-source code licensed under an open-source initiative-approved license (see www.opensource.org) that in no event limits commercial use of such code or model containing or depending on such code.

Judging Panel

The judging panel will consist of AIRC staff members and representatives from government sponsor agencies. Final award selection will be made by DoD officials.

Judging Criteria

Each team will produce a final product that will be evaluated by multiple judges from various partnering institutions and sponsors. The product specifications are subject to change for the different heats, as detailed previously in the Challenge Structure Overview and Team Submissions sections.

Each submission will be judged based on the following criteria with each judge independently scoring each solution on a scale of 1–4 for each category.  Rankings for each division will be selected by adding all judges’ scores to determine the highest scoring solution.

Impact:  To what degree will the approach positively impact the sponsor’s mission?

Acceptability:  How broadly can this approach be implemented?  Is the approach aligned with DoD and Federal equities?

Suitability: To what degree does the approach suit the needs of the sponsor?

Feasibility: To what degree do technical or workforce hurdles to scaled implementation exist?  Are associated costs affordable and commensurate with the expected benefits?

Judging will be performed in a “double-blind” fashion to avoid bias. This scheme also requires that participants not include any language that may identify them or their previous works (including work done during previous heats or efforts from outside the competition that can be used to identify them).