Armed·Forces·Jam
noun
1. an event in which individuals come together in a physical environment and attempt to rapidly prototype a game (video or otherwise) focused on some element of the challenges faced by the various branches of the Armed Forces in a limited amount of time;
An “armored” hackathon: She was on the team that made that awesome game I love at the Armed Forces Jam!
THROUGH THE CHAOS OF DEVELOPMENT COMES SOMETHING BEAUTIFUL
There are so many areas of military simulation that need help! Choose a category of military simulation(see below) and take on the challenge to create an innovative solution using gaming and emerging technology. Speak to experts, take their advice and incorporate your own ideas into your solution. Consider that boundaries often spark the best creativity!
Don’t have an existing development team? No problem! We will form teams at the event. All skills and skill levels welcome. Also remember to bring your own development equipment (Desktop, Monitor, Laptop, VR headsets, musical instruments, etc).
Developing software, games or experiences in 48 hours is hard, exhausting, and laborious work that is also collaborative, thought provoking, exciting, rewarding, and fun.
Have a great idea, or know someone who does? Everyone is welcome to pitch their concept using gaming tech to the group. The best ideas will be used to make innovative solutions for military simulation! If you just want to use your skills then join one of the ideas pitched instead!
There are so many areas of healthcare that need help! Choose a category of healthcare (see below) and take on the challenge to create an innovative solution using gaming and emerging technology. Speak to experts, take their advice and incorporate your own ideas into your solution. Consider that boundaries often spark the best creativity!
Don’t have an existing development team? No problem! We will form teams at the event. All skills and skill levels welcome. Also remember to bring your own development equipment (Desktop, Monitor, Laptop, VR headsets, musical instruments, etc).
Developing software, games or experiences in 48 hours is hard, exhausting, and laborious work that is also collaborative, thought provoking, exciting, rewarding, and fun.
Have a great idea, or know someone who does? Everyone is welcome to pitch their concept using gaming tech to the group. The best ideas will be used to make innovative solutions for healthcare! If you just want to use your skills then join one of the ideas pitched instead!
Get ready for an action packed 48 Hours!
PITCHES AND TEAMS
The night you chart a course for fun & excitement. Have an idea? Pitch it. Heard a fantastic idea? Join a team. As is with most jams, remember to bring your development gear.
Venue: Central Florida Tech Grove
6:00 pm – Doors Open
6:30 pm – Opening Ceremonies
7:00 pm – Game Pitches
8:00 pm – Jam Begins
11:00 pm – Go Home!
THE GRIND
The non-stop game-building action remains non-stop, except for the times you have to stop for food and ask for help.
Venue: Central Florida Tech Grove
8:00 am – Doors Open, Breakfast/Coffee
9:00 am - Hot Breakfast Station Starts
10:00 am - Hot Breakfast Station Ends
12:00 pm – Lunch / Call for Help
6:00 pm – Dinner
11:00 pm- Go Home!
FINAL SHOWCASE
The hard work hopefully looks like it is forming into something recognizable as fun. Success or failure, hopefully, you learned something. Tonight, you will present your team’s game concept for everyone to enjoy!
Venue: Central Florida Tech Grove
8:00 am – Doors Open
9:00 am - Hot Breakfast Station Starts
10:00 am - Hot Breakfast Station Ends
11:00 am – Lunch/ Call for Help
4:00 pm – Projects Due
4:30 pm – Dinner & Jam Fair
6:00 pm – Final Presentations
9:00 pm – Go Home!
F E A T U R E S
Have you got a game? In your head? About military sim? And you just need to get it out, with the support of a strong team? Join on of ours!
Are your algorithms Karplus-Strong? Take this opportunity to show everyone the strength of your Programming Kung Fu by creating an epically optimized galactic adventure!
Great music and sound effects have the capacity to truly transform a game into an experience. Jams aren’t just for programmers- musicians wanted!
Quickly generate excitement for your team’s game with an awesome, funny, inspiring, silly, educational, or all-of-the-above trailer- here’s an example from a previous game jam!
Games can require a lot of interaction & integration with various technologies, PC, Mac, Mobile Phones, Tablets, SDKs, APIs, Plugins, Consoles and kitchen sinks. If you can leap these hurdles in a single bound, then we need you!
Artists wanted: 2D or 3D and must have skills (middling to mad; negotiable) and the ability to have fun while creating serious beauty. Come get your art on!
If a tree falls in the forest and no one is there to hear it, did it make a sound? You get the idea- our teams need exposure! If you are a mad social scientist, bring your promotional knowledge to the table. Without advertising, a terrible thing happens- nothing!
Is the rest of your team so worried about finishing that they haven’t even considered usability? Did they forget the astronaut needs access to a restroom? Perhaps… It is your time to shine.
You certainly couldn’t have an Armed Forces Jam without some bonafide professionals! If you’re a veteran, active duty member of the military, or professional within a defense industry contractor then we need your insights, guidance and support. Join a team or consult with them all!
BELOW ARE THE CATEGORIES THAT TEAMS CAN FORM SOLUTIONS ON
SUMMARY
PROBLEM STATEMENT:
What concepts or technical approaches can be taken by leveraging low cost, COTS products integrated w/ developer-produced source code to enable Embedded Appended Training as a portable “drop in Kit” for different Army vehicle platforms?
What existing low-cost COTS concepts currently exist that can be integrated via code to leverage emerging technologies like screen projection to windshield and door windows, Helmet Enabled Displays, augmented reality wearables and sensors for hand/feet-tracking to enable a lash-up of live vehicle controls to virtual action within the simulation? Can the Army leverage COTS or GOTS motion sensors placed within the crew cab to facilitate similar actions regarding accelerating, breaking, steering, radio operations, and other simulated functions embedded within the real vehicle platform? Can low-cost COTS HW and SW be tailored to the Gunner may be emersed by leveraging virtual reality wearables and experience training realism in an immersed virtual environment? Can this capability be tailored to be portable to the point of need, meaning as a “drop-in kit” that can stored in the vehicle for portability, and also allow the Driver, Commander and Gunner to train from their real locations situated within their vehicle, and also leverage this “drop-in kit” concept with their own crew served weapon?
WHY IS THIS IMPORTANT
REQUIRED TECH
USE-CASES
#1 Precision Gunnery Training for an Un-stabilized Platform
Using a HMMWV equipped with an M2, a crew (Commander, Gunner & Driver) conducts simulated light wheeled gunnery on a local training range. With an Augment Reality Solution, the crew would maneuver their vehicle on the actual range. Engage targets both a stationary battle position and on the move within a maneuver firing box. Targets are a combination of stationary and moving trucks, dismounted teams and individuals dismounts ranging from 300 meters to 1200 meters. With a Virtual Reality Solution, the crew would remain stationary throughout the simulation and the scenario would leverage the Synthetic Training Environment-Software as the “game engine” and scenario feed.
#2 Convoy Live Fire Training for an Un-stabilized Platform
Using a HMMWV equipped with an M2, a crew (Commander, Gunner & Driver) conducts convoy live fire training as a single vehicle or as part of a larger convoy in which they must move along a designated route either live using a AR Solution, or virtually using a VR solution. They are expected to gain and maintain contact with enemy forces along the route in order to find, fix and destroy enemy elements consisting of stationary and moving trucks, dismounted teams and individual dismounts located in the open, in defensive positions or within built up areas such as buildings. These targets range from 300 meters to 1200 meters.
RESOURCES/LINKS
SUMMARY
Modify the existing VR version of the Tech Grove to become a virtual escape room game
USE-CASES
WHY IT IS IMPORTANT?
The Tech Grove is at the intersection of the military modeling & simulation epicenter and Orlando's claim to being the MetaCenter. The Tech Grove is uniquely positioned to connect the great work already established by both communities, advancing the successful growth and utilization of the metaverse for all.
RESOURCES/LINKS
SUMMARY
Create a strategy game play (single and/or multiplayer) that reinforces effective information exchange (input and output), task delegation, and timeliness of outcome in a data-driven, information saturated context.
USE-CASES
WHY IT IS IMPORTANT
Data drives decision making. The amounts of data available to decision makers is increasing, however, the usefulness and validity of that data can often be questionable. Individuals and teams must develop effective strategies to exchange, reduce, and act on the most important and impactful data in a timely manner. This is especially true in high-risk decision tasks such as those faced by our warfighters. The increasing use of automation and AI within these information pipelines is also adding to the complexity and speed at which these decisions and team actions take place.
The challenge is to create an engaging, strategy-based game play for one or more players that seeks to reinforce desired behaviors in the effective exchange and use of large amounts of information for the purpose of decision making in time critical tasks. Information may take many different forms, can be useful or noise, and might often be ambiguous or misleading. The challenge for the player(s) should be to quickly filter, order, seek additional information as needed, delegate and share tasks, and make outcome based decisions. Information sources and assistants might include automated/artificially intelligent agent characters in the game play. However, there is not an expectation of the creation of functional AI NPCs to meet this part of the challenge statement. The rules and cadence of the game-play is open to the team's creativity. However, the preferred deployment target for the resulting application is mobile device.
RESOURCES/LINKS
Tyson Griffin
Dr. James Pharmer
Generative AI: Create content that does not exist. It mimics human creativity. Start with existing data, code, images, videos, and text and create new data, code, images, video, and text that do not exist.
Radar: #1 NLP and
#10 Machine Learning and Deep Reinforcement Learning
Category: AI / Machine Learning, Creator Tools and Data Interpretation
Generative AI : There are many uses cases, here are two focused on M&S capabilities.
Create generative synthetic environments using the code generation (LLMs Large Language Models) capabilities like Microsoft OpenAI (CodeGPT, Codex, Github Copilot), Tabnine, Google CodeT5, Polycoder, Cogram, AWS CodeWhisperer, Captain Stack, Second Mate, Red Hat Project Wisdom... Synthetic environments are programs and should be able to benefit from Code Generating AI. In the future, airman, guardians and pilots might be able to ask for a simulation to train me on “X” and generative AI creates the simulation while they wait.
Create non-sensitive (unclassified, releasable, …) data to stimulate simulations where it is not available. This data can be used in synthetic environments without concern of data spillage because it is created outside of sensitive environments. The data can be tagged and labeled with metadata as it is created to facilitate its use in analytics environments as well.
Helpful links/resources
5 AI Tools That Can Generate Code To Help Programmers (forbes.com)
https://blogs.nvidia.com/blog/2023/01/26/what-are-large-language-models-used-for/
https://gamedevacademy.org/using-large-language-models-llm-in-game-development-tutorial-list/
https://arxiv.org/pdf/2208.10264
GeNN: a code generation framework for accelerated brain simulations | Scientific Reports (nature.com)
Embedded Code Generation - Flight Code Generation for Aerospace Systems - MATLAB & Simulink (mathworks.com)
Explainable Debriefer Digital Twin
Explainable AI: Take AI out of the black box. Explain how the AI developed its content as well as what data it used. Prove that the AI is not biased, using sound logic (critical analysis) and can be tested against observable reality. Digitial twin instructor evaluator.
Radar: #10 Machine Learning and Deep Reinforcement Learning
-Push all AI created synthetic environments to align with physical twins. In other words, all AI synthetic environments must serve as digital twins. We can use the digital thread to verify, validate and accredit AI’s fidelity, concurrency, and interoperability.
-Combine Knowledge Graphs (KG) with Machine Learning (ML) to make AI systems transparent and interpretable, as ML models can extract relationships, feature and entities as well as inferring new concepts. KGs can be used to answer questions, understand images, and retrieve information, which are relevant aspects of MS analytics.
The usage of AI will always be blocked by whether its analysis can be trusted. Digital twins and knowledge graphs have been postulated as possible paths to AI explainability. AI utilized in a Digital Twin can be verified against its physical twin. If safety allows, the Digital Twin’s (sensor) data can be pushed backwards in the digital thread to stimulate the physical twin and verify the AI’s directives. Further, when a knowledge graph sits on top on many years of test data, ML/DL models can be developed that verify an AI assessment at an acceptable confidence level.
Can you create statistics on a game to show winner? Mary won the game, why?
[2301.06676] Explainable, Interpretable & Trustworthy AI for Intelligent Digital Twin: Case Study on Remaining Useful Life (arxiv.org)
[1910.13520] Digital Twin approach to Clinical DSS with Explainable AI (arxiv.org)
[2207.09106] Explainable Human-in-the-loop Dynamic Data-Driven Digital Twins (arxiv.org)
Digital Twin and Artificial Intelligence: benefits & key learnings | Silo AI
[2004.14843] Knowledge Graph Embeddings and Explainable AI (arxiv.org)
Knowledge-graph-based explainable AI: A systematic review - Enayat Rajabi, Kobra Etminani, 2022 (sagepub.com)
Knowledge graphs as tools for explainable machine learning: A survey - ScienceDirect
Knowledge Graphs For eXplainable AI | by Giuseppe Futia | Towards Data Science
How AI could help new Air Force pilots avoid costly mistakes
Augmentation AI: Make AI usable in day-to-day work life. AR googles, robots and headset hardware interface with humans to add capabilities to increase work value. Dashboarding data and analytics help decision makers by adding data points based on neural networks (machine learning/deep learning). Computer vision AI can identify aircraft and weapons systems. Natural Language Processing can interpret language real-time (translate Chinese). Facilitate OODA Loop/Kill Chain decisions using more available data and likely make better (more informed) decisions.
Arguably the most scalable, flexible, and high-fidelity content that can stimulate AR/VR. Google and visors can be used to higher level training. On-boarding the AI at the edge seems pragmatic for smaller datasets, but AI delivered via API might expand the capability. The costly part of AI is often CPU/TPU access and availability. When the AI content is created where processing available, it can more complex and might be of higher value. For instance, AI that generates weather or cyber effects available via API to the AR/VR device. This interfacing could lead next level training and more inclusive readiness.
Can you integrate the OpenWeatherMap API (or other AI API) into an existing AR capability?
Use Cases:
Augmented AI: Augmentation opens the door to the most valuable training.
-AI can take AR googles training to new levels. Some low hanging fruit: Based on past recorded operations – determine the most likely next step. Wrench turn-by-turn directions to fixing equipment, aircraft, and weapon systems. AI agent to answer questions on maintenance, parts and metric requirements.
-On the Visor AI, (Red6AR), can facilitate pilot training in the aircraft based on preset scenarios and pilot interactions. AI builds training to augment pilots experience while flying an aircraft.
Resources:
Helpful links/resources
Using AI to create better virtual reality experiences | Stanford News
Augmented reality and virtual reality displays: emerging technologies and future perspectives | Light: Science & Applications (nature.com)
U.S. army’s new night-vision goggles use augmented reality - The Washington Post
AI Glasses You Can Try On And Try Out With AR (forbes.com)
Weather Information Extension - MIT App Inventor Help - MIT App Inventor Community
Weather forecasting is having an AI moment | MIT Technology Review
Weather API - OpenWeatherMap
Weather app with openweather - General Discussion - MIT App Inventor Community
Incorporate Natural Language Processing (NLP) to assist in creating/modifying Pilot Training simulations. Use NLP to enhance flight simulations by generating dynamic and diverse training scenarios. Potentially analyze real-world aviation data, weather conditions, and pilot adversarial interactions to create realistic and challenging scenarios. Provide pilots with a broader range of experiences and help them develop decision-making skills in various situations.
Hardware & Software Required:
Utilize AFAMSs Strategic Lightweight Exploratory Devices (SLEDS) to showcase the Game Challenge Results.
Utilize Battle Space Simulations Mace and Armor software platform to augment the traditional software buttonology with NLP commands.
Why is this important:
Incorporating NLP into flight simulation trainers is important because it optimizes training effectiveness, enhances communication skills, prepares pilots for various scenarios, and leverages technology to provide dynamic and adaptive training experiences. Addressing the incorporation of Natural Language Processing (NLP) into flight simulation trainers is vital for several key reasons: (section 6).
Use Cases:
Utilize NLP commands while the Instructor, Trainee, or user is in real-time emersion in Virtual or Extended Reality allowing the command to be administered “handsfree,” and allowing the pilot to maintain complete flight control during the simulation. Follow-on technology can then be incorporated beyond simulation into operational flight.
Awakening AICE (Artificial Intelligence Cognitive Expert) would serve similarly to Google Assistant and ALEXA for the flight simulation community.
Helpful resources:
Natural Language Processing (NLP) focuses on understanding and generating human language. Transferring data along a network protocol like a Distributed Interactive Simulation (DIS) packet is not typically used. DIS is a protocol used in military simulations to exchange data between simulation entities, such as aircraft, ground vehicles, and sensors.
NLP is more concerned with tasks like language understanding, sentiment analysis, language generation, machine translation, and chatbot interactions. It doesn't directly deal with the technical aspects of network communication or packet transmission.
If you want to transfer data along a DIS packet or any other network protocol, you need to work with the appropriate network and protocol mechanisms. This involves encoding and decoding data according to the protocol specifications, managing the packet structure, addressing, and ensuring reliable network data transmission.
If you're interested in combining language understanding with network communication, consider creating a system where NLP is used to interpret or generate commands or instructions that can be included in the network packets. For example, a command from a user such as "Increase altitude to 10,000 feet" could be interpreted by an NLP component and translated into the appropriate data and format to be included in a DIS packet for a flight simulation.
In summary, NLP and network packet transmission are distinct domains, but there can be scenarios where they interact indirectly by interpreting and generating commands and instructions.
5 AI Tools That Can Generate Code To Help Programmers (forbes.com)
https://blogs.nvidia.com/blog/2023/01/26/what-are-large-language-models-used-for/
https://gamedevacademy.org/using-large-language-models-llm-in-game-development-tutorial-list/
https://arxiv.org/pdf/2208.10264
GeNN: a code generation framework for accelerated brain simulations | Scientific Reports (nature.com)
Embedded Code Generation - Flight Code Generation for Aerospace Systems - MATLAB & Simulink (mathworks.com)
NLP-aided Creation of Flight Simulator Vignettes for Air Policing Using ADS-B Data. The sovereign airspace safeguarding mission is a collective task that involves the continuous presence of fighter aircraft and crews, which are ready to react quickly to possible airspace violations. The goal of this challenge is create the capability for simulation creators to quickly generate short duration military fighter aircraft flight simulator training vignettes of airspace violations using NLP and ADS-B Data.
Loss of comms with Interceptor from Commerical Airspace react quickly to possible airspace violations Interceptor Short Vignette Title and Summary of the challenge/need.
Extract real-time data using NLP from commercially available sources from the Internet and interface it with existing aircraft games. Give the trainee the ability to interact with real-time data within the gaming system they use for flight training.
Demonstrate real-time air traffic within an existing game.
Helpful links/resources
https://flightaware.com/live/
https://www.radarbox.com/
https://www.flightradar24.com/data/statistics
https://www.flightradar24.com/
https://aviation.stackexchange.com/questions/3052/is-there-an-api-to-get-real-time-faa-flight-data
https://www.geo-fs.com/
https://discussions.flightaware.com/t/visualizing-ads-b-data-in-msfs2020/68253
https://flightaware.com/news/article/Announcing-FlightAware-and-Microsoft-Flight-Simulator-Partnership/1477
https://geekflare.com/flight-data-api/
Overview:
Building trust in a synthetic environment is based on the belief that it will work as expected. This is key to UX (user experience) and UI (user interface) and the developer's ability to build trust. User must feel the capability feels intuitive and reliable; that they are empowered to control complex system (when they move the stick the aircraft reacts as expected). Gold standard for trust in simulated training systems (legacy or digital twin) is that the virtual world matches the physical world. Mismatches between environments must be viewed as learning experience, opportunity and data points used for continues trust development and not dismissed or ignored. Long term, seek edge cases (corner cases) that the simulation is missing and make changes as required.
A digital twin is a virtual model that is created to accurately reflect an existing physical object. The physical object is fitted with sensors that produce data about different aspects of the object’s performance, for example on a wind turbine. This data is then relayed to a processing system and applied to the digital model. This digital model, or twin, can then be used to run simulations, study current performance and generate potential improvements that can then be applied back to the actual physical asset. A digital twin can also be created for non-physical processes and systems, mirroring the actual process or system and allowing simulations to be run based on real-time data.
Task
Can you create a simple digital twin and show how it can support trust?
Create Digital Twin Simulation for Game Player to trust the AI in a synthetic training environments – digital twin technology will allow the comparison of virtual and physical simulations based on moving data back and forth along the digital thread thus proving functional mapping and matching of functionality.
Demonstrate a digital twin simulation that establishes trust in its ability to match its physical twin. Or Demonstrate a Game where the players are become successful by training their AI agents and trusting them to accomplish various tasks.
SUMMARY:
Move an existing DoD simulation to the metaverse (Omniverse, Hadean, or StellarX). The problem is to take an existing simulation and move it into a metaverse platform and demonstrate its functionality. Each month gaming capabilities are being revectored to the new platforms. The DoD must start moving in that direction to realize its promised advantages in training all services.
REQUIRED TECH:
Required to demonstrate the functionality based on chosen simulation and platform. A Strategic Lightweight Exploratory Devices (SLEDS) Air Force flight simulator is available in Tech Grove to use for this challenge if desired. Air Force SMEs will be available during the weekend to assist if a team is interested in using the Strategic Lightweight Exploratory Devices (SLEDS).
TASK:
Move an existing flight simulator on to a metaverse and demonstrate its functionality
WHY IT IS IMPORTANT:
All roads seem to lead to the metaverse.
https://stealthoptional.com/metaverse/us-air-force-spaceverse-training-cadets-metaverse/
RESOURCES/LINKS
https://www.nvidia.com/en-us/omniverse/
https://www.stellarx.ai/
https://hadean.com/defence/
https://warontherocks.com/2022/02/the-full-potential-of-a-military-metaverse/
SUMMARY:
A simulated experience for M&S 101 training that has immerses the user through engaging gameplay and UX. We have an existing training deck that primarily consists of PowerPoint slides. We are looking for an immersive and more engaging user experience that provides an introduction to modeling and simulation for the Air Force.
REQUIRED TECH:
Any Computer-based or AR/VR options
ie Quest, Magic Leap 2 Headset
WHY IT IS IMPORTANT:
Need a more engaging solution to providing training content
RESOURCES/LINKS:
BELOW ARE THE CATEGORIES THAT TEAMS CAN FORM SOLUTIONS ON
SUMMARY
Create a metaverse experience utilizing the existing Tech Grove digital twin that supports the goals of the Tech Grove to 1) Grow the Defense Industrial Base 2) Transfer technology between government and industry and 3) Solve hard problems.
Ability to populate and use the existing baseline digital twin of Tech Grove would allow for demonstration of technologies and solutions that are highly relevant to the mission and needs of Team Orlando, It would provide the opportunity for simulation & training solutions to be available for engagement on an on-going basis for needs such as user feedback, broad awareness of available solutions, creating greater visiblity of what Team Orlando is all about,
USE-CASES
WHY IT IS IMPORTANT?
The Tech Grove is at the intersection of the military modeling & simulation epicenter and Orlando's claim to being the MetaCenter. The Tech Grove is uniquely positioned to connect the great work already established by both communities, advancing the successful growth and utilization of the metaverse for all.
RESOURCES/LINKS
SUMMARY
The NavalX Tech Bridges are a connected network that enhances collaboration between Naval Labs, industry, academia, and other military branches. A NavalX Tech Bridge offers a collaboration space in a commercial business space, rather than on base. An off base location offers a more easily accessible landing spot to foster a collaboration ecosystem to build productive partnerships and accelerate delivery of dual use solutions to the warfighter.
Create an interconnected metaverse experience connecting all the Tech Bridge together in one experience.
USE-CASES
WHY IT IS IMPORTANT
The value of Tech Bridges is the interconnected network. Anything that would enhance and facilitate those connections brings greater value to the organization and it's partners.
RESOURCES/LINKS
SUMMARY
Move an existing DoD simulation to the metaverse (Omniverse, Hadean, or StellarX). The problem is to take an existing simulation and move it into a metaverse platform and demonstrate its functionality. Each month gaming capabilities are being revectored to the new platforms. The DoD must start moving in that direction to realize its promised advantages in training all services.
REQUIRED TECH
Required to demonstrate the functionality based on chosen simulation and platform. A Pilot Training Next (PTN) Air Force flight simulator is available in Tech Grove to use for this challenge if desired. Air Force SMEs will be available during the weekend to assist if a team is interested in using the PTN.
USE-CASES
WHY IT IS IMPORTANT
All roads seem to lead to the metaverse.
https://stealthoptional.com/metaverse/us-air-force-spaceverse-training-cadets-metaverse/
RESOURCES/LINKS
SUMMARY
Real-Time Strategy (RTS) game where forces need to be utilized to provide Integrated Base Defense. Forces and Resources must be used to provide base defense for different situations that can arise. Provide various facilities with different security priorities. Designated the amount of available forces(Security Personnel), resources and capabilities (cameras, intrusion detection systems, vehicles to transport personnel, drones, robotic dogs, autonomous boats, etc).
USE-CASES
WHY IT IS IMPORTANT?
Integrated Base Defense is of high importance. Forces and resources are not infinite and appropriate planning and training is crucial to maintain installation integrity and maintain Integrated Base Defense effectiveness.
SUMMARY
How can we use gaming and technology to recruit and retain the best talent? Develop a means of educating potential recruits about the Armed Forces, its various career fields, and opportunities for serving. Provide recruits with a better understanding of what it means to serve as future military member.
REQUIRED TECH
Computer, phone, tablet, or VR options
USE-CASES
WHY IT IS IMPORTANT
Every branch of the U.S. military is struggling to meet its fiscal year 2022 recruiting goals. Services must find a way to connect with newer generations and find a way to recruit top talent across a wide variety of skillsets.
SUMMARY
With the implementation of the United States Space Force, the U.S is interested in creating Space Ports (Airports for Space Shuttles). SpacePort Tycoon would be utilized to map out the installation of facilities and capabilities in a given location (Similar to RollerCoaster Tycoon or SimCity). Develop a game that provides an introduction to modeling and simulation for the Space Force and Air Force. Plot vertical and horizontal launch facilities, maintenance bays, viewing sites, servicing infrastructures (restrooms, admissions offices, food vendors.), etc.
USE-CASES
WHY IT IS IMPORTANT
Proper prior planning prevents poor performance, safety incidents, breaches of security.
RESOURCES/LINKS
SUMMARY
A simulated experience for M&S 101 training that has immerses the user through engaging gameplay and UX. We have an existing training deck that primarily consists of PowerPoint slides. We are looking for an immersive and more engaging user experience that provides an introduction to modeling and simulation for the Air Force.
REQUIRED TECH
Computer-based or AR/VR options
USE-CASES
WHY IT IS IMPORTANT
Need a more engaging solution to providing this training content
RESOURCES/LINKS
SUMMARY
Space Launch Delta 45 is currently prototyping digital twins in order to provide first responders virtual blueprints of buildings. We would like to further this capability by leveraging the digital twins and providing a virtual environment for Close Quarters Combat.
REQUIRED TECH
VR HUD/Headset
USE-CASES
WHY IT IS IMPORTANT
It is important to familiarize our forces with high target buildings and infrastructure in order to maximize effectiveness in high-stress situations.
RESOURCES/LINKS
SUMMARY
Create a digital twin simulation based on available sensor data. The creation of JADC2 brings sensor data grids into existence. Using this data in an effort to create simulations, if not executed will be a wasted opportunity. Digital Twins are key to developing the military metaverse. By starting with sensor data and developing functional digital twins that train required functionality the DoD grows its synthetic training environment and adds value.
REQUIRED TECH
Required to build a digital twin
USE-CASES
WHY IT IS IMPORTANT
The highest fidelity most capable simulations come from digital twin simulations. Training using these simulation will be far easier to sustain, verify, validate and accredit because they can quickly be compared to their physical counter part. Changes in the physical device can be replicated quickly to the digital device via sensor data deltas. All synthetic training environments can be augmented via digital twin technology. This effort is foundational to future synthetic environments across DoD. There is currently a huge shortage of digital twin developers, so students with this experience will be very marketable.
RESOURCES/LINKS
DATA SETS
SUMMARY
1. DoD is seeking a mechanism to certify, database, and track Electromagnetic Spectrum Enterprise (EMSE) expertise. (Resource Management)
2. DoD is seeking to test the warfighter's ability to characterize an environment, assign and coordinate resources, and manage those resources in light of an evolving environment. (Test and train)
3. DoD is seeking to integrate joint force exercises, rehearsals, and war games under simulated realistic operational conditions (live, virtual and constructive models) that incorporate all EMS capabilities and challenges. (Wargaming)
USE-CASES
WHY IT IS IMPORTANT
The DoD must ensure all personnel are indoctrinated and trained at the appropriate level on spectrum core concepts. They should understand the spectrum impacts on their capabilities, operations, and plans. Training will be tailored to meet the needs of personnel at each level of DoD structure – from technicians, to requirements personnel, to operators, and to top-level commanders. The DoD will ensure all identified members of the spectrum workforce are appropriately trained and retain relevant EMS skills and expertise.
RESOURCES/LINKS
SUMMARY
Design a gaming tool to train external new hires, internal new hires, and potential DoD/non-DoD users on the basics of the PCTE. These rudimentary principles, including agile principles, ruthless collaboration, technical and programmatic activities, are the underlying backbone to eventually understanding the complexity of PCTE.
PCTE is a very complex program. When there is a new person, the underlying principles can take months or even years to gather. With a game-based training, we can not only make PCTE training fun, but we can expedite the learning curve exponentially.
Your mission is to review the 6 areas of rudimentary data shown above, as well as the documentation to support these areas, and design the best game-based solution to make PCTE training FUN.
REQUIRED TECH
The ability to be accessed/used on a government computer.
USE-CASES
WHY IT IS IMPORTANT
The PCTE environment is very complex and can be overwhelming for a new hire or new user/customer. Automation is key – and it needs to be fun! This will help us expediently educate future PCTE new hires, both internal and external to PEO STRI, and upcoming DoD and non-DoD persons who will be a part of the PCTE program.
RESOURCES/LINKS
1st Place - $1500
2nd Place - $1000
3rd Place - $500
$1000 each
Challenge: Central Florida Tech Grove Metaverse
Each Bounty is $100
1st Place - $1000
2nd Place - $750
3rd Place - $500
$500 each
Challenge: Central Florida Tech Grove Escape Room
$100 for each Bounty
CAPT. DAN COVELLI - ARMED FORCES JAM INTERVIEW
PAUL SOHL ON GAME JAMS IN OUR COMMUNITY
Be a part of this event. With your help we can discover innovative and sometimes unconventional concepts that can help our Armed Forces, and Orlando is the place to do it!
We’d also love for you to participate in our Innovation & Diversifier Challenges: You provide us a set of goals or initiatives and watch as some of our development teams take them on, building games incorporating the elements in which you are interested. Who knows?