NAVAIR issued a challenge to develop a visualization tool and algorithms to determine aircraft parts needing immediate attention, and which should be added to a watch list. The inaugural NAVAIR Data Challenge accomplished four parallel goals, however, only the first two were planned.
- Use data to address an important NAVAIR issue – readiness;
- Understand the current data state and data science expertise within the command;
- Validate the NAVAIR Data Strategy; and
- Prove the challenge construct in solving complex NAVAIR problems.
PROBLEM: NAVAIR has vast amounts of readiness data, but struggled with effectively translating that data into usable information to accelerate decision making. When the challenge began, Naval Aviation Type/ Model/ Series (TMS) were below their Ready Basic Aircraft (RBA) goals. Too many aircraft were not mission-capable due to the unavailability of parts. This problem became the foundation for the challenge question below.
“Using historic data (e.g., parts reliability, aircraft usage rates, repair rates, and sparing data), develop a visualization tool and algorithms to determine which parts need immediate attention and which should be added to a watch list.”
At the time, NAVAIR was uncertain of its depth and breadth of data science expertise. The ability to acquire information, analyze, enhance and effectively prepare for decision-making presented opportunities to characterize NAVAIR’s current state of skills, technology and infrastructure, all key areas for the NAVAIR Data Strategy. Whether or not infrastructure could sufficiently support data science needs in a Navy Marine Corps Intranet (NMCI) environment was a concern.
ACTIONS: NAVAIR’s Integrated Business Capabilities (IBC) Enterprise Team (ET) developed the Data Challenge construct and obtained flag support. The initial goal was four to six teams participating over six months, culminating in a winner and summit to support collaborative learning. Minimal rules and guidance encouraged innovation. Teams received eight years of aircraft maintenance data for all type, model, and series aircraft. They assessed team progress by planning two checkpoints prior to the final submission.
MAGNITUDE: Response to the challenge was expected to yield four to six teams and around 50 participants. Instead, 178 participants across nine functional areas, representing all NAVAIR sites, voluntarily formed 33 teams to tackle the Data Challenge. All aspects of the challenge had to pivot to support the overwhelming response.
OUTCOMES: Key outcomes, beyond selection of a winner, are listed below:
• Delivered Solutions – Five finalists delivered solutions at a summit, which provided a venue to cross-feed knowledge, select a winner, and gain external perspectives. These solutions provided analysts and leaders with trends and magnitude of various readiness degraders affecting a platform or system, which enabled informed decisions. Team insights will improve data validation methods and enhance future readiness analytics, helping to identify and manage readiness and cost drivers. Insight from all teams will improve data validation methods and enhance tools implemented in the future.
• Demonstrated the Power of Data – Demonstrated the ability of NAVAIR teammates to use information to understand causes, highlight areas requiring action, and depict the information. Collectively, participants voluntarily invested over 7,000 hours in improving aircraft readiness.
• Validated NAVAIR Data Strategy – Conducting the NAVAIR Data Challenge tested objectives contained in the strategy. They are changing the way that the DON discusses data so its value is recognized, just like the way the DON values its people, products and technology.
Firsts and experiments associated with NAVAIR’s Data Challenge are highlighted below:
• Quick Wins – At Checkpoint Two, challenge administrators gathered potential enhancements to tooling, data quality improvement recommendations, and data rules. One team provided a data cleansing template to address gaps and invalid entries. Leadership provided and appreciated these innovations even as the challenge continued. The developed approaches have potential broad application across the command. The five finalist teams demonstrated their solutions to leadership and implementation plans are currently in the works. Four teams now use their solutions in support of program tasking. One team uses component data in helping a program office resolve readiness issues while another team provided critical parts distribution information to address a specific readiness degrader.
• Self-Forming and Self-Directed Teams – Team composition crossed competencies and site boundaries. As needs emerged, teams brought in expertise and course-corrected based on the issues they encountered. The challenge construct focused on several clear goals and outcome measures. Communication was a central theme with learning captured and shared during the Challenge and barriers eliminated once identified. At Checkpoint Two, it was incredibly difficult to down-select from 19 teams to five finalists to align with the Naval Postgraduate School (NPS) recommended summit size.
• Accelerated Learning – The Challenge actively encouraged participants to recognize gaps in customer expectations, solve problems, and share lessons learned so successful approaches could be quickly and broadly replicated. Participants expanded knowledge beyond current tasking while others transferred acquired knowledge to their regular assignments.
• Enabling Collaboration – Challenge leaders established a series of web applications and pages as the primary method of communication and coordination. Pages were visual, dynamic and updated several times weekly. They contained Subject Matter Expert contact information, team submissions, blogs, and discussion board functions.
• Brown Bag Sessions – Administrators launched one-hour tutorials on topics relating to data science to help participants navigate the data science process. The Brown Bags have continued post Challenge, with topics already booked in the future months and with growing participation rates.
• Grow Data Science Community – The Challenge site was used to grow NAVAIR’s data science community. The community is larger than just Challenge participants; it started with 297 members and has grown to nearly 600 problem solvers in the first ten months.
• External Partners – Participants fostered relationships beyond NAVAIR to develop a diverse understanding of data science as it applies to problem solving. The Deputy Undersecretary of the Navy for Management, Director of Strategy and Innovation served as the summit’s keynote speaker. NPS provided summit backdrop and contributed data science expertise. Industry supported NAVAIR efforts via brown bag sessions and summit presentations.
• Data Science Tools – Administrators incorporated data science technology during the Challenge to better support participants’ data science needs. It was identified, obtained, implemented and available to the teams within eight weeks of need identification.
• Commander’s Award – Based on the success of the inaugural Data Challenge, the IBC ET received a NAVAIR National Commander’s Award for Business Innovation.
The NAVAIR Data Challenge demonstrated the power of data and collaboration when it comes to solving real-world issues. Events like the NAVAIR Data Challenge are successfully changing the way the DON both examines and values data.
NAVAIR Data Challenge team members wer recognized with honorable mention for the 2016 Secretary of the Navy (SECNAV) Innovation Award in the Data Analytics category.
The SECNAV Innovation Awards recognize the top innovators within the Department of the Navy (DON). Their accomplishments are remarkable and serve as inspiration for the Navy and Marine Corps to think boldly and solve the fleet and force’s most challenging problems.
Join DON Innovation on https://www.facebook.com/NavalInnovation or @DON_Innovation or visit the DON Innovation website at http://www.secnav.navy.mil/innovation/Pages/Home.aspx. Email DON Innovation: DON_Innovation@navy.mil