Email this Article Email   

CHIPS Articles: Rear Adm. Danelle Barrett, Navy Cyber Security Division Director

Rear Adm. Danelle Barrett, Navy Cyber Security Division Director
The data-driven Department of the Navy
By CHIPS Magazine - January-March 2018
Rear Admiral Danelle Barrett is the Navy Cyber Security Division Director, on the staff of the Deputy Chief of Naval Operations for Information Warfare (OPNAV N2N6), with more than 28 years of service including operational assignments tours at U.S. Naval Forces Central Command/U.S. 5th Fleet; Commander, 2nd Fleet, Carrier Strike Group 2, Multi-National Forces Iraq, Carrier Strike Group 12, which included deployments in support of Operations Enduring Freedom in Afghanistan and Unified Response in Haiti; Standing Joint Force Headquarters United States Pacific Command; and deputy director of current operations at U.S. Cyber Command.

CHIPS asked Admiral Barrett to talk about the Navy’s strategy for data sharing and her responsibilities as the Navy Cyber Security Division Director. The admiral responded to questions in writing in early January.

Q: Data experts say the sweet spot in maximizing an organization’s data is the convergence of high-speed computing, artificial intelligence, machine learning and data analytics — and cloud computing for easier access to data and data sharing. Is the Navy headed in this direction? How will Navy move those information capabilities to the tactical edge, particularly the afloat environment?

A: Yes, absolutely, it’s not just one of those technologies it’s the combination of them that will produce the warfighting effects we want and will enable us to extend those effects to the tactical edge. For our afloat and expeditionary forces, we have added complexity when we are disconnected from satellites and must continue to improve technology and processes to overcome challenges in that information environment. Capabilities being developed for use in commercial industry have military application to process, move and secure data so we have confidence in what we see to execute operations.

But remember this is not just about technology; it’s about processes and how we use our data and information. We need to better leverage artificial intelligence and data analytics being developed by commercial industry and academia to discover and present information in the context needed to make quick and accurate warfighting decisions.

We are overwhelmed with data now, data without context in many cases. Our brains need machines to do the heavy lifting and make sense of the data so we can act on it in an agile manner with confidence that the data are unaltered. The human-machine interface piece is equally important — ensuring we have systems that can help us make sense of the data, present information in a user-friendly, secure manner and that those systems learn from user behavior to become predictive.

To get that right it takes a combination of people, processes and technology. It requires operators to understand their requirements for what information they need to make decisions and for futurists/technology experts to understand capabilities that are available or are emerging and to show operators the “art of the possible” in terms of information usage.

In an age of accelerating technology where Moore’s Law is in our rear view window, this synergy between operators and futurists will become even more important. How we distinguish and recommend technology that is not just a “bright shiny object” but will bring us a real warfighting advantage and is value-added in the context of the overall information environment. It’s also important to avoid “possibility-myopia” which limits innovative thought about how to quickly leverage new technology that could help us. This is where the military and civilians in the Information Warfare community can help to articulate the “art of the possible” to operators in terms of capabilities to leverage for warfighting returns.

We don’t want to just pave the cow path by applying new technology to an old process, but look at how to truly change processes to leverage technology for those advantages. The Navy’s Digital Warfare Office was established to start looking at this convergence and has several pilots underway exploring it for broader application across Navy.

To move these capabilities to the tactical edge we need to fundamentally change how we create and deliver content. We are at the point where we need revolution not evolution in those operational and technical architectures. This includes increased standardization, and becoming more agile so we can field capability quickly and better secure and efficiently transport data. Think about it similarly to how you effortlessly upgrade applications and move content using your smartphone, we want the same for our tactical edge afloat environment.

To get there we have to transform how we construct content delivery, access to data, and presentation, which involves radically changing our existing application models. By moving to a web services environment where micro-web services (similar to “apps” on your smartphone) are the norm, vice highly integrated, legacy applications or web servers ashore that users on ships have to reach back to for two-way transactions, we can provide content and field capability upgrades quickly.

The days of huge monolithic applications that are tightly integrated or tied to specific hardware are in our rear view mirror, and simply slapping the “low graphics/low bandwidth” version of web reach back services is ineffective as it will never get us to the next level of data superiority afloat. Ideally in the future, existing applications and web content would be decomposed and redesigned as web services that access authoritative shared data sources — the code and processing power to execute those functions would be on the ships, and data needed afloat would be compressed and synchronized between the ashore and afloat environments then stored locally on the “cloud” afloat on a shared hardware platform. This would also enable us to send organically created data from sensors and systems afloat in a structured manner.

Most of these data are currently not shared back in the larger data cloud ashore and exposed for potential re-use by others but could and should be. By standardizing data wherever possible, we will enjoy the advantage of being able to apply compression algorithms, use artificial intelligence to help us metatag the data and create ontologies to provide much needed context to data allowing for machines discovery, and improved cybersecurity and data protection. To test this construct for the tactical edge, we actually have a pilot planned for execution by the end of March 2018 called “Compile to Combat in 24 Hours.” This pilot is a holistic test of the end-to-end technical data and transport architecture focused on four key elements that will enable an improved information environment for operations afloat. These four key elements include:

  • Data Standardization using eXtensible Markup Language (XML).
  • Use of shared infrastructure afloat (separation of the data, presentation and “logic” layers and using common Consolidated Afloat Network and Enterprise Services (CANES) hardware to host the web services and data). This aligns with SPAWAR’s goal to have content providers load code on shared shipboard infrastructure, not add more system unique hardware.
  • Automate testing for the web service functionality and Risk Management Framework requirements to field micro-web services capability quickly.
  • Use of commercial cloud as the data repository for hosting, efficient XML compression and synchronization of authoritative data between the ashore and afloat environments. Designing micro-web services to re-use authoritative open standards based data sources will improve speed and accuracy of decision making.

Q: There are challenges to data optimization including intellectual property rights, security and legal considerations, as well as language and organizational hierarchies vary by naval community, thus making it hard to find and communicate with the right people. Can you talk about some of the ways the Navy is working to break down barriers to data sharing and how you are addressing data standardization to help with that sharing and security?

A: Probably one of the hardest barriers to overcome in a more open data environment is the old problem of institutional resistance to data exposure and sharing where everyone wants to own and control their own data. That is a purely human problem and has nothing to do with technology. Not because people don’t want to share, I believe they truly do as they see the possibility for more accurate decision-making and a return on investment for reducing data infrastructure. However, they are concerned by sharing “their data” someone could change or affect it in a manner that would make their use of it more difficult.

In a true web service environment, which is where Navy identified over 17 years ago that we want to be and have been making progress albeit too slowly to get there, you identify authoritative sources of data then expose those for discovery and re-use by others as opposed to creating and maintaining duplicative data sources. That implies that by exposing your data you would also allow others to have read/write access to those data, which requires data brokering and negotiation to ensure a “do no harm” approach to the sharing. Standardization on data formats is extremely important — the more natively open standards compliant your data are, the more you can reduce having to re-code applications and web service interfaces every time something changes in the broader data environment.

Additionally, there are security concerns with making sure data are properly protected both at rest and in transit so there is confidence that data consumed have not been changed and are reliable. Data standardization, on XML, the leading industry standard, for example, will allow use of improved data protection using Security Assertion Markup Language (SAML) down to the data element level. This will become even more important with machine-to-machine interactions. The Navy has made significant improvements in defense-in-depth across its networks to better protect our data at all levels and those efforts continue as we look to expand partnerships with commercial industry for use of the commercial cloud to host Navy data.

By using the commercial cloud, protection of data will be a shared responsibility between the government and the service provider. It will also be an opportunity for the Navy to take advantage of the speed that capability can be added for cybersecurity protection and analytics in the commercial cloud environment. There have also been successful efforts by the Navy’s Functional Area Managers to rationalize Navy databases and applications by eliminating duplication and identifying those authoritative data sources for others to use. Those efforts will continue as migration to the commercial cloud progresses, and will help us further flush out and eliminate duplication.

Q: Some data experts say the biggest stumbling block to data sharing is the attitude of data ownership/protection which limits access even when policy allows. How do you convince people that data sharing is the right thing for the Navy?

A: As noted earlier, I think this is the biggest barrier and it’s a cultural/human problem, not a technology problem. We can however use technology to help us overcome that legacy mindset of mistrust and convince data owners that their fears are unfounded. First and foremost, people need to see that it will actually result in better decision-making and improved operations, and secondly, that there will be cost-savings efficiencies. If they see that we can expose and share authoritative data without compromising its reliability, integrity, or availability, then they will be more inclined to move to that web services data environment where authoritative data are easily discovered, shared and re-used. The best way to provide that reassurance is to demonstrate how it can work effectively through pilots like the DWO office is running and the “Compile to Combat in 24 hours” web service afloat pilot being run by SPAWAR.

Q: Navy workforce capability in areas like data science, machine learning, and artificial intelligence exists on a small scale. Should the Navy commit to build a community of experts with these skills?

A: Yes, and the military and civilians of the Information Warfare Community will be at the forefront with others having roles as well, particularly where we have system experts in our non-traditional IT systems (Hull, Mechanical, and Engineering (HM&E), combat Systems, Internal Control Systems (ICS), etc.). The focus will not just be on the technology but on how this cadre with expertise in operations will be agile, adaptive thinkers and enablers of change in an age of rapidly accelerating technology. It is as much about how to be a change agent in this dynamic information environment as it is in understanding the technology itself. How to get operators and others to understand and embrace the “art of the possible” with the technology for a warfighting advantage? This means that we need to inculcate this kind of agile and adaptive thinking into the training we do in all our warfighting and warfighting support domains.

This extends beyond IT and business systems. The areas you mention are as much needed in HM&E, ICS and Combat Systems as they are in IT and business systems, so we need levels of expertise in the communities of personnel who design, build and operate those systems. There are some skillsets that will need more focus in the future as you note, and for our military personnel, it is important that we combine those skills with current tours that maintain their operational expertise and relevance. We will have challenges of not just creating these folks but retaining them as commercial industry will be competing heavily for the same talent; however, I think the nature of the work we do and service to the nation will continue to be driving factors keeping great people on our team.

Q: Can you talk about the Navy organizations involved in working to build the Navy’s data strategy and infrastructure for data optimization?

A: There has been work done on Navy data strategy for years. Good work done by DON CIO 15 years ago on data standardization and by other joint and interagency organizations can be leveraged as some of it is still very much germane. Any Navy data strategy needs to ensure alignment across our enterprise and should be driven by overarching tenets (i.e., commitment to open standards, requirements for data exposure and sharing, etc.) and supported by an operational view architecture of how that will be executed for improved operations by defining warfighting effects we want to realize in the future information environment.

Defining our outcome, that information environment we want to achieve, will then drive the standards we employ, identification of the trained and skilled manpower needed to operate and sustain it, the systems we build, buy and field, and the processes and policies we put in place to operate in that information environment. As experts in technology, our propensity is to go straight to talking about technology and systems because that is the easier part to define. But we need to start with identifying the information environment of the future, that higher level view of the information needed for improved warfighting effects in all domains. That includes both offensive and defensive effects. Development of the Navy data strategy has to include operations personnel, operations support personnel, and joint/coalition/interagency partners (particularly our naval brethren in the Marine Corps) to ensure alignment of our strategy with the broader DoD and U.S. government plan. To get that data strategy right is an all hands effort across the enterprise.

The infrastructure piece will continue to incorporate how we execute that strategy across a broad and complex enterprise both afloat and ashore. There are unique challenges we face in the afloat/expeditionary environment that hopefully the “Compile to Combat in 24 Hours” and other DWO pilots will help to inform where we need to make changes to bring us to that next level of information in warfare and as warfare.

Q: As the Navy Cyber Security Division Director on the OPNAV staff, what are some of your priorities in relation to the Navy’s data strategy?

A: My priorities are to ensure Navy develops and implements a strategy that moves us forward, not one looking in the rear view mirror or making incremental “tweaks” to what we have today. We are at the point in accelerated technology change that tweaks won’t get us to the next level anymore; we need revolutionary change, not evolutionary change. This means we need to take advantage of capabilities, processes and skills being employed in commercial industry much faster than we do today without a lot of customization to fit some old process that may no longer be the best way forward.

We have to be willing to let go of old processes that just don’t allow us to move forward at the pace we need to in today’s data environment. An industrial-age mindset won’t work in today’s environment and speed is key to achieving a competitive advantage. The Navy’s strategy, along with the policies put in place and resource decisions we make, must support implementation of a flexible and resilient information warfare platform. The strategy should be based on the agile, secure and reliable information environment we need where we can quickly apply new technologies and capabilities as they become available. The strategy needs to consider not just how we leverage those but how we ensure we deny or disrupt our adversaries from using them against us for a competitive warfighting advantage.

Q: You have talked about the steps the Navy is taking to protect both traditional and non-traditional information technology. Can you describe what non-traditional IT is and how the Navy is protecting it?

A: We have many systems that are considered Industrial Control Systems on ships and ashore, along with Combat Systems, that in the past have not been considered in the same manner that we consider cybersecurity protection of traditional IT and business systems. Many of these systems rely on old technology and operating systems and present different sets of challenges when it comes to cybersecurity but have their own vulnerabilities which if an adversary took advantage of could severely disrupt operations.

In the past several years we have implemented programs like CYBERSAFE to identify and implement measures to better protect those non-IT types of systems. Significant resources, several hundred million dollars, have been spent to address cybersecurity of these types of systems. This has been an all hands effort with support from the systems commands and operational forces across the Navy.

We will never have an impenetrable network and it’s a fool’s errand to even try, so our focus has been on cyber resiliency and being able to “fight through the hurt” on both our IT and non-IT systems. This means taking a look at the architecture and our cybersecurity processes holistically and implementing defense-in-depth measures across the enterprise. It also entails being able to understand our networks, how they are connected, monitored and what “normal” is, so when there is a deviation, we can quickly react and restore capabilities.

Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy

CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988
Hyperlink Disclaimer