Email this Article Email   

CHIPS Articles: The Cloud and its Implications to Naval Warfare

The Cloud and its Implications to Naval Warfare
By George F. Hurlburt - July-September 2010
STEMMING THE TIDE OF RUNAWAY IT

The realities of the past decades, fueled by Moore's law regarding the exponential advances in computing power, affirmed a need to stem the tide of runaway information technology (IT) within the Department of the Navy (DON). In essence, the DON's networks, hardware and software had to be brought under tight centralized governance to standardize the computing environment and ensure security. This very real need gave rise to innovations such as the Navy Marine Corps Intranet (NMCI) and the Functional Area Management (FAM) program to establish the Navy's application and hardware baselines. Insofar as these initiatives defined the DON's IT infrastructure and served to bridge the warfighter and the shore establishment, they served the department well.

These initiatives defined the realms of acceptable application support, provided required network elements culminating in a set of consolidated Network Operations Centers, defined acceptable server and storage management principles, established a homogenous database environment and made provisions for security measures in accordance with the Federal Information Security Management Act (FISMA). These initiatives paved the way for integrated procedures to support the management of the DON's overall configuration while providing a centralized approach to the distribution of IT resources extending from localized server farms, through the network, to desktop computers complete with a comprehensive technology refreshment and service desk approach.

The one commodity to miss scrutiny, however, was that of information which became a byproduct of the consolidation of the established infrastructure characterized largely by its physical environment and connectivity. The information resource took a necessary backseat to the requisite infrastructure management approach.

THE VIRTUAL "CLOUD" ENVIRONMENT

The latest trend — virtualization of the servers, databases and applications for access by remote users via network assets — represents a next logical step in the ever increasing efficiency required to support an enterprise as large and far ranging as the Department of the Navy. The virtual environment, often commonly known as the cloud, changes the governance paradigm significantly. It represents a consolidated approach to distribution through virtualization. In such an environment, however, the term "distribution" no longer can apply solely to infrastructure because the real distributed commodity clearly becomes information.

Virtualization supports shared services, which can significantly ease the security and system management burden through agile and improved federation, effective user management, and efficient access control through direct authentication, authorization and instant provisioning of both services and information upon demand. Moreover, this can all be done at a scale heretofore only imagined.

CHARACTERISTICS OF THE CLOUD

The cloud promises to deliver a dynamic intranet-optimized infrastructure for hosting all manner of applications at the edge. A number of vendors are already marketing cloud solutions involving platforms, storage, remote hosting, load balancing and other service-based features. The nature of these offerings tends to reinforce the salient point that, for the first time, information and information-based resources, may logically be billed on a per usage basis.

The cloud, however, poses a different form of threat from a security perspective. No longer can the physical network, hardware and software infrastructure be viewed as the focal point for exploitation of vulnerabilities. Rather, the policies and enterprise precepts governing the infrastructure come into sharp focus because all applications are now consolidated and management policy becomes acute.

As external events occur, new threats mature, the environment changes, and the technology itself continues to evolve. The way the virtual community operates will migrate accordingly. Policy and procedures must also evolve. This migration is dynamic over time as different environmental vectors come and go. This important principle extends to the management of the information and its use as a means of regulating "need to know."

WEB SERVICES VERSUS SOA

Services will also migrate over time. It is important to note that there is a distinct difference between the concept of "Web services" and the notion of a service oriented architecture. SOA is an enterprise infrastructure enabler while Web services deal with service calls over the Internet. These lines will likely blur in the coming years. One attribute that both forms of service share is that they are liable to proliferate. As they do so, the cataloging and discovery of dynamically composable services will become an increasingly difficult problem. This discovery issue transcends existing metadata markup standards.

This dilemma can be solved using semantic techniques. Such techniques treat service-based semantics as small world phenomena. As such, words are reduced to valueless tokens. The resulting matrix relationship defines the association of any given token to all other tokens.

These relationships among tokens define which tokens link as hubs to other tokens to build meaningful context. Through these self-generating relationships, it is feasible to create living ontology, complete with self-generating metadata. Such an ontology permits SOA discovery where services must be allocated among various functional systems. Such a dynamic SOA composition environment will become common in the cloud.

CLOUD DYNAMICS

Not all clouds are warm and fuzzy. The cloud requires a new form of governance to maintain a benign cloud-like operation; otherwise, it could become the perfect storm.

Virtualization can be orderly, cost effective and manageable, but it is far more than a mere hardware consolidation. It must be capable of dynamic resource allocation and continual reprioritization.

As the cloud pushes capability to the edge, the definition of the edge itself is changing dramatically. Open source software is nearly normative for Web-based operating systems, servers and applications. Edge devices become smaller and more powerful by the day. As the edge pushes outward, new nontraditional players will become a part of the mainstream. This, in turn, will initiate new forms of working relationships, giving whole new meanings to net-centricity. People will definitely be in the loop and have influence over what would otherwise be a software-driven battlefield lacking human safeguards. The natural tensions these elements will create will call further attention to the murky legal issues. Service and application ownership will also come under close scrutiny.

Composable services and advanced application program interfaces (API) will allow today's staid applications to be treated as dynamic mash-ups. Data portability will become far more ubiquitous and the phenomenon of open social networking will continue to promote individual skills and abilities.

The notion of open software is here to stay, but it is becoming absorbed into the mainstream in interesting ways. The old juxtaposition of the cathedral (monolithic corporations) and the bazaar (independent, sharing artisans) is giving way to a hybrid model that serves to indemnify quasi-open software in ways that were impossible until recently.

INTERNET USAGE TRENDS AND THE CLOUD

Usage trends show that individuals of all age groups are involved with the Internet to varying degrees. E-mail has become essential as a universal tool. Many tasks, such as gaming, instant messaging, social networking, online profiles, and blogging tend to cluster at young age groups. These trends suggest that these emergent behaviors will be second nature for users over the next 20 years. Clearly, they are well suited to cloud computing and offer both danger and rare opportunity for managing the enterprise to optimize for interests and innate skills. Moreover, the use of social networks for professional purposes is growing significantly according to recent surveys. This suggests that individuals will become part of the fabric of the cloud, and relationships will be generated by affinity.

These trends strongly suggest that new reputation-based mechanisms will come into play as a means of conferring trust among users. This peer-based approval approach brings the field of human interoperability enterprise (HIE) into the equation of network and security management, making this field multidisciplinary and expanding it from its traditional infrastructure engineering-based roots.

INTERNET USAGE PATTERNS

Both the social network and Internet phenomena reveal that usage and human behavior tend to operate in seemingly random fashion which actually reveals patterns when viewed holistically. These patterns reveal linkages between participants, between communities, within an enterprise, across networks and among the edges. Similar patterns can appear among data and definitely within the semantics of the written and spoken word. A sophisticated security algorithm could be developed to detect subtle nuances in these patterns, identifying suspicious activity and signaling potential state changes requiring management awareness. This form of digital forensics is gaining growing acceptance.

UNSTRUCTURED DATA

This observation becomes more important as data continues to become far less pristine. Fueled by new computational immersion through Web 2.0 and similar collaborative applications, the amount of unstructured data in any given enterprise will eclipse and surpass the amount of structured data within this decade, according to International Data Corporation. This suggests that social networking is here to stay, but also tends to diminish the role and importance of the traditional database management system.

Ultimately, the prevalence of unstructured data extends to advanced visualization techniques which must be brought to bear to make sense of the mountains of data which can best be resolved visually. This also includes the growing realm of sensor fusion, particularly as sensors of various bandwidths, types and acuity continue to proliferate at the edges. Mash-up APIs will help facilitate this important visual push aspect of cloud computing.

KNOWLEDGE MANAGEMENT AND THE CLOUD ENVIRONMENT

All of the foregoing observations serve to predict the coming state of knowledge management. KM can no longer be viewed in isolation in the cloud environment. Knowledge may be defined as information applied in the context of the user. Unlike information, however, knowledge is not a specific commodity. Rather, knowledge is instantly synthesized from vast and growing reservoirs of semi-structured information. Of necessity, KM entails applied human resources management practices, if the organization is to optimize its workforce in a highly competitive environment.

Clearly, providing the appropriate context for heightened situational awareness definitely involves information management and the application of IT and communications systems. The cloud becomes the information broker to enable contextually sensitive KM, driven by natural affinities among users.

As human beings become increasingly integrated into the computational environment, the user becomes a vital component in the infrastructure. As users deal in the commodity of information to build contextually on their individual and organizational knowledge, the mandate to manage information overtakes the need to control the infrastructure necessary to provide rote services to users.

MANAGING AN INTEGRATED INFRASTRUCTURE

Given the cloud as a backdrop, the "Navy after Next" represents a significant departure from the state of affairs in IT as it is currently known. The challenge will no longer be the management and governance of the IT infrastructure, but rather the management of an integrated infrastructure composed of all the elements that comprise the Department of the Navy.

If the net-centric precept of power to the edge is to be brought to fruition, the decision making process must extend to the edge with the requisite knowledge applied when and where needed. This means that IT must be a part of the mainstream and not a separate entity to be managed exceptionally. This changes the notion of systems engineering to encompass a more holistic multidisciplinary approach extending to the human interoperability enterprise. To accomplish this end, the science and technology community has much to do to effectively help develop and articulate the requisite integrated capabilities.

George F. Hurlburt supports the enterprise architect of Naval Air Systems Command. He has an abiding interest in applied complexity analysis, particularly latent semantic analysis. He retired after 38 years as a Navy civilian where he pioneered collaborative network architectures for the Department of Defense test and evaluation community. For IT news and policy information, go to the Department of the Navy Chief Information Officer website.

Related CHIPS Articles
Related DON CIO News

CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988
Hyperlink Disclaimer