An R9X Hellfire missile can strike one individual in a car without hurting the other passengers. Digital technology enables such precision—and will only become more advanced with 5G networks powering artificial intelligence. Mike Tsukamoto/staff; USAF; Sisar1/Twitter; Google Earth
Photo Caption & Credits

5G Netcentricity

The coming revolution in networking will redefine the concept of sensor fusion.

The U.S. government now has the ability to deliver munitions so precisely that it is using explosive-free Hellfire missiles fitted with blades to kill passengers inside a vehicle without collateral damage. As the physical means to kill enemies becomes evermore automatic, the identification, tracking, and targeting of those individuals become the critical components of the kill chain.

Prompted by the digital revolution and corresponding advances in data transfer, manipulation, and storage, information has taken on increased value in warfare. With the advent of Network Centric Warfare (NCW) in the 1990s, the U.S. military sought to apply commercial digital information sharing to warfare. NCW became the cornerstone of military information infrastructure development.

Today, information warfare dominates discussions on military operations and technology acquisition. Just as they did in the 1990s, civilian technological developments signal new uses in information warfare. Fifth-generation (5G) cellular network technology stands to revolutionize battlespace sensing and the way militaries approach data. Rapid growth of networking technologies is driven by powerful and small microelectronics, automated data manipulation and artificial intelligence, and advanced wireless connectivity. 5G networking technologies such as these will converge to enable future capabilities and present a new theory of information dominance based on these projections.

Network-Centric Warfare

Today’s networking technologies are quickly evolving, enabling them to move more data, more rapidly, than ever before. The emergence of 5G networks presents the ability to add more sensors to our networks and will change the very nature of civilian communications. Likewise, 5G will present new opportunities to change how we conduct military intelligence and targeting. 

5G does not represent a single technology, but a confluence of them. Its cellular antennae are smaller and consume less energy than older generations, and their ability to exploit the advantages of beamforming allow them to send signals only where necessary. 5G’s massive multiple-input, multiple-output (massive MIMO) architecture will support far more traffic than existing technologies. Just as these capabilities support an “Internet of Everything” in the civilian world, it can likewise support an “Internet of Military Things” to enable the “combat cloud.” The network will have lower power requirements, lower latency (that is, minimal signal delay), and greater data capacity.

5G technology will enable the network-centric future envisioned years ago, where each person, vehicle, drone, or other system is interconnected, sharing sensor data among systems. Additionally, the reduced size and cost of 5G sensors could increase the number of simple sensors across the battlespace. Aerial assets could drop these small, low-power sensors into a conflict zone. 

With thousands of sensors working in concert, combat sensor networks could collect data on scale with the likes of Google, Facebook, and Amazon, providing an ever clearer picture of the battlespace.

Instead of searching for the glint of a needle in the haystack, analysts will soon more rapidly remove the hay.

From the earliest days of the internet, it has been clear that the rapid connectivity and information flow digital communications provides would revolutionize military operations. Early advocates cited Metcalfe’s Law to illustrate their point: The value of a network increases exponentially with the increase in the number of interconnected nodes. The theorem’s shortcomings are immediately obvious. First, the value of some nodes is inherently greater than others, especially those that provide greater information. For example, in a missile early warning network composed of land- and ship-based radars, the ship-based radar is often weaker and may be of less value in terms of total detection capability. Additionally, the smaller the network, the more valuable the addition of another node is to the overall power of the network. In very large networks, like the type represented by the interconnected users on Facebook, the addition of one more user will not represent an exponential increase in the value of the website. Yet Metcalfe’s Law remains germane to network discussions precisely because of its simplicity. It is still used today to discuss large interconnected environments, such as those involving Bitcoin.

Analyzing Battlespace Data

Between 1965 and 1972, the U.S. flew 871 sorties against the Thanh Hoa Bridge in Vietnam. Only after the introduction later that year of laser-guided bombs was the bridge destroyed. Other conflicts, such as the first Gulf War and Bosnia in the 1990s, showed that destroying the target is no longer the imperative to military success—finding the right target is what is paramount. Battlespace data is now the most important component in defeating an enemy, and the first step in developing a theory for the next evolution of military sensing is observing the way the military uses data in the battlespace. In its simplest form, data is used to locate, identify, and track the enemy. 

Here is where the future of battlespace networking gets interesting. The solution to this deluge of data may not be better, more powerful sensors, nor the ability to select the most appropriate and accurate sensors at a given time and place. The solution for too many sensors may be more sensors.

Data Saturation

The underlying assumption necessary for a new theory of battlespace data collection is that emerging technologies will revolutionize battlespace sensing. Moore’s Law states that computer speeds will double roughly every 18 to 24 months. When you apply that to 5G networking, it’s clear we can anticipate continued decline of cost and size of advanced electronic sensors and communication equipment, along with corresponding advances in data processing and storage, and the realization of automated data analytics, including artificial intelligence and machine learning. The confluence of these technologies should allow for a massive network of small, inexpensive, low-power electro-optical, sonic, and thermal sensors to be placed or air-dropped by the thousands into an operational theater. They would supplement the ever-expanding list of networked sensor sources, both manned and unmanned, in every domain.

By effectively saturating the operational environment with sensors, we can produce a more complete picture of the conflict zone. Analysts and operators will no longer search for the enemy so much as remove benign data. In other words, instead of searching for the glint of the needle in the haystack, they will rapidly remove the hay. Battlespace data collection will shift from seeking anomalies to applying analysis and correlating to a “baseline” of the battlespace. 

Impact on Prediction

The application of this theory has two major ramifications. First, sensor saturation will reduce the necessity for and increase the accuracy of battlespace prediction. Greater knowledge of the battlespace will make it easier to discern the location and description of nefarious actors. Additionally, knowledge gaps will be more readily filled; the new network will increase certainty about known enemies and help predict enemy actions. 

Here we can look to the stock market for insight. In financial markets, data and analysts drive the models investors use to anticipate price movements for stocks, bonds, and indices. Knowledge is power and more information is usually better. Even so, few can beat index-investing with any regularity. Stock picking remains a “random walk.” Why? 

First, the data inputs (or sensors) used for stock picking do not directly predict price movements. The specific inputs also do not account for additional inputs that come with market fluctuation. Let’s take the price of corn futures, for example. Sensor inputs to the price of corn futures could include satellite imagery showing the relative condition of local and global corn fields, as well as applicable weather predictions. Given the basic principle of supply and demand, total knowledge of the future global corn-crop yield and the total future demand for corn, analysts could still not accurately determine the future price. Perhaps the most influential reason for this predictive shortcoming is what John Maynard Keynes called “animal spirits”—that is, the human emotional factor in the trading of stock prices. Additionally, other factors may influence stock price movements, including unpredictable industry expansions, regulatory decisions, and natural disasters. It is generally accepted, however, that the greater amount of specific information influencing a market sector (e.g., corn-crop yield), the greater the probability of predictive success. 

Now, let us consider the central objective function of military data collection, which is to identify and locate specific entities across the battlespace—a very specific objective with a well-defined end state. Yet, unlike the large number of stock information inputs, the sensor inputs in the battlespace more directly contribute to the objective. These sensors discover the identity and location of the target at a discrete time. Detection and identification of a target are the primary objectives of the data collection. 

If financial analysts have been unable to predict stock prices given the amount of information, analysis, and raw resources available, then how will militaries be able to predict enemy actions, movements, or intents? Given enough sensors and the proper “infostructure,” battlespace intelligence will transform from a system that detects sensor inputs to one that detects environmental abnormalities, develops useful correlations in the data, and provides a holistic analysis of the operating environment. 

More Sensors, More Data 

Currently, making battlespace inferences is complicated by the volume of data. As data increased from satellite, aerial, and other sensors, analysts have been overwhelmed. The result is that signals begin to become noise. Analysts are forced to determine which sensors provide the most valuable data and most useful information, and which supporting data is necessary to make predictions. In effect, they predict which supporting data they need, turning battlespace intelligence into a system of probabilities in series. This makes for an ever-destabilized intelligence cycle. 

Today, the U.S. military seemingly has a surplus of battlespace data, so much that the inclusion of more data raises fears of distraction, a detriment rather than an improvement. The value of the additional data is effectively offset by a corresponding increase in noise caused by the excess. So if the system itself begins to become the noise, then the information in the system is reduced, as is the value of the network. 

There is an addendum to Metcalfe’s Law called Zelf’s Law, also known as the “long tail” theory of the value of the lower-tiered contributors to the network. Zelf’s Law posits that each additional node on a network decreases in relative value. In terms of the battlespace sensor mosaic, an analogy for Zelf’s Law is the television pixel. In the extreme case of a one-pixel network, that one pixel would be extremely valuable, perhaps indicating on or off, day or night. As we add pixels to this imaginary battlespace TV, each additional pixel helps describe and form the picture. The value of the system increases exponentially, as predicted by Metcalfe. Yet, as we reach the fidelity of modern televisions with thousands of pixels, the value of each individual pixel is reduced. In terms of discerning the actual picture, the value of one of these pixels effectively goes to zero—one pixel barely contributes to the overall picture.

Likewise, in extremely large battlespace data-collection networks, the value of the average individual sensor approaches zero. In these types of networks, the “message internals,” or the actual data that the average sensor is transmitting, is of ever-decreasing value. Conversely, the “message externals,” or those parts of the message that describe the message itself, such as date, time, and location, become more important. The network becomes Boolean, with each sensor simply on or off (detecting or idle).

What this sensor-saturation theory describes for future battlespace sensing is a television picture with thousands of pixels (sensors). Each of these sensors is simply on or off—transmitting message externals—at a given time. It is the activation pattern of these sensors that allow for detection—by removing the hay to find the needle—or predictive analysis. The predictive analysis on these thousands of data points is similar to the big data analysis performed by Amazon.com to predict which products shoppers may want to see at a given time.

5G
Just as adding pixels improves the resolution of an image (as illustrated here), adding sensors will yield a clearer picture of the battlespace in the future. Metcalfe’s Law posits that the value of such a system increases exponentially as pixels are added. Mike Tsukamoto/staff; photo: Staff Sgt. Sean Carnes

High-Fidelity Targeting

One of the fundamental data fusion problems—configuring the data to be compatible across a network—becomes less formidable as the speed and power of 5G networking becomes available. Additionally, the data that comprise the message externals are minute compared to the internals. Thus, this future network may have the added bonus of less overall actual bits transmitted per sensor. The military may then become less reliant on individual complex sensors, and instead enable large sensor networks to complete a higher-resolution picture of the battlespace, much like adding pixels to a television improves the definition of the picture.

Thousands of miniature interconnected sensors could provide new fidelity on the targeting cycle; faster data processing enabled by modern networks would likewise counteract the growing problem of wasted intelligence data by enabling faster and more thorough data analysis. By flooding the battlespace with sensors, the network is strengthened, even as each individual sensor is devalued; the message externals become more valuable than the internals. With these emerging technologies, analysts will approach data holistically, reducing the need for analysts to rely on intelligence spikes in the operating environment. Moving ahead, however, they face a two-fold problem. First, the military will very likely lag behind the commercial sector in developing sufficient “infostructure” to take advantage of massive sensor data, including shortfalls in data storage and AI computational power. Second, military organizations lack necessary and sufficient abilities to successfully manipulate large data, AI learning, and prediction models. The military and Intelligence Communities can begin remedying the latter problem today by accelerating the application, testing, and acceptance of universal data models.

The sooner the U.S. military acts on this networking eventuality, the better it will be positioned relative to our rivals in the future. Success will require a holistic approach focused not only on acquisitions and research and development, but also systemic changes, including those in doctrine, organization, and tactics. An underlying theory of data collection and analysis that accounts for future technologies will guide the development of the next evolution in NCW.

Anthony Tingle was the concepts evaluation branch chief at U.S. Army Space and Missile Defense Command and writes on research, development, and the application of technology at the Department of Defense. Download the entire paper at 
www.mitchellaerospacepower.org.