Next-Gen computers will soon transform battlefield tntelligence.

AuthorCanton, James
PositionViewpoint

Imagine that 30 days prior to the invasion of a city the size of Mosul, Iraq, the U.S. military deploys a network of micro drones overhead and sensors embedded in the city infrastructure.

The sensors begin gathering geo-intelligence data, infrared, multi-spectral images, signals intelligence, full-motion video, mobile data, pictures, emails and social media, including the movement of people, vehicles and supply chains in order to build a 3D interactive map of not only structures and streets, but people, systems, life and traffic patterns, power usage, commerce and the location of weapons and ordnance.

By the time forces are ready to enter the city, battlefield commanders have a recent historical record of events in the city as well as a real-time mobile big data map displaying holographic images employing virtual and augmented reality. Meta-tagged individuals and locations are updated via real-time with GPS, and sensors and drone videos to keep the map fresh and current.

The massive amounts of data collected could be crunched to do predictive analysis to determine where the enemy is, and resolve identities and entities engaged in and around the city to build an identity database to determine where they have been and maybe where they are going.

That kind of knowledge and these types of scenarios could save warfighters' lives and protect civilians but it's not possible today.

The sensors and the drones are readily available, but not the computing power necessary to analyze the large amount of data from so many diverse sources. However, the day when this scenario will be possible is coming. And it will be made possible by a wave of new computing architectures.

These exponentially faster computers will marry with artificial intelligence, big data analytics and cloud computing to help crack some of the military and intelligence communities' toughest problems--if they can take advantage of these technologies, which are largely being developed in the private sector.

Computing hasn't fundamentally changed in 60-plus years. Hardware and silicon still process data with zeros and ones. And we are soon going to be reaching the physical and processing limits of today's computers.

Moore's Law, which predicts periodic increases in computing power, is going to reach an end. Faster than Moore's Law processing is what we need. There are a few major trends leading computer engineers in this direction.

First, are recent breakthroughs in artificial intelligence, AI. The past three years has seen a transition from rule-based...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT