‘KILLER’ Mil APP could allow remote targeting via smartphone
Research is under way that could turn the ubiquitous smartphone into a critical link in the kill chain of the modern battlefield.
Engineers from the University of Missouri (MU) College of Engineering, with funding from the U.S. Army/Leonard Wood Institute (LWI), are developing applications that could allow soldiers to reach into a pocket, pull out an Android-based smartphone or iPhone® and determine the exact location of a remote target via sight or sound.
Dr. Yi Shang, professor of computer science in the MU College of Engineering, is leading the research with a team of fellow MU colleagues and students. The effort, which culminates early next year, began when Shang, an expert in wireless sensor networks, realized that smartphones could be used to detect and image targets, and be treated as powerful wireless sensor nodes.
“Two years ago I came up with the idea of using a set of smartphones to find the location of remote targets based on either sight or sound,” Shang says. “Using traditional wireless sensor networks doesn’t work because they don’t have the proper capabilities. But smartphones have cameras and microphones as integral, widely available sensors. Last year there was an RFP [request for proposal] from the LWI [LWI funds research projects on behalf of the Army Research Laboratory/Human Research and Engineering Directorate]. I got together with colleagues, wrote a proposal, and it was funded.”
Shang describes a sight-based scenario in which a small team of soldiers in Afghanistan is monitoring a remote target, a vehicle for example. Using a laser to determine the exact location of the target is one common technique. But lasers have a limited range and are intrusive. The targeted entity may detect the signal the laser is shooting at them.
Smartphones equipped with the applications the researchers are working on could passively determine the location of the target, allowing the soldiers to go undetected. Their integral cameras would be used to take pictures of the target and together with other sensors already resident in the phones (compass, GPS, accelerometer, etc.), determine the target’s exact position.
Targeting can also be accomplished in sound-based scenarios, utilizing applications written to exploit the microphones in smartphones.
“If a vehicle is in a dark environment or if it’s in an urban setting where you don’t have line of sight but you can still hear the sound of a vehicle over the sound of gunshots for example, then you use the microphone-based application,” Shang says.
So far the applications are being written primarily for Android-based smartphones because they are easier to program, Shang says. The MU team is developing single image-based, multiple image-based, and video image-based technologies for different scenarios and single or multiple phone-based methods of determining target location information.
“For a single image-based application you need some idea of the size of the object you are trying to locate,” Shang explains. “For a two-phone image-based application, we are essentially using triangulation. In addition to the image recognition and the signal processing [sound] applications, we’re also developing ad hoc networking support so that the phones will be able to talk to each other directly via Bluetooth or Wi-Fi to send the data to one location to compute.”
Cellular communication networks upon which phones typically depend in a civilian setting aren’t present in a conflict area like Afghanistan. So to calculate and relay target location information, Shang and his colleagues are working on applications that will allow soldiers’ smartphones to talk directly to one another in a local area.
“That’s the third part of the program,” Shang says. “Improvised ad hoc networking allows peer-to-peer communication between the phones directly without going through any cellular network. You can do Bluetooth or Wi-Fi communications directly in ad hoc mode.”
Doing so requires some adaptation of current smartphone capabilities Shang says. For instance, Bluetooth currently supports only single hop, phone-to-phone communication but does not support multi-hop networks. Multi-hop networks could extend the distances at which target location information could be determined or relayed.
“If you have two phones a great distance apart, you cannot currently provide a third phone in the middle as a forwarding node,” he explains. “We’re implementing that part for Bluetooth.”
Shang’s team is also developing a Wi-Fi-based ad hoc network that will allow multiple app-equipped smartphones to talk to each other at extended range.
“Usually for Wi-Fi you need to have an access point. Interestingly Wi-Fi also has direct node-to-node communication called ad hoc mode. On a regular Android phone, you cannot do node-to-node communications. You have to route the phone and change the system a little bit. We’re experimenting with it. Wi-Fi actually has a longer range than Bluetooth and is more efficient so we’re also employing the Wi-Fi ad hoc network in our project too.”
Shang’s team is employing encryption mechanisms so that the signals sent between the app-equipped smartphones can’t be deciphered.
“Since they essentially act as a radio, you cannot avoid detection of the signals they send. But the content of the signals can be protected,” he says.
The killer app technology is still in early stages of development, but, as Shang notes, “it holds a lot of promise. If we can achieve our goals, the application will be a very useful tool for our soldiers.”