Skip to main content

Team helps build the ultimate surveillance tool

December 30, 2001

Something moves, and what looks like a dime-sized pebble “wakes up” in a vast desert landscape. The pebble sends a signal to another small stone just 20 yards away. It too is awake. It detects carbon, nitrogen and sulfur dioxides. The chemical sensing stone sends its information, along with data from the seismic sensing pebble, to a node stone. The node collects data from hundreds of similarly disguised wireless sensors and relays it to an unmanned aircraft that pieces together information to identify a tank.

Advances in smart, low-cost integrated devices containing many different types of sensors, wireless transceivers and processors with significant computing capabilities could make the above scenario a reality in as few as five years, says professor Parameswaran Ramanathan. He recently received a $725,000 five-year grant to investigate issues in establishing and maintaining communication between sensor devices in wireless ad hoc surveillance networks.

The project builds on work accomplished by professors Ramanathan, Kewal Saluja and Yu Hen Hu, and assistant professor Akbar Sayeed under a Defense Advanced Research Projects Agency grant.

“Aircraft could sprinkle large numbers of these integrated devices over an area to construct a surveillance network capable of monitoring, detecting and tracking threats from a variety of sources, including vehicles, persons and biochemical agents,” says Ramanathan. “Of course, there are dozens of non-military applications as well. Sensor networks could be deployed to help geologist, limnologists and others study the environment in ways that currently aren’t possible.”

But before such networks can be put to wide use, researchers have to solve some difficult problems. With thousands of sensors gathering and disseminating information, some are bound to deliver false readings. Not only do the devices need to be robust, but the network must have strategies to sort out errors and pass on correct information. In addition, because the sensors will have limited power supplies and communication capabilities, the team must devise protocols and communications strategies that deliver the most information while using the least amount of power.

“If I use a lot of power, I can shout in some sense and reach maybe 20 devices,” says Ramanathan. “But if I whisper and use less power, I can only talk to two. Then those two devices can rebroadcast whatever they heard and get the whole network going. So the challenge there is of figuring out if it is best to shout and have everybody hear or talk a little and have everybody communicate. But then, of course, everybody is communicating and everybody has to handle not only his or her own thing, but also relay something someone else said, and that consumes more power. So there is a balance to strike.”

In some cases, the end user may not want the devices to talk at all; it depends on the question being asked of the network. Sayeed is working on signal processing strategies that most efficiently combine information from many kinds of sensors. He says it’s very much like bringing a blurry picture into focus. Speed is achieved and energy conserved by finding answers when the image is still soft.

Tags: learning