Exploiting Data Fusion To Improve the Coverage of Wireless Sensor Networks in Dotnet

Exploiting Data Fusion To Improve the Coverage of Wireless Sensor Networks in Dotnet

Abstract:

Wireless sensor networks (WSNs) have been increasingly available for critical applications such as security surveillance and environmental monitoring. An important performance measure of such applications is sensing coverage that characterizes how well a sensing field is monitored by a network. Although advanced collaborative signal processing algorithms have been adopted by many existing WSNs, most previous analytical studies on sensing coverage are conducted based on overly simplistic sensing models (e.g., the disc model) that do not capture the stochastic nature of sensing. In this paper, we attempt to bridge this gap by exploring the fundamental limits of coverage based on stochastic data fusion models that fuse noisy measurements of multiple sensors. We derive the scaling laws between coverage, network density, and signal-to-noise ratio (SNR). We show that data fusion can significantly improve sensing coverage by exploiting the collaboration among sensors when several physical properties of the target signal are known. In particular, for signal path loss exponent of (typically between 2.0 and 5.0), ρ f = O(ρ d 1-1/k , where ρ f and ρ d are the densities of uniformly deployed sensors that achieve full coverage under the fusion and disc models, respectively. Moreover, data fusion can also reduce network density for regularly deployed networks and mobile networks where mobile sensors can relocate to fill coverage holes. Our results help understand the limitations of the previous analytical results based on the disc model and provide key insights into the design of WSNs that adopt data fusion algorithms. Our analyses are verified through extensive simulations based on both synthetic data sets and data traces collected in a real deployment for vehicle detection.