Networks of sensors with their geometry go beyond the individual sensor that .... [6] Simon Haykin. Communication Systems 4Th Edition, volume 103. John.
Geometry of a Sensor Networks Germano Resconi, Robin Braun and Zenon Chaczko
Catholic University of Brecia, Italy and University of Technology, Sydney, Australia
Given a set of sensors or cluster of sensors S located at different points or nodes in the ordinary space. Any sensor measures one or more values, such as Temperature. We assume that the information from all sensors at dierent positions in the space is transmitted to a Gateway node as a probabilistic phenomena, not in a deterministic way. The measured value X at the Gateway sensor node is a random value. Noise in the network randomly changes the original measurements. Information at the gateway is given by a distribution of the probability at the gateway sensor. We can show that given the values at the sensor node the distribution of the probability at the gateway changes. So the sensor measurements are parameters that dene the distribution of the values at the gateway. The probability at the gateway is conditioned by the original measures at the sensor node. The probability approach cannot take care of the topology of the network but only of the conditional probability at the gateway conditioned by the sensors. Now we compute the derivative of the conditional Boltzmann entropy for any variation of the sensor value and for any value at the gateway X . This matrix gives us the sensor situation so we can compute the Fisher information of the sensor. It is the Hessian of the entropy average function in the space of the sensors S . The Fisher information gives us the geometry or form of the sensor space S . Sensor information is very important to obtain the form of the phenomena that we want to measure with the dierent sensors. Networks of sensors with their geometry go beyond the individual sensor that measures only one value and cannot discover the eld or form of the physical phenomena. Abstract.
1
Sensors and Information
There is a need for a formal understanding of the underlying function of wireless sensor networks. Up to now, the emphasis has been on the functionality of the sensors, their interconnection and such things as power consumption and programming
(author?)
[1, 2, 3, 4, 5, 9, 10]. However, with the advent of so
called ad-hoc networks, and the ability to deploy many hundreds of sensors in the eld, there is a need to optimize this deployment for both energy consumption and ecacy in collecting the data. For example, the easiest way to collect information from a remotely located sensor may be a direct wireless link. However energy consumption for propagation is proportional to the square of the
distance. Therefore a multi-hop solution may be more appropriate. The underlying assumption of this work is that the sensors, the information source, are attached to wireless devices that transfer the sensor information back to an information sink. The additional assumption is that these wireless devices are able to forward the information of other sensors until it ultimately reaches the information sink. This paper is a work in progress . It attempts to arrive at a formal description of the Observed Field and its Information Space. It suggests a way to develop a formal description of the Transform Space, base on the notion of conditional probability and Fisher information. Ultimately, these formal descriptions will allow the application of optimization methods such as minimum Fisher information. In addition, we are thinking of this formal description as a way of incorporating the work we have been doing on the Distributed Active Information Model, for the management of complex distributed electronic environments such as Wireless Sensor Networks
(author?)
[4]. In Figure 1 we show
the information transformation from the sensors to the sink.
Fig. 1.
The Information to Sink eld
In Figure 2 we show in a more schematic way the transport of the information from sensors
S to the gateway or sink where we detect the conditional probability
p = p (x|Si , S2 , . . . , Sp )
(1)
Image of the conditional probability for one sensor S and one variable x at the gateway Fig. 2.
Where
x
are the values that we described at the gateway Figure 3 and
Sj
are
the values that we measure When we use the binomial conditional probability
p (x|S) =
N x
S x (1 − S)
1−x
Figure 2 is a graph of the conditional probability for one sensor variable
x
(2)
S
and one
at the gateway.
Figure 3 is the Scheme of the Sensor network and sink or gateway where we can collect the information. The values
x
at the gateway are random values so
only measure cannot give the wanted information. At the sink we detect the conditional probability of the values
x
under the inuence of the sensors
S
Now given a learning process, and the form of the conditional probability at the gateway, we can extract the wanted information from the sensors by calculating the entropy.
Fig. 3.
Scheme of the Sensor network and sink or gateway
E1 E2 E3 .. . Eq Where
E
= ln [p (x1 |Si , S2 , . . . , Sp )] = ln [p (x2 |Si , S2 , . . . , Sp )] = ln [p (x3 |Si , S2 , . . . , Sp )] . . .
(3)
= ln [p (xq |Si , S2 , . . . , Sp )]
is the conditional Boltzmann Entropy. Now from the system in Figure
3 we compute the Jacobian
∂E1
Ji,j
∂E1 ∂S2 ∂E2 ∂S2 ∂E3 ∂S2
1 · · · ∂E ∂Sq ∂E · · · ∂Sq2 3 · · · ∂E = ∂Sq ··· ··· ··· ··· ∂Ep ∂Ep ∂Ep ∂S1 ∂S2 · · · ∂Sq
∂S1 ∂E2 ∂S1 ∂E3 ∂S1
∂E i = ∂Sj
(4)
Now the values of the sensors for which
∂Ei =0 ∂Sj
(5)
Is the estimated value for which the probability assumes the maximum value. Now we want to know the geometry of the sensors and we compute the Fisher information matrix
P ∂Ej
∂Ej ∂S1 ∂S1
∂Ej ∂S1 ∂S2
P ∂Ej
∂Ej ∂S1 ∂Sp
P ∂Ej
···
j j j P P ∂Ej ∂Ej P ∂Ej ∂Ej ∂Ej ∂Ej · · · ∂S2 ∂S1 ∂S2 ∂S2 ∂S2 ∂Sp j j j G = JT J = ··· ··· ··· ··· P ∂Ej ∂Ej P ∂Ej ∂Ej P ∂Ej ∂Ej · · · ∂Sp ∂S1 ∂Sp ∂S2 ∂Sp ∂Sp j
j
(6)
j
And for
∂ 2 ln(p) ∂Sh ∂Sk p
∂ ∂Sh
=
∂ ∂Sk ln (p)
p
∂ ∂Sh
=
1 ∂p p ∂Sk
p (7)
=
∂p ∂p − p12 ∂S p h ∂Sk
+
∂3p ∂Sh ∂Sk
=
∂3p ∂Sh ∂Sk
−
1 ∂p ∂p p ∂Sh ∂Sk
Now we have
P ∂Ej
∂Ej ∂Sh ∂Sk
j
≈
R
∂ln[p(x|S)] p ∂ln[p(x|S)] dx ∂Sh ∂Sk (8)
=
R
1 ∂p(x|S) ∂p(x|S) p ∂Sh ∂Sk dx
=
R
2 ln[p(x|S)] p ∂ ∂S dx h ∂Sk
Because
Z
∂ 2 ln [p (x|S)] ∂2 dx = ∂Sh ∂Sk ∂Sh ∂Sk
So the matrix
G
Z p (x|S) dx =
∂ 2 (1) =0 ∂Sh ∂Sk
(9)
can write as the Hessian of the entropy in this way
P
∂ 2 Ej ∂S1 ∂S1
P
∂ 2 Ej ∂S1 ∂S2
···
P
∂ 2 Ej ∂S1 ∂Sp
j j j P ∂2E P P ∂ 2 Ej ∂ 2 Ej j ··· ∂S2 ∂Sp G = J T J = j ∂S2 ∂S1 j ∂S2 ∂S2 j ··· ··· ··· ··· P ∂ 2 Ej P ∂ 2 Ej P ∂ 2 Ej ∂Sp ∂S1 ∂Sp ∂S2 · · · ∂Sp ∂Sp j
j
(10)
j
The matrix G gives the geometry of the sensor. In fact if G is a diagonal matrix the sensors are One independent from the other but when the G have gross elements we have a correlation between the sensors. So we can detect the sensors which values are similar to others. With active control at the gateway we can control the sensor activity in a way to select only the independent sensors or create the form of the physical environment that we want to measure. The distance in the general coordinates of the sensors is given by the expression
ds2 =
X
Gi,j S i S j
(11)
i,j When the sensors are independent we have
ds2 = G1,1 S12 + G2,2 S22 + · · · + Gp,p Sp2 In conclusion, given
p
(12)
sensors it is possible to dene a distance between (infor-
mation geometry(author?) [1]) one set of measures respect to another set of measures. With the help of the Fisher Information, which gives us the deformation of the space of the sensors due to noise in the transmition of the information. Morphotronic theory(author?) [9] gives us new information about the sensor probabilistic behaviour.
Bibliography [1] S I Amari. Information geometry on hierarchy of probability distributions.
IEEE Transactions on Information Theory, 47(5):17011711, 2001.
[2] Robin Braun and Zenon Chaczko.
Multi dimensional information space
Computer
view of wireless sensor networks with optimization applications.
Aided Systems Theory - EUROCAST 2011, 2011.
[3] Robin Braun, Zenon Chaczko, and Frank Chiang. Towards a new informa-
tion centric view of wireless sensor networks. 4th International Conference on Broadband Communication, Information Technology & Biomedical Applications, 15th to 18th, July, 2009. Wroclaw, Poland., pages 14, 2009.
[4] Frank Chiang, Robin Braun, and John Hughes.
A Biologically Inspired
Multi-Agent Architecture for Autonomic Service Management.
Pervasive Computing and Communications, 2(3):261275, 2006. [5] George Haidar, Shima Ghassempour, and Robin Braun. routing algorithm for wireless sensor networks.
Journal of
Nature-inspired
Australian Journal of Elec-
trical & Electronics Engineering, 9(3):18, 2012. Communication Systems 4Th Edition, volume 103.
[6] Simon Haykin.
John
Wiley & Sons, Inc., 2000. [7] Lopes Luis, Francisco Martins, Miguel S. Silva, and Joao Barros. A Formal Model for Programming Wireless Sensor NetworksË. Technical report. [8] Sule Nair and Rachel Cardell-Oliver. Formal specication and analysis of
Proceedings of the 7th ACM international symposium on Modeling, analysis and simulation of wireless and mobile systems - MSWiM '04, page 170, 2004.
performance variation in sensor network diusion protocols.
[9] Germano Resconi and Zenon Chaczko. Morphotronic System ( Theory ).
Computer Aided Systems Theory - EUROCAST 2009 and Lecture Notes in Computer Science, 5717:916, 2009.
[10] Yeling Zhang, M Ramkumar, and N Memon. Information ow based routing algorithms for wireless sensor networks. 2:742 747 Vol.2, 2004.