Wireless Infrastructure for Remote Environmental ... - IEEE Xplore

3 downloads 202268 Views 349KB Size Report
wireless sensing technology, which creates opportunities to benefit environmental monitoring in significant ways. However, there are several challenges that are ...
2013 International Conference on Selected Topics in Mobile and Wireless Networking (MoWNeT)

Wireless Infrastructure for Remote Environmental Monitoring: Deployment and Evaluation Fan Yang∗ , Vamsi Gondi† , Jason O. Hallstrom† , Kuang-Ching Wang∗ , Gene Eidson‡ , Christopher J. Post§

∗ Dept.

of Electrical and Computer Eng., † School of Computing, ‡ Biological Sciences, § Forestry and Natural Resources Clemson University, Clemson, SC Email: {fany, vgondi, jasonoh, kwang, geidson, cpost}@clemson.edu

Abstract—The world is witnessing tremendous innovation in wireless sensing technology, which creates opportunities to benefit environmental monitoring in significant ways. However, there are several challenges that are not thoroughly discussed by current studies in designing the wireless network infrastructure for these applications. In general, the monitoring units should provide reliable data communication across diverse environments. The wireless system should also meet requirements that most current platforms fail to satisfy, such as scalable coverage, reliability, robustness, and low operating cost. Finally, the network architecture should be kept simple to aid in the management of infrastructure deployments across disparate environments. With the aim of addressing these challenges, a two-tier network architecture for remote environmental monitoring is described in this study: (i) A sensor gateway is designed to transmit observation data from local sensors at the first tier. (ii) The second tier of the infrastructure reliably replays this data to a remote server for analysis via a wireless mesh network. A wide range of environments are supported, from open fields and dense forests, to Wi-Fi areas and cellular-only zones. In this study, a wireless backhaul network comprising wireless sensor gateways was deployed in Aiken, South Carolina. The network performance was systematically evaluated through experimental trials. Results demonstrate the infrastructures ability to support effective data collection and reliable data transmission.

The main thrust of this manuscript is to present a twotier network architecture that is adaptable to environmental requirements, and at the same time provides reliable transmission, simplified management, and low operating costs. The most important components of the architecture are: (i) the sensor gateway, (ii) the wireless mesh network, and (iii) the Internet gateway. Multiple sensor gateways serve as the first tier of the infrastructure; each is responsible for acquiring data from local sensors and reliably relaying that data to the mesh network. The data is then transmitted by the mesh network to the Internet gateway, which has IP-based access to a remote server. The mesh network serves as the second tier of the infrastructure and offers coverage scalability due to the extensible nature of wireless mesh networks. The remote server provides secure access for data management and analysis. The rest of the paper is organized as follows: A review of existing remote monitoring systems is conducted in Section 2. System requirements, infrastructure design, and component features are presented in Section 3. Section 4 describes a physical deployment of the network architecture in a city area. Systematic performance evaluation of the deployed network is also included in this section. Conclusions and future work are presented in Section 5.

I. I NTRODUCTION Environmental monitoring is focused on characterizing the key parameters of a target environment. Target environments can include rivers, agricultural fields, forests, oceans, and other areas. Manual data collection would be too costly, and even dangerous in many of these monitoring scenarios. Instead, wireless sensors that are usually small in size are utilized to monitor the target environment. Wireless sensors equipped with low-cost and low-power radios are suitable for communicating wirelessly over short distances. While innovation in wireless sensor technologies creates opportunities for environmental monitoring, several challenges in designing the supporting wireless network infrastructure required for long-haul transmissions must be addressed. First, the network architecture must provide reliable data communication across diverse environments. Most current solutions are designed to meet project-specific requirements, and hence cannot be broadly applied. Second, the wireless system should provide scalable coverage, robustness, and low operating cost. Finally, the network architecture should be kept simple to aid in the management of infrastructure deployments across geographically distributed environments.

978-1-4799-0506-5/13/$31.00 ©2013 IEEE

II. BACKGROUND AND RELATED WORK Remote environmental monitoring networks generally consist of three key services: data acquisition [1], data transmission [2], and data analysis [3]. Data acquisition is usually accomplished via wireless sensors which convert analog signals from environmental sensors into digital signals for local storage, processing, and wireless transmission. Data transmission involves relaying data to remote servers across a wireless transit system. Finally, data analysis involves analyzing, processing, and presenting the acquired environmental data. The focus of this study is on data transmission, an area where there has been significant research in the context of wireless sensor networks. Existing systems often use a star topology with a single base station; the base station in turn provides communication with multiple sensors over long-range data links [2] [4] [5]. More detailed reviews of wireless data transmission networks for remote environmental monitoring can be found in [6] [7]. These networks share a

68

2013 International Conference on Selected Topics in Mobile and Wireless Networking (MoWNeT)

Scalability: The system must scale to very large areas with minimal configuration and operating effort. Fault-tolerance: Reliable data transfer must be assured from the sensing end-points to a high performance computing backbone, even in the presence of transient faults. Maintenance: The system must be maintainable with minimal effort, supporting remote re-imaging and re-configuration of deployed nodes. High bandwidth: The system must deliver high bandwidth to accommodate new applications and services, supporting large sensor deployments. Low latency: To support critical missions, the system must be capable of acquiring real time data from the sensor fabric when needed. To achieve this, the system must provide low round trip times between computational servers and sensor end-points.

similar architecture, comprising a data acquisition unit that collects information from the environment and a server unit that receives data over a communication link. One of the disadvantages of existing link designs is that these solutions focus on uplink traffic, often neglecting downlink traffic. While uplink traffic is fundamental to the transmission of environmental data, downlink configuration and control traffic is vital for real-world deployments, as sensor platforms often experience unexpected conditions during long-term operation [8] [9] [10]. The sensor gateway described in this study provides such management capabilities, including the ability to diagnose downlink performance problems during long-term deployments. Coverage scalability is another issue for many monitoring systems. It should be noted that one of the most important characteristics that differentiates existing systems is the distance between the data acquisition units and the server unit. Several studies describe monitoring systems focused on indoor, short-range monitoring [11] [12]; others describe monitoring and control systems for civil telemetry applications [13] [14] still relatively small in scale compared to large-scale, wide area environmental monitoring programs. The designs used in these systems are not readily applied in remote environmental monitoring systems, as the strategies for designing longrange communication systems are significantly different [15]. Most existing long-range monitoring systems adopt satellite communication systems for reliable packet transmission [16] [17]. A detailed review of such systems can be found in [15]. While satellite communication is a suitable strategy due to the large coverage area it supports and the reliable communication performance it provides, deploying satellite base stations is not scalable; the base stations are cumbersome and costly. The main contribution of this paper is the design and evaluation of a network architecture that avoids the complexity of developing and managing wireless infrastructures for disparate environments. We show that it is possible to keep the architecture simple and create a robust, scalable, adaptable, and manageable network that is suited to the intended application. The proposed system simplifies the deployment of the network through auto configuration, where sensor nodes adapt to changes in network topology in different deployment scenarios.

B. System architecture and implementation choices The architecture includes sensor gateways, Wi-Fi-based mesh nodes, and Internet gateways. Fig. 1 illustrates the network architecture for remote environmental monitoring. Data acquired by sensors is collected by nearby sensor gateways and forwarded by Wi-Fi mesh nodes to an Internet gateway. The architecture is designed for sensors with radio capabilities described in the IEEE 802.15.4 standard. The standard defines the physical and link layers of a low-cost, low data rate radio suitable for short-range communication in a wireless personal area network. The sensor gateway uses such an interface to receive data from wireless sensors; at the same time, it uses a Wi-Fi interface to transmit data to Wi-Fi mesh nodes. The Internet gateway has a Wi-Fi interface to receive data from the Wi-Fi mesh nodes, while it has a separate interface for Internet communication. To guarantee reliable communication between each gateway and the computational servers, a reliable messaging protocol such as the Advanced Message Queuing Protocol (AMQP) is used. The following implementation choices were made: Cellular/ethernet compatibility and failover: The deployment system supports Ethernet at the Internet gateway, as well as a cellular link when this primary connection fails. Ethernet provides higher throughput, bandwidth, and reliability, but may not always be available. A cellular gateway may be adopted, but has the disadvantage of low bandwidth and a high packet drop rate when the cellular signal is weak. The failover technique of preferring Ethernet over the cellular connection guarantees the fastest data transmission, while providing robustness when Ethernet connectivity is not available. IEEE 802.15.4 compatibility: As shown in Fig. 1, the targeted wireless sensors adopt IEEE 802.15.4 based radios. Hence, an IEEE 802.15.4 receiver is installed in each sensor gateway for data acquisition. IEEE 802.11 compatibility: In remote environmental monitoring networks, deployment sites typically cover a large area, requiring long-range transmission of data. Consequently, a WiFi mesh network based on the IEEE 802.11 standard is used to relay sensor data.

III. W IRELESS INFRASTRUCTURE DESIGN In this section, the key design requirements and system architecture are presented. A. Design requirements To operate in diverse environments under adverse conditions, the network infrastructure must fulfill the following requirements. Adaptability to diverse sensing environments: The system must adapt to a range of terrains, including mountains, trees, urban environments, open fields, and dense forests. These environments present fundamentally different signal propagation characteristics.

69

2013 International Conference on Selected Topics in Mobile and Wireless Networking (MoWNeT)

TABLE I M AJOR HARDWARE COMPONENTS OF A SENSOR GATEWAY Part Name Motherboard Mini PCI Wi-Fi card, antennas IEEE 802.15.4 radio, antennas 4GB CF card Enclosure

Fig. 1.

Note Embedded X86 Provides Wi-Fi connectivity R/W Store working device image Waterproof enclosure for X86 and radios

Multi-tier network architecture Fig. 2.

Coverage diversity and scalability: The wireless infrastructure must cover a wide range of environments, from open fields and dense forests, to Wi-Fi areas and cellular-only zones. By changing the number of wireless mesh network nodes and the corresponding sensor gateways, the monitored area can vary from a city watershed to a long river bank. Remote system management: Remote system management is vital for networks deployed in remote areas where site visits are not easy. In our infrastructure, the open source VPN tool OpenVPN is used due to its secure, reliable, and scalable design. Secure tunnels are created between a centralized VPN server and each gateway client to support traditional SSHbased management. Permanent IP addresses are assigned to the sensor gateways by the VPN server. For example, each has an IP address of 10.9.8.*, where the last number is unique to the device. It is wise when many gateways are being utilized for a deployment to create a hash table mapping between device number/name and its associated IP address. This makes it simple to look up the device and use its IP address to update its configuration. Wireless security: Wireless security is always a concern for remote monitoring systems. In this system, each sensor gateway is associated with the wireless mesh nodes through Wi-Fi Protected Access II(WPA-2) [18] and uses AES (Advanced Encryption Standard) for encryption. It is stronger than the RC4 encryption scheme shared by WEP (Wired Equivalent Privacy) and WPA.

Embedded Linux gateway in an enclosure

gateway is to acquire data from regional sensors and to transmit that data to the backhaul network. An embedded X86 Linux device is used to implement the sensor gateway due to its extensible memory, radio adaptability via PCI cards, (relatively) low power consumption, and low cost. The major hardware components of the sensor gateway are listed in Table 1. The embedded Linux device provides two mini-PCI interfaces for installing wireless network interfaces, as well as two standard USB ports for installing IEEE 802.15.4 (and other) transceivers. The gateway uses one network interface for client-level device access, and one network interface for mesh connectivity. In our system setup, an IEEE 802.15.4 radio (with integrated antenna) is connected to the embedded Linux device via USB. The assembled device is placed inside a waterproof enclosure. Data is received through the IEEE 802.15.4 channel, while the gateway relays the data to the remote server using an AMQP service over the Wi-Fi mesh network. Fig. 2 shows the deployed sensor gateway with in an enclosure. WPA-2, OpenVPN, and configuration management tools are installed as Linux system libraries. A custom AMQP client is used to communicate with the remote server. 2) 2nd tier components: The 2nd tier consists of Wi-Fi wireless mesh nodes and an Internet gateway. The optimal number of mesh nodes and their locations can be determined both theoretically and empirically so that the operating expense is minimal, while maintaining the required network coverage. Anaptyx mesh nodes were adopted due to their reliable performance, low cost, and web-based configuration tools for remotely managing the deployed mesh nodes [19]. A broadband gateway connects the Wi-Fi mesh to the Internet, allowing data from the deployment site to reach the remote server for analysis. The terminal gateway accesses the Internet

C. System components In this subsection, the system components used in a specific installation of the architecture are presented. The installation is deployed in downtown Aiken, SC to support a stormwater management application to control the quantity of stormwater runoff and resultant flooding. 1) 1st tier components: The 1st tier consists of wireless sensors and sensor gateways. The function of the sensor

70

2013 International Conference on Selected Topics in Mobile and Wireless Networking (MoWNeT)

Fig. 3.

gateway from sending data to the remote server. After power was restored, the sensor gateway successfully restarted, reestablished a link with the network, and started a new session with the server. Each sensor gateway is configured to require at least 1 minute for network connection re-establishment to minimize network overhead when all sensor gateways are reconnected. The network reconnection time after power supply disruption was recorded as 1.5 minutes. Maximum recovery time was not recorded since this value depends on the factors that caused the power outage, as well as how long the power outage persisted. VPN server failure: This test case considers VPN server behavior after a loss in connectivity or a power outage, which causes the gateway VPN tunnel to fail. Data cannot be sent to the remote server. Upon reboot of the VPN server, clients re-established the VPN tunneling interfaces with the sensor gateways and started transmitting data after a minimum time of 2 minutes and 48 seconds. Mesh node failure: This test case considers mesh node behavior when a power outage occurs, resulting in Wi-Fi connectivity loss at the sensor gateways. Upon restoration of power, the mesh nodes received the IP address assigned by the Internet gateway and provided this address to the sensor gateways to re-establish their Wi-Fi links after 2 minutes and 21 seconds. Internet gateway failure: This test case considers Internet gateway behavior when a power outage occurs, causing the sensor gateways to halt transmission of data to the remote server. Upon restoration of power, the Internet gateway reestablished connections to the sensor gateways, and they reinitiated data transmission after 2 minutes and 10 seconds. Ethernet/Cellular connection failover: This test case considers the behavior of the Internet gateway when its primary (Ethernet) connection is lost. The Internet gateway successfully switched from Ethernet to cellular to restore Internet connectivity after 2 minutes and 4 seconds. It switched back to Ethernet connectivity when it became available, guaranteeing the fastest data communication option. 2) Latency and throughput evaluations: Fig. 4 illustrates the simplified network topology of the sensor gateways, the mesh nodes, the Internet gateway, and the remote server. A solid circle represents a sensor gateway and mesh node pair; an empty circle represents an independent mesh node. The dotted line shows the data path in the wireless mesh network. As seen from the Fig 4. sensor gateway 1 is the closest node to the Internet gateway. Sensor gateways 2 and 3 are one hop away from the Internet gateway, and sensor gateway 4 is three hops away. The solid line represents the Internet backhaul network, which can be either an Ethernet or cellular connection. TCP throughput: TCP throughput was measured using iPerf (version 2.0) [20], a widely-used tool to measure the bandwidth and quality of network links. iPerf clients were installed at individual sensor gateways, and an iPerf server was installed at the remote server. Throughput performance was evaluated under two different connection scenarios, Ethernet and cellular. For each scenario, throughput was measured

Deployment map of downtown Aiken, SC

through an Ethernet connection, or through an attached cellular USB modem if the primary connection fails. IV. D EPLOYMENT AND PERFORMANCE EVALUATION A. Physical deployment While similar monitoring systems have been extensively studied in the past few years, most results are of a theoretical nature and were obtained outside of a practical context. This paper examines a physical site deployment of the wireless infrastructure. The remote server is located on campus at Clemson University in Clemson, South Carolina. The network system is deployed in downtown Aiken, South Carolina, as part of an effort to monitor natural stormwater treatment systems. Five wireless mesh nodes are deployed at downtown locations labeled 1-5 in Fig. 3. Four sensor gateways are deployed near locations 1, 2, 3, and 5. The mesh node deployed at location 4 serves as an additional hop due to the long distance between mesh nodes at locations 3 and 5. Up to 6 wireless senor platforms report environmental data to each sensor gateway every few minutes. The longest distance between adjacent mesh nodes is 100 meters. The Ethernet/cellular broadband Internet gateway is located inside Aiken City Hall, labeled R in Fig. 3, where Ethernet and cellular connections are provided. This site deployment is part R project, which provides real-time of the Intelligent River monitoring, analysis, and management of water resources in South Carolina. B. Performance evaluation We now summarize our performance evaluation results. First, we consider overall system performance, including performance in the presence of faults. We then consider studies of latency and throughput. The focus is on the infrastructure design; performance of the local sensors and the back-end server are not considered. Table 2 summarizes the system performance characteristics discussed in the following subsection. 1) Overall system performance and test cases:: The observed maximum throughput from the sensor gateways in Aiken to Clemson campus network was recorded as 28 Mbps for Ethernet, and 6.5 Mbps for cellular. The average latency was recorded as 91ms and 330ms for Ethernet and cellular connections, respectively. Sensor gateway failure: This test case considers gateway behavior after a power outage, which temporarily prevents the

71

2013 International Conference on Selected Topics in Mobile and Wireless Networking (MoWNeT)

TABLE II S YSTEM PERFORMANCE SUMMARY Performance Index Observed maximum throughput from sensor gateways to Clemson University campus network using Ethernet (Aiken city public network) Observed maximum throughput from sensor gateways to Clemson University campus network using ATT 3G network Average latency from sensor gateways to Clemson University campus network using Ethernet (Aiken city public network) Average latency from sensor gateways to Clemson University campus network using ATT 3G network Recovery time from sensor gateway failure Recovery time from VPN server failure Recovery time from mesh node failure Recovery time from Internet gateway failure Switching time from Ethernet to cellular modem

Fig. 4.

Measurement 28 Mbps

6.5 Mbps 91ms 330ms Fig. 6. 1.5 minutes 2 minutes and 48 seconds 2 minutes and 21 seconds 2 minutes and 10 seconds 2 minutes and 4 seconds

Throughput measurements on individual wireless mesh links

sensor gateway 4 and the remote server, and the link between sensor gateways 1 and 3. Measured results are shown in Fig. 5 and Fig. 6. As there was throughput variation across the tests at each link, throughput is sorted in ascending order, and here we focus on average throughput values. It should be noted that each throughput measurement was conducted independently. Sensor gateways that were deployed at least one hop away had significantly lower throughput measurements (less than 3.5 Mbps) than sensor gateway 1 (more than 25 Mbps as shown in Fig. 5). Sensor gateway 4 had the lowest throughput, as it was deployed three hops away. It is interesting to see that the link throughput of the wireless mesh network was in the range of 2 to 3.5 Mbps (top curve in Fig. 6), indicating that network throughput of sensor gateways in the mesh network will be limited by wireless mesh links. For our deployed system, the TCP throughput bottleneck lies in the Wi-Fi based wireless mesh network. How to improve the throughput bottleneck remains future work. Latency: In this study, end-to-end latency is calculated, reflecting the packet transmission time from the sensor gateway to the remote server, which can be modeled by the following equation:

Network topology of the deployed system

 ttotal =

Fig. 5.

tgateway + tmesh + tbackhual tgateway + thop Nhop + tbackhual

(1)

where tgateway is the time between a sensor gateway receiving a packet from a local wireless sensor and relaying that packet to the mesh network. tgateway is mainly determined by the sensor gateways on-board processor (normally less than 1 ms) and is assumed to be negligible due to the large Internet backhaul delay. tmesh is the packet transmission time across the mesh network; it is a function of the number of hops Nhop between the mesh node where the packet is sent and the backhaul internet gateway. It is assumed that the perhop packet transmission time thop is approximately constant. tbackhual is the packet transmission time between the internet gateway and the remote server. tbackhual should be modeled separately, depending on which access port (either Ethernet or cellular modem) is used. Fig. 7 shows the schematic of the proposed model.

Throughput between sensor gateway 1 and remote server

every second for a duration of 20 seconds, and the measurements were repeated five times. Default iPerf parameters were utilized for all measurements. Measured average throughput and standard deviation values are shown in Fig. 5. It can be seen from Fig. 5 that the links have much higher throughput and lower variance when Ethernet is used. As Ethernet was the primary interface for the Internet gateway, we focus on evaluation cases where Ethernet was available for the rest of the section. Following the same procedure, we measured the throughput performance of the link between sensor gateway 3 and the remote server, the link between

72

2013 International Conference on Selected Topics in Mobile and Wireless Networking (MoWNeT)

R EFERENCES

Fig. 7.

[1] Tseng, C.L., Jiang, J.A., Lee, R.G., Lu, F.M., Ouyang, C.S., Chen, Y.S.,Chang, C.H., 2006. Feasibility study on application of GSMSMS technology to field data acquisition. Comput. Electron. Agric. 53, 4559. [2] T. Wark, P. Corke, P. Sikka, L. Klingbeil, Y. Guo, C. Crossman, P.Valencia, D. Swain, and G. Bishop-Hurley, Transforming agriculture through pervasive wireless sensor networks, IEEE Pervasive Comput.,vol. 6, no. 2, pp. 5057, Apr.Jun. 2007. [3] Sam Madden, Michael J. Franklin, Joseph M. Hellerstein and Wei Hong, ”TinyDB: An Acqusitional Query Processing System for Sensor Networks,” ACM TODS, 2005 [4] A. Mainwaring, D. Culler, J. Polastre, R. Szewczyk, and J. Anderson, Wireless sensor networks for habitat monitoring, in Proc. ACM Int. Workshop on Wireless Sensor Networks and Applic., 2002, pp. 8897. [5] G. Werner-Allen, J. Johnson, M. Ruiz, J. Lees, and M. Welsh, Monitoring volcanic eruptions with a wireless sensor network, in Proc. Wireless Sensor Networks, 2005, pp. 108120. [6] J. Rogan, D.M. Chen, Remote sensing technology for mapping and monitoring land-cover and land-use change, Progress in Planning, 61 (2004), pp. 301325 [7] H.B. Glasgow, J.M. Burkholder, R.E. Reed, A.J. Lewitus, J.E. Kleinman, Real-time remote monitoring of water quality: a review of current applications, and advancements in sensor, telemetry, and computing technologies, in Journal of Experimental Marine Biology and Ecology, 300 (2004) 409448. [8] J. Valverde, V. Rosello, G.Mujica, J. Portilla, A. Uriarte, and T. Riesgo, Wireless Sensor Network for Environmental Monitoring: Application in a Coffee Factory, international Journal of Distributed Sensor Networks, vol. 2012, Article lD 638067. [9] Z. Chen, P. Prandoni, G. Barrenetxea, and M. Vetterli, Sensorcam: An energy-efficient smart wireless camera for environmental monitoring,” in Proc. IPSN ’12. ACM, 2012 [10] Liu, W., Fei, X., Tang, T., Li, R., Wang, P., Luo, H., Li, K., Zhang, L., Deng, B., Yang, H. Design and Implementation of a Hybrid Sensor Network for Milu Deer Monitoring, 14th International Conference on Advanced Communication Technology (ICACT), 2012, pp. 52-56, Beijing, China (2012). [11] Zhou, SQ, Ling, W and Peng, ZX, An RFID-based remote monitoring system for enterprise internal production management, International Journal of Advanced Manufacturing Technology, 33(78): 837844. [12] S. Junnila , H. Kailanto , J. Merilahti , A.-M. Vainio , A. Zakrzewski , M. Vehkaoja and J. Hyttinen, Wireless, multipurpose in-home health monitoring platform: Two case trials, IEEE Trans. Inf. Technol. Biomed., vol. 14, no. 2, pp. 447-455, 2010 [13] Ciubotaru-Petrescu, B., Chiciudean, D., Cioarga, R., and Stanescu, D, Wireless Solutions for Telemetry in Civil Equipment and Infrastructure Monitoring, 3rd Romanian-Hungarian Joint Symposium on Applied Computational Intelligence (SACI) May 25-26, 2006 [14] Pines DJ, Lovell PA, Conceptual framework of a remote wireless health monitoring system for large civil structures, Smart Materials and Structure 1998; 7(5):627636. [15] Celandroni, N., et al. A survey of architectures and scenarios in satellite based wireless sensor networks: system design aspects, International Journal of Satellite Communications and Networking 31.1 (2013): 1-38. [16] Advanced Message Queuing Protocol. Website at: http://www.amqp.org/. [17] R. Cardell-Oliver, K. Smettem, M. Kranz, and K. Mayer, Field testing a wireless sensor network for reactive environmental monitoring, in Proceedings of the international conference on intelligent sensors, 2004, pp.7-12. [18] Y. Xue, B. Ramamurthy, and M. Burbach, A two-tier wireless sensor network infrastructure for large-scale real-time groundwater monitoring, in 5th IEEE International Workshop on Practical Issues in Building Sensor Network Applications, Denver, Colorado, 2010. [19] WPA2 website at http://en.wikipedia.org/wiki/Wi-Fi Protected Access [20] Anaptyx: Wi-Fi solutions for any size. Website at: http://anaptyx.com/ [21] Iperf Website at http://en.wikipedia.org/wiki/Iperf

Schematic of end-to-end latency model

TABLE III L ATENCY PERFORMANCE OF INDIVIDUAL LINKS Link Sensor Sensor Sensor Sensor

gateway 1 and remote server gateway 3 and remote server gateway 4 and remote server gateways 1 and 3

Average latency and deviation 91ms with 10ms deviation 94ms with 10ms deviation 99ms with 10ms deviation 3ms with 1ms deviation

Network latency was measured using the Linux ping utility. Following the same procedure used to measure throughput, we summarize latency performance of individual links within the wireless mesh network in Table 3. In our latency model, tgateway =0ms, thop =3ms, and tbackhual =90ms. Latency between mesh nodes is low compared to the latency at the Internet backhaul, indicating that for our system, the end-toend latency bottleneck lies in the Internet backhaul. To improve network throughput, increasing the bandwidth of the wireless mesh network is likely to be an effective strategy. As the Internet backhaul network between the Internet gateway and the remote server have already been deployed, it is not easy to further decrease the end-to-end latency in our system, unless better Internet routes are discovered. V. C ONCLUSION Due to the complexity of remote environmental monitoring, there are many factors to consider when deploying a supporting sensor network. The network architecture presented in this paper provides a base platform that can be adapted to various deployment environments. The infrastructure provides a secure, scalable, and reliable solution for environmental monitoring. The site deployment in Aiken, SC provides a strong real-world evaluation platform. Unit failure tests demonstrate that the system is able to recover from major unit failures within a reasonable amount of time. Network performance measurements identified TCP throughput and latency bottlenecks in the wireless mesh network and the Internet backhaul. How to improve network performance in terms of higher throughput and lower latency will be our future focus. ACKNOWLEDGMENTS This work was supported through awards from the NSF (CNS-1126344, CNS-0745846), and the City of Aiken. The authors gratefully acknowledge the support of Aiken City Council and Clemson Computing and Information Technology group for their support in installing monitoring stations.

73