Camera Trap Synchronization for Wildlife Monitoring ...

16 downloads 372 Views 582KB Size Report
trap, called TigerCENSE, for capturing the image of tiger. The device is powered by rechargeable battery integrated with solar energy harvesting.
International Conference on Information and Convergence Technology for Smart Society Jan. 21-24, 2015 in Bangkok, Thailand

Camera Trap Synchronization for Wildlife Monitoring System Chayathorn Simasathien1, Aphirak Jansang1, Chaiporn Jaikaeo1, Nont Kheawwan1, Anan Phonphoem1*, Somphot Duangchantrasiri2 1

Intelligent Wireless Network Group, Department of Computer Engineering, Kasetsart University, Bangkok, Thailand 2 Department of National Parks, Wildlife and Plant Conservation, Thailand {[email protected]}*

Abstract— In practical wildlife conservation, such as tiger monitoring and density estimation, the camera trap has been widely deployed. More than one cameras have been installed in one observation site. However, each camera works independently and does not capable for communication among them. In this paper, the Camera Trap Synchronization for Wildlife Monitoring System has been proposed. The system has been tested in the real environment at Huai Kha Khaeng Wildlife Sanctuary, Thailand. The results show that the proposed system performs better than the original system.

Camera trap; Wildlife Monitoring System; Camera Trap Synchronization

I.

INTRODUCTION

Camera Trap [1-3] has been widely used in wildlife conservation. Major goals for using camera traps are population size estimation and identification. For example, tigers are scared for extinction from nature. To correctly classify and identify a tiger, images of strips or patterns from both-side are needed. Presently, more than one camera traps are deployed in an observation site for capturing multi-view images of the wildlife. However, each camera trap capturing function are depended only on the sensing unit called passive infrared sensor (PIR) which is installed in its own camera set. Once the PIR senses the object in the coverage area, it will signal the camera module to start shooting pictures. All installed camera traps are designed to work independently. Therefore, the number of false positive and false negative captured images are depended upon the sensitivity and accuracy of each camera’s sensing unit. In real and practical deployment, such as tiger conservation, researchers are unable to confidently conclude that all images retrieved from cameras in the capturing site are from the same tiger due to theirs sensing ability and unsynchronized timing of cameras. In this research, Camera Trap Synchronization for Wildlife Monitoring System has been proposed. The system will be able to connect all cameras in the same working area and synchronizing the firing signal for simultaneously shooting pictures from different cameras. The system has been deployed and tested in the real forest at Huai Kha Khaeng Wildlife Sanctuary, Thailand. The paper has been organized as follows. The next section describes the related and previous work. Section III and IV show the details of the proposed system and evaluation, respectively. Section V provides discussion and conclusion.

II.

RELATED WORK

In the researchers and users’ point of view, the current camera product cannot support required features such as the synchronizing mechanism. The current camera traps are commercial product which does not open for modification and enhancement. No schematic and detailed design are provided. In [4], the researchers also designed their own camera trap, called TigerCENSE, for capturing the image of tiger. The device is powered by rechargeable battery integrated with solar energy harvesting. The experiment has been done in the zoo, not the real forest environment. In [8], our team has newly designed the camera trap and capturing system for taking wildlife picture. The capturing images are real-time transferred from the monitoring site to the remote researcher camp (3-5 Km. away) by using long-range Wi-Fi technology. However, the proposed system has only evaluated in the lab test, not the real environment. Both [4] and [8] are not fully aware of the harsh environment, such as the tropical rain forest. To deploy the camera trap in the real environment, the device durability, the operating temperatures, battery life, optical lens become foggy, and humidity should be the major design concern. Our team then decided to modify the commercial camera trap which designed for operating in harsh environment, instead of develop the new system. Therefore, in [7], our team has modified the commercial camera to perform the synchronization function. However, the prototype version has only been tested in the lab. We also found that the additional microcontroller board, operated by the shared camera battery, quickly drains the camera battery causing shorter camera operating life.

International Conference on Information and Convergence Technology for Smart Society Jan. 21-24, 2015 in Bangkok, Thailand The modification of camera trap has been done by adding the micro controller, Jennic, for detecting and sending the capture signal from PIR sensor to the other camera via IEEE 802.15.4 devices. The SD-CARD WiFi is inserted to the camera for backing up the stored pictures to the external Wi-Fi storage. The experiment has been conducted in the lab.

III.

PROPOSED SYSTEM

From the previous work [7], we redesign the system. The new print circuit for the microcontroller becomes compacted in size to fit in the camera case. The power save mode is also added for extending the battery life. For the most important, the new camera system has been tested in the real environment. A. Target comercial camera trap for modification Around 150 camera traps are currently deployed in the wildlife monitoring site at Huai Kha Khaeng Wildlife Sanctuary Thailand are Bushnell Trophy Camera [5] (cost US$200) with PIR sensor, weather proof, SD-card Slot, 1-3 images per trigger, 6 Megapixels full color resolution. The target commercial camera trap is shown in Fig.1.

Figure 2. System block diagram

The Signal Transmitting part will broadcast the Firing signal to neighbors’ cameras in the coverage area through Tx module. The coverage area is around 15 – 25 meters. While the Rx module in the microcontroller board is regularly wakeup to capture the Firing signals from other cameras in the group. The microcontroller hardware is first implemented in [7] by using Jennic Wireless Microcontroller model JN5148 [6], 32-bits RISC processor, with 2.4 GHz IEEE 802.15.4 transceiver, 128 kB ROM, 128 kB RAM. Currently in this paper, the new microcontroller board has shown in Fig. 3. The original Jennic has been replaced by Jennic model JN5168 due to the trimmed down functional version, consumes less power, with lower cost.

(a) front

(b) inside

Figure 1. Bushnell Trophy Camera

B. Hardware Modification The Bushnell Trophy Camera is a commercial product which does not provide the schematic and detailed design. Therefore, we try to modify the current camera as follows. Fig.2 shows the system block diagram. The system has been divided into 3 parts: Sensing, Firing, and Signal Transmitting parts. In Sensing part, after investigating the source of PIR sensing signal by using oscilloscope, we hooked up the wire from the PIR controller module output port and connect to the microcontroller board. In Firing part, from the microcontroller board a cable is wired to the output port of the PIR sensor which is the input of the PIR controller module. The proposed mechanism is to fake the PIR controller module with the Firing signal instead of the real signal from its own PIR sensor. The output signal from the PIR controller module will trigger the camera to start taking picture. Once the trigger signal has been captured, it will be transmitted to the neighbors’ camera to start taking pictures simultaneously. Detection part:

(a) front

(b) inside

Figure 3. Modified camera trap with the microcontroller board

C. Power Saving mechanism Bushnell itself operates at very low power consumption. It consumes less than 1 mA when operates in the idle mode and 100 mA during picture taking mode. While the Jennic controller board consumes 15 mA during transmitting mode with approximately 17 mA in the doze mode with receiving function, and 0.7 µA for the sleep mode. Normally the Jennic controller board is fully active mode. However in the proposed system we designed to share the power from the camera battery. To prolong the camera operating life time, we have to implement the

International Conference on Information and Convergence Technology for Smart Society Jan. 21-24, 2015 in Bangkok, Thailand power saving mechanism. We proposed the simple power mechanism as shown in Fig. 4. In the Rx mode, the Jennic controller board is in the awake for 10 millisecond (ms) and followed by 250 ms. sleep mode. However, in the Tx mode, the Jennic controller board is forced to repeatedly transmit the Firing command for 270 ms. period to ensure that the other Jennic modules installed in the neighbors’ camera can wake up and detect the Firing signal from the sending camera who detected the wildlife or an object by its PIR sensor.

(a) left side

(b) right side

Figure 6. Captured images from modified cameras

B. Sytstem comparison To test the correctness of the proposed system, the modified cameras are benchmarked with the original cameras. We deploys cameras in a real tiger trail in Huai Kha Khaeng Wildlife Sanctuary for 12 days.

Figure 4. Power saving mode implemented on microcontroller device

IV.

EVALUATION

The proposed system has been installed and tested by both researchers and local forestry officers at Huai Kha Khaeng Wildlife Sanctuary, Thailand. A. System Verrification To verify that the proposed system works in the real environment, two modified cameras have been installed and let it run for 5 days. Fig.5 shows the installation site. Both cameras are located 6 meters apart, and 80 cm. from the ground. The coverage area of both cameras (PIR sensor) are non-overlapped. Therefore, both PIR sensors from each camera will not sense at the same time. However, both cameras need to take the picture simultaneously, even though its PIR sensor does not sense any object. The results show that both cameras work properly as a red bull has been captured, shown in Fig.6. Note that time stamps in both camera are around 3 seconds apart due to the human error time setup during initialization which has been done separately and manually.

From Fig. 7(a), the left and right side cameras are 5 meters apart. The original cameras (camera 2, shown in Fig. 7(b)) are located 1 meter from the ground, while the modified cameras (camera 1) are setup 0.17 meters above the original camera. From one of the example results, Fig. 8, shows that both cameras of the proposed system fire simultaneously and got both-side pictures of the Leopard during daytime, while the only one of the original camera has been triggered and got only one-side image. In Fig. 9, another example result shows that both systems took both-side images of a tiger correctly at night.

(a) camera setup

(b) camera installation

Figure 7. Camera trap setup for system comparison

However, after investigate the whole set of captured images from both systems during the system comparison, the result shows in Table I.

right side left side

Figure 5. Camera trap setup for system verification

For example, event 0:1 means that there is an object presents in front of cameras, however, both cameras from the original system do not correctly take any picture, while one of the camera from the proposed system correctly takes the picture with the object. The number of event 0:1 occurs 5 times. The interesting events are 2:0, 2:1, and 2:2 which occur 11, 2, and 48 times respectively. It means that both cameras from the original system correctly fire and get both-side images. The number of this event is 61 out of 102 times, or 59.80 %. While, the events 0:2, 1:2, and 2:2 which occur 7, 11, and 48 times respectively. It means that both cameras

International Conference on Information and Convergence Technology for Smart Society Jan. 21-24, 2015 in Bangkok, Thailand TABLE I.

from the proposed system correctly fire and get both-side images. The number of this event is 66 out of 102 times, or 64.71 %.

(a) left side modified camera

(b) right side modified camera

THE NUMBER OF FIRING CAMERAS

# of firing cameras (original : proposed)

# of events with object(s)

0:1 0:2

Original system

Proposed system

5

0

0

7

0

7

1:0

5

0

0

1:1

13

0

0

1:2

11

0

11

2:0

11

11

0

2:1

2

2

0

2:2

48

48

48

102

61 (59.80%)

66 (64.71%)

Total

No picture taken (Camera has not been triggered)

(c) left side original camera

# events with objects (both firing cameras)

Another fact is that the proposed system, on average, shows around 31% of false positive (firing camera but no object shown in the image). While the original system, on average, shows around 23% of false positive. (d) right side original camera

Figure 8. Example I results from the system comparison

The false positive cases might be caused by the environments, the speed of the object (disappear from the sensing area very fast), or the PIR sensors. For the battery consumption, there is still no conclusion due to many false positive and negative of both system that need to be investigated more. ACKNOWLEDGMENT Many thanks to local forestry officers at Huai Kha Khaeng Wildlife Sanctuary, Thailand.

(a) left side modified camera

(b) right side modified camera

REFERENCES [1] [2]

[3] (c) left side original camera

(d) right side original camera

Figure 9. Example II results from the system comparison

V.

[4]

DISCUSSION AND CONCLUSIONS

From the system verification, the result shows that the proposed system is working correctly. Also from the system comparison between the proposed and original systems, the results reveal that the proposed system performs better than the original system around 5 %. More chances for both-side images are taken which increasing the correctness of wildlife identification and population size estimation. Even though 5% does not seem to be a big improvement. However, the proposed system might increase the chance to capture the rare wild life or important events that might not be occurred frequently.

[5] [6] [7]

[8]

A. F. O'Connell, J. D. Nichols, and K. U. Karanth, Camera Traps in Animal Ecology: Methods and Analyses: Springer, 2010. J. M. Rowcliffe and C. Carbone, "Surveys using camera traps: are we looking to a brighter future?," Animal Conservation, vol. 11, pp. 185-186, 2008. J. M. Rowcliffe, J. Field, S. T. Turvey, and C. Carbone, "Estimating animal density using camera traps without the need for individual recognition," Journal of Applied Ecology, vol. 45, pp. 1228-1236, 2008. R. Bagree, V. Jain, A. Kumar, and P. Ranjan, "TigerCENSE: Wireless Image Sensor Network to Monitor Tiger Movement," in Real-World Wireless Sensor Networks. vol. 6511, P. Marron, T. Voigt, P. Corke, and L. Mottola, Eds., ed: Springer Berlin Heidelberg, 2010, pp. 13-24. Bushnell Jennic Wireless Microcontrollers < http://www.jennic.com/> C. Simasathien, A. Phonphoem, C. Jaikaeo, N. Kheawwan, and A. Jansang, "Multi-camera Synchronization Real-time Tiger Capturing System through Wireless Controller," in 6th ECTI Conference on Application Research and Development (ECTICARD 2014), Chiangmai, THAILAND, 2014. (in Thai) W. Tangtrongpairoj, A. Jansang, C. Jaikaeo, and A. Phonphoem, "Real-time Wildlife Monitoring over Wireless Network," in 3rd ECTI Conference on Application Research and Development (ECTI-CARD 2011), Bangkok, THAILAND, 2011, pp. 136-140. (in Thai)

Suggest Documents