Embedded Wireless Web Server Camera Carlos Massa
Eloi Ramon
Cephis Universitat Autònoma de Barcelona ETS d’Enginyeria 08193 Bellaterra
[email protected]
Dept. d’Enginyeria Electrònica Cephis Universitat Autònoma de Barcelona ETS d’Enginyeria 08193 Bellaterra
[email protected]
Abstract Approval of the IEEE 802.11 standard for wireless local area networking (WLAN) and rapid progress made toward higher data rates have put the promise of truly mobile computing in reality. While wired LANs have been a mainstream technology for at least fifteen years, WLANs are still an unknown territory for most applications. A lot of work developing the wireless sensor technologies has been done in the past few years. In this paper we present a new approach using small processors with limited resources as transmission bridges of small packets of data from sensors through the high data rate WLAN networks installed.
1. Introduction A wireless LAN (WLAN) is a data transmission system designed to provide location-independent network access between computing devices by using radio waves. In the corporate enterprise, wireless LANs are usually implemented as the link between the existing wired network and a group of client computers, giving these users wireless access to network resources across a building or campus. The Institute of Electrical and Electronics Engineers (IEEE) ratified the original 802.11 specification in 1997 as the standard for wireless LANs. This first version of 802.11 provides for 1 Mbps and 2 Mbps data rates and a set of fundamental specifications and services. Recognizing the critical need to support higher data-transmission rates, the IEEE ratified the 802.11b standard for transmissions of up to 11
Jordi Carrabina, Borja Martínez, Enric Pons Dept. de Microelectrònica i Sistemes Electrònics - Cephis Universitat Autònoma de Barcelona ETS d’Enginyeria 08193 Bellaterra {Jordi.Carrabina, Borja.Martinez, Enric.Pons}@uab.es
Mbps. Other standards have been introduced later for transmissions of up to 55 Mbps. Like all IEEE 802 standards, the 802.11 standards focus on the bottom two levels of the ISO model, the physical layer and data link layer. Any LAN application, network operating system, or protocol, including TCP/IP, UDP, et cetera, will run on an 802.11-compliant WLAN as easily as they run over wired-Ethernet. 802.11b WLANs communicate using radio waves because these waves penetrate off many indoor structures or can be reflected in obstacles. WLAN throughput depends on several factors, including the number of users, coverage range, interference, multipath propagation, and many others. 802.11 defines two pieces of equipment, a wireless station and an access point (AP), which acts as a bridge between the wireless and wired networks. Also, the 802.11 standard defines two modes: infrastructure mode and ad hoc mode. In infrastructure mode, the wireless network consists of at least one access point connected to the wired network infrastructure and a set of wireless end stations [1]. Ad hoc mode (also called peer-to-peer mode) is a set of 802.11 wireless stations that communicate directly with one another without using an access point or any connection to a wired network. The paper is structured as follows. Next section describes the architecture of the system. The implementation issues are presented afterwards. In the concluding section we outline the main features of the designed system and discuss on their application scope and future work.
2. System Architecture In figure 1 we can see the complete system block diagram. It can be seen that the microcontroller has its data and address buses routed to an expansion connector. This connector enables us to expand the system architecture allowing a modular design. First of all, the WLAN Expansion Board was attached to the system, proving its correct functionality. The microcontroller interfaces with the WLAN Expansion Board Interface Circuit, which sends and receives data to and from the PCMCIA card [2]. Then, the CMOS Image Sensor Control Board was added to the system. In the same way, the microcontroller asks for frames to the CMOS Image sensor Control Board interface circuit, and this one grabs the first complete frame coming from the CMOS Image Sensor while stores it in the SRAM memory. So, this SRAM acts as a
ATmega 128 microcontroller
E X P A N S I O N C O N N E C T O R
Data Bus + Address Bus
buffer, because the microcontroller is not fast enough to get all the pixels in one frame and it would lose some data. After this interface circuit has stored the entire frame in the SRAM, the microcontroller reads it all from the memory through the same interface circuit. To sum up, the complete data flow from when a frame is demanded until it reaches its destiny is as follows: • The microcontroller detects an HTTP connection through the PCMCIA card. • Then, the microcontroller asks to the WLAN Expansion Board for a new frame, and this one stores the first frame that comes from the CMOS image sensor to the SRAM. • After this, the microcontroller reads pixel by pixel the whole frame from the SRAM through the interface circuit, and packages this data in TCP packets.
XC95108 CPLD Interface Circuit
PCMCIA PRISM 2 CHIPSET
WLAN Expansion Board
Data Bus + Address Bus
XC95216 CPLD Interface Circuit
OV6630 CMOS Image Sensor IDT71024 128k x 8 bit SRAM
Ethernut 2.0 Rev. A
CMOS Image Sensor Control Board Figure 1. System Block Diagram
•
•
Then the data packages, together with control commands, are sent to the PCMCIA, trough the CMOS Image sensor Control Board interface circuit. Finally, the PCMCIA sends the packets to the destination wirelessly, using the 802.11b protocol. Figure 3. WLAN Expansion Board
3. Implementation Issues To develop the wireless web server camera, the design has been based on the Ethernut board. Ethernut is an Open Source Hardware and Software Project for building tiny Embedded Ethernet Devices. 3.1. Hardware The design platform hardware includes a small board, called Ethernut [3], which is equipped with an Atmel ATmega128 CPU and a Realtek RTL8019AS (Ethernut 1) or LAN91C111 (Ethernut 2) Ethernet Controller, which offers fast 100 Mbps wired Ethernet and which actually has not been used.
Figure 2. Ethernut 2.0 Rev. A
For this system it has been used specifically the Ethernut 2.0 Rev. A PCB, as well as two expressly designed add-on boards attached to is expansion connector. On the one hand, the first of these two boards implements the physical layer of the 802.11b protocol and on the other hand, the second board manages a Color CIF (352 x 288) CMOS Image Sensor from OmniVision [4], the OV6630. Figure 2 shows the Ethernut 2.0 Rev. A board, while figures 3 and 4 show the board that allows wireless connectivity and the one that control the CMOS Image Sensor, respectively.
In figure 3, it can be seen that a commercial PCMCIA Wireless Card has been used. It had been thought that for a quick prototype and to have the design running as soon as possible this was the fastest solution. The only requirement was that the PCMCIA had a PRISM II, Lucent or Hermes chipset, because it has been used a FreeBSD source that uses this chipset to implement the Wireless driver. The one used in this board has a PRISM II chipset [5]. In a future design, the PRISM II chipset can be used instead of the whole card, since this only routes its pins to the chipset pads, with no extra logic nor passive elements. It is also important to note that the Wireless Card used should contain a running firmware in order to drive it. In a first draft, a low power Type-1 Compact Flash wireless LAN card from Socked was used, and although its remarkable little size, only CF Type-1, and its low power consumption when in idle mode, we desisted from using it because this special card along with the Symbol Spectrum24 High Rate card, as well as the Intel PRO/Wireless 2011B does not contain a firmware inside, so the driver has to download a firmware image into the card before it can be used, which is not trivial to do. Furthermore, the main problem with driving PCMCIA and CompactFlash cards from an AVR microcontroller is the IORDY line. If the card activates this signal, then the microcontroller has to extend its access cycle. There are three possible solutions: 1. Using I/O ports instead of memory mapped access. This solution occupies a lot of port bits and is typically slow. 2. Inserting delay loops in the software at certain points. It requires some trial and error and may fail when changing cards. Moreover, it is also slower. 3. Building hardware to emulate cycle extension. Solution 3 has been used, by implementing a state machine into the XC95108 Xilinx CPLD.
The board that manages the CMOS Image Sensor is shown in figure 4. It consists of a XC95216 Xilinx CPLD, and a 1 Mb (128K x 8 Bit), 12 ns access time SRAM from IDT (IDT71024). The CPLD receives data from the CMOS Image Sensor and saves it in the SRAM when the ATmega128 microprocessor asks for a new frame from the Ethernut board. The SRAM acts as a buffer because the microprocessor is not fast enough to pick-up all the data directly form the Image Sensor. Then the ATmega128 reads pixel by pixel from the SRAM trough the CPLD, which synchronizes all the process of getting a frame.
3.2. Software The Ethernut Project has well documented software, which is an Open Source implementation of a Real Time Operating System called Nut/OS and a TCP/IP protocol suite named Nut/Net. Nut/OS is an intentionally simple RTOS for the ATmega128, which provides a minimum of services to run Nut/Net, the TCP/IP stack. Its features include: • • • • •
Non pre-emptive multithreading. Events. Periodic and one-shot timers. Dynamic heap memory allocation. Interrupt driven streaming I/O. Main features of the TCP/IP stack are:
• • • • •
Figure 4. CMOS Image Sensor Control Board
Additionally, to increase the prototype versatility and in order to make system tests handier, the CMOS Image Sensor has not been soldered directly to the CMOS Image Sensor Control Board, but connected through a flat cable to a new tiny board equipped with the needed passives (figure 5).
Figure 5. CMOS Image Sensor With and Without Lens
Base protocols ARP, IP, ICMP, UDP and TCP. User protocols DHCP, DNS and HTTP. Socket API. Host, net and default routing. Interrupt driven Ethernet driver.
The system is highly configurable by a Nut/OS configuration manager. Moreover, it has an extensive Online API Documentation [6]. Upon this RTOS, the actual application has been written. The main routine is a simple loop waiting for a connection in port 80 (HTTP). Once the connection is established, the Ethernut serves the files required by the client, which are stored in a 512 kB flash memory and controlled by a file system called urom. Besides, a FreeBSD WLAN driver [7] [8] adaptation for this microcontroller has been done. To make things work, some software structures have to be written in order to make Nut/Net functions point to the new input, output and controlling driver functions. The firmware written in the CPLDs has been written both using schematics capture and VHDL, with the Free Design tool ISE WebPACK 6.3, from Xilinx [9]. And it has been tested using the ModelSim verification and debugging tool [10]. The OV6630 CMOS Image sensor supplies raw data. With the data provided by this sensor we can build a bitmap image, but we have to create the necessary BMP headers and color palette by software [11].
Figure 6. The Whole System
This is not difficult if we assume a fixed picture resolution of 320 x 240 pixels in black and white.
4. Results Figure 6 shows the whole mounted system. At the right of the picture, we can see a very little part of the Ethernut board, at the bottom of the three boards. Over this board we can see, by order, the WLAN Expansion Board and the CMOS Image Sensor Control Board on the top. It also can be seen the CMOS Image Sensor With the Lens, connected to the CMOS Image Sensor Control Board through a flat cable.
The picture resolution used in this system is 320 x 240 pixels in black and white. In figure 7 it can be seen a frame received with the OV6630 sensor. There have been made some experimental tests, and it has been found that the bit rate with this solution is approximately 70 kB/s. With this bit rate, we can have an image approximately every second. 1 frame = 320 x 240 pixels 1 pixel = 8 bits 1 frame = 320 x 240 bytes = 75 kB With this bit rate, real time video can still not be achieved, but in any case a versatile wireless web camera that can be accessed from any web browser via HTTP has been built. Besides, it is not only a Remote controlled camera, but also a web server, with all the possibilities it involves. 4.1. Real Time Video Considerations
Figure 7. 320 x 240 Bitmap received with the CMOS Image Sensor OV6630
Considering that to have a minimum real time video quality are needed 20 frames per second, the system should be about 20 times faster to give this feature. This could be achieved either by increasing the microprocessor clock frequency, or by reducing the image size. To decrease the image size, a lower resolution could be sent, but this is not the best solution. Maintaining the 320 x 240 resolution, the image
could be compressed in JPEG format. This compression has about a 1:20 ratio, which would lead to the desired goal. To make this compression, an ASIC could be employed, but also an IP core [12] compiled in a FPGA, which would be cheaper and would give the same functionality.
5. Future Work As future work, it has been thought of merging both CPLDs in a single FPGA, which would be cheaper and would occupy less space. It has been also thought of improving the system yield by adding an IP core that would emulate a JPEG Hardware Compressor, in such a way that we would be able to receive real time video from a standard web browser, and not only an image every certain time. Also a pan and tilt kit [13] could be added and controlled by the micro, which would allow the remotely connected user to control where the camera is aiming at.
6. Conclusions The overall conclusion is that a preliminary prototype of Embedded Wireless Camera with an integrated Web Server that has a rather good bit rate, considering the limited 8 bit microprocessor used working at a 14 MHz frequency, has been built. Its most remarkable feature is the proficiency that gives a system without data cables that fits with the HTTP protocol. Now, a new design platform has been created, upon which multiple new applications can be done, adding new functionalities and adapting it to several different demands, ranging from security cameras to artificial vision in autonomous mobile robots.
References [1] Ohrtman, F. and Roeder, K. Wi-Fi Handbook: Building 802.11b Wireless Networks. McGraw-Hill, 2003. [2] D.Anderson, PCMCIA System Architecture: 16-bit PC cards, 2nd Edition. Addison-Wesley, 1995. [3] Ethernut Web Page: http://www.ethernut.de/ [4] OV6630 Single-chip CMOS CIF Color Digital Camera. Advanced Information Preliminary. Omnivision, Version 1.0, March 4, 2000. [5] PRISM Driver Programmers Manual. Intersil, Version 2.3, June 2002. [6] Ethernut Online API Documentation: http://www.ethernut.de/api/index.html [7] Lucent WaveLAN/IEEE 802.11 PCMCIA driver: http://fxr.watson.org/fxr/source/dev/wi/if_wi.c [8] B. Martínez, I. Tarruella, D. Castells, J. Carrabina, “Adaptación de un driver WLAN Linux a una arquitectura NIOS-Avalon”, Proceedings JCRA, pp. 297 – 307, September 2004. [9] Xilinx Web Page: http://www.xilinx.com [10] ModelSim Web Page: http://www.model.com/ [11] BMP Format: http://atlc.sourceforge.net/bmp.html [12] JPEG Hardware Compressor IP Core: http://www.opencores.org/projects.cgi/web/jpe g/overview [13] Pan and Tilt Kit: http://www.lynxmotion.com/Product.aspx?prod uctID=162&CategoryID=61