sensors Article
A Solar Position Sensor Based on Image Vision Adolfo Ruelas 1, *, Nicolás Velázquez 2 , Carlos Villa-Angulo 2 , Alexis Acuña 1 and José Suastegui 1 1 2
*
ID
, Pedro Rosales 1
Facultad de Ingeniería, Universidad Autónoma de Baja California, Blvd. Benito Juárez S/N, Mexicali 21280, Mexico;
[email protected] (A.A.);
[email protected] (P.R.);
[email protected] (J.S.) Instituto de Ingeniería, Universidad Autónoma de Baja California, Calle de la Normal S/N, Mexicali 21280, Mexico;
[email protected] (N.V.);
[email protected] (C.V.-A.) Correspondence:
[email protected]; Tel.: +52-686-172-4821
Received: 26 June 2017; Accepted: 26 July 2017; Published: 29 July 2017
Abstract: Solar collector technologies operate with better performance when the Sun beam direction is normal to the capturing surface, and for that to happen despite the relative movement of the Sun, solar tracking systems are used, therefore, there are rules and standards that need minimum accuracy for these tracking systems to be used in solar collectors’ evaluation. Obtaining accuracy is not an easy job, hence in this document the design, construction and characterization of a sensor based on a visual system that finds the relative azimuth error and height of the solar surface of interest, is presented. With these characteristics, the sensor can be used as a reference in control systems and their evaluation. The proposed sensor is based on a microcontroller with a real-time clock, inertial measurement sensors, geolocation and a vision sensor, that obtains the angle of incidence from the sunrays’ direction as well as the tilt and sensor position. The sensor’s characterization proved how a measurement of a focus error or a Sun position can be made, with an accuracy of 0.0426◦ and an uncertainty of 0.986%, which can be modified to reach an accuracy under 0.01◦ . The validation of this sensor was determined showing the focus error on one of the best commercial solar tracking systems, a Kipp & Zonen SOLYS 2. To conclude, the solar tracking sensor based on a vision system meets the Sun detection requirements and components that meet the accuracy conditions to be used in solar tracking systems and their evaluation or, as a tracking and orientation tool, on photovoltaic installations and solar collectors. Keywords: tracking; vision; solar position
1. Introduction The increase in fossil fuels and electric energy generation costs has driven the development of technologies that take advantage on renewable energy sources. Solar energy is one of them—it is abundant and can be used in diverse applications, for instance water heaters, solar kitchens, process heat generation in industries, and electric energy generation. Solar capture technologies have better performance when the direction of the Sun’s rays is normal to the surface of the capture, however, this is not always possible due to the continuous apparent movement of the Sun. To deal with this, there has been a great variety of developments in solar tracking systems [1–3], where it has been found that in photovoltaic systems with tracking on one or two axes, it is possible to capture 30 to 40% more energy in comparison with a fixed one [4–7]. When talking about concentrating solar power with solar collectors, the tracking system needs to be precise, for the Sun rays to fall upon the collector’s acceptance angle to ensure a satisfactory thermal behavior [8,9]. In the literature, different algorithms for control and automation of solar tracking have been reported; they can be classified into open-loop, closed-loop and hybrid control systems. The first one Sensors 2017, 17, 1742; doi:10.3390/s17081742
www.mdpi.com/journal/sensors
Sensors 2017, 17, 1742
2 of 13
uses astronomic equations to determine the location of the Sun and position the collection surface area [10–13]. For closed-loop systems, differential light sensors based on phototransistors [14–16], photoSensors resistors [9,17,18] photovoltaic cells [19,20] and image processing based systems [21–25] are 2017, 17, 1742 2 of 14 used. For the third strategy, a hybrid is used, on which the Sun´s position is calculated to locate it to a nearby focus point, from then on, thedifferential sensor itself makes a precise adjustment [26–29]. area [10–13]. Forand closed-loop systems, light sensors based on phototransistors [14–16], photo resistors [9,17,18] photovoltaic cells design, [19,20] and image processing based systems [21–25] are tests On papers about tracking systems, their construction, development and functionality used. Formentioned. the third strategy, a hybrid used, on which the Sun´ s position is calculated to locateare it tobeing are regularly Despite beingisthe main objective, when different control systems a nearby focus point, and from then on, the sensor itself makes a precise adjustment [26–29]. tested to determine if they meet the policies and standards to be used to test solar tracking systems, a On papers about tracking systems, their design, construction, development and functionality specific process that can determine the solar focus error, or the fact that the tracking system evaluation tests are regularly mentioned. Despite being the main objective, when different control systems are doesn’t depend on the capture system which being used on, hasn’t been reported. being tested to determine if they meet the policies and standards to be used to test solar tracking Therefore, it has become necessary to develop a sensor evaluate of a solar tracking systems, a specific process that can determine the solar focusto error, or thethe factaccuracy that the tracking system system, regardless of its application. Inspired on the visual systems with image processing [23–25], evaluation doesn’t depend on the capture system which being used on, hasn’t been reported. that haveTherefore, been proven useful to determine the focus error on pixels, this document describes it has become necessary to develop a sensor to evaluate the accuracy of a solar the tracking system, regardless of its application. Inspired on visualbased systems image design, construction, characterization and evaluation of athe sensor onwith vision to processing determine the have been useful to determine focus error on pixels, this document describes focus[23–25], error ofthat a surface orproven solar collector, with the the objective of evaluating solar tracking systems or the design, construction, characterization and evaluation of a sensor based on vision to determine its use as a sensor for solar tracking systems, not limiting its use to any particular orientation the or level focusinerror of a surface orThe solarsensor collector, with the objective of evaluating solar tracking systems or surfaces its installation. is designed to be used on any surface such as radiometric its use as a sensor for solar tracking systems, not limiting its use to any particular orientation or level stations (for example the SOLYS 2 used to validate the sensor), parabolic trough collectors, parabolic surfaces in its installation. The sensor is designed to be used on any surface such as radiometric dishes and photovoltaic panels, the robust sensor can withstand the severe weather conditions under stations (for example the SOLYS 2 used to validate the sensor), parabolic trough collectors, parabolic whichdishes solarand collectors already work. photovoltaic panels, the robust sensor can withstand the severe weather conditions under which solar collectors already work.
2. Materials and Methods
2. Materials and Methods The solar position sensor has a cylindrical geometry, as seen in Figure 1a, where one can find the assembly andsensor the electronic devicesgeometry, that makeasup theinsensor 1b shows upper The structure solar position has a cylindrical seen FigureFigure 1a, where one canthe find the assembly structure and the electronic devices that make up the sensor Figure 1b shows the upper tempered darkened glass with a thickness of 3 mm, which acts as the solar radiation filter with tempered with a thickness of 3type mm,that which acts visible as the solar filtercan withcause ultraviolet anddarkened infrared glass wavelengths, the wave is not to theradiation sensor that ultraviolet and infrared wavelengths, the wave type that is not visible to the sensor that can cause heating and deterioration of the vision sensor. The filter is made from a Lens Protection lens of shade heating and deterioration of the vision sensor. The filter is made from a Lens Protection lens of shade 14 used in welding helmets. This lens blocks artificial light or diffuse solar radiation, so that the 14 used in welding helmets. This lens blocks artificial light or diffuse solar radiation, so that the detection of the Sun is not affected. The structure has connectors and a holding base with neodymium detection of the Sun is not affected. The structure has connectors and a holding base with neodymium magnets and and grips to hold it, occasionally to the thebase basetotobebe evaluated. magnets grips to hold it, occasionallyororpermanently, permanently, to evaluated.
(a)
(b)
Figure 1. Mechanical design and geometry of the solar position sensor: (a) View of the assembled
Figure 1. Mechanical design and geometry of the solar position sensor: (a) View of the assembled sensor; (b) Exploded view of sensor. sensor; (b) Exploded view of sensor.
The electronic circuit of the solar position sensor (see Figure 2) consists of the 8-bit Atmega2560 microcontroller with 256 flash position memory and a working voltage 5 V at a of temperature ranging The electronic circuit ofKB theofsolar sensor (see Figure 2)of consists the 8-bit Atmega2560 from −10° towith 85° centigrade. For this to work, basic electronics are added, of a 16 MHz microcontroller 256 KB of flash memory and a working voltage of 5 Vconsisting at a temperature ranging ◦ ◦ oscillator, 5 V regulator, a capacitor network to eliminate electrical noise and a voltage leveling stage from −10 to 85 centigrade. For this to work, basic electronics are added, consisting of a 16 MHz from 5 to 3.3 V for the pressure and inertial sensor. The inclination and orientation are obtained with
Sensors 2017, 17, 1742
3 of 13
Sensors 2017, 17, oscillator, V1742 regulator, Sensors 2017,5 17, 1742
of a capacitor network to eliminate electrical noise and a voltage leveling 33stage of 14 14 from 5 to 3.3 V for the pressure and inertial sensor. The inclination and orientation are obtained with the the mpu9250 9-axis 9-axis inertial measurement sensor, which contains an accelerometer, gyroscope and and the mpu9250 mpu9250 9-axis inertial inertial measurement measurement sensor, sensor, which which contains contains an an accelerometer, accelerometer, gyroscope gyroscope and magnetometer internally with working temperatures of −40° to 85° centigrade at 3.3 V. Pressure, ◦ ◦ magnetometer magnetometer internally internally with with working working temperatures temperatures of of −40° −40 toto85° 85 centigrade centigradeatat3.3 3.3 V. V. Pressure, Pressure, temperature temperature and humidity readings are obtained obtained with with the the integrated integrated BME280, which can work from temperature and and humidity humidity readings readings are are obtained with the integrated BME280, BME280, which which can can work work from from −40° to 85° centigrade at 3.3 V. ◦ ◦ −40° to 85° centigrade at 3.3 V. −40 to 85 centigrade at 3.3 V.
Figure Figure2. 2.Block Blockdiagram diagramof ofthe thehardware hardwarearchitecture Figure 2. Block diagram of the hardware architecture of of the the solar solar position position sensor. sensor.
The position is by the vision sensor The position positionofof of the Sun is obtained obtained by image image processing processing through the Pixy Pixy vision sensor thethe SunSun is obtained by image processing through through the Pixy vision sensor (CMUcam5) ◦ (CMUcam5) based on the ARM-CortexNXPLPC4330 processor with working temperatures from (CMUcam5) on the ARM-CortexNXPLPC4330 with working temperatures from −40° based on thebased ARM-CortexNXPLPC4330 processor processor with working temperatures from −40 to−40° 85◦ to 85° 55 V. This the to aa 320 ×× 200 resolution. to 85°centigrade centigrade at V. This sensor sensor has the ability ability to detect detect colors colorsawith with 320 pixels 200 pixels pixels resolution. centigrade at 5 V.at This sensor has thehas ability to detect colors with 320 × 200 resolution. The detection strategy is based on capturing the image of the Sun through the filter. The detection detection strategy strategy isisbased basedon oncapturing capturingthe theimage imageofofthe the Sun through filter. In Figure Figure Sun through thethe filter. In In Figure 3a, 3a, real image the Sun, while Figure 3b, from the vision 3a, we we can see the real image of the Sun,while whileinin inFigure Figure3b, 3b,from fromthe the vision vision sensor sensor through through the the filter, we cancan seesee thethe real image ofof the Sun, sensor thefilter, filter, itit is object, which is why the Pixy sensor is is observed observed that the light of the Sun isisidentified identified asasaa aclear clear object, which is is why thethe Pixy sensor is observedthat thatthe thelight lightof ofthe theSun Sunis identifiedas clear object, which why Pixy sensor configured independently through the Pixymon software with the parameters of Figure 4, with which configured independently through the the Pixymon software withwith the parameters of Figure 4, with4,which is configured independently through Pixymon software the parameters of Figure with itwhich identify the and in by towards the it is is able able toable identify the object object and send send its position position in pixels pixels by the the i2c protocol towards the it isto to identify the object and its send its position in pixels byi2c theprotocol i2c protocol towards microcontroller. microcontroller. the microcontroller.
(a) (a)
(b) (b)
Figure 3. of the Sun viewed from the sensor (a) Real view of the Sun; (b) Sun Figure 3. Image Image Sun viewed viewed through through Figure 3. Image of of the the Sun Sun viewed viewed from from the the sensor sensor (a) (a) Real Real view view of of the the Sun; Sun; (b) (b) Sun viewed through the filtered visual sensor. the filtered visual sensor. the filtered visual sensor.
Sensors 2017, 17, 1742 Sensors 2017, 17, 1742 Sensors 2017, 17, 1742
4 of 13 4 of 14 4 of 14
Figure Figure 4. 4. Interface Interface and and parameters parameters of of the the vision vision sensor sensor in in the the Pixymon Pixymon environment. environment. Figure 4. Interface and parameters of the vision sensor in the Pixymon environment.
The acquisition of data or telemetry is done through serial communication and the Xbee Pro S1 The acquisition acquisition of of data data or or telemetry telemetry is is done done through through serial serial communication communication and and the the Xbee Xbee Pro Pro S1 S1 The device. The microcontroller sends and receives data through the serial port and transmits it through device. The The microcontroller microcontroller sends sends and receives receives data through through the the serial serial port port and and transmits it it through through device. the previously mentioned module.and The receiverdata computing equipment contains atransmits Java user interface the previously mentioned module. The receiver computing equipment contains a Java user interface the previously module. The receiver computing equipment contains a Java user interface made with thementioned help of the Processing 3.0 platform. Figure 5 presents a screenshot showing the made with the help of the Processing 3.0 platform. Figure 55 presents presents aa screenshot showing the made with the help of the Processing 3.0 platform. Figure screenshot showing inclination, orientation, temperature, pressure, humidity, latitude, longitude and the focus errorthe of inclination, orientation, temperature, pressure, humidity, latitude, longitude and thethe focus error of the inclination, orientation, temperature, pressure, humidity, latitude, longitude and focus error of the Sun. The information is displayed in real time and at the same time is stored in a text file. In Sun. The information is displayed in real time and at the same time is stored in a text file. In addition the Sun. it The information time and atand the adjust same time is stored in aGPS. text file. In addition contains optionsistodisplayed calibrate in thereal magnetometer the time from the it contains calibrate magnetometer and adjust theadjust time from the GPS. addition it options containstooptions to the calibrate the magnetometer and the time from the GPS.
Figure 5. User interface of the solar position sensor. Figure Figure 5. 5. User User interface interface of of the the solar solar position position sensor. sensor.
The algorithm to determine the focus error of Figure 6, which was implemented in the ATmega The algorithm to determine thethe focus error of values Figure 6, which was implemented the ATmega microcontroller, starts by loading calibration and initializing the inertialin The algorithm to determine the focus error of Figure 6, which was implemented inmeasurement the ATmega microcontroller, starts by loading the calibration values and initializing the inertial measurement sensor (MPU9250), temperature, humidity and altitude (BME280), GPS, telemetry and the Pixy vision microcontroller, starts by loading the calibration values and initializing the inertial measurement sensor (MPU9250), temperature, humidity and altitude (BME280), GPS, telemetry and the Pixy sensor. If any of the sensors fail at initialization, an error message is sent to the interface and thevision cycle sensor (MPU9250), temperature, humidity and altitude (BME280), GPS, telemetry and the Pixy vision sensor. If any of the sensors fail at initialization, an error message is sent to the interface and the cycle is terminated. When the sensors are initialized correctly, the positions of the vision, inclination, level, is terminated. When thehumidity, sensors are initialized positions the vision, level, azimuth, temperature, latitude andcorrectly, longitudethe sensors are of obtained. In inclination, the case that the azimuth, temperature, humidity, latitude and longitude sensors are obtained. In the case that the
Sensors 2017, 17, 1742
5 of 13
sensor. If any of the sensors fail at initialization, an error message is sent to the interface and the cycle is terminated. When the sensors are initialized correctly, the positions of the vision, inclination, level, Sensors 2017, 17, 1742 5 of 14 azimuth, temperature, humidity, latitude and longitude sensors are obtained. In the case that the vision sensor does are assigned assignedtotothe thepositions positions order user vision sensor doesnot notdetect detectan anobject, object, null null values values are inin order forfor thethe user interface to distinguish when an object is detected; to finish, they’re sent through the serial port where interface to distinguish when an object is detected; to finish, they´re sent through the serial port where it is wirelessly indicatesthe theprocess processperformed performed first iteration, it is wirelesslyconnected. connected.The Thealgorithm algorithm shown shown indicates in in thethe first iteration, thethe continuous data acquisition acquisitionofofthe thesensors. sensors. continuousoperation, operation,would wouldbe berepetitive repetitive from the data
Figure6.6.Flow Flow diagram diagram implemented Figure implementedin inthe theATmega ATmegamicrocontroller. microcontroller.
3. Results 3. Results In order to characterize and know the degree of error of the position sensor, tests were In order to characterize and know the degree of error of the position sensor, tests were performed performed under different conditions to record its behavior and if this was affected. One of the most under different conditions to record its behavior and if this was affected. One of the most common common events occurs when the solar position the sensor faces is clouded, Figure 7 illustrates the events occurs when the solar position the sensor faces is clouded, Figure 7 illustrates the actual image actual image and the view from the sensor in three different cloudy cases. In Figure 7a the Sun is and the view from the sensor in three cloudy Figure 7a the Sun is partially seen, partially seen, blocked by a cloud thatdifferent causes the Sun’s cases. rays toIndisperse, Figure 7b shows how the blocked by a cloud theradiation Sun’s rays disperse, Figure 7b shows how allowing the filter the attenuates filter attenuates thethat lowcauses intensity thatto forms the diffuse radiation aureole, view from the vision sensor to be clear, that allows an identification of the object without disturbances.
Sensors 2017, 17, 1742
6 of 13
the low intensity radiation that forms the diffuse radiation aureole, allowing the view from the vision sensor to be clear, that allows an identification of the object without disturbances. Sensors 2017, 17, 1742 6 of 14
(a)
(b)
(c)
(d)
(e)
(f)
Figure 7. Cloudy real images and images viewed through the vision sensor. (a) Real view of the Sun
Figure 7. Cloudy real images and images viewed through the vision sensor. (a) Real view of the Sun partially cloudy; (b) Sensor view of the Sun partially cloudy; (c) Real view of the Sun nebulous cloudy; partially cloudy; (b) Sensor view of the Sun partially cloudy; (c) Real view of the Sun nebulous cloudy; (d) Sensor view of the Sun nebulous cloudy; (e) Real view of the Sun cloudy; (f) Sensor view of the (d) Sensor view of the Sun nebulous cloudy; (e) Real view of the Sun cloudy; (f) Sensor view of the Sun cloudy. Sun cloudy.
Figure 7c illustrates a complete low-intensity nebulous cloudy day, the Sun is distorted by the diffuse of theaclouds, and low-intensity even a change innebulous its shape is seen due to the the non-homogeneity Figure radiation 7c illustrates complete cloudy day, Sun is distorted of by the the clouds, yet despite this, the filter manages to block the imperfections caused by the clouds (see diffuse radiation of the clouds, and even a change in its shape is seen due to the non-homogeneity of Figure 7d), so that the sensor makes an identification of the object without disturbances. The last case the clouds, yet despite this, the filter manages to block the imperfections caused by the clouds (see of Figure 7e illustrates that the circumference of the Sun is affected by the intense overcast, but despite Figure 7d), so that an see identification of from the object without The this, in the next the shotsensor (Figure makes 7f) we can that the image the image sensordisturbances. is again clear, as if last case of Figure 7e illustrates that the circumference of the Sun is affected by the intense overcast, it were not cloudy. With these tests we can verify that the filter performs a perfect job at attenuating but despite this, in thepresented next shot we can see[29]. thatThe thetype image from theanimage sensor perturbations in (Figure this type7f) of application of lens plays important roleisinagain clear,the as capture if it were notimage cloudy. With these testsaswe can verify that filter performs a perfect of the in the vision sensors, demonstrated in thethe work of Lee et al. [24] which job showed that the aperture ofpresented the lens helps improve of the detection of theofSun. In plays this an at attenuating perturbations in this typethe of accuracy application [29]. The type lens sense the Pixy vision sensor was tested with different types of lenses, choosing the four types of M12 important role in the capture of the image in the vision sensors, as demonstrated in the work of Lee with the most common vision angles, 12° (focal length 25 mm, F2.0), (focal of et al. lenses [24] which showed that thedifferent aperture of the lensnamely helps improve the accuracy of the21° detection length 16 mm, F2.0), 42° (focal length 8.0 mm, F2.0) and 75° (focal length 2.8 mm, F2.0). The the Sun. In this sense the Pixy vision sensor was tested with different types of lenses, choosing the pixel/degree equivalence conversion factor was characterized. For this, the sensor was exposed four types of M12 lenses with the most common different vision angles, namely 12◦ (focal length 25 directly to the Sun and displacements were performed on the “x” (azimuth angle) and “y” (zenith ◦ mm, F2.0), length 16 mm, F2.0), 42◦ (focal length 8.0 mm,for F2.0) 75◦ (focal length 2.8 angle) 21 axis,(focal the corresponding angle measurements were recorded eachand defocused pixel. Figure 8 mm,
Sensors 2017, 17, 1742
7 of 13
F2.0). The pixel/degree equivalence conversion factor was characterized. For this, the sensor was exposed directly Sensors 2017, 17, 1742 to the Sun and displacements were performed on the “x” (azimuth angle) and 7 of“y” 14 (zenith angle) axis, the corresponding angle measurements were recorded for each defocused pixel. shows relation pixels of to pixels degrees a 12° (Figure 21° (Figure 42° (Figure and 75° Figure the 8 shows the of relation to with degrees with a 12◦8a), (Figure 8a), 21◦8b), (Figure 8b), 42◦8c), (Figure 8c), (Figure lens.8d) Thelens. dataThe shows a shows linear behavior with some points ofpoints dispersion, which happens and 75◦ 8d) (Figure data a linear behavior with some of dispersion, which because anglethe measurement is done with an inertial in aand sudden movement a changea happensthe because angle measurement is done with ansensor inertialand sensor in a sudden movement in the acceleration occurs, which corrected once theonce sensor change in the acceleration occurs,iswhich is corrected the stabilizes. sensor stabilizes.
(a)
(b)
(c)
(d)
Figure 8. Characterization Characterizationand andadjustment adjustmentofof pixels to degrees with different lenses in degrees: (a) Figure 8. pixels to degrees with different lenses in degrees: (a) With With lens of 12°; (b) With lens of 21°; (c) With lens of 42°; (d) With lens of 75°. ◦ ◦ ◦ ◦ lens of 12 ; (b) With lens of 21 ; (c) With lens of 42 ; (d) With lens of 75 .
Table 1 illustrates the first-order adjustment equations and the accuracy for each lens. It is Table 1 illustrates the first-order adjustment equations and the accuracy for each lens. It is observed that the detection of the Sun increases as the viewing angle decreases, however, the observed that the detection of the Sun increases as the viewing angle decreases, however, the accuracy accuracy decreases because the resolution of the inertial sensor is reaching its limit, making it more decreases because the resolution of the inertial sensor is reaching its limit, making it more difficult difficult to read small changes. The selection of the lens depends on the precision and accuracy to read small changes. The selection of the lens depends on the precision and accuracy required for required for each application, and because of this, the lens with a 21-degree viewing angle was each application, and because of this, the lens with a 21-degree viewing angle was selected, since it is a selected, since it is a midpoint between accuracy and precision, with a 0.04260° of resolution and a midpoint between accuracy and precision, with a 0.04260◦ of resolution and a setting of 99.014%. setting of 99.014%. In the adjustment equations of Table 1 the intercept for all cases is different from zero, because In the adjustment equations of Table 1 the intercept for all cases is different from zero, because at the beginning of the experiment the position of the Sun did not match the zero pixel, since it is a at the beginning of the experiment the position of the Sun did not match the zero pixel, since it is a scenario that cannot be controlled. scenario that cannot be controlled. Table 1. Accuracy and coefficients of the first order adjustment equation for the different types of lenses. Vision Angle 12 21 40 75
Slope 0.02583 0.04260 0.08420 0.23962
Interception 0.24189 0.28362 −1.35669 −0.84907
Adjustment 98.448% 99.014% 99.518% 99.721%
Sensors 2017, 17, 1742
8 of 13
Table 1. Accuracy and coefficients of the first order adjustment equation for the different types of lenses.
Sensors 2017, 17, 1742
Vision Angle
Slope
Interception
Adjustment
12 21 40 75
0.02583 0.04260 0.08420 0.23962
0.24189 0.28362 −1.35669 −0.84907
98.448% 99.014% 99.518% 99.721%
8 of 14
However, it is is known However, it known that that it it is is aa constant constant displacement displacement and and to to obtain obtain the the experimental experimental resolution resolution (Rex), the interception can equal zero or it can have obtained by a difference as indicated (Rex), the interception can equal zero or it can have obtained by a difference as indicated by by Equation (1): Equation (1): Rex (n ++1)1)−−angle ( n ), 𝑅𝑒𝑥==angle 𝑎𝑛𝑔𝑙𝑒(𝑛 𝑎𝑛𝑔𝑙𝑒(𝑛),
(1)
the focus error of of one of The validation validation of of the the solar solarposition positionsensor sensorwas wasperformed performedbybydetermining determining the focus error one thethe bestbest commercial solarsolar tracking systems of Kipp & Zonen (SOLYS(SOLYS 2), which a solar tracking of commercial tracking systems of Kipp & Zonen 2), ensures which ensures a solar ◦ of accuracy in active mode. Figure 9 shows the configuration of the experiment. The white with 0.02with tracking 0.02° of accuracy in active mode. Figure 9 shows the configuration of the experiment. mechanism is the SOLYS with its sensor and On the right side a base to place solar The white mechanism is 2the SOLYS 2 with itspyreliometer. sensor and pyreliometer. On the right side athe base to position can be seen. 2 solar tracker 2works of the solar position place thesensor solar position sensorThe canSOLYS be seen. The SOLYS solar independently tracker works independently of the sensor. In the experimental study, the variables of azimuthal and zenith approach, inside solar position sensor. In the experimental study, the variables of azimuthal andtemperature zenith approach, the sensor, ambient temperature and solar radiation were recorded every 1 s. temperature inside the sensor, ambient temperature and solar radiation were recorded every 1 s.
Figure Figure 9. 9. Experimental Experimental setup, setup, SOLYS SOLYS 22 and and on on its its surface surface the the solar solar position position sensor. sensor.
Figures 10–13 show the results of the experiments during a full day, where the first tree graph Figures 10–13 show the results of the experiments during a full day, where the first tree graph shows the focus error in the azimuth, zenith and incidence angle, the fourth shows the solar radiation, shows the focus error in the azimuth, zenith and incidence angle, the fourth shows the solar radiation, ambient temperature and internal temperature of the solar position sensor. Around 6:35 AM the ambient temperature and internal temperature of the solar position sensor. Around 6:35 a.m. the sensor begins to detect the Sun with a radiation of 60 W/m22, the azimuth angle stays perfectly focused sensor begins to detect the Sun with a radiation of 60 W/m , the azimuth angle stays perfectly focused at 0°◦ until around 10:00 AM, when it presents small delays of 0.042°.◦It is worth mentioning that it is at 0 until around 10:00 a.m., when it presents small delays of 0.042 . It is worth mentioning that it the first value of the discrete ramp or the resolution of the solar position sensor. The focus on the is the first value of the discrete ramp or the resolution of the solar position sensor. The focus on the azimuth angle of the SOLYS had an excellent performance by staying within the minimum resolution azimuth angle of the SOLYS had an excellent performance by staying within the minimum resolution of the solar position sensor, in the experiment can be seen that from 4:00 PM onwards, shadows can of the solar position sensor, in the experiment can be seen that from 4:00 p.m. onwards, shadows be seen in the area where the SOLYS was placed, which causes disturbances in the measurement, therefore the data was excluded and is not considered in the analysis, however, they are still shown in the graph with the purpose of presenting the experiment throughout the whole day.
Sensors 2017, 17, 1742
9 of 13
can be seen in the area where the SOLYS was placed, which causes disturbances in the measurement, therefore in Sensors 2017,the 17, data 1742 was excluded and is not considered in the analysis, however, they are still shown 9 of 14 Sensors 2017, 17, 1742 9 of 14 the graph with the purpose of presenting the experiment throughout the whole day.
Figure 10. Focus error in the zenith angle during a first day of operation. Figure Figure 10. 10. Focus Focus error error in in the the zenith zenith angle angle during during aa first firstday dayof ofoperation. operation.
Figure 11. 11. Focus Focus error error in in the the azimuth azimuth angle angle during during aa first first day day of of operation. operation. Figure Figure 11. Focus error in the azimuth angle during a first day of operation.
Sensors 2017, 17, 1742 Sensors Sensors 2017, 2017, 17, 17, 1742 1742
10 of 13 10 10 of of 14 14
Figure operation. 12. Focus angle during during aa first first day day of of operation. Figure 12. Focus error error in in the the incidence incidence angle
Figure 13. operation throughout Figure 13. 13. Ambient Ambient temperature temperature and and temperature temperature inside inside the the sensor sensor during during its its operation operation throughout throughout Figure Ambient temperature and temperature inside the sensor during its the first day. the first day. the first day.
The you The zenith zenith begins begins by by detecting detecting the the Sun Sun at at the the same same time, time, maintaining maintaining an an error error of of 0.08°, 0.08°, you can can ◦ , you can The zenith begins by detecting the Sun at the same time, maintaining an error of 0.08 see the focus error changes approximately every hour, which could be translated into an error by see the focus error changes approximately every hour, which could be translated into an error by the the see the focus error changes approximately everyalignment hour, which could be translated into an errorthis, by the SOLYS SOLYS solar solar tracker tracker sensor sensor or or aa deviation deviation in in its its alignment (base, (base, sensor sensor or or structure), structure), despite despite this, its its SOLYS solar tracker sensortoorthat a deviation in its alignment (base, sensor or structure), despite this, its performance performance is is very very close close to that one one reported reported by by the the equipment equipment manufacturer. manufacturer. performance is very close to that one reported by thetest equipmentItmanufacturer. Figure Figure 13 13 shows shows the the solar solar radiation radiation during during the the test period. period. It can can be be seen seen that that the the measurement measurement Figure 13 shows the solar radiation during the test period. It can be seenlevels. that the measurement of of the position of the Sun is not affected by the change in solar radiation Not of the position of the Sun is not affected by the change in solar radiation levels. Not until until 4:30 4:30 PM PM the position of the Sun is not affected by the change in solar radiation levels. Not until 4:30 p.m. when when when aa tree tree is is interposed interposed between between the the solar solar position position sensor sensor and and the the Sun, Sun, is is the the circumference circumference detected detected a tree is sensor interposed between the and solarthe position sensor and the Sun,changes is the circumference detected by the by the to be different centroid of the object its position and affects by the sensor to be different and the centroid of the object changes its position and affects its its sensor to be different and thedisturbance centroid of the object changes its position and affects its measurement, measurement, measurement, although although this this disturbance that that is is not not an an operation operation is is maintained maintained with with aa blur blur below below 0.3. 0.3. although this disturbance that istracker not an operation is maintained with a of blur below 0.3. The overall The The overall overall behavior behavior of of the the solar solar tracker can can be be evaluated evaluated with with the the result result of the the azimuthal azimuthal and and zenith zenith behavior of the solarthe tracker can be evaluatedwhich with the result of the azimuthal and12, zenith angle error, angle error, called angle of incidence, can be visualized in Figure during angle error, called the angle of incidence, which can be visualized in Figure 12, during normal normal called the from angle7:00 of incidence, which can be visualized in Figure 12, during normal operation from operation AM to 5:30 PM. It is observed that the error remained below 0.044°, mainly operation from 7:00 AM to 5:30 PM. It is observed that the error remained below 0.044°, mainly due due ◦ , mainly due to the error in the 7:00 a.m. to 5:30 p.m. It is observed that the error remained below 0.044 to the error in the zenith angle, which could have been corrected with an exact leveling or orientation, to the error in the zenith angle, which could have been corrected with an exact leveling or orientation, zenith angle, which could have been corrected with an exact leveling or orientation, which is ambient difficult which which is is difficult difficult to to perform perform manually. manually. On On the the other other hand, hand, it it is is seen seen that that the the change change in in ambient to perform manually. On the other hand, it is seen that the change in ambient temperature does not temperature temperature does does not not affect affect the the operation operation and and performance performance of of the the solar solar position position sensor. sensor. In In Figure Figure 13 13 it is illustrated that the heat generated by the devices raises the inside temperature of the sensor by it is illustrated that the heat generated by the devices raises the inside temperature of the sensor by an an average average of of 9.3 9.3 °C °C when when there there is is no no heat heat input input from from solar solar radiation. radiation. During During the the time time of of solar solar
Sensors 2017, 17, 1742
11 of 13
affect the operation and performance of the solar position sensor. In Figure 13 it is illustrated that the ◦ Cofwhen 2017, 17,by 1742 14 heatSensors generated the devices raises the inside temperature of the sensor by an average of 9.3 11 there is no heat input from solar radiation. During the time of solar radiation and considering the radiation and considering the thermal inertia, it is seen that ◦the temperature2 rises from 25.5 °C to thermal inertia, it is seen that the temperature rises from 25.5 C to 839 W/m two different things so 839 W/m2 two different things so the following makes no sense, resulting in the temperature of the the following makes no sense, resulting in the temperature of the interior2of the solar position sensor interior of the solar position sensor increasing by 1 °C for every 28.91 W/m , so with these data, there 2 , so with these data, there is an maximum estimated working increasing by 1 ◦ C for every 28.91 W/m is an maximum estimated working operation temperature of 47 °C. operation temperature of 47 ◦ C.of the first day of operation of the sensor, there was an average focus During the registration During the registration of the first for daythe of azimuth, operation of the sensor, there was an average error of 0.0065°, 0.04196° and 0.04588° zenith and incidence angles, with modes focus of ◦ and 0.04588◦ for the azimuth, zenith and incidence angles, with modes of 0◦ , error 0.0065◦and , 0.04196 0°,of0.0426° 0.0426°, resulting in the largest error being obtained for the angle zenith. After ◦ and 0.0426◦ , resulting in the largest error being obtained for the angle zenith. After explaining 0.0426 explaining a day of operation in detail, and in order to demonstrate the consistency of the sensor, the error of the angle incidence days of operation isthe shown in Figures 15. The a day of operation inofdetail, and in in two order to demonstrate consistency of 14 theand sensor, theincidence error of the angle error of the second day 0.04021° and 14 0.04304° forThe theincidence azimuth, zenith and of angle of incidence in two days ofindicates operation0.02039°, is shown in Figures and 15. angle error ◦ andand ◦ forOn incidence angles, with0.02039 modes◦of 0°, 0.0426° 0.0426°. third day we and obtained 0.04885°, the second day indicates , 0.04021 0.04304 thethe azimuth, zenith incidence angles, ◦ ◦ ◦ ◦ ◦ 0.04172° and 0.07455° for the azimuth, zenith and incidence angles, with a mode of 0.0426°, 0.0426° with modes of 0 , 0.0426 and 0.0426 . On the third day we obtained 0.04885 , 0.04172 and 0.07455◦ andazimuth, 0.0426°. zenith and incidence angles, with a mode of 0.0426◦ , 0.0426◦ and 0.0426◦ . for the
Figure 14.14. Focus error duringaasecond secondday dayofof operation. Figure Focus errorininthe theincidence incidence angle during operation. Sensors 2017, 17, 1742
12 of 14
With this it is possible to validate a sensor precision of between 0° and 0.0426° for the azimuth, zenith and incidence angles, just a step in the overall resolution of the sensor as indicated in Table 1 for the lens with 21° of vision angle. The resolution of 0.239° per pixel and a viewing angle of 75° is enough for solar tracking systems applied in photovoltaic systems or in some solar collectors, such as a parabolic cylinder which efficiency starts to decrease at 0.5° [9]. According to the results it is proposed that for applications that require greater accuracy, such as the evaluation of tracking systems, the lens should be exchanged for ones with a lower viewing angle, or with magnifying lenses where resolutions up to 0.0017° have been reached for each pixel [24].
Figure Focus errorininthe theincidence incidence angle angle during Figure 15.15. Focus error duringaathird thirdday dayofofoperation. operation.
4. Conclusions With the development of a solar position sensor based on a vision system and the tests performed, it is concluded that a measurement of the error of approach or position of the Sun can be made from a reference, with accuracies of 0.0258° to 0.2396° and viewing angles from 12 to 75 with an uncertainty of 0.2 to 1.55%, which can be modified to achieve accuracies of less than 0.01°. The
Sensors 2017, 17, 1742
12 of 13
With this it is possible to validate a sensor precision of between 0◦ and 0.0426◦ for the azimuth, zenith and incidence angles, just a step in the overall resolution of the sensor as indicated in Table 1 for the lens with 21◦ of vision angle. The resolution of 0.239◦ per pixel and a viewing angle of 75◦ is enough for solar tracking systems applied in photovoltaic systems or in some solar collectors, such as a parabolic cylinder which efficiency starts to decrease at 0.5◦ [9]. According to the results it is proposed that for applications that require greater accuracy, such as the evaluation of tracking systems, the lens should be exchanged for ones with a lower viewing angle, or with magnifying lenses where resolutions up to 0.0017◦ have been reached for each pixel [24]. 4. Conclusions With the development of a solar position sensor based on a vision system and the tests performed, it is concluded that a measurement of the error of approach or position of the Sun can be made from a reference, with accuracies of 0.0258◦ to 0.2396◦ and viewing angles from 12 to 75 with an uncertainty of 0.2 to 1.55%, which can be modified to achieve accuracies of less than 0.01◦ . The vision system proved to be a very good choice for processing and determining the position of the sun, without its accuracy being affected by changes in ambient temperature or solar radiation levels. In addition to the above it is recommended to use different lenses to find the right balance between the accuracy and angle of vision of the application. The darkened glass used and the tests demonstrate that it is an excellent technique to attenuate disturbances by refraction or dispersion of sunlight, helping to obtain better measurements of the location of the Sun, in addition to filtering excessive light to the vision sensor, decreasing the temperature and unnecessary wavelengths, resulting in an increase in sensor life. The solar position sensor based on a vision sensor gave the expected behavior, maintaining a precision in the computation of the focus error of the angles azimuth, zenith and incidence of 0 to 0.0426, therefore the proposed sensor complies with the requirements for detection of the Sun and components that support adequate environmental conditions and accuracy requirements to be used like a sensor to obtain the focus error and thereby evaluate solar tracking systems, in addition like a sensor to feedback solar tracking systems or on orientation devices, leveling and positioning in the installation of photovoltaic plants and solar collectors. Acknowledgments: The authors acknowledge the financial support received from Programa para el Desarrollo Profesional Docente, by the financial support in the call "Cuerpos Académicos 2016". Author Contributions: Adolfo Ruelas wrote the manuscript. He also designed and programmed the software and firmware. Carlos Villa-Angulo supervised the electronic design. Alexis Acuña analyzed the experimental results. Pedro Rosales designed the experimental test and collected the data. José Suastegui assisted in the construction of the prototype. Nicolas Velazquez reviewed the paper. Conflicts of Interest: The authors declare no conflict of interest.
References 1. 2.
3. 4. 5. 6.
Lee, C.-Y.; Chou, P.-C.; Chiang, C.-M.; Lin, C.-F. Sun Tracking Systems: A Review. Sensors 2009, 9, 3875–3890. [CrossRef] [PubMed] Mousazadeh, H.; Keyhani, A.; Javadi, A.; Mobli, H.; Abrinia, K.; Sharifi, A. A review of principle and sun-tracking methods for maximizing solar systems output. Renew. Sustain. Energy Rev. 2009, 13, 1800–1818. [CrossRef] Prinsloo, G.J.; Dobson, R.T. SOLAR TRACKING. iBook Edition; SolarBooks: Stellenbosch, South Africa, 2015. Abdallah, S. The effect of using sun tracking systems on the voltage-current characteristics and power generation of flat plate photovoltaics. Energy Convers. Manag. 2004, 45, 1671–1680. [CrossRef] Abdallah, S.; Nijmeh, S. Two axes sun tracking system with PLC control. Energy Convers. Manag. 2004, 45, 1931–1939. [CrossRef] Chen, F.; Feng, J. Analogue sun sensor based on the optical nonlinear compensation measuring principle. Meas. Sci. Technol. 2007, 18, 2111–2115. [CrossRef]
Sensors 2017, 17, 1742
7. 8. 9. 10. 11. 12. 13.
14. 15. 16. 17. 18. 19. 20. 21. 22.
23. 24. 25.
26. 27. 28.
29.
13 of 13
Sungur, C. Multi-axes sun-tracking system with PLC control for photovoltaic panels in Turkey. Renew. Energy 2009, 34, 1119–1125. [CrossRef] Cope, A.W.G.; Tully, N. Simple tracking strategies for solar concentrations. Sol. Energy 1981, 27, 361–365. [CrossRef] Valan Arasu, A.; Sornakumar, T. Design, development and performance studies of embedded electronic controlled one axis solar tracking system. Asian J. Control 2007, 9, 163–169. [CrossRef] Zhao, D.; Xu, E.; Wang, Z.; Yu, Q.; Xu, L.; Zhu, L. Influences of installation and tracking errors on the optical performance of a solar parabolic trough collector. Renew. Energy 2016, 94, 197–212. [CrossRef] Alata, M.; Al-Nimr, M.A.; Qaroush, Y. Developing a multipurpose sun tracking system using fuzzy control. Energy Convers. Manag. 2005, 45, 1229–1245. [CrossRef] Mazen, M.; Abu-Khader, O.; Badran, O.; Abdallah, S. Evaluating multi-axes sun-tracking system at different modes of operation in Jordan. Renew. Sustain. Energy Rev. 2008, 12, 864–873. Chong, K.-K.; Wong, C.-W.; Siaw, F.-L.; Yew, T.-K.; Ng, S.-S.; Liang, M.-S.; Lim, Y.-S.; Lau, S.-L. Integration of an On-Axis General Sun-Tracking Formula in the Algorithm of an Open-Loop Sun-Tracking System. Sensors 2009, 9, 7849–7865. [CrossRef] [PubMed] Sthepen, D.; Neale, S.-T. Control for Solar Colectors. U.S. Patent 4146785, 27 March 1979. David, J.P.; Floret, F.; Guerin, J.; Paiva, J.C.; Aiache, L. Autonomous Photovoltaic Converter with Linear Focusing Concentrator. Sol. Cells 1981, 4, 61–70. [CrossRef] Roth, P.; Georgiev, A.; Boudinov, H. Design and construction of a system for sun-tracking. Renew. Energy 2004, 29, 393–402. [CrossRef] Parthipan, J.; Nagalingeswara, R.B.; Senthilkumar, S. Design of one axis three position solar tracking system for paraboloidal dish solar collector. Mater. Today 2016, 3, 2493–2500. [CrossRef] Wang, J.-M.; Lu, C.-L. Design and Implementation of a Sun Tracker with a Dual-Axis Single Motor for an Optical Sensor-Based Photovoltaic System. Sensors 2013, 13, 3157–3168. [CrossRef] [PubMed] Karimov, K.S.; Saqib, M.A.; Akhter, P.; Ahmed, M.M.; Chattha, J.A.; Yousafzai, S.A. A simple photo-voltaic tracking system. Sol. Energy Mater. Sol. Cells 2005, 87, 49–59. [CrossRef] Chen, J.-H.; Yau, H.-T.; Hung, T.-H. Design and implementation of FPGA-based Taguchi-chaos-PSO sun tracking systems. Mechatronics 2015, 25, 55–64. [CrossRef] Stone, K.W. Automatic Heliostat Track Aligment Method. U.S. Patent 4564275, 14 January 1986. Berenguel, M.; Rubio, F.R.; Valverde, A.; Lara, P.J.; Arahal, M.R.; Camacho, E.F.; Lopez, M. An artificial vision-based control system for automatic heliostat positioning offset correction in a central receiver solar power plant. Sol. Energy 2004, 76, 563–575. [CrossRef] Arbab, H.; Jazi, B.; Rezagholizadeh, M. A computer tracking system of solar dish with two-axis degree freedoms based on picture processing of bar shadow. Renew. Energy 2009, 34, 1114–1118. [CrossRef] Lee, C.-D.; Huang, H.-C.; Yeh, H.-Y. The Development of Sun-Tracking System Using Image Processing. Sensors 2013, 13, 172–178. [CrossRef] [PubMed] Ruelas, A.; Velázquez, N.; González, L.; Villa-Angulo, C.; García, O. Design, Implementation and Evaluation of a Solar Tracking System Based on a Video Processing Sensor. Int. J. Adv. Res. Comp. Sci. Softw. Eng. 2013, 3, 5448–5459. Rubio, F.R.; Ortega, M.G.; Gordillo, F.; López-Martínez, M. Application of new control strategy for sun tracking. Energy Convers. Manag. 2007, 48, 4174–4184. [CrossRef] Merlaud, A.; De Maziere, M.; Hermans, C.; Cornet, A. Equations for Solar Tracking. Sensors 2012, 12, 4074–4090. [CrossRef] [PubMed] Skouri, S.; Ali, A.B.H.; Bouadila, S.; Salah, M.B.; Nasrallah, S.B. Design and construction of sun tracking systems for solar parabolic concentrator displacement. Renew. Sustain. Energy Rev. 2016, 60, 1419–1429. [CrossRef] Wei, C.C.; Song, Y.C.; Chang, C.C.; Lin, C.B. Design of a Solar Tracking System Using the Brightest Region in the Sky Image Sensor. Sensors 2016, 16, 1995. [CrossRef] [PubMed] © 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).