Quantifying Fodder Quality Assessments using Machine ... - Publications

3 downloads 2495 Views 2MB Size Report
2008 Rural Industries Research and Development Corporation. All rights reserved. ...... technology, a custom application has been developed for this project. .... University of Adelaide provided 145 samples of fodder from Project UA – 64A.
Quantifying Fodder Quality Assessments using Machine Vision

by Prof John Billingsley and Mark Dunn

September 2008 RIRDC Publication No 08/154 RIRDC Project No PRJ-000785

© 2008 Rural Industries Research and Development Corporation. All rights reserved.

ISBN 1 74151 743 5 ISSN 1440-6845 Quantifying Fodder Quality Assessments using Machine Vision Publication No. 08/154 Project No. USQ-4A The information contained in this publication is intended for general use to assist public knowledge and discussion and to help improve the development of sustainable regions. You must not rely on any information contained in this publication without taking specialist advice relevant to your particular circumstances. While reasonable care has been taken in preparing this publication to ensure that information is true and correct, the Commonwealth of Australia gives no assurance as to the accuracy of any information in this publication. The Commonwealth of Australia, the Rural Industries Research and Development Corporation (RIRDC), the authors or contributors expressly disclaim, to the maximum extent permitted by law, all responsibility and liability to any person, arising directly or indirectly from any act or omission, or for any consequences of any such act or omission, made in reliance on the contents of this publication, whether or not caused by any negligence on the part of the Commonwealth of Australia, RIRDC, the authors or contributors. The Commonwealth of Australia does not necessarily endorse the views in this publication. This publication is copyright. Apart from any use as permitted under the Copyright Act 1968, all other rights are reserved. However, wide dissemination is encouraged. Requests and inquiries concerning reproduction and rights should be addressed to the RIRDC Publications Manager on phone 02 6271 4165.

Researcher Contact Details

Prof John Billingsley NCEA University of Southern Qld Toowoomba Q 4350 Phone: 07 46312513 Fax: 07 46311870 Email: [email protected] In submitting this report, the researcher has agreed to RIRDC publishing this material in its edited form.

RIRDC Contact Details

Rural Industries Research and Development Corporation Level 2, 15 National Circuit BARTON ACT 2600 PO Box 4776 KINGSTON ACT 2604 Phone: Fax: Email: Web:

02 6271 4100 02 6271 4199 [email protected]. http://www.rirdc.gov.au

Published electronically in September 2008

ii

Foreword Australian fodder exporters and the domestic fodder industry are focussing strongly on developing systems for quantitatively describing fodder quality in relation to the needs of specific markets. Many fodder characteristics important to buyers and to the value of hays for specific uses, such as colour, water and fungal damage, weed contamination, grain content, leaf to stem ratio and stem thickness are currently assessed subjectively. The subjective nature of these assessments of fodder quality adds to confusion in the market place and reduces the competitiveness of the Australian industry. There is a clear need for a method that is convenient and ‘instant’ for quantifying these characteristics. AFIA (Australia Fodder Industry Association) have recognised a need for improved understanding of fodder quality leading to better measurement standards accepted by all Australian testing laboratories. The proposed machine vision systems would provide an effective addition to an industry-based quality assurance system. Presentations were given to the AFIA Fodder Grading Committee meeting in Melbourne November 2005. At this meeting the importance of accurate methods for describing hay quality was recognised. The quality of hay traded can be highly variable and disputes between sellers and buyers are commonly related to quality. The potential for a common standard methodology to characterise visual characteristics of hay was discussed. Improved grading systems have been called for by AFIA and there is a need for uniform and objective measures that can meet the needs of the end user. At the AFIA grading meeting it was agreed that there was a need for funding a project to assess the potential for using image analysis technology for quantifying a range of hay visual characteristics including overall colour, leaf and stem colour, water/fungal damage, leaf to stem ratio, stem thickness etc. The project will develop and evaluate a low cost device that could provide an assessment of fodder quality based on those properties that can be determined from image analysis techniques. The overall outcome will be an understanding of the characteristics of fodder crops that can be assessed using this technology. Such technologies, when combined with existing RIRDC fodder projects, would help the Australian fodder industry increase its competitive advantage in the export market and provide a rational basis for the purchase and sale of hay within Australia based on accurate measurement of quality characteristics. The project would lead to better measurement standards accepted by all Australian testing laboratories which would provide an effective base for industry-based quality assurance systems. The importance of this report is that it provides an initial prototype machine vision system for evaluating fodder. The methodology will provide a platform that may be extended in future projects to investigate further characteristics, using visible or other wavelengths. This report, an addition to RIRDC’s diverse range of over 1800 research publications, forms part of our Fodder Crops R&D Program, which aims to facilitate the development and maintenance of a viable fodder crops industry. Most of our publications are available for viewing, downloading or purchasing online through our website: • •

downloads at www.rirdc.gov.au/fullreports/index.html purchases at www.rirdc.gov.au/eshop

Peter O’Brien Managing Director Rural Industries Research and Development Corporation

iii

Acknowledgments The project team would like to acknowledge the assistance of the researchers involved in project UA64A for their assistance with fodder samples and manual measurement data. The project team also acknowledges Balco Australia for their assistance in evaluation of the prototype device created by this project.

Abbreviations AFIA Australia Fodder Industry Association NIR Near Infrared

iv

Contents Foreword ............................................................................................................................................... iii Acknowledgments................................................................................................................................. iv Abbreviations........................................................................................................................................ iv Contents.................................................................................................................................................. v Figures ................................................................................................................................................... vi Tables..................................................................................................................................................... vi Executive Summary ............................................................................................................................ vii 1. Introduction ....................................................................................................................................... 1 2. Objectives ........................................................................................................................................... 2 3. Methods .............................................................................................................................................. 2 3.1 System development (hardware) ................................................................................................... 5 3.2 System development (software) .................................................................................................... 6 4. Results .............................................................................................................................................. 13 5. Discussion......................................................................................................................................... 22 6. Implications...................................................................................................................................... 22 7. References ........................................................................................................................................ 23 Appendix A – Prototype Installation Instructions ........................................................................... 24 To setup included software: .............................................................................................................. 24 Connecting the box to the PC:........................................................................................................... 26 Installing DirectX:............................................................................................................................. 30 Testing the Hay: ................................................................................................................................ 34

v

Figures Figure 1. Solar light emissions ................................................................................................................ 2 Figure 2. Reflectance spectrum for oaten hay samples -visible and NIR range only.............................. 4 Figure 3. Reflectance spectrum for lucerne hay samples – visible and NIR range only ......................... 4 Figure 4. Reflectance average for Lucerne and Oaten Hays ................................................................... 5 Figure 5. Illustration of the sample enclosure layout .............................................................................. 5 Figure 6. Example image using the machine vision platform software .................................................. 6 Figure 7. Example calibration images a) thin (7 pixels) b) thick (22 pixels) ......................................... 8 Figure 8. Results of the Stem Width Algorithm on rotations of a fixed image....................................... 8 Figure 9. Results of the Stem Width Algorithm on rotations of a fixed image with thicker lines ......... 9 Figure 10. Sample lucerne hay images with resultant stem width histograms........................................ 9 Figure 11. Colour triangle example input image and resulting chromaticity representation ................ 10 Figure 12. Example lucerne hay image ................................................................................................. 11 Figure 13. Example Lucerne hay image in X channel of the XYZ colour space .................................. 12 Figure 14. Stem width histogram for 10 subsamples ............................................................................ 15 Figure 15. Histogram weighted average vs manual stem width measurement...................................... 20 Figure 16. Stem Percentage vs Manual Leaf:Stem ............................................................................... 21

Tables Table 1. Colour and Chromaticity measurements for 10 independent subsamples of one type of Lucerne hay ........................................................................................................................................... 14 Table 2 Repeatability measurements for stem histogram...................................................................... 15 Table 3. Repeatability measurements for stem to leaf ratio .................................................................. 16 Table 4. Automatic Results vs Manual Results..................................................................................... 17 Table 5. Stem width histograms ............................................................................................................ 18

vi

Executive Summary What the report is about At present fodder is assessed subjectively. The evaluation depends greatly on someone’s opinion and there can be large variations in assessments. We seek to use machine vision in several ways, to provide measures of fodder quality that will be objective and independent of the assessor. Growers will be able to quote a quality measure that buyers can trust. We also seek the possibility of discerning colour differences that are beyond the capability of the human eye, while still using equipment that is of relatively modest cost. Who is the report targeted at? The report is aimed at potential manufacturers and vendors of measuring equipment as well as at growers and traders of fodder. These can encourage development and marketing of a commercial system. Background The purchaser of fodder is concerned primarily with nutritional value and palatability. In transactions with the vendor, these must be related to factors that can be measured, such as colour, leaf-to-stem ratio and the appearance of damage such as fungal contamination due to moisture. To ensure agreement between the parties, the quality assessment must be objective. Aims/Objectives To discover and validate principles that can be used to create a low-cost field instrument to be used by purchasers, traders and growers of fodder. This project will develop a prototype machine vision system to quantify fodder quality. The low cost camera based device would give an assessment of those properties that could be determined from image analysis techniques for specifying visual aspects of fodder quality. Methods used Our expertise is in the area of machine vision; therefore avenues of chromatography and chemical analysis were not pursued. We investigated the potential of detecting micro-spectral differences between samples, using a precision spectrophotometer. This gave readings of the reflectance of each sample all the way from the ultra-violet to 2.5 nm, deep into the infrared. Wherever differences in spectra could be discerned, it would be possible to use a monochrome camera with appropriate colour filters to perceive the same effect, provided the difference lay in a band that could be detected by the camera. The camera would have the added advantage of inspecting small image features, rather than taking a measure of colour over the whole image. We also applied image analysis software to determine features such a leaf-to-stem ratio. This automatic process is much more exhaustive than manual measurement, in which only a few samples would be taken out and measured.

vii

Results/Key findings This project has demonstrated the feasibility of machine vision technology for the measurement of fodder characteristics. The prototype developed is a fully functional system which can provide and record the information as fast as the human operator can load the samples. Objective measurement of colour will be most beneficial to the follow-on project to determine rules for a grading system. This colour may also be correlated with the Balco TrueGrade grading system for oaten hay. Delivering a histogram of stem widths, rather than a single numerical average, provides valuable additional information for the fodder grading process. The histogram is built by identifying every edge component in the image The stem to leaf ratio is identified using a combination of colour and shape information. Implications for relevant stakeholders At the right price and with the right ‘user-friendliness’, the instrument will have appeal for growers, traders and consumers. This will also have appeal for the manufacturers of such an instrument. The availability of an objective assessment should go far to quell any disagreements between parties. It is, of course, necessary for policymakers to determine how the quoted grades should be related to the features that can be measured. Recommendations Field testing must be performed, to ensure that the laboratory results can truly extend to practical use. Further research is needed in relation to palatability. The flavour of the sample might be affected by small amounts of a contaminant that would not influence to overall colour. With a system based on imaging, it is possible, though by no means certain, that ‘specks’ of such material could be perceived. It is envisaged that expert classification advice will be required to determine a conversion scale from the automatically measured physical characteristics to a grading scheme.

viii

1. Introduction Fodder crops cover a wide range of crop and pasture species that are grown, harvested and processed to facilitate both on-farm use and domestic and export trade. The fodder industry is large with an estimated 20,000 producers on 46,000 properties across all States producing between five and six million tonnes of hay and around two million tonnes of silage per year. This production is traded as a wide range of fodder including lucerne, clover, pasture, cereal and others. The gross value of production at the farm gate is estimated to be about $900 million a year, which represents a 50% increase over the past ten years. About 25-30% of fodder production is traded off-farm and this share has increased substantially during the last few years. Fodder production is concentrated in Victoria and New South Wales, although Western Australian and South Australia are the major exporting states. The largest domestic market users are the dairy industry (40%), horse industry (25%) and feedlot industry (20%) and others (15%). In recent times there has been a growing trend for the dairy industry to rely more on off-farm purchases with recent estimates suggesting that more than 55% of fodder is purchased off-farm. The animal feeds industry in East Asia is estimated to be valued at US$10 billion, and it is perceived that a large untapped demand will enable the industry to develop many new opportunities. The fodder industry has been taking advantage of this market with exports increasing significantly in recent years to over half a million tonnes. The largest market is currently cereal hay into Japan but other markets such as Korea and the Middle East are growing. 1 The subjective nature of classifying fodder quality characteristics leads to confusion in the market place and reduces the competitiveness of the Australian industry. Disputes between buyers and sellers over the quality of hay traded frequently occur. The AFIA has recognised these issues and the potential image analysis technology to be used to quantify current visual assessment of hay. The AFIA fodder grading meeting of 7 November 2005 recommended their support for funding a project to assess the potential for using image analysis technology for quantifying a range of hay characteristics including overall colour, leaf and stem colour, water/fungal damage, leaf:stem ratio, stem thickness etc. Such technologies would improve the accuracy with which fodder crops could be characterised and graded. Objective measures of hay quality that meet the needs of all end users and sellers as well as methods for making these measurements which are uniform across Australia would be important in this regard. This project leans more towards development than research. It will draw heavily on other research experience concerning palatability and nutritional content and if relevant on any microspectral analysis of fodder that can be found. Published items such as “Accounting for co-extractable compounds in spectrophotometric measurement of extractable and total-bound proanthocyanidin in Leucaena spp” [1] are concerned with the acquisition of knowledge. This project is concerned with the exploitation of any such knowledge into the design of an instrument or instruments that are handy to use by a farmer. We have a substantial body of experience in the acquisition and analysis of images, including the hardware design of embedded cameras and processors. This can be applied to the analysis of texture, shape, colour and spectral analysis of any properties that have been shown to reveal attributes of the fodder of interest to animal nutritionists and farmers. Results of the practical experiments into palatability in the University of Adelaide will be of great value in this work. We will also be grateful to receive advice from commercial enterprises with experience in this area, wherever such advice is made available.

1

http://www.rirdc.gov.au/programs/fca.html#Latest

1

2. Objectives The outcome would be an automated repeatable methodology for classifying fodder quality which would provide competitive advantage to the Australian industry through better quality control of fodder products for domestic and export markets. This project aimed to deliver a low-cost bench prototype system to automatically measure the following characteristics from fodder samples: • • •

stem width colour leaf to stem ratio

Using these measurements, other classification information may be interpolated or generated. Other characteristics, such as water damage, weed content and grain content were not included in this research due to lack of manually assessed samples with a wide spread of assessed levels. These characteristics may be investigated in the future with further input from an expert knowledge owner such as an exporting company.

3. Methods Light emission Each light radiating body (incandescent lights, fluorescent lights, the Sun) emits different intensities of light at every wavelength. Figure 1 below is a representation of solar emission as received at the surface of the Earth, having been filtered by the atmosphere.

Figure 1. Solar light emissions

2

Note that the peak values inside the atmosphere correlate with the wavelengths to which human eyes are adapted. The colour of an object is determined by the reflective efficiency of that object at each of the various wavelengths. So for example, an object that appears red will reflect a substantial proportion of light with wavelengths around 740nm, but will absorb most of the other light. Properties of vision The image seen by a human eye is simply light reflected by objects and focussed on the retina. This light falling on the retina is transformed into neural pulses that allow the brain to ‘see’. The amount of light on each point sends a corresponding number of neural pulses, to differentiate between bright and dark. Colour is differentiated by three different types of cone photoreceptors in the retina, Blue (435nm peak), Green (535nm peak) and Red (575nm peak). Each of these detectors responds more dynamically to light near their peak value. Using just these three types of detectors, the human brain is capable of fine colour differentiation, based on mapping the incoming spectrum into just three numbers representing red, green and blue. In fact there can be colour details that the human eye cannot determine. A mixture of red light and green light will be perceived as yellow, which the eye could not separate from monochromatic sodium light (as used to light motorways) in which a green object would look grey. There is thus a possibility that machine vision can go beyond the perception of the human eye, even within the visible range. The extents of the visible range of light wavelengths are from around 380nm (violet) to 740 nm (red). Wavelengths outside this region are still transmitted and reflected, but human eyes do not register any effect. Digital imaging sensors, such as the Logitech Ultravision camera used in this project, are the equivalent in many ways to the human eye. The images are focussed by a lens onto the image plane, where an array of photoreceptors measures the intensity of the light. At each point, or pixel, of the image, a single band of wavelengths is filtered through. The imaging array consists of alternating red, green and blue sensitive pixels. The mixture of these colour channels then provides the complete information for the complete image. It is possible that by using alternative bands to split up the light, differences in fodder could be seen that did not strike the eye. Fodder reflectance Using a spectral analysis device (ASD Devices FieldSpec® 3) we have recorded 58 samples of lucerne and oaten hay reflectance under solar radiation. The spectral analysis device was calibrated according to manufacturer’s specification with the white calibration panel provided. Each sample was then held in direct sunlight, and the reflectance of every wavelength between 350nm and 2500nm was recorded. Figures 2 and 3 following display the measured sample data in the visual and NIR wavelength range.

3

Oaten Hay Samples 1.2

Intensity

1 0.8 0.6 0.4 0.2

38 5 42 5 46 5 50 5 54 5 58 5 62 5 66 5 70 5 74 5 78 5 82 5 86 5 90 5 94 5 98 5 10 25

0

Wavelength Figure 2. Reflectance spectrum for oaten hay samples -visible and NIR range only

Lucerne Hay Samples 1.2 1

Intensity

0.8 0.6 0.4 0.2

Figure 3. Reflectance spectrum for lucerne hay samples – visible and NIR range only

4

26

6

10

6

6

Wavelength

98

94

90

6

6 86

82

6

6

6

78

74

70

6

6 66

62

6

6

6 58

54

50

6

6 46

42

38

6

0

Figure 4 displays the averaged reflectance over all the samples, grouped by either oaten or lucerne hay.

Reflectance - Lucerne and Oaten Hay 1.2

Intensity

1 0.8 Lucerne

0.6

Oaten

0.4 0.2

29

69

24

22

09

49

21

89

19

29

17

69

16

09

14

49

13

9

9

9

11

98

82

66

50

9

0

Wavelength Figure 4. Reflectance average for Lucerne and Oaten Hays The noisy signal between 1350-1380nm and 1800-1915nm corresponds to water absorption bands, where no real information remains, and must be disregarded. The graph above clearly displays that oaten hay is significantly more reflective than lucerne hay. There is also a significantly different slope in the visible and near infrared (NIR) regions.

3.1 System development (hardware) A hardware system has been devised and assembled to perform the relevant measurements by machine vision. The system comprises a laptop computer attached to a webcam and a lightproof sampling enclosure. Figure 5 illustrates the physical layout of the sampling area. The system is provided with constant illumination to ensure colour constancy throughout the testing procedures.

Figure 5. Illustration of the sample enclosure layout

5

3.2 System development (software) Using the machine vision software platform developed at NCEA, based upon Microsoft™ DirectShow technology, a custom application has been developed for this project. This application consists of a user interface and an image processing core, for the application of developed algorithms in real time. This software (Figure 6) can process saved images, or streaming video from the camera.

Figure 6. Example image using the machine vision platform software

6

Stem Width A single number representing stem width has been presented in the manual measurements from project UA-64A. The manual number was averaged from a sub-sample of 10 individual stems. The project team has determined that a histogram of all stem widths present in a sub sample may be a better reflection of the characteristics of that sample. An algorithm has been created to accumulate this data. To commence, each sub-sample must be made independent of colour and lighting effects. The algorithm therefore thresholds the image at a level where exactly 35% of the pixels in the image are ‘target’ pixels. This will ensure that similar levels of stem edges are detected between samples. This normalises the algorithm to ensure that the measurement of stem widths is independent of lighting and colour effects. The algorithm then identifies each stem edge, using a standard edge detection algorithm (Sobel Filter). The opposite edge of the stem is then detected by searching perpendicular to the direction of the edge until an equal and opposite boundary is found. The physical distance between the edges is then accumulated to the histogram. Calibration The calibration of this algorithm was performed using a set of templates. The templates contained stripes of known width at various orientations. An example set of images are displayed in figure x below.

7

Figure 7. Example calibration images a) thin (7 pixels) b) thick (22 pixels) The results are displayed in the following graphs. Figure 8 displays the results of the 4 rotations of the thin lines. The oblique lines give rise to quantisation errors (pixilation). This causes a spread in the histogram of ± 1 pixel.

4000 3500

Count (normalised)

3000 2500

Thin | Thin --

2000

Thin / Thin \

1500 1000 500 0 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 Stem Width (pixels)

Figure 8. Results of the Stem Width Algorithm on rotations of a fixed image Figure 9 displays the results of the thicker lines. Note that the pixillation errors again cause only a ± 1 pixel spread in the histogram.

8

4000 3500

Count (normalised)

3000 2500

Thick | Thick --

2000

Thick \ Thick /

1500 1000 500 0 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 Stem Width (pixels)

Figure 9. Results of the Stem Width Algorithm on rotations of a fixed image with thicker lines Figure 10 below displays sample images processed by the stem width algorithm along with the resulting stem width histograms. It is immediately and visibly obvious that the second image, with a histogram skewed further to the right, has a distribution or wider stems.

Figure 10. Sample lucerne hay images with resultant stem width histograms.

9

Colour Oaten hay colour was manually assessed by Balco Australia, using a TrueGrade scanner. This scanner combines the green channel and the percentage total green area into a score from 0 to 60. The lucerne hay was subjectively graded into one of 5 arbitrary colour groups. The project team has decided that a deterministic measure of the average colour in the sub-sample is a better representation of colour. This average chromaticity in each of the red, green and blue channels will be repeatable and may be used objectively to grade samples. The algorithms to convert to the 60 point TrueGrade scale are not available at this time, but may be implemented at a future time. As the testing box is a closed system, the only light is supplied by the fluorescent tubes mounted inside the box. This ensures constant illumination for the image sensor. The algorithm samples every pixel in the active centre part of the image. The chromaticity values are summed over the region and the average determined. Figure 11 below displays the chromaticity values as white dots overlaid onto the image. The colour triangle representation has increasing red vertically, increasing blue to the bottom right corner, and increasing green to the bottom left corner. In this example image, it can be seen that the average is low in blue and high in green.

Figure 11. Colour triangle example input image and resulting chromaticity representation The colour is reported as a chromaticity unit vector in the red, green and blue channels. That is, the percentage of each colour of the total intensity of the pixel, ie

And the average chromaticity values for the sample then becomes

10

Stem to leaf ratio The visual difference between stem and leaf is related to both the colour and texture of elements of the image. Stems are generally straighter and brighter than leaf areas. Converting the image from standard red, green, blue colour space to the XYZ colour space shows a discrimination function between stem and leaf areas in the X channel. The algorithm for this conversion is:

Using this colour space, stems appear as bright areas in the image, leaves appear as darker areas. Figure 12 below display an example image of lucerne hay in the prototype testing unit. Figure 13 displays the grey scaled X colour channel of the converted XYZ colour space representation of the image.

Figure 12. Example lucerne hay image

11

Figure 13. Example Lucerne hay image in X channel of the XYZ colour space The algorithm then identifies bright areas in the image as possible stems. The next step is to check the texture of the areas. This is done by tracing the edges of each bright area using s-psi graphs [2, 3]. Areas that are bounded by straight edges are classified as stems, all other areas are leaf. The total area in the image thus classified determines the stem to leaf ratio for the sub-sample.

12

4. Results University of Adelaide provided 145 samples of fodder from Project UA – 64A. The details provided were as follows: • Sample # • Colour Score (/5 Lucerne, /60 Oaten) • Average Stem Diameter (mm) • Min Stem Diameter (mm) • Max Stem v (mm) • Leaf:stem • Stem % • Leaf % • NIR Predicted Shear (KJ/m2). Colour Under invariant lighting, the average colour of a sub sample of each type of fodder was determined. While the average intensity of each of the three colour channels was found to vary substantially between repetitions of the measurement, the average chromaticity (the percentage of contribution to the total made by each of the three individual channels) was found to be more consistent. Table 1 illustrates this point with 10 sub-samples of a single type of lucerne hay. In this experiment, a subsample bin was filled from the bag, and then placed inside the camera box. The software processed the image and the sub-sample was removed. The sub-sample was then replaced in the bag, mixed and a different sub-sample taken. This process was replicated 10 times. The standard error for raw colour measurements is demonstrated to be 4 to 10 times higher than the standard error for chromaticity.

13

Table 1. Colour and Chromaticity measurements for 10 independent subsamples of one type of Lucerne hay Raw Colour L36-1.bmp L36-10.bmp L36-2.bmp L36-3.bmp L36-4.bmp L36-5.bmp L36-6.bmp L36-7.bmp L36-8.bmp L36-9.bmp Average Std Deviation Std Error

Red 123 125 116 111 130 106 118 130 138 127 122.4 9.697651 0.079229

Green 114 120 108 103 122 100 109 121 129 120 114.6 9.335714 0.081463

Blue 97 99 87 84 105 88 89 104 109 104 96.6 8.959167 0.092745

Chromaticity L36-1.bmp L36-10.bmp L36-2.bmp L36-3.bmp L36-4.bmp L36-5.bmp L36-6.bmp L36-7.bmp L36-8.bmp L36-9.bmp

Red 0.368263 0.363372 0.37299 0.372483 0.364146 0.360544 0.373418 0.366197 0.367021 0.361823

Green 0.341317 0.348837 0.347267 0.345638 0.341737 0.340136 0.344937 0.340845 0.343085 0.34188

Blue 0.290419 0.287791 0.279743 0.281879 0.294118 0.29932 0.281646 0.292958 0.289894 0.296296

Average Std Deviation Std Error

0.367026 0.004704 0.012817

0.343568 0.002949 0.008585

0.289406 0.006622 0.022883

Stem width The histograms for the 10 sub-samples of a single type of lucerne hay are produced below in Figure 14. It is clear that there are some minor variations, but the shape of the histogram is similar in each case. The software provided records the complete histogram, so there may be other statistical methods to extract more meaningful information from this histogram for further classification projects. The following graph illustrates the histograms graphically.

14

Figure 14. Stem width histogram for 10 subsamples Table 2 provides the summary statistics for these histograms. Table 2 Repeatability measurements for stem histogram File

Median

Skew

Max

Weighted Av

L36_1.bmp

0.016672

1.0082

8

14.67055685

L36_2.bmp

0.016456

0.8876

8.5

15.20826751

L36_3.bmp

0.021459

0.7969

8

15.3929721

L36_4.bmp

0.01323

1.278

9

13.54571653

L36_5.bmp

0.01323

1.278

9

13.54571653

L36_6.bmp

0.018755

1.0164

8

15.01685241

L36_7.bmp

0.020125

0.9702

8

15.35953223

L36_8.bmp

0.013946

1.2825

10

13.03305785

L36_9.bmp

0.017512

1.0856

9

14.224993

L36_10.bmp

0.015378

1.079

9

14.10242664

Average

0.016676

1.0682

8.65

14.41000916

St Dev

0.002843

0.1688

0.6687468

0.847218115

0.17046

0.158

0.0773118

0.058793725

Std Error

15

Stem to leaf ratio The samples provided to this project were in small (20cm*20cm) bags. As these sub-samples have been transported and stored for a long period of time, as well as being taken from a mixed bale, it has proven difficult to correlate the manual measurement for stem:leaf ratio with the machine vision analysis results. This is due to the fact that the leaves have mainly fallen from the stems and crushed into fines. The repeatability measure is within 5% as displayed in table 3. Table 3. Repeatability measurements for stem to leaf ratio Stem to Leaf Ratio L36-1.bmp

85.1

L36-10.bmp

90.6

L36-2.bmp

85.2

L36-3.bmp

89.6

L36-4.bmp

89.6

L36-5.bmp

90.8

L36-6.bmp

94.1

L36-7.bmp

92.6

L36-8.bmp

89.4

L36-9.bmp

87.3

Average

89.43

Std Deviation

2.912445

Std Error

0.032567

Comparison Results The full process was undertaken with single sub-samples for 10 each of lucerne and oaten hays randomly selected from the hay provided by the UA project. The results are tabulated in Table 4 and Table 5 below, along with the manually measured data (where relevant).

16

Table 4. Automatic Results vs Manual Results MANUAL

AUTOMATIC

Colour Score (/60)

Ave Stem Dia (mm)

Min Stem Dia (mm)

Max Stem Dia (mm)

Leaf:stem

O_19

23

5.08

3.50

7.80

0.471

O_17

27

6.26

3.50

10.30

O_STDE

22

4.68

3.10

6.20

O_31

32

5.40

3.90

O_34

27

4.94

O_45

24

O_21

Red

Green

Blue

Stem to Leaf Ratio

Shadow Percent

Histogram Peak

Weighted Av

0.354

0.362

0.283

61.4

8.81

8

13.84

0.250

0.364

0.36

0.275

73.5

6.81

5

15.75

0.538

0.359

0.355

0.286

69.1

5.21

17

14.75

8.00

0.695

0.301

0.38

0.319

34.4

1.95

6

13.64

2.50

6.70

0.639

0.375

0.355

0.271

71.3

12.7

4

19.36

4.95

3.90

5.70

0.724

0.361

0.35

0.289

73.5

5.9

13

13.58

33

5.21

3.90

7.20

0.316

0.369

0.354

0.278

81.2

4.39

14

15.96

O_STDBC

22

5.79

4.90

8.40

0.515

0.348

0.358

0.294

69.9

8.48

4

16.60

O_48

34

5.20

3.00

7.80

0.124

0.332

0.354

0.314

52.6

6.99

6

15.13

O_37

31

4.97

3.30

6.20

0.724

0.358

0.35

0.292

65.6

7.68

4

17.27

O_13

27

5.51

3.10

8.90

0.515

0.376

0.345

0.279

80.5

5.81

6

16.01

O_STDD

29

5.55

3.70

7.20

0.266

0.356

0.353

0.291

80.9

5

10

15.94

O_22

28

6.07

4.10

7.90

0.506

0.375

0.351

0.275

78.4

8.31

7

17.56

O_17

27

6.26

3.50

10.30

0.250

0.359

0.345

0.295

86

4.02

4

15.18

L_LP

4

1.80

0.80

2.60

0.351

0.425

0.354

0.222

73.4

24

6

13.32

L_53

5

1.73

0.90

3.70

0.042

0.425

0.346

0.228

78.5

16

5

11.37

L_52

4

1.76

0.50

3.00

0.190

0.395

0.346

0.259

74.9

14.7

4

12.02

Name

L_57

4

2.06

1.30

2.90

0.299

0.384

0.352

0.263

75

14.4

5

15.91

L_58

5

3.05

2.30

3.70

0.053

0.393

0.352

0.255

65.2

23.9

7

14.88

L_42

3

1.47

0.70

2.10

1.326

0.416

0.357

0.226

72

18.2

7

11.06

L_50

5

1.64

0.40

2.60

0.282

0.404

0.35

0.246

66.9

26.5

5

17.23

L_56

5

1.85

0.50

3.20

0.205

0.382

0.349

0.269

61.2

24

6

11.14

L_44

4

1.50

0.50

2.00

0.370

0.447

0.346

0.206

84.4

13.2

5

11.34

L_68

2

1.36

0.70

2.10

0.923

0.361

0.369

0.27

43.9

22

7

11.78

17

Table 5. Stem width histograms Stem Width Histogram

1

2

3

4

5

6

7

8

9

10

11

53

116

150

161

171

166

174

196

194

180

172

51

102

139

160

166

165

155

139

133

129

130

35

74

92

143

183

182

187

166

165

142

150

44

88

132

178

195

222

208

213

217

208

41

74

95

129

113

98

85

92

101

42

86

110

146

185

198

198

195

39

72

82

106

148

141

143

47

101

134

165

163

153

60

116

129

153

164

43

79

104

121

115

46

85

99

134

45

94

143

43

89

47

12

13

14

15

16

17

18

19

20

21

173

190

186

167

132

105

92

90

80

55

49

40

51

73

118

94

94

107

106

129

127

95

93

113

141

131

107

83

149

154

161

153

198

200

174

114

89

90

82

85

89

57

207

158

121

136

121

92

92

101

111

98

67

75

78

67

65

92

90

97

87

87

116

116

97

106

97

87

91

74

72

76

84

191

172

203

202

210

167

158

141

122

123

117

106

105

74

55

46

54

137

149

149

129

153

176

179

173

161

141

125

136

108

105

115

109

106

96

134

124

113

114

108

104

119

128

125

117

135

109

109

98

104

104

92

86

81

181

163

152

178

174

137

123

151

155

153

119

112

76

78

84

85

73

75

72

51

115

113

111

107

102

107

101

98

102

120

91

86

79

78

91

102

121

89

75

68

169

177

174

165

163

165

142

121

120

110

108

86

96

90

91

94

92

90

87

85

84

174

146

143

132

121

138

193

171

153

124

144

135

128

123

113

97

72

70

75

66

73

72

101

120

140

145

146

143

129

119

122

98

81

68

70

73

71

94

114

130

120

110

103

65

50

93

135

165

158

164

156

158

144

137

138

130

123

105

93

103

120

130

96

114

105

95

81

78

73

38

81

116

156

186

213

198

177

149

148

141

142

121

101

116

122

123

93

100

82

58

42

54

42

36

79

172

208

278

303

288

261

223

210

177

185

176

124

106

115

98

87

78

64

67

59

56

55

49

43

92

199

281

311

294

258

196

170

180

164

141

142

116

99

95

66

76

98

82

59

48

41

27

31

31

61

105

110

137

168

168

148

151

140

123

119

118

125

120

135

131

116

107

106

107

127

118

138

106

81

49

86

113

151

162

166

172

165

165

158

159

168

162

149

125

120

131

109

111

109

86

80

67

76

80

72

142

188

239

273

283

302

277

237

206

218

177

172

148

116

114

101

72

50

41

41

36

41

45

32

36

76

107

136

145

139

120

122

126

117

105

95

110

122

105

98

96

103

122

111

88

75

58

50

66

56

134

194

255

270

272

270

271

237

222

196

183

170

144

141

121

81

73

67

53

45

35

31

35

38

77

144

180

243

272

262

249

236

242

232

225

208

181

135

119

99

86

79

62

63

64

48

51

38

39

71

139

180

223

230

228

232

250

233

233

251

196

147

137

116

93

125

106

70

57

51

45

40

27

22

18

22

23

24

25

Stem Width Histogram (Continued)

26

27

92

96

58

49

50

50

59

47

77

64

51

50

60

53

57

51

66

53

71

79

78

61

66

90

66

64

80

63

32

26

42

32

27

31

61

72

117

80

34

25

50

58

32

36

31

32

30 40 53 27 34 41 41 36 52 47 48 68 48 64 48 49 26 32 44 45 48 50 51 57 65 64 71 63 82 80 95 68 62 63 105 104 114 81 91 78 36 48 34 26 33 29 26 30 31 60 70 71 90 45 62 38 55 47 19 22 24 65 67 68 21 19 19 19 26 25 28

29

31 50 34 41 35 51 20 36 46 83 55 45 90 61 29 22 26 49 60 26 11 59 16 23

32 41 40 37 22 37 28 28 37 74 40 101 50 66 32 20 13 43 49 32 17 51 15 14

33 41 34 29 20 63 15 32 38 41 32 108 42 72 36 22 17 32 32 30 15 59 24 14

34 21 27 23 19 45 17 37 27 32 33 49 28 68 28 12 17 25 37 24 12 50 10 12

35 16 19 17 22 70 21 34 50 33 27 28 26 60 41 16 13 17 32 21 9 53 10 10

36 10 40 12 20 81 9 33 84 28 29 25 22 46 33 22 12 18 26 25 13 40 13 12

37 9 36 7 18 120 10 34 91 17 34 15 16 36 23 17 9 15 24 17 12 37 11 7

38 8 33 19 9 107 12 22 33 21 48 16 18 42 18 18 13 11 18 16 14 51 6 11

19

39 5 38 18 11 80 15 19 34 21 44 17 21 35 27 18 8 20 19 18 7 67 7 9

40 9 46 19 6 75 6 15 24 10 22 13 20 37 23 18 10 10 17 14 10 40 5 5 8

Using these results, the data can be correlated against the manual measurements. Figure 15 displays the correlation of manual stem width measurement and the weighted average of the histogram. Even though the r2 value is less than 0.5, it is unclear at this stage how much variation is due to the repeatability of the manual measurement method (10 individual measurements). It is anticipated by the project team that using statistical measurements from stem width histogram will prove more reliable for classification purposes.

Figure 15. Histogram weighted average vs manual stem width measurement The comparison between manually measured Leaf:stem ratio and automatic Stem % is displayed in Figure 16. As discussed previously, this correlation is poor due to disintegration and settling of the bagged samples since the manual measurements were made.

20

Figure 16. Stem Percentage vs Manual Leaf:Stem

The colour representations also produce no meaningful correlation, due to the manual scales used. It is envisaged that expert classification advice will be required to split the 3dimensional RGB colour space into the required categorizations.

21

5. Discussion This project has demonstrated the feasibility of machine vision technology for the measurement of fodder characteristics. The prototype developed is a fully functional system which can provide and record the information as fast as the human operator can load the samples. The average colour of the sample input into the system is provided in chromaticity components, which is the percent of the total intensity provided by each colour channel. This chromaticity remains constant under most lighting conditions, however it is recommended to use the invariant lighting box provided to ensure the best results are attained. The colour of the sample will be most beneficial to the follow-on project to determine a grading system given the objectively measured characteristics. This colour may also be correlated to the Balco TrueGrade grading system for oaten hay if the manual grading algorithm is known. The stem width histogram will provide additional information to the fodder grading process. The histogram is built by identifying every edge component in the image, and measuring the distance (in pixels) across the stem to the opposite edge. The mean and skew of the histogram will probably be the starting point for using this information effectively. The stem to leaf ratio is identified using a combination of colour and shape information. The image is first segmented into possible stems using the X colour channel of the Xyz colour space. The outline of the areas identified are then traced and if there are mainly straight edges, the area is stem, otherwise it is counted as leaf. The prototype unit was evaluated externally by Balco Australia, a hay and grain exporting company. The unit was installed successfully and some initial testing was undertaken. The tester indicated that the simplicity and useability of the system was major advantage of the system. Some minor software issues with data displayed and workflows were raised. These issues were resolved with software upgrades sent via email. The system is still under external testing for large scale data correlation against standard grading procedures. This information will be reported to project stakeholders in due course.

6. Implications The objective, repeatable measurement of the physical characteristics of fodder provides a valuable tool for growers, exporters and researchers. At the right price and with the right ‘user-friendliness’, the instrument will have appeal for growers, traders and consumers. This will also have appeal for the manufacturers of such an instrument. The availability of an objective assessment should go far to quell any disagreements between parties. It is, of course, necessary for policymakers to determine how the quoted grades should be related to the features that can be measured. This project has developed an important methodology for quantifying fodder quality assessment using machine vision. It is recommended that the system developed is evaluated on a broader scale within the industry by those currently assessing fodder samples using manual methods. This will ensure that the laboratory results can be extended to practical use. Opportunities to work with commercial companies in the industry who may wish to commercialise the technologies should be sought.

22

7. References [1] S. A. Dalzell and G. L. Kerven, "Accounting for co-extractable compounds in spectrophotometric measurement of extractable and total-bound proanthocyanidin in Leucaena spp," Journal of the Science of Food and Agriculture, vol. 82, pp. 860-868. [2] M. Dunn, J. Billingsley, S. Raine, and A. Piper, "Using Machine Vision for Objective Evaluation of Ground Cover on Sporting Fields," presented at Proceedings 11th IEEE conference on Mechatronics and Machine Vision in Practice, Macau, 2004. [3] M. Dunn, J. Billingsley, and N. Finch, "Machine Vision Classification of Animals," in Mechatronics and Machine Vision 2003:Future trends. Baldock, UK: Research Studies Press Ltd, 2003, pp. 157-163.

23

Appendix A – Prototype Installation Instructions This software is designed as a prototype machine vision system to quantify fodder quality. The low cost camera based device gives an assessment of those properties that can be determined from image analysis techniques for specifying visual aspects of fodder quality. Under invariant lighting, that average colour of a subsample of each type of fodder can be determined. While the average intensity of each of the three colour channels was found to vary substantially between repetitions of the measurement, the average chromaticity (the percentage of contribution to the total made by each of the three individual channels) was found to be more consistent. It has been determined that a histogram of stem widths present in a subsample can be a better measurement that a single averaged number. An algorithm has been included to accumulate this data. The stem:leaf ratio is an indicator of the relative percentage of stem found in the image. Setting up software Connecting the box to the PC Setting up DirectX Testing the Hay

To setup included software: Insert the CD into the PC. The installer should start automatically. If it does not, open up Windows Explorer, and navigate to the CD/DVD drive. Double-click on "Install.exe" in the base directory of that drive.

The installer will ask for a destination directory. If you wish, you can select a more favourable directory to install to. Once selected, click the "Install" button in the lower right corner of the window to continue.

24

A progress bar should display the installation progress. After the bar has reached the end, the "close" button at the lower right corner of the window will be enabled. Click it to complete installation.

This software requires DirectX 8.0 or higher libraries. If they are not installed on the computer, the installation software will inform you to install DirectX before continuing. If this is the case, follow the instructions in the section labelled "Installing DirectX", then restart the installation process from the beginning. Once the software has been installed, an icon on the desktop and in the start menu should appear.

25

Connecting the box to the PC: Insert the CD labelled "QuickCam" into the PC's CD-Rom drive.

The installation software will automatically start. It will commence with an install program, then will present you with a menu.

Ensure the USB plug is not connected to the PC and virus scanner software is turned off. Click "Custom Installation (Advanced)" and press "next".

26

If you selected "Get the Latest Software", you will be prompted whether you want the software to use the internet to check for the latest version of camera drivers. If you do not want to do this, or no internet connection is available, press "Cancel". Otherwise, press "OK" and follow the update instructions (Those steps are not covered here).

If the update software alerts you that it cannot connect to the internet, press "Cancel", as this update is unnecessary.

Click "Logitech QuickCam drivers" and press "next".

The next step will see you presented with a License Agreement. Read the agreement and click on "I accept the terms in the license agreement". Press "Next" to continue.

27

You will then be presented with update options. It is recommended you click "I do not want to activate this service". Once chosen, press "next" to continue.

The software will then install. This may take some time, so be patient.

28

You will then be prompted whether you wish to restart the PC. It is recommended to do so, however ensure you don't have any unsaved or unfinished documents etc. running. To restart, press "Restart Now". Alternatively you can press "Restart Later" if you wish to finish the steps later. If so, you will need to restart the PC before continuing.

Plug in the power plug to a 12V supply and the USB plug to the PC. Virus scanner software may now be turned back on (if the restart did not do so already).

29

Windows should detect and install the device. If this does not happen, ensure the USB plug is correctly inserted.

Installing DirectX: This is only necessary if you don't have version 8.0 or later of DirectX installed on the PC. To check, run the installer above, and it will indicate if DirectX needs to be installed on the PC. Note that it is OK to reinstall DirectX from any version if the software detects the incorrect version or if problems are encountered. To install DirectX, open up Windows Explorer, and navigate to the CD/DVD drive. Doubleclick on "DirectX.exe" in the base directory.

Press "Yes" to accept the license agreement and continue.

Choose an appropriate location for the temporary files. If unsure, type "c:\temp" in the text area. Press "OK" to continue.

30

The program will then copy many files to that location. This may take some time, so be patient. Once this is done, the program will exit (and the window will disappear).

Open up Windows Explorer again (or use the same window if you wish), and navigate to directory you specified (such as "c:\temp" in our example). Double-click on "DXSETUP.exe".

Click on the radio button next to "I accept the agreement" to indicate you do accept the agreement, and press "Next" to continue.

31

Press "next" to start the DirectX install process.

The DirectX installer will now install onto the PC. This may take a long time, so be patient.

32

The program will then display a message indicating that it installed correctly. If it displays different text asking to save your work first, this indicates that it will automatically restart the PC when the program closes. Press "Finish" to close the program.

33

Testing the Hay: First, ensure the box is plugged in to both the 12V power supply and the USB port on the PC.

To start up the hay analysis program, run "Hay Analysis" in the "Hay" section from the Start menu.

The main window will then be displayed. The menu is along the top.

If you wish to use a pre-recorded image file, click on the "File" menu, and select "Open Media File". Then, navigate to the correct file, and press "open".

34

If instead you wish to use a set of pre-recorded images, click on the "File" menu, and select "Open Picture File Set". Then, navigate to a file in the correct directory, and press "open". The program will then process all picture files in the directory.

To instead use images directly from the light box, press "Start Camera".

If you are finished with images directly from the box (such as if you wish to use pre-recorded images instead), simply click "Stop Camera".

If you wish to use an alternate camera or if you have several cameras plugged in, click on the "File" menu, and select "Select Capture Device".

A menu of available cameras will be presented. Select which one you wish to use and press "OK".

35

If you wish to change the camera properties (such as the brightness), click on the "Options" menu, and select "Show property page". This is not recommended, as the default settings are optimal.

Once you have an image you wish to capture, click on "snap". It will save that image to the hard-drive. The default location is "C:\VTCapture" followed by the current date.

If you click on "Show Colour Triangle", it will display colour information of the image. A triangle with colour borders will be displayed over the image. Each dot of the image is sampled, and the colour value is plotted in white. If a white dot is near a corner of the triangle, this indicates a pixel is mostly of the colour opposite to the corner.

In this example, a large number of image dots are white (or grey), while others are more green and red (and less blue). The spread of white dots leans toward the blue/red corner, this indicates that the image has a large amount of green (the border opposite that corner is green).

36

While an image is being displayed, you can click on "Sample". This will process the current image. The program will also capture the image in the same way as the "snap" menu item does.

Information about the image is then displayed on the right side of the image.

By clicking on the "Save Information" button, all the data displayed along with the time and date is written into a .csv file. The program will automatically append to this file if more than one save is performed. 37

The location of this file is displayed at the bottom of the data. To change this, press the "Browse" button, and select a different name or location.

Near the top, three items labelled "Red:", "Green:" and "Blue:" are displayed. The numbers associated with them indicate the relative amount of that colour in the image. This is an indication of the proportion of each colour component.

Below the colour values is a histogram of the stem widths. With the left being the smallest stem width, this is an indication of how much of the image contains stems of particular widths. Thus a bump in a particular area indicates more of the image is filled with stems of this width.

The Stem to Leaf Ratio is an indication of the percent of the image that contains stems as opposed to leaves. This is useful for comparing different samples.

The Shadow Percent is an indication of the percentage of the image that contains shadows. These areas are not used in the preceding calculations.

38

Suggest Documents