Guidelines for Creating a Bat-Specific Citizen Science ...

4 downloads 247 Views 145KB Size Report
Jan 10, 2013 - Citizen Science Acoustic Monitoring. Program. Trixi A. Beeker a , Kelly F. Millenbah a ..... Citizen-based Monitoring Network of Wisconsin.
This article was downloaded by: [Michigan State University] On: 10 January 2013, At: 04:30 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Human Dimensions of Wildlife: An International Journal Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/uhdw20

Guidelines for Creating a Bat-Specific Citizen Science Acoustic Monitoring Program a

a

a

Trixi A. Beeker , Kelly F. Millenbah , Meredith L. Gore & Barbara L. Lundrigan

b

a

Department of Fisheries and Wildlife, Michigan State University, East Lansing, Michigan, USA b

Department of Zoology, Michigan State University, East Lansing, Michigan, USA Version of record first published: 10 Jan 2013. To cite this article: Trixi A. Beeker , Kelly F. Millenbah , Meredith L. Gore & Barbara L. Lundrigan (2013): Guidelines for Creating a Bat-Specific Citizen Science Acoustic Monitoring Program, Human Dimensions of Wildlife: An International Journal, 18:1, 58-67 To link to this article: http://dx.doi.org/10.1080/10871209.2012.686147

PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Human Dimensions of Wildlife, 18:58–67, 2013 Copyright © Taylor & Francis Group, LLC ISSN: 1087-1209 print / 1533-158X online DOI: 10.1080/10871209.2012.686147

Guidelines for Creating a Bat-Specific Citizen Science Acoustic Monitoring Program TRIXI A. BEEKER,1 KELLY F. MILLENBAH,1 MEREDITH L. GORE,1 AND BARBARA L. LUNDRIGAN2

Downloaded by [Michigan State University] at 04:30 10 January 2013

1

Department of Fisheries and Wildlife, Michigan State University, East Lansing, Michigan, USA 2 Department of Zoology, Michigan State University, East Lansing, Michigan, USA Bats are challenging study subjects and effective conservation depends on the collection of high-quality data to monitor changing abundances and distributions. Compilation of such data could be achieved through the use of acoustic monitoring programs that engage community stewards or citizen scientists. Currently, no guidelines exist for the creation of a bat-specific citizen science acoustic monitoring program. To address this problem, we compared program development guidelines relevant to other citizen science and acoustic monitoring programs to create a set of recommendations for developing such a program for bats. Our process included reviewing five U.S. citizen science program development guidelines and 12 U.S. and European acoustic monitoring programs to identify elements most frequently applied. Twenty-five essential program elements, encompassing five categories, and five additional unique elements were identified. Our recommendations provide a guide for developing a citizen science acoustic bat monitoring program locally, regionally, or nationally. Keywords

citizen science, program evaluation, Chiroptera, wildlife monitoring

Introduction Around the world, bats play a vital role in maintaining healthy ecosystems. They serve, in particular, as primary consumers of nocturnal insects, many of which damage forest resources or are pests to crops and people. Bats thus provide an indispensable ecological service—that of balancing insect populations in urban, wilderness, and agricultural areas (Bat Conservation International, 2010; Pierson, 1998). In addition, bat guano supports unique microorganisms that have the potential to benefit humans in a variety of contexts, such as the detoxification of industrial waste (Fascione, 2010). Unfortunately, many bat species are now threatened or endangered. Of the 45 bat species in the United States, 11 are listed with the International Union for Conservation of Nature as endangered, vulnerable, or near threatened (Hutson, Michelburg, & Racey, 2001) and six are designated as endangered under the Endangered Species Act. Effective conservation is complicated by nocturnal behavior and flight, both of which make bats difficult study subjects. Conservation efforts are also challenged by a poor public image propagated in part by negative media and literature, and by the recent emergence of the highly contagious and fatal white-nose Address correspondence to Kelly F. Millenbah, Department of Fisheries and Wildlife, 480 Wilson Rd., 13 Natural Resources Building, Department of Fisheries and Wildlife, Michigan State University, East Lansing, MI 48824-1222, USA. E-mail: [email protected]

58

Downloaded by [Michigan State University] at 04:30 10 January 2013

Bat-Specific Citizen Science Monitoring Program

59

syndrome (WNS) in North America. Obtaining biological data on bats, through acoustic monitoring or other methods, is critical for informing conservation decisions (Fascione, 2010) and evaluating the efficacy of those decisions. Citizen science has the potential to contribute timely and meaningful solutions to problems in bat conservation by providing high quality and quantity data as well as public education. Citizen science engages members of the general public in data collection using carefully designed protocols for research projects that have typically been established by a professional research scientist who in turn analyzes and interprets the data (Bonney et al., 2009; Droege, 2007; Weckel, Mack, Nagy, Christie, & Wincorn, 2010). Benefits associated with citizen science programs include increased cost-effectiveness of data collection (e.g., Cohn, 2008; Silvertown, 2009), increased capacity for wide-ranging and long-term population monitoring (e.g., Bonney et al., 2009; Droege, 2007), public education in science and conservation (e.g., Silvertown, 2009), and development of a constituent base for the creation of bat conservation legislation (e.g., Silvertown, 2009). Given these benefits and the conservation crisis currently facing bats, it is likely that the popularity of, and need for, batrelated citizen science programs using acoustic monitoring will increase. Unfortunately, recommendations for developing, implementing, and evaluating bat-specific citizen science acoustic monitoring programs are nonexistent. Stakeholders wishing to develop such a program must draw on knowledge and practice gleaned from studies of other species. These non-bat programs likely have some applicable and appropriate elements but none are focused directly on bat-specific issues and challenges; the transferability of these elements to bats may inhibit efficacy. We advance a set of best practices for developing and implementing a bat-specific citizen science program using acoustic monitoring (i.e., identifying animals by sound transmission). Our objectives were to identify general best-practice guidelines for bat-specific citizen science and acoustic monitoring programs and to recommend a set of guidelines for designing and implementing a bat-specific citizen science acoustic monitoring program.

Methods We used an iterative, five-step inductive qualitative approach to address our objectives (Miles & Huberman, 1994; Patton, 1980). First, we identified five sets of citizen science program development guidelines based on the following criteria: the guideline had to (a) be relevant to some aspect of an acoustic animal monitoring program; (b) have been recommended by an expert in environmental education, wildlife monitoring, and/or citizen science; and (c) be based on the classic definition of citizen science in which volunteers collect and submit data to a scientist and are not involved in research project design. Second, we created a cross-case display matrix (Miles & Huberman, 1994) of elements across guidelines to synthesize the extent to which programs shared certain features and configurations (Denzin, 1989; Noblit & Hare, 1983). The elements in our matrix were derived from the “Director’s Guide to Best Practices Programming—Citizen Science” by The Association of Nature Center Administrators (Prysby & Super, 2007) and included six additional elements drawn from the literature to account for the extent to which programs increased scientific validity, improved protocol use, strengthened citizen-science volunteer contributions, and increased longevity of programs. Any element used by more than three guidelines (i.e., ≥60%) advanced to the next phase of analysis. We chose 60% as our cutoff because it indicates convergence (i.e., majority) among guidelines (Patton, 1980) and offered an opportunity to build a general interpretation grounded in the findings of separate programs (Noblit & Hare, 1983).

Downloaded by [Michigan State University] at 04:30 10 January 2013

60

T. A. Beeker et al.

Our third step was to identify 12 acoustic animal monitoring programs based on the following criteria: the program had to (a) incorporate volunteers collecting data for a scientist or team of scientists (i.e., the citizen scientist needed to have an active role in the scientific process), (b) aim for statistically reliable and biologically relevant data collection (via training and testing/quality control of volunteer performance), (c) make at least some attempt to evaluate the educational experience of the volunteers, (d) have been in operation for at least 2 years, including 2010, the year during which our study commenced, (e) encompass a spatial area large enough to provide statistically valid and useful (to natural resource managers) data about population changes for a given species, and (f) make information about the monitoring program available online for public access. Our fourth step was to generate a new matrix that combined the 12 acoustic animal monitoring programs with the elements advanced from the evaluation of the citizen science guidelines. Any element used by more than nine programs (i.e., ≥75%) advanced to the next phase. We chose this more conservative estimate of convergence at this phase of analysis to improve confidence in our categorization system (Patton, 1980). Finally, we conducted semi-structured exploratory interviews (Yow, 1994) via e-mail or through telephone calls with acoustic animal monitoring program directors or management staff (n = 12) (see Appendix A for interview guide) and a review of online grey literature to assess the completeness of our list of elements (Patton, 1980). We then synthesized information across all data sources (Table 1, program director interviews and grey literature review) to create a set of best practices for developing and implementing a bat-specific citizen science acoustic monitoring program.

Results A matrix comparing presence/absence of the 39 initial program elements across the five sets of citizen science program guidelines was developed. Twenty-five of the 39 elements advanced to the next phase of analysis because they were used in ≥60% of the guidelines (Table 1). Those included 75% (n = 6) of the elements from the scientific validity category, 67% (n = 4) from the educational quality category, 57% (n = 4) from the protocols category, 50% (n = 4) from the citizen scientist category, and 70% (n = 7) from the sustainability category. A second matrix comparing presence/absence of the 25 program elements identified from citizen science program guidelines to the 12 acoustic animal monitoring programs was developed. All 25 of these elements were identified as best practice elements for a batspecific citizen science acoustic monitoring program because they were used in ≥75% of programs evaluated (Table 1). In addition to the 25 best practice elements identified in Table 1, we included five additional elements based on program director interviews and a review of the grey literature. The five additional elements were (a) use of a mobile full-spectrum continuous recording system (via a laptop computer and global positioning system [GPS] software linked to an external bat detector device) when collecting field data on bats, (b) walking transects with heterodyne detectors when collecting field data on bats, (c) creating a publically accessible website on which to share information and data, (d) employing an inclusive volunteer recruiting program to recruit all levels of citizen scientists for field data collection, and, when possible, (e) having a unifying legal mandate or legislative foundation to make data collection for the species an imperative. Together these 30 best practice elements embody our recommendations for developing a bat-specific citizen science acoustic monitoring program.

61

Educational quality

Scientific validity

Category

Defines research question, goals, and objectives Defines stakeholders (end users) and purpose (uses) Identifies scope (data quantity, time-frame, geographic area) Involves experts (professional scientists create or advise) Explains/justifies choice of species monitored Describes data analysis and statistical technique Plans for data distribution, publication, and publicity Develops evaluation and/or quality assurance/quality control plan Identifies educational objectives Identifies specific audience Ensures objectives and audience match Assesses needs/skills of volunteers Develops curriculum Describes, applies, and assesses evaluation strategies

Element

3 2 2 4 3 3

4 3 5 4 2 3 2 3

A: # of citizen science program guidelines (out of 5) element was present1

Y

11

(Continued)

Y Y Y

Y

Y

11

12 — — 10 12 12

Y Y Y Y

Best practice element? 12 12 12 12

B: # of acoustic animal monitoring programs (out of 12) element was present2

Table 1 Matrix comparing the presence and absence of elements across 5 citizen science program guidelines and 12 acoustic animal monitoring programs to identify best practice elements for a bat-specific citizen science acoustic animal monitoring program. Column A identifies the number of elements that were employed by each of the citizen science program guidelines evaluated. Elements had to be employed by ≥60% (i.e., 3 or more) of the citizen science program guidelines to be evaluated in Column B. Column B identifies the number of elements that were employed by each of the acoustic animal monitoring programs evaluated. Elements had to be employed by ≥75% (i.e., 9 or more) of the acoustic animal monitoring programs to be advanced as a best practice elements

Downloaded by [Michigan State University] at 04:30 10 January 2013

62

Aligns field protocols, data sheets, and statistical tools Tests protocols and data sheets Describes and justifies monitoring method(s) Provides multiple roles/ participation levels Assesses and addresses equipment and field site needs Develops data submission plan Uses international monitoring and data collecting/sharing methods Considers citizen science level of participation Develops training program Tests volunteer skills Develops recruitment plan and job description Creates/follows safety plan and risk management procedures Develops advancement system and “train the trainer” for volunteers Offers various levels of tasks for various levels of skill Aims to attract diversity in volunteers Develops funding plan Develops partnership and communication strategies

Protocols

Sustainability

Citizen scientists

Element

Category

Table 1 (Continued)

12 12 — 12 12 — — — 12 12

2 1 2 4 4

12 12 12 — 12 — —

B: # of acoustic animal monitoring programs (out of 12) element was present2

3 5 1 3 4

4 4 3 2 3 2 2

A: # of citizen science program guidelines (out of 5) element was present1

Downloaded by [Michigan State University] at 04:30 10 January 2013

Y Y

Y Y

Y Y

Y

Y Y Y

Best practice element?

63

— — — 10 10 12 12 12

2 2 5 2 5 3 4 3

Y

Y Y Y Y

1 The following programs were evaluated: The Director’s Guide to Best Practices Programming Citizen Science (Prysby & Super, 2007), The Citizen Science Toolkit (Cornell University Lab of Ornithology, 2007), The Manager’s Monitoring Manual (U.S. Geological Survey, 2009a), The Non-formal Environmental Education Programs: Guidelines for Excellence (North American Association for Environmental Education, 2004), EUROBATS—Guidelines for Surveillance and Monitoring of European Bats (Battersby, 2010). 2 The following programs were evaluated: Indicator Bats Program (Jones, 2009), United Kingdom’s National Bat Monitoring Program (Bat Conservation Trust, 2009), The Pacific Northwest Bat Grid (Ormsbee, 2008), Southeastern Bat Blitz (Southeastern Bat Diversity Network, 2010), Wisconsin’s Citizen Science Acoustic Bat Monitoring Program (Citizen-based Monitoring Network of Wisconsin, 2009), New York’s Mobile, Citizen-Based Acoustic Bat Monitoring Program (Britzke & Herzog, 2010), Frogwatch USA (Association of Zoos and Aquariums, 2009), North American Amphibian Monitoring Program (U.S. Geological Survey, 2009b), Breeding Bird Survey (U.S. Geological Survey, 2008), Birds in Forested Landscapes (Cornell Lab of Ornithology, 2010), Bird Atlas Survey (British Trust for Ornithology, 2010), eBird (eBird, 2010).

Develops and applies volunteer retention plan Maintains institutional and staff support (facilities, supplies, equipment) Documents all steps of the project Applies adaptive management to entire program Makes guidelines available online Evaluates/measures effects Ensures results inform public policy, add to scientific knowledge, and connect with the media Markets program

Downloaded by [Michigan State University] at 04:30 10 January 2013

64

T. A. Beeker et al.

Downloaded by [Michigan State University] at 04:30 10 January 2013

Discussion The best practices identified in this project could be applied on a local level or scaled up to a national level depending on conservation needs and program resources. There is considerable economic incentive for leveraging citizen scientists to inform bat conservation. For example, in 2009 scholars estimated that an excess of $45 billion USD was needed to study the effects of WNS (Fascione, 2010); citizen scientists can help offset this financial burden. In addition, such programs facilitate the rapid collection of large data sets, thus providing information to decision-makers more quickly than if the task were left entirely to scientists. The critical conservation crisis surrounding bats, however, necessitates the development and implementation of bat-specific citizen science programs to ensure optimal allocation of resources and to maximize the chances for effective conservation policy. Not only can the best practices discussed herein be used to inform the development and implementation of bat-specific citizen science programs, they can be quantitatively or qualitatively measured as indicators of success and contribute to outcome evaluation. It is noteworthy that only four elements were employed across all of the guidelines included in our assessment: an identified program scope, a training program for participants, documented program steps, and online availability. Although our assessment did not explore why these elements appear so common, such information would be valuable to program planners and conservationists. Elements most commonly employed may represent requisites for new bat-specific citizen science acoustic monitoring programs. In analyzing data to generate the set of bat-focused best practices, a number of elements were dropped from consideration because the majority of sampled programs did not include them. This lack of convergence does not mean that those elements are trivial, nor does it mean that they cannot be valuable performance indicators. Variance across programs, however, suggests that these elements may be peripheral and therefore less essential for program development and less central to program implementation. Future research could explore reasons why some elements appear peripheral in some programs but not in others. For example, testing volunteer skills, and offering tasks suited to various skill levels, were present in only one program guideline and so were not included in the best practices matrix. Yet, acoustic monitoring involves a number of technical skills, and it would seem important to assess whether volunteers have the abilities needed to validly and reliably collect data both before and during data collection activities. Information on volunteer skill levels would allow program leaders to better match volunteers to tasks, potentially resulting in more efficient and effective use of volunteers’ time and program resources. However, this element was not common in program guidelines assessed herein. Determining why this and other elements are peripheral (e.g., Is time limited? Are volunteers turned off by testing?), could inform researchers about potential barriers to implementing successful bat-specific citizen science acoustic monitoring programs. Finally, we realize that using citizen scientists in any animal monitoring project presents special challenges. It is crucial for research projects to be designed recognizing the possible limitations associated with using citizen scientist volunteers. Special, specific protocols for citizen scientists need to be designed and tested to determine if the data collected by the volunteers are reliable. Also protocols need to be carefully limited so that volunteers are being asked to collect data that are within the range of what can be taught to a non-expert (Cohn, 2008). Ormsbee (2008) also notes that the nocturnal nature of bats and the use of high-technology acoustic equipment used to study bats can represent an additional challenge for citizen scientists. Our research attempted to identify the minimum best practice elements to use for a bat-specific citizen science acoustic monitoring program based on other commonly

Bat-Specific Citizen Science Monitoring Program

65

identified program elements from citizen science and acoustic animal monitoring programs. The true efficacy and validity of our recommended best practices will only be confirmed once applied and evaluated through new program creation and adoption. Conservation decisions, funding allocations, and conservation action priorities should all be based on the best available information, of which evaluation is a key component. The full potential of bat-specific citizen science acoustic monitoring programs for resolving the conservation crisis surrounding bats will be realized only with purposeful and deliberate evaluation. The best practices proposed herein will ideally serve as a baseline upon which such evaluation can occur.

Downloaded by [Michigan State University] at 04:30 10 January 2013

References Association of Zoo and Aquariums. (2009). Frogwatch USA. Retrieved from www.aza.org/frogwatch/ Bat Conservation International. (2010). Benefits of bats—Natural history of bats: Ecological and economic value. Retrieved from www.batcon.org/index.php/all-about-bats/intro-to-bats/subcategory/ 18.html Bat Conservation Trust. (2009). National bat monitoring programme. Retrieved from www.bats.org. uk/pages/nbmp_reports.html Battersby, J. (2010). Guidelines for surveillance and monitoring of European bats. Retrieved from www.eurobats.org/publications/publication_series.htm British Trust for Ornithology. (2010). Bird atlas United Kingdom. Retrieved from www.bto.org/ birdatlas/ Britzke, E., & Herzog, C. (2010). Using acoustic transects to monitor bat population trends in the Eastern United States., 2010 White-nose Syndrome Symposium, May 25–27, Pittsburgh, Pennsylvania. Bonney, R., Ballard, H., Jordan, R., McCallie, I., Phillips, T., Shirk, J., & Wilderman, C. (2009). CAISE Inquiry Group Report. Public participation in scientific research: Defining the field and assessing its potential for informal science education. Washington, DC. Citizen-based Monitoring Network of Wisconsin. (2009). Wisconsin bat monitoring program. Retrieved from www.wiatri.net/inventory/bats/index.cfm Cohn, J. (2008). Citizen science: Can volunteers do real research? Bioscience, 58, 192–197. Cornell Lab of Ornithology. (2007). Citizen science toolkit. Retrieved from www.birds.cornell.edu/ citscitoolkit/toolkit Cornell Lab of Ornithology. (2010). Birds in forested landscapes. Retrieved from www.birds.cornell. edu/bfl/easternbirds3.html Denzin, N. K. (1989). Interpretive interactionism (Applied Social Research Methods Series, Vol. 16). Thousand Oaks, CA: Sage Publications. Droege, S. (2007). Just because you paid them doesn’t mean their data are any better. Retrieved from www.birds.cornell.edu/citscitoolkit/conference eBird (2010). Retrived from http://ebird.org/content/ebird/about Fascione, N. (2010). Testimony regarding FY 2011 funding to address the bat disease whitenose syndrome. Testimony to U.S. Senate Committee on Appropriations, Subcommittee on Interior, Environment, and Related Agencies. May 14, 2010. Retrieved from www.batcon.org/ pdfs/whitenose/Fascione_Senate_WNS_testimony.pdf Hutson, A. M., Mickelburgh, S. P., & Racey, P.A. (2001). Microchiropteran bats: Global status survey and conservation plan. Gland, Switzerland: International Union for the Conservation of Nature and Natural Resources. Jones, K. (2009). Monitoring bat biodiversity: Indicators of sustainable development in Eastern Europe: Indicator bats program. Darwin 15/033 Final Report. May 2006–2009. Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis. Thousand Oaks, CA: Sage Publications.

Downloaded by [Michigan State University] at 04:30 10 January 2013

66

T. A. Beeker et al.

Noblit, G. W., & Hare, R. D. (1983). Meta-ethnography: Issues in the synthesis and replication of qualitative research. Paper presented at the Annual meeting of the American Educational Research Association, Montreal. North American Association for Environmental Education. (2004). Non-formal environmental education programs: guidelines for excellence. Retrieved from www.naaee.org/programs-andinitiatives/guidelines-for-excellence/materials-guidelines/nonformal-guidelines Ormsbee, P. C. (2008). The bat grid: A unique approach to reliable data. BATS Magazine, 26, 1. Patton, M. Q. (1980). Qualitative evaluation methods. Thousand Oaks, CA: Sage Publications. Pierson, E. D. (1998). Tall trees, deep holes, and scarred landscapes: conservation biology of North American bats. In T. H. Kunz & P. A. Racey (Eds.), Bat biology and conservation (pp. 309–325). Washington, DC: Smithsonian Institution Press. Prysby, M., & Super, P. (2007). Director’s guide to best practices: Programming citizen science. Logan, UT: Association of Nature Center Administrators. Silvertown, J. (2009). A new dawn for citizen science. Trends in Ecology and Evolution, 24, 467–471. Southeastern Bat Diversity Network. (2010). Bat blitz. Retrieved from www.sbdn.org/hosting_blitz. html U.S. Geological Survey. (2008). North American breeding bird survey. Retrieved from www.mbrpwrc.usgs.gov/bbs/ U.S. Geological Survey. (2009a). Managers’ monitoring guide: How to design a wildlife monitoring program. Retrieved from www.pwrc.usgs.gov/monmanual/ U. S. Geological Survey. (2009b). North American amphibian monitoring. Retrieved from www. pwrc.usgs.gov/naamp/ Weckel, M. E., Mack, D., Nagy, C., Christie, R., & Wincorn, A. (2010). Using citizen science to map human-coyote interaction in suburban New York, USA. Journal of Wildlife Management, 74, 1163–1171. Yow, V. R. (1994). Recording oral history: A practical guide for social scientists. Thousand Oaks, CA: Sage Publications.

Bat-Specific Citizen Science Monitoring Program

67

Appendix A Questions posed to program directors or management staff (n = 12) in 2010 to confirm and clarify best practices associated with citizen science programs and or acoustic monitoring programs

Downloaded by [Michigan State University] at 04:30 10 January 2013

Questions What elements of [program name] have been most effective and why? Are there any “lessons learned” from [program name] that may have led to design improvements in [program name]? Are there any documents about the development process for [program name] that you would be willing to share? In particular, I am interested in statistical design, data management, and volunteer recruitment, training, and retention. What are you most proud of in the [program name]? Is the [program name] linked to the Indicator Bats Program in Europe? If so, what kind of changes have you made to adapt to the USA? What’s contributed most to the success of the [program name]? Any memorable disasters associated with implementation of [program name]? Has anything happened that merited significant redesign of [program name]? What would you NOT do again if you redesigned [program name]? How has the challenge of any alleged “unreliability of volunteer data” been met in [program name]? How has data standardization been built in to [program name]? How are volunteers recruited and retained in [program name]? Are volunteers tested on knowledge/improvements in [program name]? What kind of bat detector do you use in [program name] and why? What kind of software do you use for sound analysis in [program name] and why?

Suggest Documents