Extension Education Society (EES)

21 downloads 0 Views 2MB Size Report
Drip Irrigation- A Waterwise Approach ... The learning that takes place during the implementation of projects, ... evaluate the impact of a project or program.
Vol. 24

* * * * * *

July- September 2012

Constraints in the Adoption of Ber Production Technology Perception of Agripreneurs towards ~odal Training Institute Adoption of IPM Practices Drip Irrigation -A WaterwiseApproach Impact of IPM Training Centre Marketing BehaviourofTribal Women

Published by

Extension Education Society (EES) Tamil Nadu Agricultural University Campus Coimbatore- 641003, Tamil Nadu

No.3

Journal of Extension Education Vol. 24 No. 3, 2012

(coNTENTS) RESEARCH ARTICLE 1. Constraints in Adoption of Ber Production Technology in Arid Zone of Rajasthan M.I. Meena and Dheeraj Singh

4857

2. Socio Economic Correlate with the Perceived Effectiveness of Indigenous Tribal Agricultural Practices P. Venkatesan and M.Sundaramari

4864

3. Discrimination Problems Faced by Women Workers in Unorganized Sector of Ludhiana District of Punjab Paramdeep Kaur and Kanwaljit Kaur

4868

4. Drip Irrigation- A Waterwise Approach T. Rathakrishnan and S.R. Padma

4875

5. Perception of Agripreneurs towards Nodal Training Institutes R. Venkattakumar, P. Chandarshekara and B.S. Sontakki

4880

6. Assessing the Impact of an Integrated Pest Management Training Course in Jharkhand S.K. Kriese mer, R. Srinivasan, M. Ravishankar and B. Bhushan

4886

7. Adoption of IPM Practices by Vegetable Growers in Karnataka B. Narayanswamy, Saju George and M.R. Hegde

4892

8. Marketing Behaviour of Tribal Women on NWFPs S. Devika, P. Balasubramaniam and N. Suganthi

4897

RESEARCH NOTE

1. Level of Aspiration of the Goat Farmers Braj Mohan, Ramji Lal Sagar and Khushyal Singh

4903

2. Socio-Economic Characteristics of Goat Farmers and Awareness of Goat Production Practices Braj Mohan, Ramji Lal Sagar and Khushyal Singh

4905

Journal of Extension Education Vol. 24 No. 3, 2012

Assessing the Impact of an Integrated Pest Management Training Course in Jharkhand S.K. Kriesemer 1 , R. Srinivasan2 , M.

Ravishanka~andB.

;

Bhushan4

ABSTRACT

The learning that takes place during the implementation of projects, particularly during training courses, is comparatively easy to measure, and is at the same time a necessary precondition for change in behavior. To assess immediate impact of a training course held in Khunti, Jharkhand, the participants' gain in knowledge and skills was evaluated at the end of the course, and their level of retention assessed after six months. Eighteen trainees were evaluated for learning, and 13 trainees were assessed for retention of learned knowledge. Respondents' answers were rated with one or minus one point for correct and incorrect answers, respectively. The overall results were computed as the average achievements of trainees by percentage. Fourteen trainees were farmers, of which seven also acted as community service providers for an NGO; four trainees were NGO staff Regarding their knowledge before and after the training course, correctly answered questions increased by 18 percentage points and incorrectly answered questions reduced by two percentage points. At the follow-up workshop, 53% of questions were answered correctly, but the share of incorrect responses increased by 15 percentage points to 39%. Results for knowledge retention after six months showed that correct answers increased by 16 percentage points compared to before the training. In future training programs, efforts should be made to carefully address incorrect knowledge to avoid the spread of incorrect information. Evaluating learning is important because without learning, no change can occur (Kirkpatrick and Kirkpatrick 2008). Kirkpatrick flrst published his often-cited four levels of evaluation model in 1959; the levels are: reaction, learning, behavior, and result (Kirkpatrick and Kirkpatrick 2008). In agricultural development, it is difficult to evaluate the impact of a project or program after completion due to . the attribution gap (GTZ 2004) and due to a multitude of research methodologies to choose from (Maredia et al.

2000). Impact research often aims to quantify changes at the behavior and/ or result levels. The learning that takes place (or not) during the implementation of projects, particularly during training courses, is comparatively easy to measure and is at the same time a necessary precondition for any change in behavior (e.g., adoption of new technologies, change in practice, change in attitude) to take place. A modification of Kirkpatrick's model is Scriven's (20 10) Training Evaluation Checldist, which adds th,e needs assessment

1- Food Security Center (FSC), University ofHohenheim, Gennany, 2-AVRDC- The World Vegetable Center, Shanhua, Ta inan 74199, Taiwan, 3-AVRDC- The World Vegetable Center South Asia, Birsa Agricultural University, Ranchi 834 006, Jharkhand, India, 4-Vigyan Prasar, No ida 20 I 309, Uttar Pradesh, India

·-

Utilization of Market Information Disseminated through Dynamic Market Information (DMI)

before and the retention of knowledge after the intervention as two aspects (among others) to be considered when planning and evaluating training courses. Knowing the initial level of knowledge of trainees is important, not only to better address the participants' specific training needs, but also to compare it with their level of knowledge after the training. The difference in knowledge before and after the training is what was learned. The knowledge present at a later stage (some months after the training course) is knowledge that has been retained, and has the potential to be used in practice. AVRDC- The World Vegetable Center aims to contribute to safer and sustainable vegetable production by developing and promoting integrated pest management (IPM) solutions in rural areas of Jharkhand. A training course on IPM practices was conducted in August 2010 with a follow-up workshop in February 2011. To assess the immediate impact, the gain in knowledge and s!d1ls of participants at the end of the course and their level of retention after six months were evaluated.

METHODOLOGY Twenty-nine participants joined the first day of the training course and took part in the pre-training evaluation of knowledge and practices. Of these participants, 18 followed the entire training until the end of the second day and completed the post-training evaluation. Due to time constraints, not all participants were able to attend the full two

4887

day course. Thirteen trainees participated in a follow-up workshop about six months later. The results presented in this report refer to the sub-group of 18 participants for the evaluation of learning, and the sub-group of 13 participants for the evaluation of retention of learned knowledge and skills. The pre-training evaluation questionnaire contained questions on demographic characteristics of the respondents (7 questions), on respondents' knowledge (41) and pest control practices (9) . Respondents' knowledge was assessed using multiple choice type questions with four answer options and the option "I do not know". Respondents' answers were rated with one or minus one point for correct and incorrect answers, respectively. Questions asking for a set of answers (e.g., "List 4 methods .. .") were rated with one point for each sub-answer. The response "I do not know" and questions left blank were rated with zero. Trainees had no time constraint to complete the test. For this reason it can be concluded that trainees did not know the answers to questions they left blank. The possible range of points achievable in the pre- and post-training test reached from -54 points for only wrong answers to +54 points for 100% correct answers. The followup test contained identical but fewer questions and ranged from -35 to +35 points. Some questions were deleted from the original set of questions in order to save time. For easy comparison of the three rounds of tests, the overall results were computed as ape centage of the average achievements of participants.

4888

Journal of Extension Education

FINDINGS AND DISCUSSION Trainee Characteristics The training group was composed of 16 men and 2 women, between 20 and 38 years, with an average age of 28 years. Most respondents (11) had a matriculate or intermediate level of education. Six had a bachelor, master or professional degree or held a diploma. One participant indicated being "non metric". Seven trainees were farmers, seven were farmers as well as community service providers for a local NGO, and four trainees were NGO staff (three field executives and one lab assistant) . Only four trainees had previously attended a training course on

integrated pest management and three had received prior training on the safe use of pesticides. These former training courses had been held between seven months and three years ago.

Pest control practices before the training Fourteen respondents indicated that they used synthetic pesticides on the vegetables they grew, while 3 did not use synthetic pesticides, and one trainee did not respond. All except two trainees responded that they usually followed the instructions given on the pesticide label 1 • The stages in which trainees applied pesticides are given in Table 1.

Table 1. Pesticide Application Stages chosen by Trainees Stage of pesticide application

(n=18)

No. of trainees

Upon noticing few insects, when they start damaging the plants

15

Upon noticing few insects, even without damage or minor disease symptoms

9

Upon noticing many insects or more disease symptoms

8

Preventive, well before the insects or disease symptoms are seen as routine

7

Most respondents (14) judged that the pesticides they used were effective, whereas four thought that they were only moderately effective. No trainee judged pesticides as in effective.

leaves (2), a mixture ofjaggery, cow dung and cow urine (2), neem leaves, pheromone traps, cow dung, and mosquito nets (one each). One trainee even indicated burning tires close to the field.

Ten respondents did not use any other crop protection method other than synthetic pesticides, and mentioned the lack of knowledge about alternatives as a reason. Seven trainees indicated using other methods, and listed the use of ashes (3) , pongamia

Most commonly consulted sources of information about alternative cop protection methods are extension agents from NGOs (6), family members or relatives (3), pesticide shop keepers (3), government extension agents or staff of the agricultural college (one each) .

Utilization of Market Information Disseminated through Dynamic Market Information (DMI)

Other sources mentioned are AVRDC - The World Vegetable Center project office, books, and radio, TV (Krishi darsan, an agricultural telecast) and an organic agricultural scientist (one each). Almost all trainees (15) indicated visiting their fields regularly to monitor the crops, with one trainee visiting the field daily, three every second day, seven every three days/twice a week, and four once a week. The most commonly cited monitoring practice is visual observation of crop damage (11), followed by visual observation of insects (4) and counting damaged plants or plant parts (4). Three respondents combine the latter practice with visual observation. After a pest control intervention, eight trainees indicated they were satisfied when they saw a reduction in crop damage a few days after the intervention. Four trainees

4889

indicated they wanted to see dead insects after a few hours of the intervention, while three trainees were satisfied when they saw dead insects on the following day or a few days after the intervention, each. As a follow-up, nine trainees continued monitoring, nine others used another intervention when necessary, and five indicated repeating the same intervention, but using a different product. Only one trainee would use the same product while repeating the intervention.

Evaluation of learning: level of knowledge before and after the training Overall, the group of trainees achieved a mean of 6.2 points (SD 7 .16) in the pretraining test with a minimum of -7.5 points for the poorest test and a maximum of 18 out of 54 points for the best test. Tests contained an average of 38% of correct answers and 26% of wrong answers (Table 2). On average, for

Table 2. Comparison of Overall Test Results (in %), Before and After the Training, and at Follow-up Workshop (n=l8)

Overall end result Correct answers Incorrect answers "I do not know" Blank questions

Before 11.5 37.7 26.0 14.0 22.3

14% of questions respondents indicated that they did not know the answer and 22% questions remained unanswered (blank). Mter the training, the average test result was 16.6 points (SD 7.18). The poorest test achieved two points and the best test achieved

After 30.7 55.3 24.2 12.4 8.0

Follow-up 13.7 53.2 38.7 5.3 2.9

30 points out of 54. On an average, 55% of the questions were answered correctly, and 24% were answered incorrectly. Respondents indicated being uncertain about the answer for 8% and did not respond to 12(o of the questions.

4890

Journal of Extension Education

= The gain between pre-and post-training knowledge test is 10.4 points on average . Correctly answered questions increased by 18 percentage points and unanswered questions reduced by 14 percentage points. At the same time, incorrectly answered questions and cases where respondents indicated uncertainty reduced only by two percentage points, each.

Evaluation of retention: level of knowledge six months after the training At the follow-up workshop, the overall average test result of trainees was 4.8 points (SO 3.6) out of 35. Even though still 53% of questions were answered correctly, the share of incorrect responses increased by 15 percentage points to 39%. Five and three percent of responses were unknown or were not answered (remained blank), respectively. Looking at the knowledge status six months after the training, results showed that correct answers increased by 16 percentage points compared to before the training, but decreased by 2 percentage points compared to immediately after the training. While unknown responses and blank questions reduced by 9 percentage points and 19 percentage points, respectively, the share of incorrect answers increased by 13 percentage points after six months of the training compared to "before training."

CONCLUSION Overall, trainees' knowledge about IPM was

enhanced by the training course. The increase in correctly answered questions immediately after the training was largely at the expense of unanswered questions. This clearly shows that knowledge gaps were closed by the training. Even though the share of correctly answered questions decreased slightly six months after the training, one could still conclude that learning and retention had taken place. The change in questions incorrectly answered or where trainees expressed uncertainty was smaller than the change in correct questions before and after the training. These results confirm that it is difficult to change and correct already existing incorrect knowledge (Chi 2008). The change in the share of incorrect answers between the training and the followup workshop was striking and influenced the overall end result substantially. After a reduction of incorrect answers at the end of the training course, the number of wrong answers increased substantially at the followup workshop. At the same time, unknown responses and questions left blank reduced consistently. The reasons for this development are unknown and require further research on the local circumstances (origin of the knowledge, other training courses) and respondents' confidence levels about their IPM knowledge. Although the increase in correct answers by 16 percentage points (before vs. six months after the training) is impressive, the change in incorrect answers proved that the more

Utilization of Market Information Disseminated through Dynamic Market Information (DMI)

conservative approach (wrong answers count negatively) followed in this study of evaluating the overall training impact was appropriate. In future training programs, efforts should be made to carefully address and discuss incorrect knowledge among farmers, community service providers, and NGO field executives to clarify any misunderstandings and to avoid the spread of incorrect information. Whether trainees applied their knowledge in practice and changed their crop protection practices still needs to be assessed.

4891

International Handbook of Research on Conceptual Change. pp 61-82. Vosniadou S (Ed), Routledge, New York, USA. GTZ 2000. Results-based Monitoring. Guidelines for Technical Cooperation Projects and Programmes. Gesellschaft fuer Technische Zusammenarbeit (GTZ) GmbH. Eschborn, Germanz. 23 p. Kirkpatrick DL, Kirkpatrick JD. 2008. Evaluating Training Programs. The Four Levels. BerrettKoehler Publishers, San Fransisco, USA. p 274.

REFERENCES

Maredia M, Byerlee D and Anderson J. 2000. Ex Post Evaluation of Economic Impacts of Agricultural Research Programs: A Tour of Good Practice. Paper presented to the Workshop on "The Future of Impact Assessment in CGIAR: Needs, Constraints, and Options", Standing Panel on Impact Assessment (SPIA) of the Technical Advisory Committee, Rome, May 3-5, Rome, Italy. 39 p.

Chi MTH. 2008. Three Types of Conceptual Change: Belief Revision, Mental Model Transformation, and Categorical Shift.

Scriven M. 2010. The Evaluation of Training. p 14, http:/ /michaelscriven.info/images/ EVAL_of_TRAINING2F.PDF (accessed on July 28, 2011)

The systematic use of training evaluation for all courses offered by AVRDC - The World Vegetable Center will show if the results are situation and site specific or if similar patterns can be found elsewhere.

\