Development and Validation of Instrument for Assessing Practical ...

7 downloads 287066 Views 76KB Size Report
Development and Validation of Instrument for Assessing Practical. Skills in Building Electronics Systems in Nigerian Technical. Colleges. 1Okwelle P. Chijioke ...
Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5): 779-785 © Scholarlink Research Institute Journals, 2012 (ISSN: 2141-7016) jeteas.scholarlinkresearch.org Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5):779-785(ISSN: 2141-7016)

Development and Validation of Instrument for Assessing Practical Skills in Building Electronics Systems in Nigerian Technical Colleges 1

Okwelle P. Chijioke and 2K. R. E. Okoye

Department of Science and Technical Education, Rivers University of Science and Technology, Port Harcourt, Nigeria. 2 Department of Vocational Education, Nnamdi Azikiwe University, Awka, Nigeria. Corresponding Author: Okwelle P. Chijioke ___________________________________________________________________________ Abstract An electronics systems skill assessment scale (ESSAS) aimed at improving the assessment of students’ performance in construction of simple electronics systems was developed and validated in this study. The study answered one research question and tested two hypotheses. Electronics teachers from technical colleges in the South-South zone of Nigeria were the sample used for item validation of the ESSAS which was further tried out on students of technical colleges. Data collected were analyzed using statistical mean, Cronbach’s Alpha, t-test and One Way Analysis of Variance (ANOVA). The result of the study showed that 61 practical skills were found appropriate for the ESSAS. Also, the instrument was found to possess a high reliability of 0.87. It was therefore, recommended amongst others that electronics teachers in technical colleges and other similar institutions in Nigeria should be made to be aware and use the ESSAS for assessing students’ performance in construction of simple electronics systems. __________________________________________________________________________________________ Keywords: development; validation; assessment; practical skill; electronics systems. __________________________________________________________________________________________ INTRODUCTION teaching learning process, it is therefore, the task of In the field of technical and vocational education, the teacher to construct valid and reliable practical skill activities form major part of performance based tests for the purpose of appraising instruction. Denga (1987) underscored this view and practical skills in technical and vocational subjects. stated that most activities in technical and vocational education require practical skills for carrying them Technical colleges in Nigeria offer technical and out and as such assessment of students with a view to vocational education programmes for the purpose of determining if such practical skills have been producing middle level skilled manpower required acquired will require laboratory based performance for the nation’s economic and technological test. According to Khan (2007) a performance test development (Federal Republic of Nigeria [FRN], requires examinees to perform a task instead of 2004). National Technical Certificate (NTC) is answering questions. Therefore, the major feature of awarded by the National Business and Technical a performance test is that a criterion situation such as Examinations Board (NABTEB) to students who a job is simulated in workshops or laboratories, to a have completed their post- primary education at relatively high degree (Okwelle, 2003). technical colleges (NABTEB, 2004). The NTC curriculum for radio, television and electronics work Assessment of practical skills may be either in the trade among others is aimed at training skilled area of process evaluation, product evaluation or technical manpower equipped with the necessary some combination of both (Okeke, 2004). Process technical knowledge and practical skills for design assessment requires attentive and consistent teacher and construction of simple electronic systems. To this observation and rating of student performance. On end, students are required to carry out practical the other hand, product or outcome assessment activities following logical steps before arriving at involves the teacher objectively judging the quality of final stage of the task. Consequently, each step and the finished product. Both forms of assessment are the final stage or finished product is to be assessed accomplished by the teacher through the use of some comprehensively and systematically by the teacher, if type of observational techniques such as checklists, the objective of the training is to be achieved. rating scale and recording sheet to be used for rating Evidence from research studies (Bukar, 2006; the students’ performances (Ojoko, 2000). Since Chejile, 2006; Garba, 1993; Okwelle, 2003) indicate assessment is placed highly among the activities of that the popular method of assessing students’ 779

Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5):779-785(ISSN: 2141-7016) practical skills in technical and vocational education programmes including Radio, Television and electronics work trade in Nigerian technical colleges is based on mere looking at the students’ finished products by the examiners with little or no attention to the processes involved in carrying out the practical work. Marks are then awarded to the students based on what the teacher or examiner feels the student deserves. This practice is considered biased and subsumes the award of grades that show individual examiner’s feelings (Okwelle, 2011). However, NABTEB uses a marking scheme checklist, to assess students’ performance in practical components of NTC examinations. This scheme merely highlights the major skills to be rated but lacking in details, the various stages of specific skills involved in the process of carrying out the given task. The implication of this is that the scores and grades assigned to students in practical works by the examiners may not be true representative of their performances.

MATERIALS AND METHODS Instrumentation research was employed in the study. Instrumentation design is appropriate for use when introducing new procedures, technologies or instrument for educational practices (Gay, 1996). The area of the study was South-South zone of Nigeria comprising, Akwa Ibom, Bayelsa, Cross River, Delta, Edo and Rivers States. There were two target populations in this study. These included 41 electronics teachers and 287 final year students identified from the department of Radio and Television in all the 20 technical colleges accredited by National Board for Technical Education (NBTE), to run NABTEB programme with specialization in Radio, Television and Electronics Works in the South-South Zone of Nigeria. The entire 41 Radio and television teachers were used in the study for the purpose of validating the ESSAS. No sampling was done because the number was small and manageable. Of the 41 teachers, nine were from federal technical colleges and 32 teachers from 16 state technical colleges. Also, nine of the teachers were from schools in Akwa Ibom State, two from Bayelsa State, nine from Delta State, four from Edo State, eight from Cross River State and nine from Rivers State. Also as part of the sample, a total of 38 final year students of the department of radio and television were purposively sampled from two of the 20 colleges. The choice of these two schools was based on the adequacy of all the models of equipment, materials and tools necessary for implementing the test. This sample was used for try-out of the validated ESSAS instrument for the purpose of ascertaining its initial reliability.

In this context, there is the need to improve the standard of assessment in Radio, television and electronics work trade by using valid and reliable assessment instruments which will take account of the processes of practical activities leading to the completion of the final practical products. However, literatures available to the researcher indicate that no such instrument is in use in Nigerian technical colleges. Against the background of rareness of standard instrument for assessing practical skills in construction of electronics systems at technical colleges impelled this study. The purpose of this study therefore, was to develop and validate an instrument for assessing students’ practical skills in design and construction of simple electronics systems at technical college level.

The instrument developed in this study is “Electronics System Skill Assessment Scale” (ESSAS). Based on the suggestions of Benson and Clark (1982); Cluzeau (2002); Orion et al (1996); Samarakkoddy et al (2010), the procedure adapted in the development of the ESSAS was a multi-staged approach shown in figure 1.

RESEARCH QUESTION The following research question guided the study: 1. What practical skills are considered appropriate for inclusion in Electronics System Skill Assessment Scale (ESSAS) for assessing students’ practical skills in building electronics systems?

The instrument developed will assess practical skills performance of students in building electronics systems in technical colleges in Nigeria. Following a detailed review of NABTEB curriculum for the award of National Technical certificate (NTC) in Radio Television and Electronic works, building of simple electronics systems, was identified as a major practical skill area for assessment in the NABTEB curriculum. Next, six performance objectives relating to this skill area were isolated from the curriculum. Based on the critical review of relevant literature, these objectives were transformed into six basic task statements. In order to ensure that all basic task areas and various levels of behavioural objectives were adequately covered, a table of specifications of twoway grids was developed; the horizontal axis lists the six basic tasks statement (content areas) and the

HYPOTHESES The following null hypotheses were tested at five percent level of significance. 1. There is no significant difference between electronics teachers of federal and state government technical colleges on appropriate skills for inclusion in Electronics System Skill Assessment Scale (ESSAS) (P < 0.05). 2. There are no significant differences among electronics teachers across six states regarding the appropriateness of practical skills for inclusion in Electronics System Skill Assessment Scale (ESSAS) (P < 0.05). 780

Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5):779-785(ISSN: 2141-7016) vertical axis lists the six levels of psychomotor domain of Padelford (1984).

overall instrument reliability of 0.81. This value exceeded Nunnally’s criterion of 0.7 accepted for statistical consideration (Nunnally, 1978). The choice of Cronbach’s alpha method in determining the reliability coefficient was informed by the fact that the instrument had lots of test items and in clusters (Trochim, 2006).

State the purpose of the instrument Review related literature including NABTEB curriculum Identify major practical skill areas for assessment

Based on the results of the pilot study, the draft ESSAS was revised in wordings to produce the test form for further validation. In determining the practical skills appropriate for inclusion in the final test, copies of the preliminary validated ESSAS was administered to 41 technical teachers in radio and television department of the 20 technical colleges that run NABTEB programme across the South – South states of Nigeria. The instrument was arranged in two parts: Part I sought demographic information about the respondents while Part II had 61 items dealing with skill in building electronics systems. A five point scale of Highly Appropriate (HA), Appropriate (A), Moderately Appropriate (MA), Inappropriate (IA), Highly Inappropriate (HI), were maintained against each of the practical skill statements with a corresponding assigned values of 5, 4, 3, 2, and 1 respectively.

Isolate performacnce objectives Derive basic tasks statemets from the objectiive Develop table of specifications Select likert five point rating scale Generate practical skill test items Write test form Preliminary face and content validate test form Revise test form to produce ESSAS Pilot test ESSAS Revise test items in ESSAS Administer revised ESSAS for content validation Final ESSAS Assembly

All the 41 draft copies of the ESSAS administered and returned were found to be valid and therefore used in the study. None of the six basic task and 61 practical skill items were dropped as they were rated above “Moderately Appropriate”. The result of this exercise was used to assembly the final form of ESSAS with new rating options of Excellent, Very Good, Good, Fair and Poor, with assigned values of 5, 4, 3, 2 and 1 and respectively. This final version of ESSAS was tried out on 38 RTV students from one technical college each in Bayelsa and Rivers State. The internal consistency reliability of the instrument was ascertained using the Cronbach’s alpha method to obtain the overall reliability coefficient of 0.87. Data for answering the research question was analyzed using the mean and standard deviation. In order to select the appropriate tasks and practical skills for inclusion in the ESSAS, a mean cut-off of 3.00, which is moderately appropriate was chosen. Therefore, any practical skill with a mean score of 3.00 and above was appropriate, while score below 3.00 was considered inappropriate.

Try out final ESSAS assembly

Figure 1: Flowchart for ESSAS Development The six basic task areas were further analyzed based on available literature, to generate 65 practical skill items or specific tasks statements which were matched with the appropriate cells on the table of specifications. At this stage, the test items were written by expressing the extent of appropriateness or otherwise of performing each of 65 specific task statements, employing the Likert five point rating scale format with response options in the order of Highly Appropriate, Appropriate, Moderately Appropriate, Inappropriate and Highly Inappropriate, with assigned values of 5, 4, 3, 2 and 1 respectively, to form the initial copy of the assessment instrument. Preliminary face and content validation of the draft instrument was carried out by recruiting a panel of seven experts; three lecturers in Technical Education, two lecturers in Measurement and Evaluation; and two college electronics teachers. Following the comments of these experts, a final instrument consisting of six basic task areas and 61 practical skill items which was named as the “Electronics System Skill Assessment Scale” (ESSAS) was assembled. A pilot testing of the draft ESSAS was carried out on 15 final year students who were not part of the study sample for the purpose of estimating initial reliability. The internal consistency reliability of the instrument was determined by calculating Cronbach’s alpha reliability coefficient which yielded

The null hypotheses one was tested at five percent level of significance, using the student t-test statistics. The One – Way Analysis of Variance (ANOVA was used to analyze the second null hypothesis. For testing null hypothesis one, if calculated t-value was greater than or equal to t-critical value at five percent level of significance, then reject null hypothesis one but if otherwise, accept the null hypothesis. Similarly, if the calculated F-value was equal or greater than the F-Table value at five percent level of significance, the 781

Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5):779-785(ISSN: 2141-7016) null hypothesis two was rejected but if the Fcalculated was less than F-Table value, the null hypotheses was accepted. All statistical analysis was performed with Statistical Package for Social Sciences (SPSS) statistical soft ware.

RESULTS AND DISCUSSIONS Research Question What practical skills are considered appropriate for inclusion in Electronics system skill assessment scale (ESSAS) for assessing students’ practical skills in building electronics systems?

Table 1: Mean Ratings of Practical Skills for Building Electonics Systems S/n 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60. 61.

Designing Skills in Building Electronics Systems Block diagram of the project is correct Function of each block diagram is clearly stated Schematic diagram is correct Schematic diagram fully labeled with part numbers Design is adequately explained with relevant electronics principles Design is feasible based on principles to suit functionality Design of casing is suitable for the project Design is feasible with respect to materials Working drawing of the casing is correct Important dimension of the casing are clearly included on the working drawing Function of components for the design are clearly stated Constructional problems are clearly stated Planning Skills in Building Electronics Systems Total cost estimate of project provided Components obtained are of correct specifications Right materials for the casing is obtained Materials for the casing are well prepared Right tools for the work are identified Right measuring instruments for the work are identified Vero board provided is of correct size Correct circuit layout is produced on Vero board Provisions for components substitute are made Relevant safety measures are clearly identified Constructing Skills in Building Electronics Systems All active components are correctly fixed on Vero board All passive components are correctly fixed on Vero board Soldering of parts are neat and bright Jump wires used are correctly fixed Protective devices are correctly fixed Provision for heat outlet is made Appropriate holes for fixing external fitting (e.g. switches, control indicators) are provided Circuit is well grounded Components layout on the Vero board are well spaced Underside layout of Vero board is neat Materials are well economized Tools handled skillfully Equipment operated skillfully Student worked independently with little or no assistance from the teacher / instructor Safety measures are strictly adhered to during construction Project is completed within reasonable time Assembling Skills in Building Electronics Systems All component parts are properly fixed on the base board or plate Base board and circuit are well grounded Link wire between circuits and component parts are of correct length All external fittings are properly fixed All joints are firm Appropriate jack socket outlets are provided Power supply cable is correctly fixed All external fittings are properly labeled Finishing Skills in Building Electronics Systems Joints brushed for cleanliness Finished project is comparable to those made in industry Construction is free from errors Good attitude exhibited during the project execution Confidence exhibited during project execution Project limitations are clearly explained Testing Skills in Building Electronics Systems Right test instrument provided Appropriate tests are carried out Test results reflect results of the planning stage All functions operated well No open circuit identified No short circuit identified Project operated over a long period of time Project was executed safely Project can be dismantled and reassembled easily

782

SD

Remarks

( ) 4.48 4.21 4.43 4.29 4.07 3.53 3.82 3.63 4.00 3.80 3.65 3.36

0.51 0.72 0.67 0.78 0.75 0.67 0.66 0.85 0.94 0.71 1.08 1.29

Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate

4.14 4.46 3.61 3.61 4.02 4.24 4.12 4.36 3.70 4.04

1.03 0.71 0.48 0.80 0.47 0.69 0.84 0.62 0.92 0.80

Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate

4.05 3.97 4.24 4.00 4.41 4.14 4.21 3.73 3.73 3.61 3.36 4.00 4.07 3.19 4.14 4.09

0.74 0.65 0.69 0.44 0.54 0.92 0.65 0.61 1.02 0.73 0.88 0.63 0.84 0.95 1.03 0.43

Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate

4.04 3.92 3.78 4.09 4.36 4.09 4.07 4.04

0.44 0.81 0.52 0.30 0.53 0.70 0.93 0.83

Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate

4.17 3.90 3.53 3.51 3.73 3.68

0.70 0.88 0.26 0.89 0.59 0.68

Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate

4.31 4.26 4.04 4.80 3.95 4.95 4.75 4.00 4.90

0.63 0.55 0.56 0.63 0.68 0.78 0.58 0.59 0.72

Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate Appropriate

Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5):779-785(ISSN: 2141-7016) Table 1 indicates that the rating of the items ranged from 3.19 to 4.95. All the items had their mean scores above the cut-off point of 3.00, which qualifies all the 61 practical skills appropriate for inclusion in the ESSAS. The standard deviation (SD) of the items ranged from 0.26 to 1.29. This implies that the raters were very close in their ratings.

appropriateness of practical skills for inclusion in ESSAS. Therefore, the second hypothesis was accepted. DISCUSSIONS The successful development of a valid and reliable test instrument to assess practical skills of students in building electronics systems in technical colleges is the main contribution of this study. The findings of the first research question revealed that all the 61 practical skills (i.e. 61 items of the ESSAS) were considered appropriate for inclusion in the electronics system skill assessment scale (ESSAS). This signifies that the electronics teachers in technical colleges considered the 61 test items as appropriate for use in assessing students’ performance in practical areas of building electronics systems. This finding is consistent with Garba (1993) and Iji (2007) regarding the fact that all the items of the test instrument they developed were considered by the respondents as appropriate for use in assessing students’ performance.

Hypothesis 1 There is no significant difference between electronics teachers of federal and state government technical colleges on appropriate skills for inclusion in electronics system skill assessment scale (ESSAS)(P < 0.05). Table 2: T-test Analysis of Mean Response of Federal Teachers and State Teachers on Practical Skills for inclusion in ESSAS Group of Teachers Federal

State



SD

1066.82

73.06

1068.70

df

T Cal

T tab

Remark

39

-0.076

2.042

*Accepted

69.81

The results of the present study indicate that the developed ESSAS possesses high content validity. In this study the process of content validation of the ESSAS involved the construction of table of specifications based on the six levels of Padelford (1984) model of psychomotor domain, resulting to a balanced spread of distribution of the practical skills across the six levels. This is in line with the views of Agaegbu (2004) and Khan (2007) who maintained that to ensure that a test has content validity, a table of specifications that satisfies the conditions of covering both the content areas and various levels of educational objectives be prepared. The result is also in agreement with the assertions by Anastasi and Urbina (1997) and Denga (1987) that the fairer the degree of distribution of test items, the better the representation of the behavioural domain and the higher the content validity of a test.

n = 41, df = 39, p < 0. 0.5, *Accepted Table 2 shows that state electronics teachers recorded a high mean score of 1068.70 (SD = 69.0) above federal teachers with mean score of 1066.82 (SD =73.06). At 39 degree of freedom (df), the t – calculated value of -0.076 is lower than t- tabulated value of 2.042, indicating that there is no significant difference between federal and state electronics teachers on appropriate skills for inclusion in ESSAS. The first hypothesis therefore was accepted. Hypothesis 2 There are no significant differences among electronics teachers across six states regarding the appropriateness of practical skills for inclusion in electronics system skill assessment scale (ESSAS) (P < 0.05).

From the analysis of data relating to the first null hypothesis, it was revealed that there was no significant difference between Federal and State teachers on their opinions regarding the appropriate skills for inclusion in the ESSAS. This signifies that the type of employer (federal or state) does not affect the ratings of all the teachers used in this study. Since both federal and state colleges use the same curriculum, it is welcome development to note that the teachers irrespective of their employers are of same consensus on practical skills appropriate for assessing students for the National Technical Certificate (NTC) based on the NABTEB curriculum

Table 3: ANOVA of Teachers Ratings across Six States on Practical Skills for inclusion in ESSAS. source of variation Between Groups

sum of squares

Df

mean square

13879.481

5

2775.896

Within Groups

180874.96

35

5167.856

Total

194754.44

40

F cal

F tab

0.537

2.53

Remark

*Accepted

n = 41; p < 0. 0.5; df = (5, 35); *Accepted

The result of the analysis of the second null hypothesis indicated that there were no significant differences among the electronics teachers across six states (Akwa Ibom, Bayelsa, Cross River, Delta, Edo

The data in Table 3 shows that f-calculated of 0.537 is less than f- Table value of 2.53. This implies that there are no significant differences among the electronics teachers across six states regarding the 783

Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5):779-785(ISSN: 2141-7016) and Rivers States) regarding the appropriateness of practical skills for inclusion in ESSAS. This implies that all raters irrespective of their locality were of the same consensus over the practical skills selected for inclusion in the ESSAS.

Chiejile, L. C. 2006. Development and validation of a test instrument for assessing students’ practical performance in electrical installations. Unpublished doctoral dissertation, Nnamdi Azikiwe University, Awka, Nigeria.

CONCLUSION Preliminary data obtained in this study indicate that the “Electronics System Skill Assessment Scale” (ESSAS) is a valid and reliable rating instrument that could be used in assessing students’ practical skills performance in building electronics systems in technical colleges. It is expected that electronics teachers in technical colleges in Nigeria may now be able to use an objective, comprehensive and systematic instrument to effectively assess students’ performance in practical works. In so doing, the teachers will be able to show proof of the scores and grades that they award. Furthermore, it is believed that students’ performance in radio, television and electronics works trade especially the practical aspect will be improved.

Cluzeau, F. 2002. Development and validation of an international appraisal instrument for assessing the quality of clinical practice guidelines: the AGREE project. Retrieved October 15, 2010 from http://gshc. bmj.com/content/12/1/18 fall Denga, D. I. 1987. Educational measurement, continuous assessment and psychological testing. Rapid Educational Publishers Limited, Calabar, Nigeria. Federal Republic of Nigeria. 2004. National policy th on education (4 Edition). NERDC Press. Lagos, Nigeria. Garba, L. N. 1993. Development of an instrument for evaluating practical project in woodwork. Unpublished doctoral dissertation, University of Nigeria, Nsukka, Nigeria.

RECOMMENDATIONS Based on the findings of this study, the following recommendations are made: 1. Electronics teachers in technical colleges and similar skill acquisition institutions should be made to be aware and make use the ESSAS for assessing practical works involving building of simple electronics circuits and systems. 2. National Business and Technical Examination Board (NABTEB) and similar examination bodies could consider using the ESSAS in assessing practical components of students’ performance at NTC level and other examinations at post- primary level.

Gay, L. R. 1996. Educational research .competences, th for analysis and application. (5 Edition), Prentice Hall, Merrill, NJ, USA. Iji, C. O. 2007. An instrument for evaluating higher degree thesis in education. Journal of Research in Education. 4 (2), 76 – 80. , Khan, M. S. 2007. School evaluation., A.P.H. Publishing Corporation, New Delhi. National Business and Technical Examinations Board (NABTEB). 2004. Syllabus for engineering trades for the national technical certificate examinations. Benin City: Yuwa Printing Press.

REFERENCES Agbaegbu, C. N. 2004. Psychometric qualities of a test. In C. N. Agbaegbu, S. A. Ezeudu., & M. N. V. Agwagah (Eds.), Educational measurement & evaluation for colleges and universities (pp.108 121) Owerri: Cape Publishers Int’l Ltd.

Nunnally , J. C. 1978. Psychometric theory. McGraw-Hill, New York.

Anastasi, A and Urbina, S. 1997. Psychological testing. (7th Edition). Prentice Hall, Upper Sadde River, NJ: USA.

Ojoko, S. S. 2000. Measurement and evaluation in teacher education. Springfield Publishers, Owerri. Okeke, B. C. 2004. Standardization of an instrument for assessing pracing work technology in technical colleges. Journal of Vocational and Adult Education 3(1), 42-52.

Benson, J. and Clark, F. 1982. A guide for instrument development and validation. The American Journal of Occupational Therapy. 36(12), 789 - 800. Retrieved October 13, from http://www.Scinf. umontreal.

Okwelle, P. C. 2003. Construction of valid evaluation instruments in technology education. Journal of Technical and Science Education. 12 (1&2) 72 – 81.

Bukar, B. 2006. Development and validation of Laboratory-based tests for assessing practical skills of HND students in electronic maintenance and repair. Unpublished doctoral dissertation, University of Nigeria, Nsukka, Nigeria. 784

Journal of Emerging Trends in Engineering and Applied Sciences (JETEAS) 3(5):779-785(ISSN: 2141-7016) Okwelle, P. C. 2011. Development and validation of instrument for assessing practical skills in radio and television systems in technical colleges. Unpublished doctoral dissertation, Nnamdi Azikiwe University, Awka, Anambra State, Nigeria. Orion, N., Hofstain, A.; Tamir, P.& Giddings, G. J. 1997. Development and validation of an instrument for assessing the learning environment of outdoor science activities. Retrieved October 15, 2010 from http://stwww.weizman.ac.il/g-earth/geogroup/whole_ articles/a7-whole.pdf. Padelford, H. E. 1984. Acquiring psychomotor skills. Journal of Epsilon P Tau. X(2), 35-39. Samarakkody, D. C., Fernando, D. N., Perera, H., McClure, R. J; and Silva, H. D. 2010. The child behavior assessment instrument: Development and validation of a measure to screen for externalizing child behavioural problems in community setting. International Journal of mental health systems. 4,13. Retrieved October 15, 2010 from http://www.ijmhs.com/content/4/1/13. Trochim, W. M. K. 2006. Types of reliability. Retrieved December 10, 2009from http://www.social research methods.net/kb/reltypes.plp.

785

Suggest Documents