Evaluation of a project to train Science educators equipped with micro-science kits: summary of findings
Duration of Project: 2003 - 2006
Geographical base for project: Butterworth, Eastern Cape Province
Evaluation conducted by: Bayview Consortium
The main focus of the project was to train 90 Grade 4 - 12 educators from 60 schools in Science and in the use of Micro-Science Kits that were provided to schools. The project also focused on enrolling qualifying teachers on accredited Advanced Certificate in Education (ACE) Science modules from 2003 to 2006.
The project schools were located in the disadvantaged, rural area of Butterworth in the Eastern Cape Province.
The project addressed the following problems:
- poorly trained educators which resulted in poor content knowledge and the inability to plan effectively; and
- poor resourcing of schools in terms of equipment for practical work resulting in lack of exposure of learners to practical work, and inability to use scientific equipment.
The Zenex Foundation commissioned an external evaluation to ascertain whether the identified problems were being effectively addressed.
The evaluation took place six months after the project had completed its final activities with teachers and schools.
The evaluator indicated that only 50% of the teachers could develop their lessons to link with the use of apparatus and material. In more than half of the schools visited, learners indicated that they used the apparatus once or never during the course of the year.
The learners in schools rarely used the apparatus or materials as in only 15% of the lessons observed were they afforded the opportunity to do so. It came as no surprise that about 80% of learners observed showed neither signs of critical thinking nor understanding of concepts being taught.
Grade 8 learners were tested on chemistry in 2003 and again 2004 when they were in Grade 9. The same test was used in the pre and post-test. The results showed that there was no significant change on learner performance and the mean percentage score on the test was 24%. In 85% of the learners there was no change in performance and in 15% of learners there was actually a decline in performance over the two years.
In the internal evaluation, in 2003, the Grade 10 and 11 learners were tested on electricity and tested again in Grades 11 and 12 in 2004 respectively using the same pre and post-test. The mean score was 33%. In 88% of the learners there was no change in performance, 6% showed an improvement and 6% showed a decline in performance.
The external evaluators used the same chemistry tests to test Grade 10 learners in the study. One group of learners tested had been exposed to teachers who were on the project for the last three years of their schooling (Grades, 8, 9 and 10). The control group of learners had no prior exposure to teachers in the project. For the group of learners who had exposure to teachers in the project for three years, the mean score was 25.3% - not very different from the scores in the internal evaluation. The mean scores for the control group of learners were 26.4%.
The ACE course offering that was expected to attract teachers to the programme became a secondary objective because of the project implementation challenges. Only 16 teachers out of the 36 that enrolled for an ACE passed their Science modules at the end of 2004.
Supply of apparatus and materials
The evaluator observed that more than 50% of the project budget was allocated to the provisioning of apparatus and materials and yet they remained unused. Hence the recommendation that less materials and apparatus (with enough replications for all learners to take part in hands-on activities) should be supplied to schools than what was then the case to enable learners to participate in more focused Science investigations.
Local management of the programme
The evaluator identified an apparent weakness of the project in its management in the province. Although sufficient funding (12% of the budget) was budgeted for the appointment of a local manager to administer the project, challenges in terms of travel, accommodation and expertise of the manager resulted in underperformance in the project. The evaluator recommended that the appointment of a suitable local administrator/manager be a priority objective of the project since it was run at a distance from the service provider’s home base.
Clustering of schools
The evaluator recommended that care should be exercised when clustering schools to ensure that feeder schools were in fact feeding a substantial number of learners into identified recipient higher level schools. A further recommendation was for links to be established between feeder schools and their identified recipient higher level schools for them to understand each other and together strive to achieve a common goal.
It was recommended that most of the project budget should be allocated to classroom support activities as they were vital to the success of any teacher intervention project.
The evaluator recommended that the curriculum developers should be sensitive to issues critical to teaching and learning second language in the design of the training programme. Design tasks should engage teachers and put pressure on them to promote discussions, self-reflection and combined subject, pedagogical and contextual knowledge.
Teachers’ conceptual knowledge
The evaluator suggested that the curriculum designers should take into consideration the pedagogical content knowledge of teachers when developing in-service programmes for teachers.
Learner performance was regarded as a key aspect of development and thus indicators on learner performance should include class work, homework, tests, exam scripts and record of practical work. Written work of learners was seen as the most reliable means of verifying what they can do.
Inclusion of accredited modules
The evaluator recommended that the inclusion of accredited modules (ACE) offered be carefully chosen to meet both the level at which the teacher was teaching and his/her personal needs.