Samuel Berlinski and Matías Busso
Technology has increased productivity in the workplace and shaped the way we use our free time and the way we socialize. So why is technology not at the core of classroom learning?
Funding to equip schools and students with technology is usually considered one of the biggest challenges, especially in the developing world. But finding the right approach to use technology effectively in the classroom is also a major challenge, according to an evaluation by the IDB.
The Costa Rican government and the IDB recently conducted an experimental evaluation of seventh-grade students to assess the effectiveness of technology in improving students’ ability to reason, debate, and communicate using mathematics.
The results showed that the use of technology had negative effects on student learning—a surprising outcome, since various studies have shown that technology, when guided, offers students opportunities to explore and understand mathematics in all of its dimensions.
The Costa Rican experiment aimed to foster students’ active participation in geometry classes by changing the way classroom time is allocated in order to carry out different tasks and devote more time to exploration.
Unlike the traditional lecture-style teaching of math in Costa Rica, the classes gave students an active role and teachers a less controlling one.
The evaluation was carried-out in 85 schools, involving 18,000 seventh grade students and 190 teachers. In Costa Rica, the seventh grade is the first year of secondary school. The country has a long tradition of technology in its schools, and the students involved in the experiment were also accustomed to using technology at home.
The students were divided into five different groups, including: a control group in which teaching continued with the traditional method. These participants did not receive new materials or new technologies.
Of the four treatment groups, three used the new teaching approach and a new set of class materials designed by Costa Rican and international experts, including a new technology (either an interactive whiteboard, a computer lab, or a laptop for every child); and one used the new teaching approach and materials but did not receive the new technologies.
Surprisingly, the control group learned significantly more than any of the four treatment groups. The treatment group that did not use the new technologies learned about 17 percent of a standard deviation less than the control group.
The treatment groups that used the new technologies performed on average 25 percent of a standard deviation below the control group. In other words, for an average student, this is equivalent to be ranked 10% lower after being exposed to the new pedagogy and technology.
Students in the treatment groups performed worse than those in the control group in learning both basic concepts and higher-order geometry skills. In addition, the evaluation found that students in the treatment groups were less willing to try the new learning strategies, more disengaged from the class, less inclined to make an effort, and liked math less.
Performance declined most among students who previously had been the best in their classes: their behavior deteriorated and they were less engaged with learning.
Why did this happen? There are several possible reasons. Students may have been better equipped to learn under the traditional method or the new approach may have given children more opportunity to become distracted.
Another possible contributing factor may have been that student-teacher interaction suffered during the intervention. The role of teachers is decisive: their ability and creativity in leading, motivating, and engaging students can make all the difference.
For this evaluation, teachers used the training they received but didn’t master the innovation in a way that maximized the benefit for their students. The findings suggest that teachers found it difficult to adapt to the new teaching style with the new materials, and some of them may have rushed over some of the material.
The study in Costa Rica shows the importance of evaluating small pilot projects before embarking on large-scale interventions. The IDB has made several efforts to analyze the use of computers in schools and give policy recommendations on strategies to implement information and communications technology in education.
It should be noted that the results of the Costa Rica evaluation reflect student performance in the short term. Perhaps additional training, more finely-tuned materials, and a better blending of active learning techniques with technology could produce significant improvements in math performance.
The evaluation does show, however, that educational reforms involving new technologies and teaching approaches may not bring immediate results.
Policy makers should monitor the performance of such reforms carefully and also consider alternatives to strengthening short-term performance, such as tutoring programs.
This story is one of the project stories included in the Development Effectiveness Overview, an annual publication that highlights the lessons learned from IDB projects and evaluations.
Download here the evaluation “Challenges in Educational Reform: An Experiment on Active Learning in Mathematics“.
About the authors:
Samuel Berlinski is a principal research economist in the IDB headquarters in Washington, DC.
Matías Busso is a lead research economist in the IDB headquarters in Washington, DC.