In 2008 a couple of economists from the IDB visited the Peruvian Ministry of Education to meet with the Education Technology Director. Our purpose was to evaluate an old IDB-project that provided computer labs to middle schools.
We were welcomed and were provided with all the support to carry out that evaluation. We were told that the Government’s real interest was in conducting the most rigorous evaluation possible of another project, the Peruvian One Laptop per Child (Una Laptop por Niño).
Although that program was not financed by the Bank, we formed a group with experts from various units of the Bank, and in early 2009 worked with the Government’s team to design the first impact rigorous impact evaluation of the program. The details of the evaluation are available in a recent working paper, here I just want to mention some salient features.
This is the first evaluation of OLPC that focuses on educational outcomes, namely student learning. There are many stories about implementation of OLPC, and many opinions for and against the program (see for example OLPC News and the Technology Salon), but no evaluations that focus on learning, and none using a rigorous evaluation approach.
We used a randomized control trial: out of 320 schools, 210 were randomly selected to receive XO´s (the laptops), while the rest did not. The schools were identical before the program, and except from the computers nothing differed between them, so we are pretty sure that any difference after the program can be attributable to OLPC.
Our main focus was on academic achievement in Math and Language, that were the declared objective of the program. We also looked at cognitive skills as measured by Raven’s Progressive Matrices, a verbal fluency test and a Coding test.
The results are striking.
Our results indicate that the program dramatically increased access to computers. There were 1.18 computers per student in the treatment group, compared with 0.12 in control schools at follow-up. This massive rise in access explains substantial differences in use.
Eighty-two percent of treatment students reported using a computer at school in the previous week compared with 26 percent in the control group. Effects on home computer use are also large: 42 percent of treatment students report using a computer at home in the previous week versus 4 percent in the control group. Internet use was limited because hardly any schools in the study sample had access.
However, we find no evidence that the program increased learning in Math or Language. This is not surprising, as the program did not include specific interventions to integrate the laptop to the curricula, nor the computers include specific math or language software.
The program did not affect attendance or time allocated to doing homework, nor did it increase motivation or reading habits and the program did not seem to have affected the quality of instruction in class.
On the positive side, the results indicate some benefits on cognitive skills. In the three measured dimensions, students in the treatment group surpass those in the control group. A back-of-the-envelope calculation suggests that the estimated impact on the verbal fluency measure represents the progression expected in six months for a child.
Similarly, the estimated impact for the Coding and Raven tests accounts for roughly the expected progression during five and four months, respectively.
The results are quite striking: as implemented, the program did not increase learning. Is this surprising? What is next for OLPC, in Peru and elsewhere? Let us know your opinion…