Back to Top

CfEE Forum round table, 13th February 2018

The Programme for International Student Assessment (PISA) is a major cross-national study of 15-year-olds’ academic skills run by the OECD. The Programme has increasingly come to dominate education-policy discussions worldwide since it was first carried out in 2000. In recent years, policymakers have come to benchmark the success or otherwise of their policies by the changes in PISA performance over time. Yet there are many challenges to measuring trends using large-scale international assessments such as PISA. For example, administration and analysis procedures can change between survey rounds, which could potentially contaminate the results and make them difficult to interpret.

From 2000 until 2012 PISA was taken via paper and pencil. In 2015 that changed with all but 14 countries taking the assessment online. The use of computers in large-scale educational studies has several attractions, including the introduction of more interactive questions, efficiencies in processing and marking, and enabling greater insights into test-taking behaviour. Yet, in the short-term, the change poses challenges, including the potential for so-called ‘mode effects’ to influence the comparability of PISA scores over time.

At a recent CfEE roundtable, Professor John Jerrim discussed his recent Digital Divide Research Brief and shared analysis demonstarting that the computer based test was harder, although it should be noted that the sample size was small, so there are quite wide margins of error, but non-the-less the average effect size is high. The OECD report that it account for this 'mode effect' by calibrating the scoring scale only on questions which were not affected by the administration method. However Jerrim found that although this method reduces the differences, there is still a significant mode effect.

Discussants at the event recognised the value of PISA and the important data it provides. Furthermore, they welcome the OECD’s use of field trials to investigate changes to the tests composition and administration. However more research and analysis of PISA data is needed. Overly simplistic interpretation of PISA needs to be challenged and OECD should be encouraged to clarify how comparable the 2015 data is, both between countries and over time.

Blog Category: 
About the author