Back to Top

Teacher professional development around the world: the gap between evidence and practice

by Anna Popova, David Evans, Mary E. Breeding, and Violeta Arancibia

Working paper version (free)

Commentary by Celeste Carano and Nomtha Sithole

Governments and donors invest in teacher training with the expectation that it will produce better teachers and higher student achievement. But is that spending effective? In countries with low resources, like Liberia, where the government has less than $50 per child for education, up to $5000 may be spent training a teacher. That’s a high cost for a programme without a sure outcome. Policymakers and donors can also choose from a variety of approaches, ranging from mass training programmes, to peer-to-peer support, to coaches, to tech-based resource libraries. But what will best support teachers and results in the classroom?

In this paper, the authors ambitiously aim to tackle both the question of what characteristics define successful teacher training programmes and how to define them. They set out to identify what aspects of teacher training programmes are linked with better student learning outcomes through a tool to measure and compare teacher professional development programmes across countries. The In-Service Teacher Training Instrument (ITTSI) aims to capture the details of teacher training programmes and is piloted in this paper.

The ITTSI is wide-reaching, capturing indicators including design and implementation of programme, scale, cost, who is targeted, what knowledge and skills it intends to develop in teachers, how the programme is delivered to teachers and whether it was positively or negatively received by participants. The authors first use the tool to analyse 33 programmes with existing impact evaluations, sorting out those that were effective from those that were not. Importantly, after coding information from the papers, the authors made considerable effort to obtain additional data not shared in the papers, by contacting authors and implementers directly, ensuring a high proportion of the ITTSI was completed for each programme. The authors then analysed which aspects of the professional development programmes were associated with the largest improvements in student learning.

Building on that both quantitative and qualitative analysis, they worked with World Bank regional teams to identify countries for phase two of the analysis of at-scale programmes, ultimately collecting 48 ITTSI surveys of at-scale programmes, in addition to 91 short versions of the survey (the aptly named ‘BITTSI’).

Ranking the 33 previously evaluated programmes by their standardised impact on student test scores, the researchers selected the top 16 of the programmes as the ‘best’. They then compared the average value of the ‘top programme indicators to those of the at-scale, government-led programmes to investigate the difference in practices between the successful professional development programmes and government programmes.

So what are the features of training programmes that proved successful in impact evaluations? Most notably, programmes which took an intensive, practical ‘hands-on’ approach appeared as standouts. Teacher practice in the classroom, consecutive days of face-to-face training, and follow-up visits to review material from the training, were all associated with gains in student performance. In qualitative interviews with programme implementers, the researchers found that teachers most enjoyed the interactive, fun programmes that enabled more active learning. Perhaps unsurprisingly, more active learning for teachers seems to mean more active learning for their students as well. Other successful features included linking career opportunities to training, and targeting training to teachers based on their years of service (although only 2 of the 33 programmes took these approaches).

Government programmes, in contrast, were less likely to be linked to career opportunities, suggesting they lack incentives for teachers to improve after the training. In addition, fewer at-scale government programmes provide reading materials and books – linked with student learning gains – than do the 16 high-performing programmes. Most striking, the hands-on and in-person aspects of successful programmes were all less likely to be found in the government programmes, or found to a lesser degree, meaning teachers had fewer days of face-to-face training, fewer follow-ups from the trainers, and less time to practice what they learned first-hand in an applied way.

Readers may come away questioning exactly why the government programmes differ in these ways, and cost comes to mind immediately as a potential factor. But while the authors gathered information on programme cost with ITTSI, analysis of this was not shared in the paper. This leaves an open question of how the ‘best’ professional development programmes and the at-scale, government implemented programmes compared on the cost per teacher, and overall cost-effectiveness. While the readers, and authors, assume that at-scale programmes are lower cost, the size of the gap would be interesting to explore further and particularly relevant for policymakers considering trying to replicate more (presumably) expensive aspects of the successful programmes. While in-person follow up and continued practice with peers may translate to greater student gains, governments would need to consider not only the higher costs of adding that more staff-intensive time into training, but also just what human resources can be allocated to enable that follow-up – likely at an additional financial cost.

The differences in intent and follow-through in the design of government programmes is also touched on, but only briefly. The authors do acknowledge that due to the scope of the study, their inability to track programmes from design through to classroom practice make it difficult to identify impact and the point at which the weaker programmes fail to deliver. Our own observations are that professional development programmes may tend to be developed in reaction to pressures to improve school results – in the hopes they will produce a quick fix – and are often slow to evolve to actual needs. In Liberia, an emergency primary school teacher training programme developed as a temporary solution after the war remains the default training programme fifteen years later. With the slow pace of innovation and adaptation to evolving education systems, developing countries continue to channel money to programmes like this with low or no returns on student learning outcomes.

Similarly, the balance of politics is a challenge when considering how to integrate teacher incentives into professional development programmes. This may be why nearly half of all government programmes in the study did not include this feature. The role of unions in policy and in demanding incentives for teachers often enhances tensions in decisions around the use of resources. Professional development, however effective or ineffective, is popular; programmes that measure, monitor, and require teachers to meet explicit standards or face consequences may be received with less enthusiasm.

But this only makes this paper even more valuable for policymakers (and donors). The most effective programmes have qualities that may (we presume) make them more expensive, more time-consuming, and more difficult to deliver. But with some evidence that these qualities translate into impact for students, perhaps the argument for funding can be more easily made. In addition, ITTSI alone holds value to policymakers and programme managers and will become more valuable the more it is applied to future teacher training programmes. If used at the World Bank alone, that would build up a stronger comparison base for other programmes in the future. The ITTSI could ultimately help strengthen the impact of professional development programmes by probing their design and anticipating points of impact prior to roll-out.

While to date a relatively small number of programmes have been evaluated using ITTSI, as the authors rightly point out, a key purpose of the study was to design and test the tool for further use in the future – increasing its value over time.

References

Popova, A., Evans, D., Breeding, M.E., Arancibia, V. (2018). ‘Teacher Professional Development around the World: The Gap between Evidence and Practice (English).’ Policy Research working paper no. WPS 8572. © World Bank Group, Washington D.C.


To download this commentary in pdf form click here.

This commentary was originally published as part of the CfEE Annual Research Digest 2017-2018, in September 2018.

The volume is edited by Lee Crawfurd, Strategic Advisor with the Ministry of Education in Rwanda and the Tony Blair Institute for Global Change, and a CfEE Fellow.

To download a pdf of Lee’s introduction to the volume, visit this page.


Celeste Carano is a Governance Advisor to the Ministry of Education in Liberia with the Tony Blair Institute for Global Change. She previously led fundraising and strategy for More Than Me, a Partnership Schools provider in Liberia, and worked on network growth at Teach For All.
 
Nomthandazo Sithole advises the Ministry of Education in Liberia on the partnership schools initiative with the Tony Blair Institute for Global Change. She previously worked on youth and business growth initiatives in Liberia, her permanent home since 2011.
Blog Category: 
About the author