The problem with the EEF's way of converting effect sizes to months of progress is that they used an inappropriate basis: Glass, McGaw and Smith's 1981 "Meta-analysis in education". Glass et al (and I checked this with both Glass and McGaw separately) used some norms from standardized tests and concluded that one year's growth was one standard deviation. This is true for younger children, but for children aged ten, one year's growth, as measured by standardized assessments, is around 0.4 standard deviations, and for 15-year-olds, it's only around 0.2. So the EEF's evaluation of the Embedded Formative Assessment programme (which I co-authored with Siobhan Leahy) found that students in schools where teachers were developing their practice of formative assessment had Attainment 8 scores 0.13 standard deviations higher than students in the control school (who were just given extra cash). This works out to be a 25% increase in the rate of progress, or an extra 3 months progress in each of the two years of KS4. When one considers that the intervention cost around £1.20 per pupil, and changed how teachers spent 1 to 2% of their time, it is clear that CPD can make a big difference, if it's continuing, job-embedded, and focused on the things that make the biggest difference to learners...