The methodology for this investigation is grounded in the Raising achievement in primary schools methodology as described in this section. ERO’s Education Review reports for the schools in the sample were also analysed to investigate the factors that influenced school effectiveness.

ERO evaluated the extent to which schools had undertaken deliberate actions that led to more students achieving at or above National Standards. ERO’s judgement was based on the:

  • proportion of students who had accelerated progress, in relation to the number of students underachieving and the total number in the school
  • deliberateness and coherence of actions associated with accelerating progress
  • depth of knowledge about how to extend the reach so more students were achieving success than before.

Accelerating progress

ERO focused on individual student’s accelerated progress, rather than the overall increase in the proportion of students achieving at a school. Improvement in the progress of individual students contributes to the overall goal of all students achieving.

The evaluation considered both short and long‑term acceleration. Progress was considered accelerated when a student’s achievement moved from well below to below, at or above a national standard, or from below to at or above. This meant the student made more than one year’s progress over a year.

Progress was also considered accelerated when a student’s progress was noticeably faster than might otherwise have been expected from their own past learning when using norm-referenced tools that assessed the breadth of reading, writing or mathematics. It needed to be faster than classmates progressing at expected rates. These considerations acknowledged the need for equitable outcomes, and took into account acceleration over less than one year.

Deliberateness and depth

If leaders and teachers do not know how they have accelerated some students’ progress, they will not be able to apply this knowledge to scale up, spread and extend their reach to more students. The investigation considered deliberateness in teacher and leader actions to improve outcomes and evaluate impact. It also considered depth of teacher and leader knowledge about particular students’ learning, interests and needs, and about curriculum progression to know what and how to teach so students’ learning progressed at expected or accelerated rates.

Evaluation questions

ERO evaluated schools’ capability to do something different for students achieving below expectation. The initial questions focused on the best practice in each school. This provided a strength-based framework for reporting the findings.

In schools that had taken deliberate actions and improved student outcomes, ERO explored the triggers for the particular group of students the school identified and the deliberate actions they took. Some schools used ALiM or ALL examples. ERO also evaluated how each school sustained the focus on improving outcomes for students achieving below or well below year group expectation. The investigative questions for schools that had an innovative response to underachievement were:

  1. What triggered the need to do something different?
  2. How did the school know whatto do differently?
  3. How did the schoolknow what worked, when, why and for who?
  4. How is the school ensuring it has learnt from this focus on acceleration so outcomes are improved for more students?

In schools that had a more‑of‑the‑same response to underachievement, ERO explored the following:

  1. What can be built on to facilitate acceleration?
  2. What needs to be done differently?
  3. How can the capability to do this be built?

The framework in Figure 1 highlights these questions. The evaluation prompts are in Appendix 1of the main report. This framework was also used to describe the findings.

this image is the framework for evaluation and questions the centre is a circle which  starts with description of the students below or well below national standards for their year group, next is identification of learning strengths and needs and setting priorities in reation to school goals, then responding with innovations that accelerate learning, next is responding to the impact of innovations that accelerated and improved student outcomes and lastly refocus.  The questions that arise of these are what can be built onto focus accerleation? what needs to be done differently? how can the capability be built to do this? What triggered the need to do something different? what triggered knowing what to do differently? how did the school know what worked when why and for who? How is the school ensuring it has learnt from this focus on accereation so there are improved outcomes for more students?

Schools involvement in ALiM and ALL

Ninety-three schools reviewed in Terms 2 and 3, 2013 had participated in Accelerating Learning in Mathematics (ALiM), Accelerating Literacy Learning (ALL) or Mathematics Support Teacher (MST).[5]

Some schools had been involved in an initiative more than once and some had been involved in more than one initiative. Twenty-one schools had participated in both ALiM and ALL and five of these schools had participated in ALL twice. Most schools were involved in ALiM or ALL in 2012 or 2013. The difference in numbers between these two programmes for these years was the schools that accessed MST support.

The complexity of school involvement is shown in Figures 2 and 3. Over the years the initiatives have been evaluated by the Ministry and there have been considerable changes to the design and implementation. For example, from 2013 schools are expected to undergo two cycles of inquiry.

Figure 2: Number of times schools have participated

This is a bar graph the y axis is number of schools involved each year and ranges from 0 to 40 at intervals of 10s. The x axis has four labels and each has four bars representing the years 2010, 2011, 2012, and 2013.  These are for ALiM 3%, 10%, 33% and 27%. For ALL 36% for 2010 and 34% for 2013. For MST 1 2%, 0%, 5% and 5%. For MST it is 6% for 2013 only.

Figure 3: Number of schools participating each year

this graph shows the number of participating schools each year. ALiM shows in 2010 3 schools 2011 10 schools 2012 33 schools and 2013 27 schools. ALL shows 2012 36 schools and 2013 34 schools.  MST 1 show 2010 2 schools 2011 no schools 2012 5 schools and 2013 5 schools.  MST 2 shows only 2013 with 6 schools.

Information about the types of schools, roll size, school locality (urban or rural), and decile range is in Appendix 1.