Part A: Reviewing the mathematics curriculum

Context for the findings

The New Zealand Curriculum is a statement of official policy about teaching and learning in English medium New Zealand schools. Its function is to ‘set the direction for student learning and to provide guidance for schools as they design and review their own [local] curriculum’. [4]

From the beginning of 2010, all schools were expected to develop and implement a curriculum for students in Years 1 to 13 that was consistent with the principles, values and key competencies outlined in The New Zealand Curriculum. The process of design and review is ongoing and responsive to each school’s context.

The New Zealand Curriculum empowers schools to exercise ‘the scope, flexibility, and authority they need to design and shape their curriculum so that teaching and learning is meaningful and beneficial for their particular communities of students.’ [5]

Building into the curriculum aspects which have particular significance for school communities ensures that learning has meaning for students, and is supported by their families and the wider community. For this reason it should be reviewed regularly to ensure it adequately reflects the priorities for learners, and the vision and values of the communities in which they live.

The strands of the mathematics and statistics learning area of The New Zealand Curriculum provide a structure for the mathematics standards. The weighting given to the strands of mathematics changes according to year levels. For example the focus on the ‘number’ strand should be the focus of 60 to 80 percent of mathematics teaching time for students in Year 4. For students in Year 7 the focus on the ‘number’ strand changes to 40 to 60 percent of the teaching time. The weighting given to each mathematics strand should also be informed by what each school knows about the achievement of its students.

What did ERO ask?

How effectively is the school’s mathematics curriculum designed, enacted and reviewed to respond to the strengths and needs of all students and accelerate their progress and raise achievement?

What did ERO find?

ERO found variation in the extent to which schools effectively designed and reviewed their mathematics curriculum to respond to the strengths and needs of all students.

Table 1 outlines ERO’s findings about the effectiveness of schools’ curriculum review and design processes:

  • Schools with the most effective curriculum review and design processes were those where assessment, curriculum design and teaching practices were highly integrated and connected.
  • The schools with partially effective processes were focused on developing guidance for teachers and assessing student learning without the high level integration evident in the schools with effective processes.
  • The third category of schools was those with minimally effective processes. For these schools the focus was on programme organisation.
  • In the schools where processes were not effective, there was little evidence of any curriculum review and design of mathematics programmes.

Table 1: Effectiveness of curriculum review and design

Highly effective

11 percent

In schools with a highly integrated approach to assessment, curriculum review and design, and teaching strategies:

  • leaders collected, analysed and interpreted achievement information and what they knew about teaching practice in mathematics
  • leaders used the findings to make decisions about curriculum priorities, and about professional development for teachers
  • a collaborative approach to review sought input from teachers, parents and trustees and, in a few schools, students were also consulted
  • review of mathematics as a learning area was recent and linked to ongoing professional learning and development for teachers
  • leaders encouraged teachers to think about what would engage and extend students, and it was expected that the curriculum would be designed and adapted to achieve this
  • mathematics programmes integrated mathematics with other learning areas
  • mathematics contexts were relevant to students
  • students had opportunities to apply their mathematics knowledge in a range of tasks
  • students were helped to make connections between aspects of mathematics by teachers who taught the number strand through other mathematics strands
  • students’ problem solving skills were developed through meaningful tasks.

Partially effective

51 percent

In schools where the focus was on developing guidance for teachers and assessing student learning:

  • leaders concentrated on building teachers’ knowledge about specific assessment practices such as moderation or how to make overall teacher judgements (OTJs)
  • review focused largely on the development of guidelines for teaching programmes and ensuring strand coverage and expected time allocations for topics
  • there was less of an emphasis on exploring best practice in teaching and assessing mathematics, particularly in the context of the mathematics standards.

Minimally effective

32 percent

In schools where the focus was on the organisation of the mathematics programme:

  • some steps had been taken to review the mathematics learning area
  • the focus of review was on the mechanics of implementation, such as stipulating the time allocation for mathematics and ensuring all teachers covered the same content
  • few had developed guidelines to help teachers implement their mathematics programme
  • consultation was limited and often only involved teachers
  • few had accessed professional learning and development to build capacity, and pedagogical and assessment practices were poor.

Not effective

6 percent

In schools where there was minimal curriculum review:

  • there was a high turnover of school leaders and/or teachers
  • information about student achievement was limited
  • there was little evidence of self review
  • curriculum leadership for mathematics was lacking.

ERO found that in most schools aspects of self review needed improvement. Often the focus of the inquiry neglected to look at aspects of teaching practice that might have impacted on achievement outcomes. Leaders often addressed the “what” (content) of the curriculum that should be taught, without considering the “how” (teaching approaches and strategies) or the “so what” (outcomes for learners). Self review should ensure that both learning and teaching come under scrutiny along with the impact for student outcomes. [6]

What are the implications of ERO’s findings for priority learners?

In the 11 percent of schools with highly effective curriculum review and design processes:

  • Leaders and teachers made decisions about content and teaching and learning approaches.
  • Teachers interpreted the curriculum in light of what they knew about the prior learning, emerging needs and the strengths and interests of students.
  • They adapted the curriculum so students experienced success and were fully engaged in their learning.
  • Leaders gave teachers permission to exercise their discretion about both content and pedagogy.
  • The school curriculum operated as a guide to what teachers could do, without restricting them in their responsiveness to students.
  • In these schools, leaders and teachers successfully worked together to ensure mathematics programmes were relevant and tailored to all learners.

In the 51 percent of schools with partially effective curriculum review and design processes:

  • Leaders and teachers had developed guidelines for mathematics programmes and were assessing students’ achievement and progress.
  • Curriculum design focused mainly on coverage of, and time allocations for, the mathematics strands rather than on what leaders and teachers knew about students’ progress and achievement, particularly for those learners below or well below the mathematics standards.
  • These schools had all the assessment data they needed to develop a curriculum that responded to the needs of priority learners.
  • They did not necessarily have the confidence to move away from using predetermined long-term curriculum plans.

Many students will achieve and progress in schools that use the same programme outline each year. However, priority learners benefit from a more responsive curriculum.

Just over one-third of the schools had considerable work to do in reviewing and enacting their mathematics curriculum to respond to students, particularly learners who were achieving below or well below the standards in mathematics.

For the schools (six percent) that had made little or no progress, fundamental barriers had to be addressed. These included retaining staff, building capacity to engage in curriculum design and review, and implementing high quality mathematics programmes for all students. Lack of effective curriculum leadership, was also an impeding factor. Until these issues are addressed, students in these schools are unlikely to make the necessary progress.

In many schools, leaders and teachers focused largely on the mechanics of the curriculum such as the coverage of mathematics, the percentage of time spent on number and other strands, and on developing planning formats. Such activity precluded teachers from focusing on the broader issue of what constitutes powerful and responsive teaching in mathematics. Most teachers followed their school’s guidelines for teaching mathematics that often described the topics and/or strands that were to be taught. This predetermined or prescriptive curriculum did not always match the identified strengths, interests and learning needs of the current group of learners.

The implications of this for priority learners are profound. These students are already at risk because their achievement and progress is behind that of their peers. An unresponsive curriculum places them at even greater risk of failure and disengagement from school.

To improve current practice, schools must take a much more responsive and innovative approach to designing and enacting the curriculum. This means including rich content relevant to the diverse range of students in their schools. Teaching approaches need to be sufficiently attuned to the learners.

This reiterates ERO’s recommendation in its August 2010 report Working with the National Standards within The New Zealand Curriculum which stated:

Ongoing review and design of each school’s curriculum in relation to The New Zealand Curriculum is crucial to ensuring that teaching and learning programmes are responsive to what schools know about students’ progress and achievement against the National Standards. Schools need support to implement robust self review that enables them to make their curriculum responsive to all students.

The findings of this current evaluation show that schools continue to need considerable support in this area.

Part B: Use of achievement information

Context for the findings

The New Zealand Curriculum states: ‘The primary purpose of assessment is to improve students’ learning and teachers’ teaching as both student and teacher respond to the information that it provides. With this in mind, schools need to consider how they will gather, analyse, and use assessment information so that it is effective in meeting this purpose.’ [7]

Collecting, analysing and using achievement information should be part of a well-considered school strategy to improve learning and teaching. The following diagram illustrates how information can be used to contribute to improving outcomes for learners at all levels of the system.

Figure 2: The use of achievement information [8]

figure 2 is a diagram showing outlines of four people representing teachers and students a box which arrows to them reads How well were teachers and students using achievement information.  They are standing on tower of five circular stacks. They are from top to bottom: Information for learning, student and teacher. Information for future learning  and partnership, next teacher employer, parent, family whanau. Information for shcool review and development, leadership team. Information for governance, board of trustees.  Information for stewardship, Ministry of Education. A box pointing to information for governance reads How well were trustees using achievement information? and finally a box pointing to leadership team reads How well were school leaders using achievement information.

In this evaluation, ERO focused on how students, teachers, leaders and trustees used information about students’ achievement in mathematics (based on the mathematics standards) as part of their specific learning, teaching, management and governance roles and responsibilities.

Students’ involvement in goal setting, talking about their learning and knowing about their next steps are crucial to their success as learners.

The deliberate use of achievement information by teachers enables them to respond through their planning, make decisions about teaching strategies to use and determine how they involve students in their learning.

Leaders, on the other hand, need to have the systems and processes to gather, collate, analyse and use information about students’ progress and achievement, and teacher capability to improve teaching and learning.

Boards of trustees rely on timely, relevant, well analysed progress and achievement information, including National Standards’ information, which enables them to identify needs, trends and patterns that can inform decisions for future planning.

What did ERO ask?

To what extent is achievement information in relation to the mathematics standards used by:

  • trustees to inform governance decisions
  • school leaders to inform curriculum decisions
  • teachers to inform their teaching
  • students to inform their next steps in learning?

ERO’s findings on the use of achievement information to report to parents have been published in a separate report. [9]

What did ERO find?

The use of achievement information in relation to the mathematics standards by trustees, leaders, teachers and students was variable.

Figure 3 shows that in about one-quarter of schools achievement information was well used by trustees, leaders and teachers. Students’ use of information was weaker with achievement information being well used by students in only seven percent of schools.

Figure 3: Use of achievement information by trustees, leaders, teachers and students

figure 3 is a bar graph called use of ahcievement information by trustees, leaders and students. The x-axis has four labels called Trustees, Leaders, Teachers and Students. The y-axis is ranged 0%-100% at intervals of 20.  Each has a bar split into 5 sections which denote well used, some use, limited use, not used and mathematics standards information was not available for use. The figures are for trustees respectively 25%, 24%, 21%, 9% and 21%. For leaders 27%, 25%, 24%, 8% and 16%. For Teachers 27%, 25%, 23%, 7% and 18% and lastly for Students 7%, 21%, 31%, 12% and 29%.

ERO’s findings about the extent to which achievement information was used by students, teachers, leaders and trustees are shown in Table 2 as a ‘continuum of use’.

Table 2: Use of mathematics standards achievement information by trustees, leaders, teachers and students


Well Used

Some Use

Limited Use

Not Used


  • Boards received good quality information regularly from school leaders, and were active and engaged – independently questioning the data and seeking to further their own understanding.
  • They used the data to inform resourcing decisions, which were targeted and responsive to areas of need.
  • Boards also used the information to set appropriate targets to raise achievement and align them with strategic goals.
  • Robust self-review processes were evident.
  • Boards received information of varying quality from school leaders.
  • Generally they were not pro-active in their approach to understanding the data. They were more reliant on leaders to guide them.
  • Trustees used the data to inform resourcing decisions and to set targets, but these were not as well developed as boards in theWell Used category. For example, targets were not challenging enough and resourcing was not specific to priority learners.
  • Some self review was evident.
  • The information boards received from leaders was often of poor quality.
  • Generally trustees did not question the information or show a high level of understanding of what it was saying.
  • Boards in this category showed a less sophisticated understanding of the National Standards.
  • They responded to data largely through resourcing decisions, which varied in terms of their efficacy.
  • Data was not used to inform strategic planning, and many boards did not use data for target setting.
  • There was limited evidence of self review.
  • Boards received some information from school leaders, but this was not analysed and, in some cases, ERO had concerns about the validity of the data.
  • Boards in this category showed no evidence of considering the information in depth or using it to inform resourcing decisions, strategic planning or target setting. This was sometimes due to paucity of information, and sometimes due to a lack of board capability.
  • No evidence of self review.

Well Used

Some Use

Limited Use

Not Used


  • Leaders regularly collected and presented comprehensive student achievement information across all strands of mathematics.
  • Information was analysed to show progress over time and to assess the efficacy of interventions.
  • The information was used to inform decisions around PLD and curriculum, allocate additional staffing and set targets.
  • The information was used as part of school self review.
  • Leaders collected and presented student achievement information, but this was more variable in quality, regularity and comprehensiveness.
  • Not all leaders tracked progress over time.
  • The progress of priority learners was not always reported on.
  • Self review was limited.
  • Leaders collected information but this was often poor quality.
  • There was limited use of information to inform PLD or to review assessment practices.
  • Many leaders in this category were developing their understanding of the mathematics standards.
  • Information was mostly used to identify students who needed support.
  • Some targets were made, but often they were too general to measure or not challenging enough.
  • Very little evidence of self review was found.
  • Most leaders had not collected and analysed the information.
  • In many cases ERO had concerns about the validity of the data, or the robustness of the overall teacher judgements (OTJs).
  • Data was not used to inform target setting or identify professional learning and development priorities.

Well Used

Some Use

Limited Use

Not Used


  • Teachers collected high quality data from a range of sources in order to inform their OTJs.
  • This information was used to plan programmes and identify teaching strategies.
  • They focused on the learners requiring additional support.
  • Teachers showed a commitment to and understanding of teaching as inquiry.
  • They provided regular opportunities to involve students and their parents, whānau and aiga in learning conferences and goal setting in relation to mathematics standards.
  • Teachers were making OTJs based on data but these were less reliable than those in theWell Used category, because they were based on fewer sources, or related only to number and not all strands of mathematics.
  • There were some responsive practices, but variability of quality of practices among teachers, within each school.
  • Some teachers were using teaching as inquiry processes.
  • There was some regular reporting to parents, whānau and aiga, but less evidence of their involvement in their child’s goal setting and in supporting their learning.
  • Teachers had less understanding of the mathematics standards and were less confident making OTJs.
  • Achievement information was often related to the number strand only.
  • Schools in this category also showed wide variability in the extent to which teachers in each classroom used information.
  • Although achievement information was used to group students, it was not used to identify deliberate teaching strategies for each of these groups.
  • Teachers were either: making minimal use of assessment information to teach students with no clear link to the mathematics standards; or making no use of assessment information to inform their planning and practice.

Well Used

Some Use

Limited Use

Not Used


  • Teachers had explained the mathematics standards for students.
  • Students were therefore able to use assessment information to reflect on their own learning.
  • Students could talk about where they were in relation to the standards and their next steps.
  • Students took an active role in goal setting and participated fully in learning conferences along with teachers and parents, whānau and aiga.
  • Students were well supported by teachers to understand their achievement.
  • These schools had some of the characteristics of theWell Used category, but with marked variability in the extent to which they shared achievement information with learners.
  • In particular, students were less likely to understand their next steps, be engaged in effective goal setting or have a clear understanding of the mathematics standards.
  • Older students were more likely than younger students to use assessment information to understand what they had learnt and needed to focus on next.
  • In addition to the factors identified in the Some Usecategory, these students were less likely to be able to talk about their learning.
  • Additionally, they were less likely to find out about their progress and achievement directly from their teachers. Such information was often sourced entirely from their school reports.
  • Students were not aware of how well they were achieving in relation to the mathematics standards or informed about their next steps for learning.
  • They had limited or no knowledge of the standards.
  • In some cases teachers did not share information with students.
  • Students were not involved in activities such as goal setting or learning conferences.

Schools where achievement information in relation to the mathematics standards was not available for use

As shown in Figure 3, in up to 29 percent of schools, achievement information in relation to the mathematics standards was not available for use by all groups (trustees, leaders, teachers and students).

Of the 29 percent of schools:

  • Achievement information in relation to the mathematics standards was not available for use by trustees, leaders, teachers or students in 13 percent.
  • In the remaining 16 percent, availability of achievement information in relation to the mathematics standards varied. For example, in some of these schools, teachers and/or leaders had information about achievement in relation to the mathematics standards, but it was not being reported to the board of trustees. In other schools, teachers were making OTJs in relation to the mathematics standards that were available for leaders to use, but they were not using the information in their teaching or sharing it with students.
  • Overall, students were the largest group (29 percent) to not be using information about their achievement in relation to the mathematics standards. In some of these schools, students were aware of and using achievement information, but it was not in relation to the mathematics standards.

What are the implications of ERO’s findings for priority learners?

The variability in use of achievement information identified in this evaluation raises questions about the understanding of trustees, leaders and teachers about the purposes of assessment and the principles of assessment for learning.


Trustees rely heavily on school leaders providing timely and useful information that can be used to make evidence-based resourcing decisions. In almost three-quarters of the schools, boards were receiving information about achievement in relation to the mathematics standards. However, the quality of the data and the extent to which trustees used the data varied.

Trustees need to be able to understand what the information is telling them and ask relevant questions to help set targets and resource actions that will accelerate the progress of learners that are not achieving well. The variability in the use of data by trustees means that in some schools boards do not know if they are allocating funds for the right programmes and initiatives or the most useful teacher professional development. Further, when trustees do not receive information about the progress of identified learners, they are unable to determine whether allocated funding had resulted in the desired gains for learners that needed to make the most progress.


School leaders play a critical role in supporting teachers, trustees, students and their parents to use achievement information to improve learning. Leaders establish school-wide guidelines for how assessment information will be collected and used.

In the schools ERO identified as using achievement information well, leaders put in place systems and clear expectations so that data collected was used by teachers, trustees and students. In many of the other schools, teachers collected data but it was not used to its full potential. For example, data was used to identify learners’ achievement but was not used to review and develop the school’s mathematics curriculum or to identify the most successful teaching practices. Teachers often invested considerable time and energy into assessment activities. School leaders need to ensure that the information gained from such activities is used to the fullest extent to benefit learners.


The use of achievement information by teachers to inquire into their practice and inform their teaching decisions is essential to effective teaching. Some teachers were using the achievement information they collected to modify their programmes and to discuss progress and possible goals with individual students.

ERO’s findings suggest that teachers need to move beyond using achievement information for grouping students and put more of a focus on inquiring into the effectiveness of their teaching strategies in terms of what works and what does not.


Directions for Assessment in New Zealand highlights the importance of students being at the centre and notes ‘all our young people should be educated in ways that develop their capability to assess their own learning.’ [10] ERO’s findings in both this report and previous evaluations suggest that this remains a challenge for many schools.

Not all students get the opportunity to develop both the capability and motivation to assess, interpret and use information in ways that affirm and further their own learning. Students rely on their teachers to encourage and support them to take an active role in assessing their learning.

Part C: Accelerating the progress of learners below and well below the mathematics standards

Context for the findings

The literature on schooling improvement provides useful insight into how students’ progress and achievement can be accelerated.

According to Lai et al [11] acceleration is:

  • a rate of progress faster than the cohort to whom [an] individual belong[s]
  • faster than the expected/normal rate of progress so that the [resultant] changed distribution comes to match an expected distribution [a]
  • made similarly by different subgroups within the total [target] group
  • sustained for at least two to three years.

Alton-Lee notes that ‘accelerated improvement requires a whole system to function as a collaborative learning community that is advancing progress on the four areas of leverage: pedagogy, educationally powerful connections, professional learning and leadership.’ [12]

McNaughton and Lai (2009) [13] assert that teachers who successfully improve students’ literacy learning are knowledgeable and responsive in their approach to accelerating the progress of priority learners. Successful teachers draw on deep content knowledge, pedagogical content knowledge, and knowledge of students to “selectively and strategically apply known instructional procedures [and] are constantly refining and changing to be more effective.” Along with the deep content knowledge, there is a disposition amongst expert teachers to be innovative and to problem solve in the pursuit of better teaching and learning.

Neill, Fisher and Dingle (2010) [14] identified factors that contributed to the accelerated progress of low performing students in the pilot of the Accelerating Learning and Mathematics (ALiM) programme. These included:

  • regular lessons that focused on gaps in knowledge, and on building students’ use of mathematics strategies and language and memory
  • opportunities for collaborative and peer supported learning that built students’ confidence and sense of self efficacy
  • activities that engaged students and were pitched at an appropriate level of challenge
  • teachers with the necessary pedagogical content knowledge and disposition to be reflective and responsive to students
  • coherence between ALiM and the overall school curriculum
  • high levels of support from students’ families.

As part of its Better Public Services programme the Government has set 10 targets to be achieved over the next five years. One of these is that 85 percent of 18 year olds will have achieved National Certificate of Educational Achievement (NCEA) level 2 or an equivalent qualification in 2017. For this to be achieved, a concerted effort is needed at both a system and a school level to provide the necessary interventions and support to accelerate the progress of those students who are currently working below or well below the National Standards.

What did ERO ask?

How are teachers accelerating the progress of learners who are below or well below the mathematics standards?

What did ERO find?

Generally, schools were very good at identifying learners in Years 4 to 8 who were achieving below or well below the mathematics standards. Most schools did this by looking into school-level collation and analysis of data, and by teachers using their classroom-focused analysis of achievement information. The exceptions were a few schools that either had no achievement data, the data did not pertain to the mathematics standards, or it only identified students below, and not well below the mathematics standards.

What happened for these students once they were identified did not necessarily accelerate their progress. This was partly because of a lack of understanding by leaders and teachers about what ‘accelerated progress’ actually means and partly because leaders and teachers did not know how to accelerate progress in mathematics.

ERO’s findings indicate that schools were tending to adopt a ‘business as usual’ approach to accelerating the progress of identified learners. There was more of a focus on giving learners some support than coming to grips with what it meant to accelerate progress, how to do this in relation to mathematics, and how to gather evidence about what does and does not work.

The most common approach to supporting learners who were below or well below the mathematics standards was by grouping for teaching. In many schools, teachers grouped students for teaching as part of the regular classroom mathematics programme. Some used cross grouping between classes across year levels based on data about students’ achievement in mathematics. Other responses included differentiating planning for individual students or groups of students, and using commercially available resources.

In at least half of the schools, teacher aides were working with learners who were below or well below the mathematics standards. They worked with individual students or small groups. Support was provided either in class or by withdrawing students, or a combination of both. Generally teacher aide support was undertaken under the guidance of the classroom teacher. The exception to this was where teacher aides had oversight of students engaged in independent learning activities, while the teacher worked with individual students or groups needing support.

A few schools took a more systematic and deliberate approach to accelerating progress. In these schools, leaders were questioning their previous use of resources such as teacher aides and exploring alternative solutions. Some were also exploring different teaching strategies, drawing on research and their own evidence of what works to better support learners.

ERO found minimal evidence of schools using robust self-review processes to inquire into and evaluate the impact of support programmes, initiatives and strategies, particularly where the intent was to accelerate learner progress.

What are the implications of ERO’s findings for priority learners?

ERO is concerned that schools are continuing to use a range of programmes and initiatives with little or no evaluation of the impact for students involved in them. These findings reiterate those in ERO’s 2008 national evaluation report: Schools’ Provision for Students at Risk of Not Achieving which stated:

ERO found that the majority of schools could adequately identify students at risk of not achieving, particularly in the areas of literacy and numeracy. There was a much wider variation in the quality and effectiveness of how schools addressed the specific needs of students, and monitored, reviewed and reported on the progress and impact of their provision. In particular, nearly half the schools in this evaluation needed to improve the way that they monitored and evaluated their initiatives or interventions. [15]

As noted in 2008, leaders and teachers generally know which learners need additional support and take steps to provide this. However, the issue lies with the nature of the support, the sense of urgency with which it is provided, and its effectiveness at accelerating learner progress.

Many trustees, leaders and teachers do not have a clear understanding of what ‘accelerated progress’ means for learners in their school. The expectation that something different may need to happen for identified students to ‘accelerate’ their progress is not widely held or understood.

The prevalence of teacher aides working with identified learners (particularly those below or well below the mathematics standards) either through in-class support or in withdrawal programmes is highlighted in this evaluation. ERO found the use of the least qualified adults to work with the learners who need the most expert teaching is accepted practice in many schools. The findings also highlight practices whereby learners are grouped according to ability in class or across classes as a response to accelerating the progress of identified students.

Alton-Lee [16] refers to research that shows that ‘business as usual’ teaching, among other things can do harm in education. She notes that “resources can be allocated in ways that exacerbate disadvantage through practices such as fixed ability grouping, streaming, grade repetition and the allocation of the least qualified teachers or teacher aides to work with the lowest achievers or students with special needs.”

Compounding this issue is the finding that shows schools do not have the evidence that practices such as cross grouping and use of teacher aides actually lead to accelerated progress. ERO’s 2008 report about what was happening for students at-risk of underachieving noted that:

Given the significant investment that many boards of trustees make when employing staff such as teacher aides and other additional personnel, schools need to be clear about why they choose particular options. Trustees need regular information about the use of additional staffing, and the impact that resources and programmes have on students at risk of not achieving. Boards need this information to determine the effectiveness of their investment to make decisions about the future resourcing. [17]

Four years on from the 2008 report this remains the case. The findings of this evaluation highlight the need for improved monitoring, reporting and evaluation of the ways in which schools are using resources to accelerate the progress of identified learners. Given the significant investment which schools are making to raise achievement for priority learners, there needs to be more robust self evaluation of the effectiveness of resourcing decisions.