Section Two: Evaluation processes and evaluative reasoning

To achieve equity and promote excellence for all learners, internal evaluation must involve both good processes, and good evaluative discussion. In schools with effective internal evaluation, there were different points of view about what the data was saying, about issues and successes that affected students' learning and about what teachers might do next. Leaders, teachers and trustees did not simply go through the evaluation process as a series of discrete steps. They asked good questions, collected, analysed and made sense of good data and reasoned clearly and robustly about why and how their chosen response would result in the changes necessary for improvement.

This diagram is a flow chart. In the middle is a circle which reads Learner focused evaluation processes - We can do better. Surrounding this is five circles joined by a continuous line. From the centre circle to each of the five circles there are two way linking arrows. The circles read from the top going clockwise. Monitoring and evaluating impact. Noticing. Investigating. Collaborative sense making and Prioritising to take action.

Learners are at the heart of these processes, providing a lens through which schools:

  • investigated and scrutinised practice
  • analysed data and used it to identify priorities for improvement
  • monitored and evaluated their improvement actions, and
  • generated timely and useful information about progress towards goals and the impact and outcomes of actions taken for all learners in their school.

Placing learners at the heart of review and decision-making means: 

  • trustees scrutinising the work of their school in achieving valued outcomes for learners
  • inviting student participation in improvement efforts by talking to them, responding to their concerns and seeking their input into the decisions that affect them
  • leaders and teachers developing learner-centred relationships to engage and involve the school community
  • checking that students have effective, sufficient and equitable opportunities to learn.


Often the catalyst for internal evaluation, especially those that were emergent, rather than planned or strategic, was 'noticing' what was happening for learners. Leaders, teachers or trustees noticed something that caused them to pause and think. Often this was accompanied by questions such as:

  • What is happening here?
  • Is this what we expected?
  • Should we be concerned?
  • Do we need to take a closer look?

In these schools, there were always many eyes scanning for potential issues for students, and a variety of ways in which teachers, leaders and trustees knew further investigation was needed.

The most commonly cited catalyst was student achievement data such as NCEA, National Standards, or other assessment information regularly gathered by teachers and leaders.

Other formally collected data provided

catalysts too. Sources included:

  • observations of, and reflections on, teaching in classrooms
  • regularly scheduled surveys of students, staff or parents and whanau
  • teacher reflection, either individually or as part of learning groups
  • regular focus groups with students
  • meetings between the principal and teachers, or between the principal and parents
  • parental complaints, and data from pastoral systems (e.g. restorative sessions).

Alternatively, the catalyst may have come from a more informal source, such as: 

  • conversations 'at the school gate'
  • hunches or gut feelings
  • anecdotal evidence
  • informal feedback from other schools.


School leaders and teachers sought to obtain a more complete picture of what was happening and why before making any decisions about what and how to improve. The investigation focused on finding out what was currently happening in the school, and examining relevant research evidence and good practice guidelines about what effective practice looks like. 

By investigating together, leaders and teachers had shared understandings and owned the process and the findings. 

Data was collected over and above what was routinely collected. Leaders and teachers were clear about what data would provide sufficient evidence to understand the issue or problem. Schools asked questions such as: 

  • What do we already know about this?
  • What do we need to find out?
  • How might we do this?

Trustees, leaders, teachers, students and whanau had knowledge, beliefs and attitudes they could apply to understanding the issue.

It was important not to assume what these were ahead of time. Internal evaluation took into account the different ways in which participants could contribute, and tailored data collection methods to suit. 

Leaders and teachers used a wide range of data collection approaches. Methods included focus groups, interviews, planning checks, classroom observations and reflecting on samples of student work. The perspectives of students, parents and teachers were often sought through questionnaires or discussion opportunities.

Some of the schools found that video was a useful way to collect data. Having video evidence made it possible to repeat observations and notice things that were not initially obvious or to look for change. The use of video also allowed teachers to share their teaching strategies and approaches with one another in a professional learning context where capability building was a key focus.

Investigating what 'good' looks like was also part of the process. Teachers and leaders pulled together what they already knew about what they were investigating. This enabled them to identify gaps in their knowledge.

Further sources of evidence included research literature, external experts, other schools, Ministry publications like the Best Evidence Syntheses and ERO's School Evaluation Indicators. Sound evaluative reasoning helped to ensure a match between the school's context and the kinds of evidence that they drew on to identify what 'good' looks like. They did this by investigating the kinds of practices that were likely to make the most difference for all the learners in their school. They also investigated whether the improvements achieved were good enough in terms of the school's vision, strategic direction and their priorities for equity and excellence. Leaders and teachers could then make defensible judgements about valued student outcomes.

Collaborative sense making 

To make sense of the data gathered, leaders and teachers went from asking "what is happening here?" or "what is so?" to ask "so what?" Investigating and sense making were not totally separate processes. The process of analysis began when the first data was collected. Sense making could and did inform the direction of further data collection or research. Investigating and sense making were iterative and interwoven.

In these schools leaders, teachers and trustees understood that data often provided an incomplete representation of a more complex underlying reality. They were able to evaluate the quality of the data they had collected, and analyse and scrutinise it well. Some data were quantitative, like test scores; and some were qualitative, like classroom observations or survey responses. Both forms of data were valuable, and leaders and teachers understood the strengths and limitations of each. Many of the schools had a staff member with expertise in data collection and analysis. That person was working on building the capability of others at the school to understand and use data.

Leaders and teachers worked together to interpret the data and often reported what they had found to other staff and trustees, sharing their insights and testing to check the adequacy of the interpretations that they had made. Making sense of the data involved asking questions such as: 

  • What is our data telling us?
  • What insights does it provide?
  • Is this good enough?
  • What might we need to explore further?

After investigating and making sense of the issue or problem, schools were clear about where their strengths were, and where they needed to improve. This understanding usefully informed their response.

Prioritising to take action

Leaders and teachers carefully prioritised actions in order to plan for change in practice. Any response incurs a resource cost of some kind, so at this stage leaders asked:

  • What do we need to do and why?
  • How big is the change we are planning?
  • What strengths do we have to draw on?
  • What support might we need?

Leaders were clear about what capability and capacity they had, and what support they would need. They recognised that in an environment of limited resources, not all avenues can be explored at once. They were able to draw on relevant expertise to support the change they wanted to make and rejected any professional learning and development opportunities that would distract them from the agreed changes.

Most improvement actions focused on providing well-targeted, timely professional learning and development opportunities to support improved teaching practice. In many cases, collaborative professional learning groups that had participated in the evaluation process were continued 

Other responses included changes to curriculum design, assessment practices and expectations, or performance management processes.

Planning for how change would be managed was closely linked to the evaluation findings. Leaders and teachers were clear about what success for students would look like and how they would know whether or not the actions taken were working.

Monitoring and evaluating impact 

Internal evaluation did not end with the implementation of improvement actions. Monitoring the impact of any changes made was crucial. This stage focused on questions such as:

  • What is happening as a result of our improvement actions?
  • What evidence do we have of progress?
  • Is this good enough?
  • Do we need to adjust what we are doing?
  • What are we learning here?

Ongoing noticing, investigation and sense making enabled leaders and teachers to see whether what they were doing was having the desired result. Adjustments or further changes were sometimes needed. Where things were working well, ongoing monitoring provided opportunities to recognise and celebrate success.