Findings

This section presents the findings from the key evaluative questions and from services’ self reporting. The findings take into account the ways in which early childhood services support educators to undertake assessment of children’s learning. Examples of evaluative comments from review officers (in boxes) are included to provide further information on effective practice for early childhood services.

Self reporting: services’ support for assessment

ERO initially gathered self-reported information from services about the support provided to educators to undertake assessment, their registration and qualifications, and the volunteers involved with the service. This information provides a background to the key findings of the overall evaluation.

Professional developmentAlmost two‑thirds of services (64 percent) reported that educators had undertaken professional development in relation to assessment practices in the previous three years. The most common of these were professional development in Kei Tua o te Pae (32 percent), learning stories (18 percent), and in-house professional development specific to the service philosophy or type (12 percent).

Almost half the kindergartens had participated in professional development for Kei Tua o te Pae, compared to less than a third of education and care services, and only 10 percent of playcentres. Over a third of playcentres and education and care services had had no professional development in assessment. Two‑thirds (66 percent) of rural services had not received professional development in Kei Tua o te Pae. Of rural services, playcentres were less likely to have had this professional development.

ERO found that services that had participated in professional development to support the implementation of Kei Tua o te Pae were more likely to be effective across all of the five evaluative questions than services that had not participated in this particular professional development. These findings were statistically significant. [12]

Registration and qualificationsTable 1 shows that:

  • Nearly three‑quarters of kindergartens had educators that were either all fully registered and with an early childhood education (ECE) qualification, or a mix of fully and provisionally registered ECE qualified early childhood educators.
  • Almost all education and care services had a mix of ECE qualified, registered or not registered educators, and educators with a non-ECE teaching qualification.
  • Almost all (91 percent) rural services had educators with a mix of ECE qualified, registered or not registered educators, and educators with a non-ECE teaching qualification, or with playcentre-based qualifications.
  • Only nine percent of rural services had all fully registered, or a mix of full and provisionally registered and ECE qualified educators.
  • All playcentres had some educators with playcentre-based qualifications.

Table 1: Educators’ registration and qualifications

  % of kindergartens % of playcentres % of education and care services
All fully registered and ECE qualified 46 0 1
All fully or provisionally registered and ECE qualified 26 0 10
Mix of ECE qualified and registered, ECE qualified but not registered, other teaching qualification 28 35 89
Playcentre qualifications 0 65[13] 0
Total 100 100 100

Table 2 shows that two‑thirds of education and care services had educators on their staff that were currently undertaking an ECE qualification, compared to 18 percent of playcentres and nine percent of kindergartens.

Table 2: Services with educators undertaking ECE qualifications

  % of kindergartens % of playcentres % of education and care services
Educators undertaking ECE qualifications 9 18 66

ERO found that services with fully or provisionally registered ECE qualified teachers were more likely to have good quality assessment practices across all five of the evaluative questions than services where educators had a mix of registrations and qualifications or playcentre qualifications. However, services with a mix of registrations and qualifications were more likely to have good quality assessment practices across all five of the evaluative areas than services where educators had playcentre qualifications. These findings were statistically significant. [14]

Time available for assessment, planning, and evaluationAlmost all services (89 percent) provided time and/or support for educators to assess children’s learning, plan the programme, and evaluate its effectiveness. The extent of this time and support varied greatly among services. Educators at many services had regular meetings and while some of these were weekly occurrences, others were fortnightly, or monthly meetings. Similarly, in many services educators had regular non‑contact time, but this too varied from less than two hours per week to two afternoons per week. A small number of services (three percent) had informal non‑contact time if the ratio allowed for it, and four percent had no non‑contact time. Educators in over half of playcentres reported that they used their own time at home to assess, plan, and evaluate. A fifth of services provided further time for planning and 16 percent provided professional support (mentoring and guidance) to help educators plan and assess.

ERO found that kindergartens and education and care services were the most likely to provide time for meetings and regular non‑contact time. Kindergartens were the most likely to provide professional support for their educators.

Resources dedicated to assessmentAlmost all services (96 percent) had resources dedicated to supporting assessment practices. These included:

  • computers, laptops, and printers;
  • digital cameras;
  • other Information and Communications Technologies (ICT) equipment, for example data projector, dictaphone;
  • early childhood exemplars (Kei Tua o te Pae) and other Ministry of Education documents; and
  • portfolios, profiles, and templates.

Playcentres were less likely to have computers, laptops, digital cameras, or other ICT. Kindergartens were more likely to have an administrative person available to help with assessment, for example, monitoring the occurrence of assessment undertaken for individual children.

VolunteersAbout three quarters of services (73 percent) had volunteers or other non-teaching staff regularly involved in the day-to-day activities of the service. Education and care services were the least likely to have any volunteers or other staff involved (36 percent) or parent help (19 percent). Kindergartens were the most likely to have a teacher aide (37 percent) and administrative support staff (24 percent). Sixty‑one percent of kindergartens also had parent help in their service.

Key evaluation questions

The key evaluation questions were investigated during on-site reviews in early childhood services.

Assessment policies and practice

How well do educators develop and implement assessment policies and practice for the service?Each early childhood service is required to have a philosophy statement that expresses the beliefs, values and ideals that guide the practice of the service. [15]Although there will be common elements, services may have different approaches to children’s learning and assessment that reflect their philosophy.

Sound policies and practice guide early childhood educators in undertaking assessment of children’s learning and development that reflects the service’s philosophy. Assessment is used to support the provision of good quality learning experiences.

Open communication between early childhood services and parents and whānau ensures that information is shared which can enhance assessment and learning. Discussions between educators and parents can make children’s learning more apparent to parents, and can also explain the purpose of assessment activities.

ERO evaluated how well educators developed and implemented assessment policies and practice for their service in relation to the evidence that:

  • the service’s philosophy was reflected in the assessment practice;
  • there was a shared understanding of the purposes and intent of assessment;
  • assessment practice was based on sound research;
  • assessment practice incorporated input from appropriate people; and
  • effective strategies in the service supported assessment practice.

Philosophy and assessment practiceAssessment practice in early childhood services should be aligned with the individual service’s philosophy.

ERO investigated how well each service’s philosophy was reflected in its assessment practice, and the extent to which educators’ beliefs about learning reflected the service’s philosophy.

In about two‑thirds of services, the focus of the philosophy was strongly reflected in assessment practices. In these services both philosophy and assessment practice emphasised educators’ beliefs about learning, including:

  • learning through play;
  • interactions;
  • parent participation;
  • valuing children’s interests and knowledge;
  • a child-centred approach; and
  • increasing the child’s voice.

In many of these services, the philosophy made direct reference to the importance of educators noticing, recognising, and responding to children’s learning and development. Services’ philosophies recognised that children were actively involved in their own learning and development. Educators responded to children’s interests, strengths, experiences, and conversations, and sought to increase parent participation in assessment. Where this was a particular strength, parents were involved, alongside educators, in reviews of philosophy and assessment practice.

The philosophy stated that children would learn through play, that their interests would be extended and that children would be treated as competent and confident learners. The assessment practices reflected the philosophy, with observations and anecdotal notes of children at play documented and shared by all members of the teaching team each day, in order to challenge and provide ongoing opportunities and experiences for learning. The philosophy of a partnership approach to learning with parents was also evident. Portfolios were sent home as a learning story was completed. Information about children’s interests, strengths, likes and dislikes at the service were shared by the educators and in return parents shared anecdotal information from home, which together created a holistic view of the child’s knowledge, skills and understanding. [16]

For the remaining third of services, assessment practice did not reflect or support their philosophy.

Although most of these services had a stated philosophy that focused on children’s holistic development, learning through play, and partnerships with parents, this philosophy was not always evident in assessment practice. Educators’ observation and assessment of children’s learning was informal, lacked rigour, and did not meaningfully show children’s interests, abilities, and skills. Assessment lacked knowledgeable analysis, and educators’ perception of how and what children learnt did not clearly link to the service’s stated philosophy.

Where ERO found very poor practice, the service’s philosophy did not guide assessment practice in any way. In some services, even the programme, when in action, did not reflect the philosophy. This was the case, particularly, in services where external facilitators, an umbrella organisation, or senior management had developed the philosophy without consultation, and this philosophy was not embedded in the educators’ understanding or practice.

Shared understanding of assessment

When educators have a shared understanding of the purposes and intent of assessment, practice is more likely to be well understood, consistent, and result in positive outcomes for children. ERO investigated the extent to which educators within each service had a shared understanding of, and discussed and reflected on, assessment of children’s learning.

Educators in over half of the services had a shared understanding of the purpose and intent of assessment. In these services there were clear expectations for assessment, including a documented assessment process that was recognised and implemented. Where this was a particular strength, services had an ongoing process for reviewing their planning, assessment, and evaluation practices.

These services provided educators with support such as professional development in assessment, as well as time to discuss and reflect on children’s learning. These meeting times were both formal (regular meetings) and informal (for example, during children’s sleep time). Educators discussed what information they had gathered about children’s learning, and why. They also reflected upon how to achieve positive learning outcomes for children as a response to assessment.

The service had indepth professional development with an external facilitator, which had resulted in changes to its assessment and planning. Assessment practices were meaningful, manageable, and child focused. Analysis of learning was recorded alongside extension ideas. As a result of professional development, educators were developing a collective understanding of assessment and had systems to continue to develop this understanding.

In just under half the services, educators lacked a shared understanding of the purposes and intent of assessment and there was little collaboration on assessment and children’s learning.

Many of these services experienced high staff turnover and had many new or unqualified educators on the team. This meant there was little consistency in assessment. In some services only one or two educators had any knowledge of the purpose of assessment and this was often not shared with the rest of their team.

In other services, educators could articulate some understanding of the purpose and intent of assessment, but this was not demonstrated in assessment records, reflective journals or minutes of meetings. In some services, while an understanding was apparent amongst educators, this was not supported by service-based expectations, assessment policies, and clear guidelines for assessment. A lack of professional development meant that educators were not given help to increase their knowledge and the quality of their own and others’ assessment practice.

Research-informed assessment

A knowledgeable educator in an early childhood education setting is able to assess children’s learning in an informed and reflective way. ERO investigated the extent to which assessment was based on current early childhood theory, using key guiding documents, such as Te Whāriki, the DOPs, and exemplars from Kei Tua o te Pae.

Almost two‑thirds of services had based their assessment practice on the key guiding documents. The intent of these guiding documents was reflected in assessment practice, through making children’s learning visible, acknowledging children’s dispositions, and reflecting the holistic nature of children’s learning and development. [17]

The DOPs and Te Whāriki underpinned the programme, and local and international research was linked to each aspect of the philosophy. Narrative assessment described children’s learning and their developing dispositions. Teachers’ own reflective research was guiding the development of sound assessment practice.

In about three‑quarters of these services, educators had undertaken professional development in assessment that had raised their levels of understanding of the theories and practice inherent in these guiding documents. Where ERO found very good practice, educators had regular and whole-centre professional development. This helped them to stay informed of current theories about assessment, and adjust their practice accordingly.

Just over a third of services had not based, or were only beginning to base, their assessment practice on current theories about assessment.

In most of these services, educators were beginning to use Te Whāriki, the DOPs, and, to a lesser extent, Kei Tua o te Pae exemplars to inform assessment. Although narrative assessment had been implemented this did not consistently illustrate children’s learning. Such narrative often described what teachers did, rather than reflecting on children’s learning. Educators’ perspectives of learning did not adequately recognise children’s learning dispositions, experiences, and interests. Some educators in these services had undertaken professional development in assessment, but this new learning had not yet resulted in effective assessment practice.

In a small number of these services, there was no meaningful link between key guiding documents and assessment practice. There was little theoretical understanding and any references to Te Whāriki were shallow and superficial. Educators made no use of the DOPs and Kei Tua o te Pae exemplars, and made no reference to children’s dispositions. Although some of these services were attempting narrative assessment, often directed by their association or management, there was no professional development to support this, and hence there was little or no understanding of current theories. Assessments were poorly written, mostly describing participation and activities. There was little analysis of children’s learning; instead this was mostly anecdotal comment that did not provide a basis for future learning.

Input from a diversity of people

The socio-cultural approach to teaching and learning recognises and takes into consideration the wider world in which children learn and develop. Educators consider the child as part of a family and community, and acknowledge the influence of society and its cultural values on children’s learning and development. Including the perspectives of children, peers, educators, families and whānau in assessment enhances children’s learning, and establishes links between the service and the home. ERO investigated how well services incorporated input from a diversity of people into assessment practice.

Input from children, parents and whānau, and all educators was well incorporated into assessment practice in just over half the services. Where ERO found particularly good practice, assessment also included the perspectives of other people involved in the children’s lives. There was celebration of children’s cultural background and recognition of whānau aspirations and values.

The voices of children were included in assessment. Educators recorded children’s own narratives, conversations, and explanations about their learning experiences, and those of their peers, supported by photographs and art work. Educators asked children about their learning and recorded this information, and allowed children to select what went into their portfolio or profile. Children developed awareness of their own learning.

Educators had also implemented strategies to include parents’ voices in assessment. These included guiding parents through questions, encouraging them to reflect on their child’s learning, and participating in discussions. Parents were also encouraged to share useful information about language and activities from home. In such ways parents became actively involved and were able to extend and support their child’s learning.

Children, teachers and parents had input into the learning stories. The service had an area for comments where appropriate people could contribute as learning developed. There were also areas for parents’ learning stories. Many parents contributed to these. When children travelled away from the service, parents and children were encouraged to record their learning with other family members, in other geographical areas, or with other cultures.

In most of these services, many educators contributed to assessment. Some children’s portfolios were the responsibility of one educator, but others also contributed their observations to many portfolios and profiles. In a few services, the voices of other people were visible in assessment. This included other children, educators in training, local iwi, teachers from the local school, visitors from the wider community such as dental nurses, fire fighters, police, and the children’s whānau such as grandparents and siblings. These contributions enriched and extended the recording and understanding of children’s learning experiences.

In just under half the services assessment practice did not include contributions from a range of people. The voice of the educators dominated assessment information. Some parent and child voices were captured, but this was limited and not useful enough to contribute to children’s learning or teaching practice. In most services, parents were asked to complete an introduction page about the child’s background and personal information. In some services educators had tried to include parents’ contributions, but often educators had not been able to convey an understanding of assessment so that the parents could understand the importance of their contribution, or provide useful input to learning. Any comments from children were often very descriptive and focused on the enjoyment of activities rather than recording their emerging learning.

Where ERO found very poor practice, the educator ‘voice’ was visible in assessment records, but very rarely did more than one educator comment on a child’s learning. Educators in these services either did not take up the opportunity to contribute to all assessments, or strategies such as non-contact time or meetings, to enable a range of contributions, were limited. Parents and children’s contributions were either limited or not apparent. Assessments were sometimes shared with parents, but there was no expectation that parents or children would contribute.

Strategies for assessment practices

Strategies for regular and inclusive assessment help educators implement and undertake assessment practice. ERO investigated the extent to which services had strategies and systems to support worthwhile assessment practice.

Almost two‑thirds of services had implemented strategies and systems that supported effective assessment practice. These services had expectations for assessment that were reflected in written guidelines for assessment practice. Educators in almost all the services had regular non-contact time, meetings about assessment, and ICT resources to support assessment practice. Services had guidelines to ensure that children’s learning was assessed regularly and that the content reflected the holistic nature of children’s learning and development. Strong professional leadership in these services gave educators robust feedback on their assessment practices.

Meetings allowed educators to reflect and discuss children’s learning. The coordinator encouraged educators to develop their own styles within certain criteria. This had resulted in more personalised learning stories and indepth observations of children’s learning. The reading, sharing, and discussion of learning stories were recorded in the planning journal. A set of guidelines and questions focused these discussions.

Systems to share assessment information amongst educators and with parents were highly evident and implemented effectively. Regular meetings and daily discussions gave educators opportunities to share observations and reflect on assessment. Where ERO found very good practice, services had folders that included examples of good assessment as guides. Profiles and portfolios were accessible to parents and they were able to take these home. Many services had daily communication notebooks in which educators and parents regularly entered information and feedback. Some services held presentations and information evenings to inform parents about children’s learning.

Conversely, over a third of services lacked strategies and systems to support assessment practice.

In most of these services, systems to guide educators were informal or, if written, lacked clarity. Although a few services did some recording of children’s learning, their assessment guidelines were not based on current good practice. Children were assessed as a group rather than as individuals, and assessment was not undertaken regularly. A few of these services had informal systems to share assessment information amongst educators and with parents, but these systems were often ineffective or not followed.

ERO found poor leadership in many of these services and a lack of higher‑level professional discussion. A few of these services did not have non-contact time or meetings for educators to discuss assessment, and thus relied on educators to record assessments of children’s learning in their own time. The services did not have effective strategies to ensure the regularity, content, format, or sharing of assessment information.

Overall quality of assessment policies and practice

Figure 1 shows that overall, assessment policies and practice in a fifth of services (20 percent) were well developed and implemented. Assessment policies and practice were developed and implemented in 41 percent. In 34 percent of services assessment policies and practice were partially developed and implemented, and in five percent of services these were not developed.

Figure 1: Assessment policies and practice

This is a bar graph. The y axis is called percent of schools and is ranges from 0-100 at intervals of 20. The x axis has four labels which are Well developed and implemented 20% Developed and implemented 41%, Partially developed and implemented 34% and Not developed 5%.

ERO found that regular and ongoing professional development and low staff turnover were key factors in educators’ development and implementation of assessment policies and practices. Where educators had participated in whole-staff professional development about assessment they were more likely to have an understanding of assessment of children’s learning. In services where educators had not undertaken professional development, or only one or two educators had, there was often a lack of shared understanding of assessment. This led to poor practice and limited strategies for assessing children’s learning and development. Low staff turnover contributed positively to consistency and understanding of assessment practice.

Figure 2 shows that 60 percent of education and care services, 37 percent of playcentres, and 76 percent of kindergartens had developed and implemented sound assessment policies and practices.

Figure 2: Assessment policies and practice by service type

This is a side bar graph. The Y axis has three labels and each is split into four parts. The four parts relate to Well developed and implemented, developed and implemented, Partially developed and implemented and not developed. The first bar is Education and care services and its percentages relating to the four parts are 17%, 43%,35% and 5%. The second is Playcentre 4%, 33%, 47% and 16 percent. The third is Kindergaren 34%, 42%, 22% and 2%.  The x axis is ranged 0-100 at intervals of 20.

Reflecting the four principles of Te Whāriki

To what extent does assessment practice reflect the four principles of Te Whāriki?

The valued outcomes of early childhood education vary from family to family depending on their cultural, educational, and religious beliefs, as well as their views on early learning. In New Zealand the early childhood curriculum, Te Whāriki, is underpinned by the concept of nurturing and promoting each individual child’s growing competence to communicate, participate, and learn about the world.

Socio‑cultural assessment is recognised in New Zealand as a collaborative enterprise, including children, parents, whānau, and educators. [18] Educators are expected to contribute to the development of children’s competencies by working in partnership with each child’s family. Feedback tells children what outcomes are valued and how they are doing. It also acknowledges the goals children set for themselves.

ERO evaluated the extent to which assessment practice reflected the four principles of Te Whāriki in relation to the evidence that:

  • children’s holistic development was reflected in assessment practice;
  • children and their families were involved in assessment practice;
  • children were given feedback on their learning; and
  • children’s learning was captured in context to their relationships with people, places and things.

Holistic development - kotahitanga

A holistic approach to learning and assessment takes account of all the dimensions of children’s learning and development and recognises that these are interrelated and interconnected. Early childhood educators therefore regard each child in the cultural context of their whānau and community. Underpinning this holistic view of the child is educators’ knowledge of learning theory and their understanding of child development, including cognitive, physical, social, emotional, and spiritual dimensions.

Combinations of children’s emerging knowledge, skills, and attitudes to learning are described as dispositions for learning. Positive dispositions for learning include courage and curiosity, trust and playfulness, perseverance, confidence and responsibility. Dispositions for learning also include the way children approach learning, for example, taking an interest, being involved, persisting with difficulty, challenge and uncertainty, and expressing a point of view. Children’s dispositions are noticed, recognised and responded to by competent educators in early childhood settings.

ERO investigated how well services reflected children’s holistic development in their assessment practice.

Nearly two‑thirds of services clearly reflected children’s holistic development in their assessment practice. In these services assessment included information about children’s knowledge, skills, dispositions, and attitudes. There was good analysis of assessment information that incorporated all aspects of children’s learning and development. This information was used to plan in advance to support children’s interests or dispositions, and to extend their learning and development in a range of contexts, activities, and experiences. In services with very good practice, assessment also reflected children’s cultural dimensions such as their own and their whānau’s aspirations, language, practices, and traditions.

Assessment included information about the whole child. The educators took into consideration the child’s knowledge, skills being developed, their dispositions being followed, their attitudes and aspirations when gathering information about their learning and development. This holistic development of the child was central to the service’s philosophy.

Just over a third of services did not reflect multiple aspects of children’s learning and development in assessment information.

ERO found variable practice in many of these services. For example, some assessments in a service reflected the holistic nature of children’s learning and development, others did not. Children’s knowledge, skills, and to a certain extent, dispositions, may have been included in assessments, but there was little focus on attitudes and cultural dimensions.

Although some educators were beginning to understand the concept of holistic development, this was not reflected in their assessment of children’s learning and development. Some assessments were still highly descriptive of children’s activities at a certain time and place, and lacked higher-level analysis of children’s learning over time and in a range of situations, reflecting educators’ limited understanding ofTe Whāriki.

In a small number of these services, ERO found little or no evidence that assessment was holistic. There was little understanding of Te Whāriki and learning programmes were educator-directed rather than being driven by children’s interests. Assessment was mostly a description of children’s involvement in activities.

Parents and families – whānau tangata

The involvement of parents and whānau in assessment acknowledges and values the interconnection between home and the early childhood service. Parents and whānau have a wealth of information and understanding about their children, particularly about their participation in the world outside the early childhood service. ERO investigated how well services involved parents and whānau in assessment practice.

About half of the services involved parents and whānau in assessment activities. These services were proactive in seeking parents’ input about their child’s interests, strengths, and aspirations, as well as the family’s cultural background, values and beliefs. Services used enrolment sheets, asked reflective questions, and recorded parent conversations and learning stories accompanied by photographs of their children to plan possible learning experiences. Some parents also contributed stories about their family, culture, language, and events such as holidays. In most of these services, parents were easily able to access assessment records such as portfolios or profiles.

Learning stories were well displayed to make children’s learning visible. Educators had developed a template for parents to contribute information about their child and their aspirations for their learning when they began at the service. Educators provided parents with a small notice to indicate when a new learning story had been placed in their child’s portfolio. Families could take portfolios home and a useful format for encouraging families to make a written contribution had been developed. Many families used this, or their own format, to record stories from home.

Where ERO found especially good practice, services had established effective systems to encourage parent involvement in assessment of their child’s learning. For example, services shared assessment information not only through portfolios and profiles, but also through email diaries and learning stories, daily notebooks, information and whānau evenings, wall and slideshow displays, and parent interviews. Parents were well informed and actively involved in their child’s learning and development.

Just under half of the services had difficulty involving parents and whānau in assessment, and the contributions of parents and whānau were limited.

Many of these services asked parents for information about their family and child’s interests at enrolment, and less often at regular intervals throughout the child’s attendance. However, this was frequently the only consideration of the child’s family, cultural background, values and beliefs. ERO found little evidence that educators used this information in planning or to reflect on children’s learning.

Although parents in most of these services had access to assessment records such as profiles and portfolios, the usefulness of this to parents was limited. Services often reported that many parents declined to participate in assessment activities.

Feedback to children – whakamana

Feedback to children about their learning and development enhances their sense of themselves as confident and capable learners. ERO investigated how well services gave children feedback on their learning.

Just over half of services were enhancing children’s sense of themselves through feedback about their learning. Children in these services revisited past and current learning experiences and could talk about their learning. They were able to revisit their learning through portfolios, wall displays, DVDs, and computer presentations of digital photographs. Educators used language and questions that encouraged children to discuss and think further. Where ERO found very good practice, educators valued children’s resourcefulness, curiosity, creativity and problem solving.

Children were constantly looking at their portfolios and any comments they made were added. This gave children’s perspectives on what they were thinking at the time, indicated change over time, helped children to revisit past experiences and learning and reflect on these. Educators were skilled at making links with past learning while talking with children and being explicit about children’s progress.

Documentation of emerging interests, including children’s work, was collated into planning folders and children and parents were able to revisit these rich learning experiences. Displays of learning stories and photographs were carefully placed throughout the service at a suitable height so children could return to these, discuss them, and recall past learning and progress.

Almost half the services were not giving children feedback about their learning. Children in these services had limited access to records of learning experiences such as portfolios, wall displays, and photographss. When educators did make opportunities to revisit experiences, children were not encouraged to reflect on, or build on, their learning. Most feedback given to children affirmed or directed behaviour rather than encouraged reflective strategies such as problem solving or curiosity.

Children’s learning in context – ngā hononga

Children’s learning and development are influenced by their relationships with people, places, and things. Assessment of this learning and development should be captured within the context of these relationships. ERO investigated how well services were assessing children’s learning in context.

Two‑thirds of services assessed children’s learning in context. Assessment of children’s learning reflected the social contexts in which the children learnt, and included meaningful descriptions of the environment and the people in it that influenced their learning. ERO found that where parents were very involved in their child’s learning, the parents made links with home experiences and the cultural context of the family, for example, aspirations, language, practices, and traditions.

In services with very good practice, educators included other people such as friends, educators and parents, and used descriptions, photographs and captions to capture the context of learning in a meaningful way as well as to show children’s progress and learning over time. Educators used their observations and analysis of learning to plan programmes and activities that would allow children to follow their current and emerging interests in a child-initiated context. In many of these services, the cultural context of children was an important feature of assessments.

Assessment was individualised, and drew on knowledge gained from the service or home context. Educators and parents, who came from a wide range of ethnicities, incorporated cultural contexts into assessment. Educators noticed and responded to children’s initiatives and recognised their individual strengths and abilities.

A third of services did not assess children’s learning in context. Few educators acknowledged social interaction and children’s strengths and abilities, and few incorporated cultural contexts. Most observation was descriptive and did not make any links to an analysis of what learning was occurring or what might happen next.

In a small number of these services, the assessment of children’s learning did not occur in a meaningful context. Rather, educators assessed children undertaking set tasks, as opposed to assessing learning occurring during child-initiated play.

Overall reflection of Te Whāriki in assessment practices

Figure 3 shows that in 64 percent of services ERO found that assessment practices were highly reflective or reflective of the four principles of Te Whāriki. Assessment practices at over a third of services (36 percent) were only partially reflective or not reflective of the four principles of Te Whāriki.

Figure 3: Reflecting the four principles of Te Whāriki

This is a bar graph. The Y axis is called percent of services and is ranges 0-100 at intervals of 20. The x axis has four labels which are Highly effective 15%, Reflective 49%, Partially reflective 30% and Not reflective 6%.

ERO found that educators’ understanding of Te Whāriki and socio‑cultural assessment were key factors in how well assessment practice reflected the four principles. In services where practice was partially or not reflective, parents, children, and the educators themselves were not able to use assessment information to support children’s learning and development.

Figure 4 shows that in 63 percent of education and care services, 39 percent of playcentres, and 78 percent of kindergartens assessment practices were highly reflective or reflective of the four principles of Te Whāriki.

Figure 4: Reflecting the four principles of Te Whāriki by service type

This is a side bar graph. The Y axis has three labels and each bar is divided into four parts.  The four parts are Highly reflective, Reflective, Partially reflective and Not reflective. The first label is Education and care services the four parts are 11%,52%,32% and 5% respectively. The second label is Playcentre 6%,33%, 41% and 20%. The third is Kindergarten 28%, 50%, 21% and 1%. The x axis is ranged from 0-100 at intervals of 20.

Reflecting children's learning and development

How well is children’s learning and development reflected in assessment? Children are better able to learn when educators observe children, use this information to challenge their own thinking, and provide learning opportunities that extend children’s abilities. Assessment information reflects the complexity of learning and development, and the context of interactions with people, places, and things. [19]

ERO evaluated how well children’s learning and development was reflected in assessment in relation to the evidence that:

  • assessment information demonstrated the breadth of children’s learning and development;
  • assessment information showed an increasing complexity in children’s learning and development; and
  • assessment information included appropriate analysis to reveal learning.

Breadth of children’s learning and development

Assessment that captures the breadth of children’s learning and development, including skills, dispositions, parents’ aspirations, and children’s interests provides a picture of the whole child. ERO investigated how well assessment information demonstrated the breadth of children’s learning and development.

In just over half the services assessment information demonstrated this breadth. Educators in these services gathered a range of assessment information that included many aspects of children’s learning and development. Children’s dispositions were referred to in assessment, as were parents’ aspirations, through the use of photographs and written comments. Combined, these aspects helped show children’s learning and development, and informed planning and next steps for learning. Educators invited children to comment on their own learning and accurately reflected this in assessment records to build a picture of the whole child.

Educators effectively included all aspects of the child’s development in assessment information. The children’s profiles clearly showed the extent of learning by the comments made about interests and needs. The inclusion of children’s voice in the portfolios was a strength. Families were well informed of the breadth of children’s learning and the development that had occurred. Parents’ goals for their child were recognised and responded to on a regular basis. Children were listened to and their ideas and opinions were valued.

Just under half the services were not demonstrating the breadth of children’s learning and development in their assessment information. While assessment records covered a wide range of experiences and activities, many of these services lacked strategies to ensure that children’s progress and breadth of learning and development could be demonstrated. Educators in some of these services analysis showed connections between the narratives and children’s learning, but others discussed only children’s participation in activities.

Increasing complexity

Assessment that acknowledges the complexity of children’s learning and development shows the progress of each child as they develop competence and confidence over time. ERO investigated how well assessment information showed the increasing complexity in children’s learning and development.

In just under half the services educators were writing narratives and including photographs and children’s art work in portfolios that showed the progress of individual children over time. These educators were also able to show children’s skills and learning dispositions in ways that demonstrated the complexity of their learning and development. Educators supported children and encouraged them to revisit previous learning experiences, building on children’s interests. They did this through effective noticing, recognising, and responding, and were able to build on prior learning.

Comprehensive records showed what educators were noticing and recognising about children’s learning and how they responded to this knowledge to increase the complexity of the child’s experiences and understanding. Immediate responses to develop children’s interests and knowledge through the daily reflective diary made learning meaningful and increased the complexity of experiences. All entries in portfolios were specific for that child and their learning. Building in prior knowledge and revisiting past learning was an established practice.

In just over half the services educators did not demonstrate understanding of the complexity of children’s learning and development in assessment information.

Educators in many of these services were only just beginning to make links between stories about children’s learning and recognising significant learning moments for children. ERO found evidence that, although some children’s learning was increasing in complexity, not all educators were able to recognise this and thereby add challenge to children’s learning or help them to revisit past learning.

Where ERO found very poor practice, educators failed to see opportunities to increase the complexity of children’s learning through their play and current interests. There were very few connections between learning stories to show children’s progress, and where this did occur educators often misinterpreted the nature of the complexity and the child’s interest.

Analysis to reveal learning

Analysis of educators’ observations of children makes children’s learning visible. This analysis transforms what educators notice into the recognition of learning. ERO investigated how well assessment information included appropriate analysis to make learning visible.

In just under half the services educators were analysing assessment information appropriately in order to understand children’s learning better. Educators’ recognition of learning, and short‑term reviews made children’s learning in assessment information visible. Educators in a service worked together in meetings to analyse observations, record children’s learning, and to identify next steps, possibilities and opportunities. In services where ERO found very good practice, both parents and children were involved in analysis through learning conversations.

Each afternoon the educators had a reflection meeting to discuss daily stories about children. They downloaded the day’s photographs and talked about the photographs while someone recorded the information. Resources and environment were discussed to ensure that these were set up for the following day to continue to stimulate interests in learning.

In just over half of services, educators did not identify children’s learning through their analysis. Although educators in a few of these services did not undertake any analysis, most educators were beginning to do so, but were at a very early stage of understanding and development. Educators’ analysis of children’s learning was variable – focusing on activities and groups of children rather than recognising learning occurring for individual children. The identification of next steps and possible directions was missing in most narratives or short-term reviews. Although some of these educators were analysing children’s learning informally, this was not shared with other educators or parents.

Overall quality of reflective practice

Figure 5 shows that in less than half of services (48 percent) the reflection of children’s learning and development through assessment was highly evident or evident. The reflection of children’s learning and development in assessment was only partially evident in a third of services, and not evident in 19 percent of services.

Figure 5: Reflecting children’s learning and development

This is a bar graph. The Y axis is called percent of service and is ranged 0-100 at intervals of 20. The x axis has four labels which are Highly evident 17%, Evident 31%, Partially evident 33% and Not evident 19%.

ERO found that in services where children’s learning and development was partially evident or not evident through assessment, educators did not understand or practise socio-cultural assessment. In some of these services, educators had only recently undertaken professional development in assessment, and they lacked confidence and experience to analyse and reflect upon children’s learning and development through assessment.

Figure 6 shows that in 53 percent of education and care services, 25 percent of playcentres, and 71 percent of kindergartens, reflection of children’s learning and development through assessment was evident or highly evident.

Figure 6: Reflecting children’s learning and development by service type

This is a side bar graph which has three labels. Each label is divided into four parts. The four parts are Highly evident, Evident, Partially evident and Not evident. The first label and its four parts is Education and care services 12%, 41%, 39% and 8% respectively. The second label is Playcentre 5%, 20%, 51% and 24%. The third is Kindergarten 31% 40%, 24% and 5%. The x axis is ranged from 0-100 at intervals of 20.

Assessment informing learning

How well does assessment information inform learning in the service?Good quality assessment practice contributes to positive outcomes for children. Assessment helps educators to provide learning opportunities that enrich children’s experiences, learning, and abilities. The complexity of children’s learning increases when they participate in learning experiences that are connected and relevant to their own family and community.

Assessment involves the observation of children by experienced and knowledgeable educators who use that information to improve their programmes and outcomes for children. Educators who assess well, embrace the concept of “ako” - that the child and the educator are in a learning journey together - and that teachers are also learners.

ERO evaluated how well assessment information informed learning in the service in relation to the evidence that:

  • links between assessment and planning demonstrated the educators’ response to children’s learning;
  • children participated in meaningful experiences as a result of assessment practice; and
  • children contributed to the assessment process.

Links between assessment and planning

The use of assessment information to plan for future programmes helps to create meaningful and increasingly complex learning experiences for children. ERO investigated how links between assessment and planning demonstrated that educators responded to children’s learning.

In just over half the services educators were using assessment to plan for, and respond to, children’s learning. Educators participated in team planning sessions to develop programmes and next learning steps for children. These sessions focused on what educators had noticed and recognised during observations, how they had responded to children’s current and emerging interests, and how they planned to do so in the future. Educators’ analysis of children’s learning and opportunities for further learning were also documented in reflective journals, learning stories, and portfolios.

In services where ERO found very good practice, educators had developed useful strategies to provide links between planning and assessment. For example, in one centre an ongoing team reflective journal was used at formal meetings to promote thinking and analysis amongst the team.

Educators met to discuss learning stories and next steps for children. The ‘where to next’ was documented in a shared planning book. The service had theme books - folders of learning stories that had grown into a unit developed from several children’s interests. Educators wrote reflective narratives for these books. The ‘where to next’ in learning stories was written in a broad context, as educators wanted the child to drive their own learning.

In just under half the services, educators were not making useful links between assessment and planning. Educators did not regularly participate in reflective discussions and there was little sharing of observations and analysis of children’s learning. Some educators were beginning to notice and recognise children’s emerging interests and needs. However, subsequent experiences provided by educators lacked depth and continuity, and consisted mainly of changes to activities and resources, rather than responding to what children knew and were interested in, and exploring how their learning could be developed and enhanced.

Participation in meaningful experiences

Children participate in meaningful experiences when they are engaged in, and challenged by the learning occurring, and where that learning is enhanced by good quality assessment. ERO investigated whether children were participating in meaningful experiences informed by assessment.

In about half the services children did participate in meaningful experiences informed by assessment. Educators planned activities that were based on identified interests, strengths, and needs, and that were meaningful to children’s home life. Children arrived at the service with a sense of anticipation and excitement about the challenges and experiences ahead. They were engaged in activities they had chosen themselves that were stimulating and appropriate to their age. They could easily access resources to support their play, and educators interacted with the children rather than directing play. The children had a sense of themselves as capable learners, and could share their learning with each other and with educators.

Assessment practice enabled educators to recognise activities and experiences likely to engage particular children, and to respond both immediately and long term to their interest. Children were able to choose freely, the environment was organised to take account of their emerging interests and educators responded from their indepth knowledge of children.

In services where ERO found particularly good practice, educators were quick to respond to children’s learning by introducing new resources and extending children’s thinking through open‑ended questions and sustained conversations. Educators often made adjustments to the programme immediately, as well as planning for extension and development of children’s interests and ideas over time.

In the remaining services assessment had little or no influence on the provision of meaningful learning experiences for children. In some of these services children actively participated in, and enjoyed, experiences that reflected the service’s philosophy. However, these experiences were not informed by assessment of children’s interests and learning. Many of the activities were educator-directed and they often lacked challenge and opportunities for decision‑making, particularly for older children.

Children’s contribution to assessment

When children contribute to the assessment of their own learning, they are able to discuss and choose the direction of their learning experiences. By having opportunities to make decisions about what is important and should be included in their assessment records, they are able to identify themselves as competent and as experts. ERO investigated the extent to which children were contributing to assessment practice.

A third of services provided children with opportunities to participate in assessment of their own learning. These educators included children’s voices in assessment in a variety of ways, such as speech bubbles of children’s comments about the learning experience, and participation in decisions about which photographs and art work to include. Children were able to revisit their learning through portfolios and through planned discussions such as mat time. Educators recognised children’s aspirations and goals and this informed both spontaneous and formal planning. Educators also encouraged children to evaluate their own learning through conversations that required children to think about how they might develop an idea or skills. Recording these conversations showed how children’s thinking and learning developed over time.

In services where ERO found very good practice, educators acknowledged children’s voices and perspectives. Many of these services were using ICT effectively to help children revisit their learning and participate in their own assessment.

ICT was used as a tool for documenting the programme in action, assessment and children’s publications. Children were learning how to use ICT tools to support their learning and to make it visible. They revisited and reflected on past learning through portfolios and planning stories. Children discussed their own play and learning information with their peers and educators. They developed their own criteria for assessing achievement mostly using ICT and there were opportunities for children to become the educators. Children made decisions about what they would do next and about entries into their portfolios.

Children in two‑thirds of services had limited opportunities to contribute to assessment of their learning. While educators in some of these services were beginning to record children’s voices and to encourage them to revisit their learning, many educators did not actively seek out children’s self evaluation or give them opportunities to further plan or extend their own experiences. The educators did not have an adequate knowledge and understanding of current assessment theory and practices to respond meaningfully to children’s perspectives, plans, and interests.

Overall quality of assessment informing learning

Figure 7 shows that learning was well informed or informed by assessment information in half of the services. Learning in 38 percent of services was inadequately informed and 12 percent were not informed by assessment information.

Figure 7: Assessment informing learning

This is a bar graph. The Y axis is called percent of services and is ranged from 0-100 at intervals of 20. The x axis has four labels which are Well informed 14%, Informed 36%, Inadequately informed 38% and Not informed 12%.

ERO found that in services where learning was inadequately informed or not informed by assessment information, a variety of factors, including staff turnover and inadequate planning, analysis, and non-contact time contributed to this situation. A lack of professional development and strategic direction meant that educators did not have a shared understanding of assessment, or services did not have the policies and procedures to drive good quality assessment practices.

Figure 8 shows that learning in the service was either well informed or informed by assessment information in 48 percent of education and care services, 22 percent of playcentres, and 71 percent of kindergartens.

Figure 8: Assessment informing learning

This is a side bar graph. There are three labels and each label is divided into four parts. The four parts are Well informed, Informed, Inadequately informed and Not informed. The first label is Education and care services, its four parts are 10%, 38%, 39% and 13% respectively. The second label is Playcentre 4%, 18%, 51% and 27%. The third is Kindergarten 31%, 40%, 26% and 3%.  The x axis is ranged 0-100 at itnervals of 20.

Contributing to self review

To what extent does assessment practice contribute to ongoing self review?Effective self review allows educators to review their programmes, physical environment, and interactions in light of assessment information about children’s learning and development.

ERO evaluated the extent to which assessment practice contributed to ongoing self review in relation to the evidence that:

  • educators use assessment information about children’s learning and development to inform the service’s programme development;
  • educators use assessment information to improve the service’s physical environment; and
  • educators use assessment information to improve interactions between educators and children, and amongst children.

Programme development

When services gather and analyse assessment information about children’s learning they are able to use that information to identify new directions for the learning programme and the professional development requirements of educators. ERO investigated how well services used assessment information to inform programme development.

About half the services were using assessment information about children’s learning to inform programme development. Educators in these services undertook both spontaneous and planned reviews and made changes to the programme in response to their assessment of children’s learning and development. Reflective discussions, guided by key questions about learning and development, and the documenting of changes to activities and planning, ensured that planned experiences to further challenge children and extend their learning were ongoing and purposeful. Educators were also using self-review processes to identify professional development opportunities.

Where ERO found very good practice, services had a comprehensive framework to guide self review. The strategic direction and philosophy of these services matched their policies and procedures for planning, assessment, evaluation, and consultation. These frameworks ensured regular self review and the effective implementation of changes to the programme.

Planned and spontaneous reviews added significantly to the development of the learning environment and to educators’ practice. Planned reviews focused on the evaluation of the projects undertaken with children and provided educators with useful information. Educators followed a predetermined plan in which they developed and answered key questions, such as how they encouraged complexity of learning, included literacy, and involved the community. The format included a section for identifying ‘possibilities and opportunities’ or next teaching and learning steps.

Half the services were not using assessment information about children’s learning and development to inform their programme.

Some of these services were developing self-review processes, but their educators had varying levels of understanding about the purpose of self review. While some educators could articulate changes made to the programme in response to assessment information, few had written records that documented these changes and provided a record for future staff reflection.

ERO found that in some of these services educators were unable to change long-term programme planning in response to assessment information, as strategic and annual plans were inflexible or not sufficiently focused on teaching and learning. Strategic planning was largely about property development and assets, rather than also focusing on programme and professional development. Nationally-based management guided many of these services, and strategic plans were not localised and did not adequately reflect the context of each service, the children attending, and their parents and whānau.

Physical environment

Appropriate changes to the physical environment of an early childhood service reflect the current interests and strengths of, and next learning steps for, the children in the service. ERO investigated how well services used assessment information as part of their self review to improve the physical environment.

Educators in less than half the services used assessment information to guide development of the service’s physical environment. These educators changed the environment regularly so children had access to learning resources and spaces that supported their current interests and strengths, and promoted inquiry and exploration. Educators also used next steps identified in learning stories to contribute to short-term planning and long-term strategy on budgets and changes to the environment.

Where ERO found very good practice, educators adapted the environment so children developed a wide range of skills and dispositions. They consulted children about the physical environment to make sure that it matched their interests and gave them challenge.

Educators changed the activities in the environment according to children’s interests. The exterior space was small and educators were limited by what they might change in this area, but discussed how to best accommodate changing needs within the confines of the space available. The service used a community centre and had to put all its equipment away at the end of each session, so there were daily opportunities for altering the layout of the equipment and areas of interest.

Over half the services did not use assessment information as part of their self review to improve the physical environment. While changes to the environment were made, these were unlikely to be the result of analysis of ‘what next steps’ identified in assessment, or of children’s use of resources and outdoor and indoor spaces. In many services, changes to the environment were driven by management decisions based on budgets, new resources available for purchase, and health and safety matters, rather than by educators’ analysis of children’s current interests and strengths.

Interactions between and amongst educators and children

When assessment information contributes to interactions in the service, educators are able to show that they are reflecting upon their interactions with children, and considering how to extend and improve the quality of these interactions. ‘What next’ steps in assessment focus on ensuring that all children positively participate in the social and educational environment of the service. ERO investigated how well services used assessment information to extend and improve interactions between educators and children, and amongst children.

In the services (less than half) that used assessment information to inform interactions, assessment records provided evidence of educators reflecting on interactions, and recorded subsequent changes or improvements made to enhance interactions. Assessment information contributed to the knowledge educators had about children, and to engage children in interactions that supported their sense of belonging and their learning and development.

Where ERO found very good practice, educators used assessment information in their promotion of children’s problem solving, negotiation, leadership, cooperation, and sharing of ideas and views. These services also used the assessment process to build and improve on their interactions and relationships with parents and whānau.

Respectful and responsive relationships were formed between children, their peers and educators. Educators were responsive to children and engaged in professional dialogue with parents and other educators. Children interacted alongside and with others as part of their engagement in learning and during social times at the centre. They developed confidence in communicating and working cooperatively. Educators used effective questioning skills and sustained dialogue with children to promote problem solving, and encouraged them to share their views and theories of the wider world. Children also supported the learning of their peers. They developed dispositions that would support them throughout their education. A range of strategies for active exploration, thinking and reasoning supported children’s growing confidence to engage with and understand the world around them.

More than half the services did not use assessment information to inform the interactions at the service. While in many of these services interactions were positive, they were not very constructive. Interactions between educators and children were instructional and educator‑directed, and educators lacked the ability to listen carefully to children’s responses and respond appropriately. Educators had poor questioning and prompting skills, and did not give children time to think and respond. In many services, the effect of assessment on interactions was not documented or analysed in learning stories and next steps for learning. Where ERO found very poor practice, there was very little evidence that interactions between educators and children extended and supported the development of children’s language, understanding, and thinking and other interpersonal skills.

Overall quality of assessment contributing to self review

Figure 9 shows that in just under half the services (49 percent) assessment information made a ‘significant contribution’ or ‘contribution’ to ongoing self review. In a third of services (34 percent) assessment information made a ‘limited contribution’ to ongoing self review, and made no contribution in 17 percent of services.

Figure 9: Contributing to self review

This is a bar graph. The Y axis is called Percent of services and is ranged from 0-100 at intervals of 20. The x axis has four labels which are Significant contribution 12%, Contribution 37%, Limited contribution 34% and No contribution 17%

ERO found that in services where there was limited or no contribution of assessment information to ongoing self review, a lack of professional development opportunities hindered educators’ abilities to participate in discussions and to use assessment information to reflect on their practice.

Figure 10 shows that, in 46 percent of education and care services, 24 percent of playcentres, and 69 percent of kindergartens, assessment information made a ‘significant contribution’ or a ‘contribution’ to ongoing self review.

Figure 10: Contributing to self review

This is a side bar graph. There are three labels and each is divided into four parts. The four parts are Significant contribution, Contribution, Limited contribution and No contribution. The first label is Education and care services its four parts are 8%, 38%, 37% and 17% respectively. The second is Playcentre 2%, 22%, 43% and 33% and the third is Kindergarten 29%, 40%, 24% and 7%. The x axis is ranged 0-100 at intervals of 20.

The overall quality of assessment

The quality of assessment in early childhood services was reviewed against five key evaluation areas.

ERO found that 38 percent of services were implementing good quality assessment practices across all five key evaluative areas.

Thirty percent of services were implementing good quality assessment practices in some areas of assessment, but not in others.

Thirty-two percent of services were not implementing good quality assessment practices in any of the five evaluative areas.

For each of the five areas, ERO compared the quality of assessment practice by service type and locality. The following findings were statistically significant. [20] ERO found that:

  • kindergartens were more likely to have good quality assessment practice than education and care services and playcentres; and education and care services were more likely have good quality assessment practice than playcentres, across all five of the evaluative areas;
  • urban services were more likely to have good quality assessment practice than rural services across all five of the evaluative areas. Sixty‑three percent of rural services were playcentres. An urban playcentre was more likely to have good quality assessment practice than a rural playcentre; and
  • within a particular service type, locality did not influence the quality of assessment practice in the five areas, unless the service was a playcentre.

ERO found variable quality of assessment practice across the early childhood education sector, as well as in particular service types and by locality.

Overall quality between service types

Figure 11 shows a fairly even spread across the three groupings of ‘good’, ‘variable’, and ‘poor’ quality. This evenness of spread is reflected by education and care services, with about a third of education and care services in each grouping. Figure 11 shows that although kindergartens had greater representation in the good quality group (56 percent), 30 percent were in the poor quality group. Conversely, 20 percent of playcentres were in the good quality group, with 57 percent in the poor quality group.

Figure 11: Overall quality by institution type

This is a multi bar graph. The Y axis is called percent of educators and is ranged from 0-100 at intervals of 20. The x axis is called Overall quality and has three labels. Good quality, Variable quality and Poor quality. Each label has four percentages which relate to All services, Kindergarten, Playcentre and Education and care services. For the first label the percentages are 38%, 56%, 20% and 33% respectively. For the second label it is 30%, 14%, 22% and 38%. For the third label it is 32%, 30%, 57% and 28%.

ERO found that in the playcentres in the good quality group (10 of 49 playcentres overall) parent educators were engaged in ongoing professional development in assessment and also had good support from a Centre Support Person who modelled high quality practice. Many parent educators were experienced and valued the importance of assessment. They supported new parents in noticing, recognising, and responding to children’s learning. There were systems and documentation to ensure continuity of assessment practice. Regular planning meetings and effective self review ensured assessment information was used to reflect children’s interests and strengths in the programme.

Of the playcentres that were in the poor quality group (28 of the 49 playcentres) most had had little or no professional development in assessment. If there had been any professional development usually only one or two educators at the service had taken part. Where professional development had been undertaken for Kei Tua o te Pae it was provided by a Centre Support Person who had attended a workshop or seminar funded by the Ministry of Education. Unfortunately this model of training appeared to result in a lack of shared understanding amongst all educators about the purpose and practice of assessment for children’s learning and development.

In most of the kindergartens in the poor quality group (30 of 101 kindergartens overall) educators had not participated in any recent professional development in assessment. Although educators at most of these kindergartens had non‑contact time and access to ICT to assist with assessment, they did not have good professional support and leadership. There was a lack of management systems and frameworks to provide guidance and support for good quality assessment.

Overall quality between rural and urban servicesFigure 12 shows that half the rural services were not implementing good quality assessment practice in any of the five key evaluative areas. Nearly two‑thirds of rural services in this evaluation were playcentres, and ERO found that playcentres were less likely to have received professional development for Kei Tua o te Pae, to have access to ICT resources, non‑contact and meeting times, or ECE registered educators. Rural playcentres were less likely to have good quality assessment practices across the five evaluative areas than urban playcentres.

Figure 12: Overall quality by locality

This is a multi bar graph. The Y axis is called percent of educators and is ranged from 0-100 at intervals of 20. The x axis is called Overall quality and has three labels. Good quality, Variable quality and Poor quality. Each label has three percentages which relate to All services, Urban and Rural. For the first label the percentages are 38%, 39% and 25% respectively. For the second label it is 30%, 33% and 25%. For the third label it is 32%, 28% and 50%.