Rata Street School - 'summer effect' in writing

The school has been involved in several schooling improvement initiatives over many years. 

These initiatives have shaped their approach to school internal evaluation and built organisational capacity for inquiry over a long period of time. The Building Evaluation Capacity for Schooling Improvement project and the Literacy Professional Development Programme have helped extend their review and development strategies. More recently, the school has been part of a Learning and Change Network with five local schools. This involves challenge, critique and collaboration at every level: students, teachers, leaders and whanau.

At Rata Street School each year students' transience means about 30 percent of children arrive from or leave to other schools. Rather than take a deficit approach, the school focuses on the progress it can make for students while they are at the school. Students are taught how to recognise what level they are at in their learning, and what they need to do to move forward.

Internal evaluation is not seen as a discrete activity or process - it is embedded as an everyday activity. It starts with the analysis of student achievement by teachers working in year groups to identify priority curriculum areas for their year level, and set targets for these levels. The areas chosen for development are those that have the potential to have a significant and positive impact on the students that need to make the most progress. Information is collated to identify common needs and school-wide priority areas.

Student, teacher and leader goals are co-constructed to help everyone understand what they are trying to achieve and who are they intending to reach. Once students needing additional support are identified, the teachers reflect on what is and is not working for students. The needs of the student and teacher are identified several times throughout the year. Practice analysis conversations, including class observations and formal pre/post discussion and reflections, contribute to the school's extensive review and improvement practices. This approach has resulted in a shift from deficit thinking to a focus on what teachers can do despite contextual challenges. It has also led to more specific actions that in turn link to teachers' appraisal goals and reflections.

The evaluation outlined overleaf is an example of a recent evaluation and change activity at the school.

 

In 2009 leaders and teachers noticed a drop in achievement levels evident in the start-of-year writing samples. At that time 42 percent of students in Years 4 to 6 were achieving at lower levels in February 2009 than they had in November 2008. An initial analysis of this data by ethnicity, year level and teacher showed no trends. So no further action was taken at this time

Noticing

What’s going on here?

Is this what we expected?

At the end of November 2009 achievement data was gathered and compared to the February data. The analysis showed that 68 percent of the students had made a gain of two or more sub-levels when using the AsTTle writing levels. A further 30 percent of students had made a one sub-level shift.

However, when looking at the same students using November 2008 data compared with November 2009 data the achievement picture was very different. This data showed students were not making progress, and in some cases had dropped from where they were at the same time the previous year. Using this comparison, only 10 percent of students made a gain of two or more sub-levels from 2008 to 2009, 39 percent had a one sub-level gain, 40 percent had no gain at all, and 10 percent went down.

Investigating

What if we have another look at our data to check again if there are any trends or patterns that would explain what we noticed?

What does it look like if we use a different time period?

Leaders recognised that there was a need to do something different. They wondered if the drop might be attributed to the ‘summer effect’ – the decline sometimes seen after children return from the six weeks of summer holidays. Leaders and teachers sought to find out more about how to prevent this. Many of the staff did some reading around this effect and found research about the summer effect on students’ reading progress but little research about the impact on writing progress. They decided that it was best to focus their efforts on things they could influence. As research about good practices to remedy their issue was lacking, teachers had to devise and closely monitor their own strategy.

Collaborative sense making

What is this data telling us?

Why might this be so?

The strategy they decided to trial involved teachers pasting an example of the student’s writing from the end of the previous year into the front of the student’s new exercise book. This action was intended to help the child as they transitioned to a new classroom. By students having the same learning intention at the start of the year as they had at the end of the year it was hoped they would be reminded of their progress and achievement last year. It was also intended to show students that their new teacher understood their strengths and what they needed to do to progress further.

Instead of waiting for students to buy books, they were provided with an exercise book on the first day of term, ready to go, with the example of their writing in the first page.

This practice also meant that clear expectations were set with the child. Teachers and students referred back to the writing sample and teachers talked to students about the quality of work that is expected of them, reminding them what they are capable of. This established joint responsibility to ensure that the momentum in the learning is maintained year to year.

Leaders also ensured that teachers had the students’ data from the previous year, and that teachers had time to set groups and learning strategies for the start of the year. On the first day of Term 1, teachers could start where the previous teacher left off, with no need to assess students again. Teachers could also target instruction at a particular level right from the very first day and be very explicit about what the student needed to learn.

Prioritising to take action

What can we do differently?

What school practices can we change?

How do we make sure this practice becomes embedded as part of the practice of all teachers?

In the first year, the senior leadership team conducted rigorous monitoring of writing activities in every class, making sure there were daily writing lessons. These strategies worked well, with less than 10 percent of students dropping their level of achievement over summer. This practice is now used successfully at the beginning of each year.

Throughout the year, ongoing monitoring of all students highlighted those who are not making progress. Teachers reflected on their own practice, identified what they could do differently for these students and implemented these changes.

More recently leaders and teachers have focused on extending the ways that they worked with parents to increase students’ progress. An interim mid-year report about a child’s progress towards meeting National Standards previously included a section explaining what parents could do to help their children at home. This section is no longer completed independently by teachers. It is completed in consultation with parents. During parent/teacher conferences information is shared about the child’s goals and parents discuss how they will support these goals at home.

Other activities to enhance the ways the school works with and values parents included designing homework that children and parents could do together and identifying parent and whānau expertise, as well as expertise in the wider community, and how it could be used to advantage many students.

As a school community (students, whānau, teachers and leaders) we need to constantly review what we can do differently to improve the learning.

−−Leaders.

Outcomes for students

From 2008 to 2014 the percentage of students dropping levels in writing over summer went from over 40 percent to under 5 percent. The numbers of students increasing a level early in the year has also risen from 15 percent in 2008 to almost 40 percent in 2014.

Monitoring and evaluating impact

Are we getting the intended results?

How well is this strategy