In late June, Chatter Matters convened a seminar, Colleagues @ The Heart of Literacy. Its focus was on cross-disciplinary, cross-sector sharing of collaborative practice for advancing oral language and literacy in school-aged students and young people. Two practitioners, a speech pathologist in a primary school and high school (Pam Thuan, Vic.), and a high school principal (Cez Green, S.A.), spoke with passion about their work and about the value of their collaboration with each other’s professions.
One of the pieces of feedback from the day asked about the evidence for the material presented through the day. It’s an important question. It is always an important question. So ‘thank you’ to the participant who asked:
“The term ‘evidence-based’ was used a lot but there was no information shared about what evidence – who did the research? Why? How much research? We need to be strategic not have pockets doing different things.”
It’s easy in the reflective, listening context of a seminar to miss some of the information that is presented, so in making response, I have run back through Pam’s slides (provided to participants prior to the event) in order to direct attention to the evidence that she discussed in her presentation.
Though they are gold-standard, not all evidence need come from university-led randomised control trial (RCT) studies. Evidence is also produced through systematic, careful recording of observations and data in the iterative process of going about the work at hand, measuring, making adjustments based on the results of the measurements, and going about the work again, measuring, comparing and so on. This is action research in which practice and theory inform each other and through which we become better at achieving our stated goals – and, indeed, become better practitioners.
One of Pam’s powerful slides (pictured above) was about the natural control situation which emerged by comparing the results of the differently-done ‘work at hand’ within Mahogany Rise Primary School (MRPS) and a comparison school. In only two years, a significant difference was measured in student outcome data between the schools – MRPS had improved significantly. This is evidence that the processes ‘done differently’ had yielded efficacious positive change.
This is important. But the evidence case is not closed here.
Pam also provided a link to a collaborative write-up of the work done at MRPS. The link is in her slides (and produced again here for readers’ convenience). Her slides also included a MRPS data link which shares evidence of progress on NAPLAN scores amongst much else. In providing this information, Pam has made invitation to us all to engage with this data and take opportunity to think on it and learn from it.
The openness and problem-solving intention of the MRPS collaboration is made very clear in the above dot point on one of the data slides.
Measurements showed that when students left the MRPS oral language program and returned to regular classroom process, they experienced cessation in the language growth rates that had been witnessed whilst they were on the program. In choosing to follow this evidence, the MRPS teams expanded the oral language program to whole of school – and later, continued it into the secondary school that MRPS feeds into. That is, they measured, and then responded with considered action to what the results of those measurements showed. This is iteratively-gained evidence-based practice at its best.
It can be done anywhere.
In her presentation, Pam also referenced the university-led research and reviews that informed the choices she and her teams made about exactly what it was that they would do differently in their school. That is, she showed the sources of evidence that her collaboration had drawn upon to inform their change-processes and their measurement choices. The reports which Pam referenced (her slide is pictured above) include important public documents written with high levels of rigour – and produced in the scientifically accepted way of evidence production.
So to go back to the questions asked by one participant:
What information was shared about what evidence?
Pam was open and generous in sharing evidence and links. They can all be sourced through the material she supplied. If the participant who asked this question, or any others, have missed receiving Pam’s slides (we are aware that some servers may have blocked delivery), please reach out to Rosie here.
Who did the research?
Pam and her cross-disciplinary collaboration did some of the research. But they also drew-down on high-level peer-reviewed sources. They implemented, measured and adjusted in repeating cycles.
How much research?
Over 11 years, through devoted, attentive work, these teams have produced a sound base of evidence for at-the-coal-face practices of direct instruction in oral language and literacy that have yielded positive change in student learning outcome data.
They did it because they wanted to see the small people in their classes and corridors achieve their potential in the agency-producing skills of oral language and literacy. So that those small people might grow into big people who have choices about their lives and are not hampered by the disadvantages of lower-level language.
Though the MRPS teams did this work in a pocket of our nation, their leadership and problem-solving steadiness in care of the children in their charge and of each other, has produced a pathway which is useful to the rest of us to get to our goals for our students. Because of such strong localised evidence, this pathway invites others to walk with greater confidence. But following the path of this work doesn’t absolve us others from steadily solving our own localised problems as we care for the children in our charge and care for each other. We too must measure, adjust according to the results, implement again, measure, and keep doing so. And we are benefited by having MRPS’s clear, documented information (aka evidence), and generosity in sharing open story about the highs and lows of implementation. Place-based implementation is important. It is sensitive and dignified.
“Ours is not the task of fixing the entire world all at once, but of stretching out to mend the part of the world that is within our reach.”
From “Letter to a Young Activist During Troubled Times” by Clarissa Pinkola Estés, Ph.D.
If the part of the world that is within our reach is the face of the child and the activities of the day stretching out ahead, the implementation research from the MRPS collaboration, and the rigour of the sources which informed it, will very practically help us in our task. If the part of the world that is within our reach is the policy needed to guide others to respond to the face of the child and the activities of the day stretching out ahead, the implementation research from the MRPS collaboration, and the rigour of the sources which informed it, will also very practically help us in our task.
In like manner this richness of evidence lies behind all that Cezanne Green presented. And it is emergent in those Tasmanian local stories presented on the day of the seminar. It is also emergent in many other place-based activities happening right across Tasmania where well-documented, open sources demonstrating positive change (aka evidence) are being drawn upon and created.
The participant who gave this feedback also wrote: ‘While I acknowledge the expertise of speech pathologists, there is also a great deal of expertise amongst the teaching profession and leadership. In some ways I felt that the work of teachers was being undervalued.’
I would like to personally apologise to you that you left the day with this feeling. Our intention was for the opposite of this. And our desire is for the opposite of this. As a speech pathologist who has worked in perhaps not hundreds but certainly tens, of schools, I see how much expertise and generosity are within the teaching profession and leadership. Our wish is to expand enriched and intentional sharing of our sets of skills across our professions.