An Action Research post by Caroline Hill (SENCO)
The purpose of this study is to demonstrate the prime importance of testing reading comprehension and implementing interventions based on the results at Secondary School level. The success of this intervention is measured by retesting the pupils after the targeted intervention. This testing and subsequent intervention is aimed at the pupils becoming independent readers so that they can access the curriculum more easily. In September 2015 I started as SENCO and over the last 18 months I have moved away from 1:1 support for students with Special Educational Needs/Disability (SEN/D) to the Wider Pedagogical Role for Teaching Assistants (TAs) as recommended by Bosanquet et al (2016). This model of support incorporates class support and intervention based on each individual need. When deciding upon the testing regime, having used the New Group Reading Test (NGRT) in my previous school successfully and the fact that it is one of the most used nationally and its results are reliable and nationally recognised. The decision was made to use this system in preference to alternatives. In consultation with the Senior Leadership Team (SLT) a trial using KS3 was agreed. Having tested all of KS3 the data was used to identify students who were below their chronological age for reading comprehension. Targeted, personalised interventions were then put into place to help close the gap. Research by Brooks (2013) showed significant improvements with most learners and I concur as my results were very encouraging. Overall, 70% of students receiving intervention made accelerated progress of 3 months or more during an 8 week programme. The intervention also revealed a rise in the learners’ confidence and self-esteem. Brandon (2001) suggests that:
“A person with a healthy self-esteem is often less afraid of new ideas and perspectives, and therefore is more likely to explore new concepts.” (Brandon 2001:5)
This was true for the vast majority of the students which could lead to healthy life choices further on in their lives.
Why this study?
Is testing reading comprehension important at secondary school level? In my opinion the answer has to be yes as it is an indicator of a pupil’s ability to read independently and to access the curriculum successfully (DFE 2011). Do all secondary schools offer testing and intervention? No. Lafontaine et al (2009:4) have found that because reading is not taught as a specific subject in secondary schools, reading comprehension testing can differ from school to school. ‘Socioeconomic and cultural status can also have an impact’. In my setting I believe testing and implementing targeted intervention is vital to enable students to achieve their full potential and independence.
Testing reading and reading comprehension is integral to the NGRT approach and both are equally important. If you want to be successful in the workplace and compete effectively with your peers all over the world, then your comprehension skills need to be well developed (Lafontaine et al 2009, Paul and Clark 2016, Literacy Trust 2016). The difficulty arises when you look at the skills needed to read well and the additional skills you need to comprehend what you are reading. Paul and Clarke (2016) believe that mastery in both is required to be successful. The National Literacy Trust would agree as they explain:
“reading in its fullest sense involves weaving together word recognition and comprehension in a fluent manner.” (The National Literacy Trust 2012:6)
Letter and word recognition is the first step to reading. Recognising and pronouncing the sound each letter makes leads onto decoding, where students recognise groups of letters that make a sound and then make a word. To comprehend what is being read requires a lot of background knowledge about a wide variety of topics. Students should know how to use inference and make cognitive suggestions as to why they have reached a particular conclusion. Mastery and fluency in reading is when all the skills have been learnt and are applied automatically (Leipzig 2001).
With only ‘3% improvement of reading comprehension ability in 8 to 14 year olds since 2005’ (Literacy Trust 2016:2), and ‘5% of pupils aged 11 unable to read better than would be expected for a 7 year old’ (DFE 2012:13) it is clear that very little progress is being made. The DFE (2015) report that ‘This poor performance is the legacy of a decade of stagnation’ (DFE 2015:8) and while reading is important; the need to understand what we are reading is equally so (Tarchi 2015).
The importance of testing reading comprehension at secondary level is paramount as it impacts on all subject areas. Government (DFE 2011), research papers (Tarchi 2015, Paul and Clarke 2016, Rennie 2016, Johns and Lenski, 2001 and, the BBC 2012), all believe that reading comprehension ability is what shapes our future and that poor reading skills also impact students emotionally. Adleman and Taylor (2011) agree but also emphasise how it can affect their confidence and self-esteem.
The effects of falling behind in reading and feeling like a failure can take a large toll on young people. Children can lose all desire to learn to read or go to school (Hansel and Pondiscio 2016:2).
Reading comprehension at secondary level has a profound impact on individuals and many students who are struggling feel incapable and unintelligent (IOE 2012). Furthermore, there is a very strong link between poor reading skills and unemployment (Save The Children 2014). With all these factors impacting young people today with regards to their reading comprehension it would be completely unacceptable not to test and intervene appropriately. Testing provides the data needed to support students and helps to give them an ‘I can’ attitude (Rennie 2016). Some would argue that testing lowers self-esteem. However, done in the right way, it can negate this, giving teachers’ necessary information in the gaps in learning (Brandon 2001). Catching them early and providing quality interventions gives them the best chance of ‘catching up’ and securing good outcomes at GCSE (Brooks 2013). By providing a succinct and measurable reading comprehension test the school will be able to identify struggling readers early on, intervene promptly and monitor progress so that students have a greater chance of success in school and in future employment. If this approach were adopted nationally it could impact positively on the national statistics.
In 1943 the White Paper: Educational Reconstruction written and discussed with the Board of Education, refer to the 3 Rs; Reading, Writing and Arithmetic. In this paper it challenges parents to take responsibility for teaching their children to read, in fact they said it was ‘their duty to do so’ (Board of Education 1943:3). There was no testing available to see whether parents adhered to this responsibility and it wasn’t until 1974 that the Department of Education and Science (DES) brought in the Assessment of Performance Unit (APU) which required teachers to assess and monitor the achievement of students in their care. This was the start of accountability for teachers as a high priority (Gillard 2011). In 1975 The Bullock Report emphasised the need for every school to devise a ‘systematic policy for the development of reading and comprehension competence in pupils of all ages’ (DES 1975:514) and encouraged close communication between schools to show consistency and continuity. This gave schools a measure of how their students were doing compared to other schools in the local area.
Smith’s (1976) Journal, ‘Teaching Reading in the Secondary School’, focuses on reading comprehension. His findings were so similar to my present experience that I was amazed to find that the paper was dated 1976. It speaks of teachers concerns about secondary age students not having the comprehension skills needed to sit exams or access texts. It highlights students’ reluctance and disengagement from learning because they can’t comprehend what they are reading. It recommends modification of the lesson, intervention where appropriate and the need for all teachers to be responsible for the improvement of reading in their classrooms similar to a recent study by Adelman and Taylor (2011) who state:
Unfortunately, without re-engagement in classroom learning, there will be no long and lasting gains in achievement test scores. Unwanted behaviour is very likely to reappear, and many will continue to be left behind. (Adelman and Taylor 2011:18)
Why then, if this was the case 40 years ago, are we still seeing the same problems today?
When Mary Warnock wrote her report on Special Educational Needs (SEN) in 1978, schools were still not completely inclusive places, Castella (2011:21) describes how parents were so disillusioned by mainstream schooling, that they looked to set up ‘Free Schools’ in areas where they felt schools were not providing the expected outcomes. With a demand for greater integration for students with Special Educational Needs (SEN) into a mainstream setting, schools needed to know the starting points of all students to make sure that SEN students had the same chances to progress as their peers. Testing base line in Maths, Literacy and Science became ‘the norm’ and the need to be able to read well was pivotal as to how successful the students were (Galton, Simon and Croll 1980). This was consolidated in the 1981 Education Act where schools needed to introduce assessment for identifying the needs of children with learning difficulties as well as mainstream students. Procedures for ascertaining those needs and a statement as to how those needs would be met was also required.
With the introduction of the new National Curriculum in 1987 all schools, both primary and secondary, offering a ‘broad and balanced’ programme of learning had the opportunity to bring the importance of reading comprehension into their curriculum. However, only primary schools made this an intrinsic part of the curriculum. At secondary level there was a push for consistent progress in all areas which would give students knowledge and understanding to pave the way to adulthood and employment (HMI 1985). Testing and assessing became central to measuring progress but also to proving or disproving attainment; this was not just in academic subject areas or reading and spelling groups but also in the holistic development of the child which includes behaviours and emotional wellbeing. The report states that:
Raising achievement and standards through assessment is deemed to develop capacity to adapt and respond flexibly to a changing world. (HMI 1985:3)
The 1996 Education Act was the pivotal time when schools were held to account for the outcomes of students regardless of need or starting point. This meant that schools needed to re think how to deliver lessons to those who found school challenging and at the same time stretch and challenge those gifted and talented students. Jackson (1993) states that what was needed is
“A reconceptualization of the regular classroom (and perhaps special programs) to accommodate the diverse needs of all learners, including the gifted” (Jackson 1993:5).
By the 1990s when national testing was introduced, the reading and comprehension problem became even more apparent. The Literacy Hour began, new national policy was introduced with the ‘Every child a reader’ (2005) paper and yet, this was directed at primary school children not necessarily secondary schools as ‘reading’ is still not part of the existing secondary curriculum Lafontaine et al (2009).
OFSTED (2011) in its report ‘Removing Barriers to Literacy’, highlights the importance of testing reading/comprehension in the first 2 years of secondary school. It claims that if this is done, then early interventions can be put in place. Ofsted leaders recognised the fundamental importance of students gaining reading skills as an essential tool in securing their life chances. However, this process is still not offered to all pupils and therefore there remains a barrier for some students to be able to achieve better outcomes. Low achievement and the inability to comprehend subject material in class can impact on students’ self-esteem (OFSTED 2011).
Using technical terminology in subject areas that students cannot comprehend can bring behaviour issues, disengagement and a serious lack of motivation (Lavin et al 2009:168)
Ofsted (2011:18) reported that students felt:
- a fear of ‘losing face’ in class and ‘feeling thick’
- the stigma of attending a literacy class
- a fear of ‘finding the work too hard’ and ‘not passing any exams’
- a fear of bullying.
The National Literacy Trust (2016) through ongoing studies has collected data and has recognised the importance of secondary school reading and comprehension ability. They are part of the government’s new initiative ‘Read On Get On’ (2014) which aims to get all 11+ students reading well. Without this research, and implementation of interventions students could easily fall behind further. Therefore the situation is likely to get increasingly worse as students move through KS3 to KS4 (DFE 2015).
The latest research conducted by Paul and Clarke (2016:4) highlighted concerns that ‘22% of students do not have secure age appropriate reading skills on entering secondary school’.
Without testing all students, how many of the 22% would still be unable to access the curriculum in year 11 and how would we know? Richardson (2012:1) findings suggest ’15 and 16 year olds in England have an average reading age five years lower than their actual age.’ This would mean that they are unlikely to be able to access GCSE material which has a reading age of 12 (Literacy Trust 2012). The Joint Council for Qualifications (JCQs) has stringent guidelines for exam considerations. It is imperative that students are identified early so that their ‘normal way of working JCQ (2016/17:21) can be fully evidenced and that crucial support can be given so students are not put at a disadvantage.
The study by Paul and Clarke (2016) also showed that:
“The intervention offered did not bring significant gains in progress when only comprehension was tested and interventions which target student’s language comprehension as well is essential moving forward” (Paul and Clarke 2016:116)
With this additional information it will be interesting over time to see future impact on student attainment when linked to our guided reading interventions which does concentrate on language comprehension as well as text comprehension. Tarchi (2015:80) would agree in part, however his study found the need for a ‘multidimensional understanding of reading’ which includes the skill of using prior knowledge and inference making. Those participants in the research with these additional skills had better reading comprehension outcomes.
This wealth of research strengthens my hypothesis that there is an ongoing need for testing of reading, reading comprehension and targeted intervention to improve these levels for pupils below their chronological age.
In September 2015 there was no specific reading test, numeracy test or Cognitive Ability Tests (CATs) completed at my school due to prohibitive cost. Additionally previous systems of assessment were not consistently used. Instead, there were assessments within departments where heads of learning used their own methods to test the starting point of students in their classrooms. This meant, as a whole school, we had no quantitative data to use where the measurability was consistent across teaching and learning. Furthermore keeping exam conditions within classrooms caused problems in some classes by some students which consisted of the following:
Students would behave inappropriately in class, often drawing others into their disruption to avoid doing tasks that involved heavy text work and the comprehension of it. Cowley (2001:221) describes disruption in the classroom as a ‘direct result of students not being able to access learning’.
Students would refuse to even try or would expect a large amount of support from teachers or TAs. The Literacy Trust (2016:4) highlights this in their State of the Nation report as a problem.
English teachers raised concerns of how accessible exam questions were for some students. The late requests for exam dispensation meant that students were missing out on possible additional support. This was because we were unable to show that this was the ‘normal way of working’ for the individual over the last 2 years, as stipulated by the Joint Council of Qualifications (JCQ) (2016/17:21) Guidelines.
Secondary text books are for 12+ readers (Brooks 2013) and assumptions were made that if students are in year 7 or 8 or even higher up the school, they can read the text provided for the secondary curriculum. (Rennie 2016, Denti 2004) recognised that for many KS3 readers teacher assumptions were detrimental to the actual ability and progress of students.
Not understanding technical terminology/key words in subject areas
A simple example of this is a year 9 student in Science being asked to give one drawback of hard water. The response was “ships may collide with it and sink.” They could read the word ‘hard’ but could not comprehend that is may have another scientific meaning. This may bring a smile to the reader. However, it shows the continuing need to improve comprehension skills throughout the secondary school phase and without a base line test, this would be very hard to monitor and intervene appropriately as Tarchi (2016) suggests.
Students with significant delay in learning on transition to secondary school can be overwhelmed not just by the environment but also the demands of the curriculum. Additional subjects and complex vocabulary prevents them accessing the curriculum affecting their self-esteem and confidence. (DFE 2015)
Teachers and TAs have been unable to explicitly identify students for targeted intervention.
Without these baseline assessments we were unable to critically evaluate what students could manage and where the gaps in learning were. This is reflected in the key findings of the study by the Literacy Trust (2016).
These problems affected the whole school. We required a measure of reading comprehension that gave results that teachers could use in their classrooms to help improve attainment and identify students who require further intervention to bridge the gaps in their learning.
Ethical considerations were paramount and a question arose from the outset of this research. Which approach would give the best outcomes, not only for the students, but for my team and the school as a whole? A considerable consideration was the impact of teaching and learning in KS3, (the sample for this research). ‘Inductive research’ allows for data collection that can identify any emerging patterns and identify relationships between variables that would help me draw conclusions. However, should a deductive approach be used to test a hypothesis (in its infancy) that testing reading comprehension would provide gaps in skills that needed targeted intervention. The deductive approach ‘moves towards hypothesis testing, after which the principle is confirmed, refuted or modified’ (Gray 2004:14). I decided that because this research would shape school policy, an inductive approach was required through the implementation of evaluative methods that could provide answers to future questions from the SLT. Inductive research allows for data collection that can identify any emerging patterns and identify relationships between variables that would help me draw conclusions.
Evaluation research can be defined as a type of study that uses standard social research methods for evaluative purposes, as a specific research methodology….which should enhance knowledge and decision making and lead to practical applications (Powell 2006:102).
Evaluation entails ‘looking at a specific set of practices with regard to their functioning efficiency and quality’ (Hitchcock and Hughes 1995:31) which, for the purpose of this study, would be analysing how testing reading comprehension helps with the identification of students for targeted intervention.
Evaluation should be seen as being central to making schools more effective and setting an agenda for staff development. (Hitchcock and Hughes 1995:34)
Cohen et al (2013), Norris (1990) suggests there are theoretical differences between evaluation research and other approaches. This study was a mixed method approach and consisted of both quantitative and qualitative elements. Quantitative research, which tends to focus on numbers and frequencies, rather than on meaning and experience (Bryman 2004) was the New Group Reading Test (NGRT). Other Qualitative methods were employed in the form of a focus group gathering data through discussion at senior level and the use of questionnaires investigating the implementation, completion and effectiveness of the NGRT by TAs. These methods helped to gain more insight into the perceptions and effectiveness of the test from others points of view, providing a much more balanced and informed study (Bell 1987, Wilson 2009, O’Leary 2010). Used by practitioners and viewed as ‘anti positivist’ in its approach, in contrast to a quantitative approach, this interpretative style of research is sometimes criticised to the extent of its validity because of many influencing factors that statistical research eliminates (Hitchcock & Hughes 1995).
However, by conducting elements of both quantitative and qualitative research I aimed to gain maximum validity for my findings.
Because qualitative and quantitative methods involve different strengths and weaknesses, they constitute alternative but not mutually exclusive strategies for research. (Patton 2002:14)
A mixed method approach using the questionnaires, raw data and focus group, was ultimately designed to address the need to test comprehension skills at this secondary school. However, I had to be cautious with the study and analysis so as not to influence my results. Bias is described as ‘any influence which distorts the results of the study’ (Gray 2004:574). By conducting a mixed method approach, quantitative and qualitative data could be crosschecked which gave opportunity to triangulate my research and increase the validity of my findings.
Triangulation refers to the use of more than one method of data collection within a single study (Hitchcock & Hughes 1995:180).
This was achieved through comparing responses from the focus group, and the questionnaire and looking at the raw data. Cohen and Mannion (2007:143) suggest that methodological triangulation is most likely to be used in research and has the ‘most to offer’ and therefore contributes to validity and reliability.
Data collection methods for this study
A Focus Group which consisted of the SENCO, Head of English and the Deputy Head was formed prior to any set up to discuss which test would be the most valuable, both in terms of cost effectiveness and the raw data it would provide.
Raw data from the NGRT was collated. In the first cycle, all 450 pupils from Key Stage 3 (KS3) (age 11 – 14 years) completed the test. The second cycle (6 months later) involved the pupils who were targeted to receive intervention from the first cycle to see if there was an impact on attainment.
Questionnaires were sent to SLT, Heads of Learning (HOLs) and all Teaching Assistants (TAs) to ascertain whether the raw data reflected similar outcomes from individual teacher assessment methods. Additionally the questionnaires gathered information on whether staff where using results to inform their planning.
Focus groups are an important way of discovering what others think about a particular programme or study, this would include individuals feelings, attitudes and personal opinions (Mertens and Mcaughlin 2004). Having an idea of how teachers perceive and value the process of testing reading comprehension in my school will assist this study in finding proper measures and the correct implementation of tests to secure the best outcomes.
The use of focus groups is based on the assumption that group members have information and can formulate and express their opinions, feelings, and behaviour in words, but that they need the researcher and the group context to extract this information. (Gilflores and Alonso 1995:2)
In addition, focus groups should be combined with qualitative investigation such as questionnaires to ‘obtain more meaningful conclusions’ (Cohen et al 2013:377).
Using a focus group at the beginning of this study allowed other stakeholders to be involved in implementing a process of testing that would be embedded in the school improvement plan. The areas of focus were:
- Which test provides the best data for the school?
- Which is the most cost effective test and why?
- How is the test going to be implemented?
- When should the test be implemented?
- Who is going to implement it?
- Who is responsible for collating and recording the data?
- How is the data going to be distributed?
- Who is responsible for ensuring results are shared and recorded?
Mertens and Mcaughlin (2004) concur that having pre thought out questions to discuss in the focus group deters stakeholders from straying onto other threads of thought during the meeting.
NGRT Raw Data
Sapsford and Jupp (2006:153) state that ‘numbers give the strongest impression of factual accuracy.’
The NGRT online test is one I had used for the last 4 years in my previous school and I had used the data provided to identify students for additional reading intervention with great success. The assessment adapts to the learners’ ability as they work their way through. This makes it versatile and appropriate for all abilities.
NGRT Digital is an ideal screening/monitoring test for groups of students. Its ability to show ongoing progression and achievement of students makes it an essential tool that satisfies the literacy assessment requirements of all four UK inspection frameworks. (GL Assessment 2015:2)
Additionally the NGRT online package was identified as the most cost effective way of testing as the analysis and marking of the test is done electronically and is presented as a spreadsheet with areas of weakness and strengths highlighted on the result sheets. This meant that I could print the results for each class and present the scores to the teachers directly, which promoted ownership of the data provided alongside the areas for development for each individual student. This is concurrent with Hargreaves’s (2007:38) vision that ‘Teachers will need to be the drivers, not the driven’. Teachers could then identify areas for improvement and plan more effectively to bridge the gaps in learning in their classrooms.
The NGRT programme is one that you can use every six months and so it was agreed that students receiving intervention would be tested twice a year to check progress collected from the TAs intervention programmes and used to identify further students for intervention. The rest of KS3 would be tested once, during term six of the school year. This would then be reviewed after the first year to see what impact the test had had and whether term 6 was the best time to run it. The Vernon Spelling test was also used. This assesses spelling ability and it was agreed that the two tests would give us the necessary data to identify students for literacy intervention and therefore one spreadsheet was made with both test results side by side.
My final step to realising my objectives was to design a questionnaire (Appendix 1) that would determine all the respondents’ (37 colleagues) access to the NGRT, the results and its suggested use in school. The defining characteristic of a questionnaire is that it ‘is a written form of questioning’ (Thomas 2009:173) and was chosen because I needed to seek data quickly from a number of respondents (Bell 1987). The design for the questionnaires had to be carefully considered because Newby (2010) informs us that keeping things simple encourages participation and I wanted all stakeholders to be given the opportunity to have their say. Therefore, each question was constructed to be simple and required colleagues to tick their answers or give a brief comment. This was important since the results from this study would help shape the way forward for testing comprehension in future years. The questionnaire consisted of 9 questions; both open and closed. By tailoring them in this way Cohen and Mannion (2011:338) state that this not only addresses ‘the matter of appeal to respondents’ but, perhaps more significantly ‘the matter of accessibility to the questionnaire for respondents’ enhancing matters of validity and reliability.
The questionnaires were distributed to TAs, SLT and Achievement coordinators who prioritised time for themselves to complete them.
Additional to the ethical considerations already discussed in the methodology, my first consideration was that of consent. Informing a person in a research setting of what you are doing and getting their permission is seen as good practice (BERA 2011). For the purpose of this study I needed to gain consent from parents and staff. Gregory (2003:35) states that ‘research undertaken without the explicit consent of the researched lacks an adequate moral basis’. Fully informed consent must be sought from the relevant adult ‘gatekeeper’ as well as the child (Gray & Winter 2011) therefore a letter was sent to parents informing them of the NGRT and a phone number was given should parents require further information, or have additional questions. However, further ethical considerations were made because I was conducting research with some children with SEN. Consideration of how pupils with SEN can ethically participate in educational research, for reasons of equity and social justice, ensuring full participation and understanding along the way should be made, with the barriers removed to ensure their voice is heard (Gray & Winter 2011). In the delivery process of the NGRT careful consideration was made to the children’s learning abilities as highlighted in The Code of Practice (2015:71). Some examples of making the test fully accessible are;
- visual prompts for Global Delay students
- increased font size of the text for visually impaired students
- Quiet room for Autistic Spectrum Condition students
‘People with learning disabilities are not only valid sources of information about their lives but sometimes they are the best or only source’. Mcarthy (1998:144). Aubrey et al (2000) suggests it may be difficult to gain full and informed consent from children with SEN/D:
In the case of participants whose age, intellectual capability or other vulnerable circumstance may limit the extent to which they can be expected to understand or agree voluntarily to undertake their role, researchers must fully explore alternative ways in which they can be enabled to make authentic responses. In such circumstances, researchers must also seek the collaboration and approval of those who act in guardianship (e.g. parents) or as ‘responsible others’ (BERA 2011:6)
I was aware that gaining consent could also be flawed because of the general acceptance of children in my school setting particularly because of my influential role as SENCO (Mcarthy 1998). Thomas (2009), Morrow (2011:10) states that one of the biggest ethical challenges for researchers is the ‘disparity in power and status between adults and children’. It could be suggested that most of the students took part because they felt they were not able to decline. Therefore, I corresponded or telephoned parents to ensure fully informed consent was gained. Denscombe (2002:186) suggests this is good practice:
the ethical principle to operate is to seek consent from someone who has direct, formal responsibility for the welfare of that child …
The approach to this data collection involved gathering and recording information in such a way that it could be preserved and analysed in accordance with guidelines and the Data Protection Act (1998).
Staff who participated in the study were given the opportunity not to participate if they did not wish to and this is also seen as good practice (British Educational Research Association (BERA) 2011).
In its infancy, this group found some significant professional differences in approach to deciding what testing would be better, not only for our school, but for the students taking the test. By staying inside the parameters of the questions provided, everyone was able to voice their opinion frankly without fear of judgement and succinct meetings assisted stakeholders in discovering the best outcome within the time limits. Colman would suggest that,
“effective collaboration creates added value and that working together improves and achieves more than can be achieved separately.” (Colman 2008:4)
It was clear from the results and responses that my questions could have had more clarity so that participants clearly understood what was meant. Bell (1987) Mertens and McLaughlin (2004) and Newby (2010) say that by ‘considering what your respondents might reply’ and what it will mean to your research is important particularly when analysing the data.
The following responses are the results of the meetings which informed the planning of the NGRT to KS3:
Which test provides the best data for the school?
The two tests that the group were focusing on were the NGRT and the CAT’s. Both tests were digital, all stakeholders were competent in the delivery and analysis of CAT’s but only one member of the group had any experience of the NGRT. The discussion developed into what outcomes we needed and in what timescale. One group member wanted to do CATs solely because they had no experience of the NGRT.
Which is the most cost effective test and why?
Both tests were £5.45 per student; however the length of the CATs test was three times longer than the NGRT. However these gave more detailed analysis of the student’s holistic ability rather than just concentrating on their reading comprehension age. Neither of the tests discussed had outcomes that would satisfy all stakeholders and so it was decided that for the first year the NGRT would be trialled and if the interventions were not effective then we would then trial the CATs test.
How is the test going to be implemented?
There was diverse thinking about this question. The Learning Hub team which consists of the SENCO, HLTA and TAs was prepared to organise groups of students to be tested over the space of a week in computer rooms. The Head of English thought it would be better coming from her team as they could do the test during English lessons. It was agreed that the English faculty would deliver the test for the first round of testing for all KS3 students in their English lessons.
When should the test be implemented?
Having the Deputy Head as part of the focus group enabled us to look at the school calendar and discuss when the test would be best delivered. A further suggestion was using the NGRT B test for measuring progress 6 months after the first test. It was agreed that Term 6 would be the best time for implementation as Year 11s would have finished their GCSEs and year group exams would have finished. The results of the test would then inform planning for further intervention ready for the start of the new academic year in September. This meant that at the end of January the following year, students who had taken part in literacy interventions in Terms 1 and 2 would then be tested again without the concern of whether students would remember the answers from the initial test. This would also give the Learning Hub team additional data to support students who were still below their chronological age for reading comprehension.
Who is going to implement it?
The Deputy Head was very keen for the SENCO to be involved with the implementation of the test. Therefore the Head of English and the SENCO sat down and agreed a week when TAs could support in the English lessons to enable all students access to the test in their lessons. A visual cue sheet was made by the SENCO for teachers and students to help them log on.
Who is responsible for collating and recording the data?
During this discussion the SENCO expressed her interest in collating and recording the data. All stakeholders were more than willing to agree to this as they felt that the responsibilities they already had would hinder the analysis and delay the results being given out quickly.
How is the data going to be distributed?
In the first discussion, the SENCO explained how a spreadsheet would be made to show the outcome of the test, this was approved and the document was sent out to all teaching staff. This prompted a further group meeting where it was suggested that teachers would not use the data if they had to look for the document within school systems. Someone then asked the question as to whether the results could be put on SIMS (the school’s data system). This ignited further discussion and it was decided to put the students’ reading comprehension ages on every register within SIMS so that teachers would know at a glance the reading ability of every student in their class.
Who is responsible in ensuring results are shared and recorded?
From our discussions the raw data was collated by the SENCO and passed to the data manager within a week of completion. The data manager then updated SIMS so that every KS3 student had a reading comprehension age recorded not only on a spreadsheet but also on the schools information system. This remains current practice today.
This is our second year of delivering the test, and now that the focus team are working together and robust systems of operation are in place, other teachers have begun to use the data provided. This has resulted in further opportunities for all students to read in their classrooms, either to the rest of the class out loud or individually reading about a particular topic at a level relevant to their ability. All teachers are reminded on a regular basis of the results and how they should utilise this in their intervention strategies in class to meet pupils’ needs. It has also been utilised by pastoral staff to target support for behaviour concerns.
As a school we now value early identification of reading and spelling ages and have re written policy to include NGRT and Vernon Spelling testing for all year 6 students coming into secondary school from primary school. This is underpinned by the SEN Code of Practice (2015) which states:
All schools should have a clear approach to identifying and responding to SEN. The benefits of early identification are widely recognised – identifying need at the earliest point and then making effective provision improve long-term outcomes for the child or young person. (SEN Code of Practice 2015:94)
Identifying children and young people who are significantly behind their peers is vitally important – with early identification the school can make sure that their students get intervention early.
Raw Data Analysis
|Number of students with a Reading comprehension age of below age 7 years||Number of students with a Reading comprehension age of between 7 and 9 years||Number of students with a Reading comprehension age of between 9 and 10.06 years||Percentage of year group below chronological age for reading comprehension|
The following table show the analysis of the 1st NGRT for KS3. Appendix 2 is the raw data from which this analysis has been drawn. Students’ results have been grouped together in year groups and then broken down into specific age ranges.
Results of the first NGRT July 2015
This highlights the students who would benefit from targeted intervention to improve reading comprehension skills. However, caution needs to be taken when analysing the raw data. There are many factors that could affect the accuracy of the test. For example, students feeling unwell on the day, trauma in the family, and/or poor mood. We need to be aware of all of these possibilities and encourage discussions with the students to discover how they felt they had done in order to get a broader picture of their learning skills. The American Educational Research Association (AERA) (1999) suggest that:
High-stakes decisions should not be made on the basis of a single test score, because a single test can only provide a “snapshot” of student achievement and may not accurately reflect the student’s progress and achievement. (AERA 1999:2)
We predict that this early identification at Year 7 with targeted intervention will result in a significant rise in students being able to access their GCSE texts in Year 11.
The current government has accelerated the academisation of all schools, abolished levels and introduced changes to National Curriculum assessment and new exam specifications. Under the new Key Stage 2 assessments, only ‘66% of 11-year-olds read at the expected level in 2016’ (National Literacy Trust 2016:4). Our data suggests that only 61% are at the expected levels. These results were, personally, not surprising. However, SLT were surprised by some of the very low scores some students had achieved. This underpinned my rationale for the NGRT to be part of our assessment programme not only in transition from Year 6 to Year 7 but for the rest of KS3. The Education Endowment Foundation (EEF) was set up because children are underachieving, especially those who are disadvantaged. They claim:
…. better use of evidence can make a real difference by helping schools spend money more effectively to improve the teaching and learning of children from low-income families. (EEF 2015:1)
GL assessment concurs as they maintain that:
“We know that poor literacy skills can severely limit a child’s horizons. The New Group Reading Test (NGRT) allows teachers to assess reading and comprehension skills benchmarked against the national average and monitor progress.” (GL Assessment 2015:1)
From the raw data, it was identified that 3 specific interventions were needed for the different age groups shown in the table above. Research into which appropriate intervention would be best suited for each age group has been done separately from this study and is underpinned by Brooks (2013), ensuring that the training and delivery of all interventions have been carefully planned and thought through. This is good practice and one Ofsted (2009) would agree with. In a report of interventions in both primary and secondary schools the following observation was made:
“Intervention was more effective in the primary schools than in the secondary schools visited and stemmed from careful analysis of pupils’ weaknesses, flexible planning of programmes, thorough training of key staff and effective monitoring and evaluation.” (Ofsted 2009:7)
With this information, thorough training was given to TAs who were delivering the interventions selected for this investigation and they are detailed in Appendix 3. Students with a reading age of below 7 – 10.06 years received support in small groups for one hour once a week. The programmes ran for 6-8 weeks to avoid too much disruption in mainstream lessons. Students are taken from any subject area, but we take care not to impact on the same subject twice in a year for any individual. The NGRT was given again to all students in KS3 in July 2016 and the following results were recorded.
Results of the second NGRT July 2016
|Number of students with a Reading comprehension age of below age 7 years||Number of students with a Reading comprehension age of between 7 and 9 years||Number of students with a Reading comprehension age of between 9 and 10.06 years||Percentage of year group below chronological age for reading comprehension|
The table shows the significant increase in reading and reading comprehension skills after targeted intervention for all year groups from 2015 – 2016.
Linking identification to assessment and intervention is paramount, the National Literacy Trust (2016) concur as they state:
Once you have identified pupils, it’s clearly vital that you make a plan to support their needs. For some pupils, this will include a referral to other agencies, such as speech and language therapy, for more detailed assessment of their needs. (State of the Nation 2016:13)
82 year 7 students made accelerated progress of 12 months or more
49 of year 8 students made accelerated progress of 12 months or more
80 of year 9 students made accelerated progress of 12 months or more
Overall 211 (46.1%) of KS3 made accelerated progress in reading comprehension skills after targeted intervention was put in place. This was in line with Brooks (2013:19) who believes ‘Children’s comprehension skills can be improved if directly targeted.’ With this new information we can start the process again but this time moving the students into the next bracket of intervention where appropriate. If progress has not been made, then further investigation is required to ascertain the reasons for this and whether outside agencies are needed to help support the needs of the individual (Lord et al 2017). This is also recommended by the Code of Practice (2015) where the graduated approach to meeting needs is expected.
With the data available to teachers’, at the time of taking registers, a significant impact on some teachers practice has been observed. They have acknowledged that this data provided them with information about pupils that they hadn’t already known and helped them provide a more individualised learning experience. However, understanding what a 7 year old reader looks like can be challenging for secondary school teachers as secondary teachers deal with students who are expected to read well by the age of 11 (Literacy Trust 2012). I have been asked to provide whole school Continued Professional Development (CPD) on this subject and offer a termly ‘drop in’ session where teachers can bring questions or queries to me directly and we can work through and discuss strategies together. Emphasising the importance of reading in every classroom is supported by yellow folders (a file in the classroom giving strategies and methods that work for identified students). This gives SLT the tools to help improve teachers’ practice and to add guidance to promote this. This is an expectation under the Code of Practice (2015). Quality First Teaching can have a significantly positive impact on outcomes. The Department for Children, Schools and Families (DCSF)(2008) personalised learning guidance suggests that schools need to consider whether their current practice is effective by:
- Designing highly focused teaching sequences/lesson plans, with high demands of pupil engagement
- Designing reasonable adjustments and special educational provision into lesson plans
- Focusing on questioning, modelling and explaining
- Promoting pupil talk, both individually and in groups
- Supporting pupil independence in their learning (DCSF:13)
Schools enabling and coaching teachers to raise reading comprehension levels will only strengthen teaching and learning (Brooks 2013, National Literacy Trust 2012). In my setting, teachers are now beginning to see the reading comprehension capabilities of all their students and are personally identifying areas of need for them, trusting me to provide accurate data. This improves collaborative working across the school and all stakeholders are involved in raising standards. Colman (2008) recognises this when he says;
“Trust is often seen to develop through the process of collaborative working. In this respect, it is actually by co-operating on specific areas of work that familiarity, understanding and respect may emerge, thus providing the stimuli for trust to develop.” (Coleman 2008:18)
From a SENCO perspective, the data allowed me to identify students significantly behind their chronological age for reading and comprehension and to determine what training needs would be required for TAs to support students’ learning in this area. Having spent nearly a year on training TAs in specific Literacy, Numeracy and Social interventions they are now feeling confident in delivering quality intervention. This has been documented through team meetings and 1:1 line management meetings. By targeting the specific gaps in each individual and tailoring the interventions to groups of readers with similar reading ages, the SEN/D team has developed its skills, taken on a whole new way of working and has begun to see significant improvement in students’ reading ages (Appendix 4). TAs now understand the starting points of students with low literacy skills and are able to deliver interventions that Brooks (2013) recommends for weak readers. Taking responsibility on interventions has raised the confidence of the TAs and they implement the interventions with diligence and professionalism. Bosanquet et al (2016) report that:
“Teaching assistants, who have a pedagogical role delivering specific curricular interventions, report a direct positive impact on pupil progress when prepared and trained and have appropriate support and guidance.” (Bosanquet et al 2016:116)
Students come willingly to interventions. These interventions are well organised, well-structured and have an entry and exit criteria and students understand what they are learning and why. This has raised the skill profile of the SEN/D team and members are now seen as experts in the field and as a valuable resource by the SLT.
37 Colleagues participated in the questionnaire from a variety of different roles including support staff, teachers and leaders. The information that follows is the views of those 37 participants and not necessarily the views of the school. A copy of the questionnaire and the responses can be found in Appendix 1.
Graph to show colleagues responses to the eight questions
During the questionnaire participants had the opportunity to comment on any of the questions asked and their responses are documented below.
Question 1 – Is it a good base line test?
“It provides a common baseline to assess pupils”
“Knowledge of reading age is essential in all subjects and allows staff to differentiate and challenge appropriately “
“It enables us to quickly assess what level a pupil is working at”
“It helps identify how a pupil is performing in relation to their peers”
“The test is too long and it is easy for student’s to get distracted”
“It helps identify how a pupil is performing in relation to their peers”
Question 2 – Do students access the test well on line?
“The NGRT allow for efficient and timely completion”
“Their results can be lower than previous data and they will tell us they messed around or didn’t bother completing it properly.”
“The NGRT is all set up for students so access is easy.”
“Some of our high needs students feel rushed and can’t complete the test in the time allotted.”
Question 3 – Do you encourage colleagues to use the data?
“I have started to find out students’ scores and begun to have conversations with teachers.”
“Using this data helps me guide my team in Quality First Teaching. There are more low ability readers than I expected in KS3”
Question 4 – Do you think all staff use the data?
“The use of data is varied”
“Anecdotal evidence from learning walks does not always show knowledge of reading age being acted upon.”
Question 5 – Is there a better way to test Reading Comprehension?
“I am not familiar with any other alternatives.”
“Maybe, there are always new ideas and literature. We need to keep up with cutting edge testing.”
Question 6 – Is there anything you would change about the process of testing NGRT?
“No the earlier the assessment is undertaken the more opportunity there is to respond to the results.”
“I would prefer it if some targeted students could do the test on paper copies.”
“It seems to work well for our school.”
“Better and more relevant reading material to encourage reading and use of questions on what they are reading.”
“Yes- Find a more consistent testing approach as some students may not take it seriously.”
Question 7 – Do you think there has been an improvement of reading ages since introducing the test?
“Yes, evidence in data shared following intervention and re testing.”
“Yes, improved performance of students.”
“Yes, our Guided Reading Results have improved significantly.”
Question 8 – Do you think the students understand the importance of the test?
“I think the more we embed this into our practice the more the student’s will see the importance.”
“We need to have conversations with those coming for intervention to show them why the test is so important.”
“It’s too early to tell. This is a very new way of working, hopefully in a few more years we will have a better understanding.”
The final question on the questionnaire asked for stake holders to show how they used the NGRT data. The graph below shows the areas of use and by how many.
How do you use the data collected? (See graph below)
From these results it is clear that there is still a lot of uncertainty about the NGRT and the effectiveness of using the data throughout the school. This could be because:
- It is a new initiative.
- Data was first on a spreadsheet and then on SIMS.
- The SENCO had not shared it appropriately with teachers.
- Support staff are only just beginning to use data to identify needs.
As a result of these findings it is evident that further development of staff understanding is required. However significant impact has been made on learners’ progress. With time, a cohesive understanding of this process will undoubtedly continue to improve outcomes for learners.
We are now in the second cycle of providing this test. There are still areas that need improving. Firstly managing the organisation of dates where all stakeholders are not too disrupted within the curriculum. Making dates in advance as part of the whole school calendar has given a clear indication of when the test should be delivered. This gives credence to the test and it is taken seriously.
Furthermore, ensuring all students have taken the test is proving time consuming. Students who are sick long term, disengaged, or on another activity outside of school, currently have no data entered. This means that further follow up is essential. Without this testing students are at a significant disadvantage to gain GCSEs and opportunities of employment (Rennie 2016). This group of student’s is likely to need the personalised, intensive support that will enable them to re-engage with learning (Literacy Trust 2016).
Introducing new ways of working within the SEN/D department has caused some teachers to question whether intervention is having a negative impact on progress in the classroom, and subsequent impact on examinations, due to time spent out of lessons.
Over time, balancing the need for intervention against the impact of removing a child from a subject area is going to need further monitoring and tracking. Even with research underpinning the need for intervention, some teachers are reluctant to give students any of their subject time or allow them to attend. This in turn impacted upon student outcomes to the interventions. With so little evidence on reading comprehension intervention at secondary level (Brooks 2013, Rennie 2016, Lafontaine et al 2015, Paul and Clarke 2016) it is argued that ‘more large scale methodologically rigorous studies are needed in this area’ (Paul and Clarke 2016:124). Additionally they suggest that reading comprehension may require additional individualised support separate from basic reading skills. This is what the NGRT identifies in its content so that targeted reading comprehension intervention could be put in place.
Gaps in data continue to be a significant problem which has caused teachers and TAs frustration when trying to plan effectively. Introducing a TA who would have responsibility to check, monitor and follow up on the NGRT would tighten the overall procedure.
Limitations and further research
With such a small scale study there are obviously restrictions to what has been researched. The study did not include the impact on school micro populations and whether testing had a positive impact on a particular gender. The sample, evidence and analysis of this study was extremely narrow to focus on whether testing reading comprehension and implementing intervention from the outcomes had a significant impact on our students. This process is in its infancy and some students have already made significant progress, although regular reviews will need to be made over time.
Personally I would like to research further the impact of targeted testing on a student’s self-esteem, motivation and well-being. From my reading into testing and assessment a common thread of enjoying reading for pleasure has intrinsically woven itself into researchers’ conclusions and it is an area that would have an impact no matter who was taking part or what their starting point was. Clark and DeZoysa (2011) would agree as they find:
“The best way to promote mature readers is by instilling in children a passion for reading……… All reading makes a difference ,but evidence suggests that reading for pleasure makes the most.” (Clark and DeZoysa 2011:17)
Reading comprehension is pivotal to key skills in adult life (Clarke 2012). Testing and assessment of reading comprehension in secondary schools is fundamental to establishing gaps in learning and preparing students for their future (Ofsted 2009). Assessment, in whatever form, as long as it addresses the needs of the child alongside appropriate swift intervention, has a significant impact not only on attainment but encourages students to want to own their personal learning (National Foundation for Educational Research 2016, Tarchi 2015, Brooks 2013, Cowley 2001). There is limited research done around reading comprehension skills, but those who have written journals and books agree that this is an area that requires more study, especially with current issues being almost identical to Smith (1976) research forty year ago. From 2000 – 2010 government has uncovered the fact that reading and reading comprehension skills have stagnated compared to our international counterparts. 17 countries significantly outperformed England at secondary level in reading comprehension in 2012 (DFE 2015).
The NGRT assessment data underpinned the need for targeted intervention, not only in comprehension skills but in synthetic phonic skills and decoding skills as well. Without this data, students could struggle with the secondary curriculum within every subject area and teachers would be unaware of their gaps in learning until much later on in their secondary school journey, possibly leaving it too late to intervene and support them to reach their full potential. Personalising the support after testing shows our commitment to individuals’ progress not only in reading but also in Maths and Emotional Literacy.
Teenagers today need to feel as if they belong (Brandon 2001, Hansel and Pondiscio 2016) which is vital to self-esteem and coping in a secondary setting (Cowley 2001). Having a chance to receive some targeted intervention enables a student every opportunity to bridge their gaps in learning and gives them the tools to learn and to be independent (Brooks 2013). After initial reluctance to attend intervention from some students (usually the most hard to engage) it was very surprising to observe that by week 4 of the intervention students attitudes were changing and the vast majority of students wanted to learn, and were eager to make the next steps. Cowley (2001) believes that:
“In these situations it is very tempting to give up and to say ‘whatever’ and let the student’s behave as they wish. But maintaining high standards and refusing to give up on expectations is, in the long run, the key to success.” (Cowley 2001:8)
The rise in their self-esteem and motivation was tangible and, long term, could be an investment into the students’ attitude to themselves as learners and how this change could impact their lives post 16 and into adulthood. As Brandon (1995:26) suggests that ‘To face life with low self-esteem is to be at a severe disadvantage’.
This ranges further than a school’s future predicted rise in their attainment.
Collaborative working across the whole school with regards to testing reading comprehension raises the profile of literacy in its broadest sense (Clark 2012, IOE 2012). A whole school drive to improve reading comprehension skills maintains and sustains the rigorous implementation of interventions in KS3. By intervening early, every student will have the very best chance of accessing the GCSE curriculum and achieving good grades which will enable them to plan and map a positive future (Johns and Lenski 2001).
Most importantly, once testing was completed and gaps identified we found that teachers needed to encourage reading for pleasure, involving parents and other family members as well as educational facilities. The Save the Children Fund Read on Get on study (2014:4) agree that the joy of reading ‘widens vocabulary, encourages imagination and develops knowledge’. They go on to say that:
“Above all else we must together put reading and the joy of reading at the heart and the head of our culture.” (Save the Children Fund 2014:9)
Finally a point to consider; In the country’s most deprived areas, more than a third of adults still lack the literacy expected of an 11-year-old (Clark 2012). With similar concerns around young people’s ability to comprehend what they are reading highlighted over 40 years ago, are we doing enough to improve the reading ability of all? If we do not intervene now and bridge the gaps in learning we could be seeing the same statistics in another 40 years time.
Adelman, H. & Taylor, L. (2011) School Engagement, Disengagement, Learning Supports, & School Climate. Available at http://smhp.psych.ucla.edu/pdfdocs/schooleng.pdf [Last accessed 20/10/2016]
American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (1999). Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association.
Aubrey, C., David, T.,Godfrey, R.,Thompson, L. (2000) Early Childhood Education Research. London. Routledge Falmer Press.
BBC (2012) Viewpoints: Teaching Children to read. Available at: http://www.bbc.co.uk/news/education-19812961 [Last accessed 1/11/2016]
Beck, I L. McKeown, M G. & Kucan, L. (2002). Bringing words to life. New York. Guilford Press.
Bell, J. (1987) Doing Your Research Project. A Guide for First-Time Researchers in Education and Social Science. Buckingham. Open University Press.
Board of Education (1943) White Paper: Educational Reconstruction (1943) London.
Bosandquet, P. Radford, J. and Webster R. (2016) The teaching assistant’s guide to effective interaction: How to maximise your practice. Oxen. Routledge.
Brandon, N. (2001) The Psychology of Self-Esteem 32nd edition: A Revolutionary Approach to Self-Understanding That Launched a New Era in Modern Psychology. New York. John Wiley and sons Inc.
British Educational Research Association (BERA) (2011) Ethical Guidelines for Educational Research [online] http://www.bera.ac.uk/files/2011/08/BERA-Ethical-Guidelines-2011.pdf [Accessed 2/11/11]
Brooks, G. (2013) Fourth edition. What works for children and young people with literacy difficulties? The effectiveness of intervention schemes. London. The Dyslexia-SpLD Trust.
Bryman, A. (2004) Social Research Methods. 2nd Ed. Oxford University Press
Castella, T. (2014) The anarchic experimental schools of the 1970s BBC News Magazine Available at: http://www.bbc.co.uk/news/magazine-29518319 [Last accessed 20/10/2016]
Clark, C and De Zoysa, S (2011) Mapping the interrelationships of reading enjoyment, attitudes, behaviour and attainment: An exploratory investigation. London Literacy Trust.
Clark, C. (2012) Children’s and Young People’s Reading Today. Findings from the 2011 National Literacy Trust’s annual survey. London. National Literacy Trust.
Cohen, L., Mannion, L. Morris K (2011) Research Methods in Education 7th Ed. Abingdon. Routledge
Cohen, L., Mannion, L. (1980) Research Methods in Education. Beckenham. Croom Helm
Coleman, A. (2008) Trust in collaborative working: The importance of trust for leaders of school based partnerships. Findings from original research undertaken into the importance of trust as a driver of school based collaborations. National College for School Leadership.
Cowley, C. (2001) Getting the buggers to behave. London. Bloomsbury.
Data Protection Act (1998) The Data Protection Act 1998. Available at: http://www.legislation.gov.uk/ukpga/1998/29/contents [last accessed 2/3/2017]
Denti, L. (2004) Introduction: Pointing the way: Teaching reading to struggling readers at the secondary level. Reading and writing quarterly , 20 (2), 109 -112.
Denscombe, M. (2002) Ground Rules for Good Research. A10 point guide for social resources. Buckingham. Open University Press.
Department for Children Schools and Families (2010) Working Together to Safeguard Children: A guide to inter-agency working to safeguard and promote the welfare of children. London. Crown.
Departments for Children, Schools and Families (2008) Personalised Learning: A Practical Guide. Nottingham. Crown.
DES (1975) The Bullock Report : A language for life. London. Crown.
DFE (2011) Evaluation of Every Child A Reader. London. Crown.
DFE (2012) Research Evidence For Reading For Pleasure. London. Crown.
DFE (2015) Reading The Next Steps. Supporting higher standards in schools. London. Crown
DFE (2015) Special Educational Needs and Disability Code of Practice: 0 to 25 years statutory guidance for organisations which work with and support children and young people who have special educational needs or disabilities. London. Crown.
Education Endowment Foundation Reading: Comprehension Strategies. Available at: https://educationendowmentfoundation.org.uk/resources/teaching-learning-toolkit/reading-comprehension-strategies [last accessed 20/12/2017]
Dyslexia Action UK. Active Literacy Kit. Available at: http://www.dyslexiaaction.org.uk/page/active-literacy-kit [Last accessed 21/03/2017]
Gagen, M. R (2007) The importance of guided reading. Available at: http://righttrackreading.com/guidedreading.html [Last accessed 21/03/2017]
Gillard, D (2011) Education in England: a brief history. Available at: www.educationengland.org.uk/history (Last accessed 23/9/2016)
Galton, M. Simon, B. and Croll, P. (1980) Inside the primary classroom (The ORACLE Report) London: Routledge and Kegan Paul
Gilflores, J. Alsonso, C, G. (1995) Using Focus Groups in Educational Research. Exploring Teachers’ Perspectives on Educational Change EVALUATION REVIEW, Vol. 19 No. 1, February 1995 84-101m 1995 Sage Publications, Inc.
GL Assessment (2015) Case Study Making a difference with NGRT. Available at: http://www.gl-assessment.co.uk/sites/gl/files/GL839_case%20study_Torfaen%20LA_November%202015.pdf. [Last accessed 13/04/2017]
Gray, C., Winter, E. (2011) The ethics of participatory research involving children with special needs. In. Harcourt, D., Perry,B. Waller, T. Eds. Researching young children’s perspectives London: Routledge, pp 26 – 37
Gregory, I. (2003) Ethics in Research. London. Continuum.
Hansel, L., Pondiscio, R. (May 2016) Job One: Build Knowledge. ESSA Creates an Opportunity— and an Obligation — to Help Every Child Become a Strong Reader. Knowledge Matters, Issue Brief #4.
Hargreaves, A. (2007). Five flaws of staff developments and the future beyond. Journal of Staff Development, 28(3), 37–38.
Hitchock, G., Hughes, D. (1995) Research and the Teacher: A Qualitative Introduction to School-based Research. 2nd Ed. London. Routledge.
HMI (1985) The Curriculum from 5 to 16 (Curriculum Matters 2) London: HMSO
Hopkins, D. (2002) A Teachers Guide to Classroom Research. Berkshire: Open University Press
IOE (2012) Every Child a Reader (ECaR) Annual Report Available at: http://www.ucl.ac.uk/international-literacy/pdfs/ECaR_annual_report_2011-12.pdf [ last accessed 17/10/2016]
Jackson, N.E (1993). Moving into the mainstream? reflections on the study of giftedness. Gifted child quarterly, 12 (1), 46 – 50.
Johns, J. L., & Lenski, S. D. (2001). Improving reading: Strategies and resources. Dubuque. Kendall/Hunt.
Joint Council for Qualifications (2016/17) The adjustments for candidates with disabilities and learning difficulties. Available at: http://www.jcq.org.uk/exams-office/access-arrangements-and-special-consideration [Last accessed 1/11/2016]
Jones, K. (2003) Education in Britain: 1944 to the present Cambridge. Polity Press.
Lafontaine, D. Baye, A. Vieluf, S. Monseur. C (2009) Equity in opportunity-to-learn and achievement in reading: A secondary analysis of PISA 2009 data. German institute for international education research. Germany. Frankfurt am main,
Lavin,R.E. Lake, C., Davis, S., and Madden,N. (2009) Effective programs for struggling readers: A best Evidence synthesis. Baltimore MD: Johns Hopkins University. Centre for Research and reform in Education.
Leipzig, D. H. (2001) What is reading? Available at: http://www.readingrockets.org/article/what-reading [last accessed 2/11/2016]
Lord, P, Sims, D, White, R, and Palak R (2017) Communication Trust. Evidence for the Frontline: Evaluation report and executive summary. London. Pearson.
Mayer. R, E. (2007) Learning and instruction. Cambridge. Pearson Education Inc.
Mcarthy, M. (1998) Interviewing People with Learning Disabilities about Sensitive Topics: A Discussion of Ethical Issues. British Journal of Learning Disabilities. Vol 26, No. 1 Pp. 140 – 145.
Mertens, D, M. (1998) Research Methods In Education and Psychology: Integrating Diversity with Quantitative and Qualitative Approaches. London. Sage.
Morrow, V. (2011) The Ethics of Social Research with Children and Young People – an overview [online] http://www.ciimu.org/webs/wellchi/reports/workshop_1/w1_morrow.pdf [last accessed 2/3/17]
National Literacy Trust (2012) Literacy Guide for Secondary schools: 2012-2013. Available at: http://www.literacytrust.org.uk. [Last assessed 21/10/2016]
National Literacy Trust (2016) Literacy: state of the nation. Available at: http://www.literacytrust.org.uk/research/nlt_research/2364_literacy_state_of_the_nation [Last accessed 13/04/2017]
Newby, P. (2010) Research Methods for Education. London. Pearson Education Limited.
Norris, N. (1990) Understanding Educational Evaluation. London. Kogan Page Ltd.
Mullis, I.V.S., & Martin, M.O. (Eds.) (2015). PIRLS 2016 assessment framework (2nd ed.) Chestnut Hill, MA: Boston College.
Ofsted (2009) An evaluation of National Strategy intervention programmes. London. Crown.
Ofsted (2011) Removing barriers to literacy London. Crown.
O’leary, Z. (2010) The essential Guide to Doing Research. London. Sage.
One world literacy foundation (2013). Why reading is important Available at: http://www.oneworldliteracyfoundation.org/index.php/why-reading-is-important.html [Last accessed 17/10/2016]
Patton, M. (1990). Qualitative Research and Evaluation Methods 2nd Ed. Newbury Park, CA. Sage.
Patton, M. (2002) Qualitative Research and Evaluation Methods. 3rd Ed. London. Sage.
Paul, S. A. S. and Clarke, P. (2016) A systematic review of reading interventions for secondary school students. International Journal of Educational Research 79 116 – 127.
Powell, R. (2006) Evaluation Research: An Overview Library Trends Vol 55, No. 1 Pp 237 – 246
Rennie, J. (2016) Rethinking reading instruction for adolescent readers: The 6R’s. Australian Journal of Language and Literacy. Vol 39 issue 1 p42 – 53.
Richardson, H. (2012) Many teenagers cannot read GCSE exam papers. Available at: http://www.bbc.co.uk/news/education-20346204 [Last accessed 1/11/2016]
Rising Stars (2016) Dockside reading intervention. Available at: http://www.risingstars-uk.com/Subjects/Reading-and-Ebooks/Dockside/About-this-series. [Last accessed 21/10/2016]
Sapsford,R and Jupp, V (2006) Data Collection and Analysis: second addition. London. Sage.
SEN/D code of practice 0 – 25 years (2015) Available at: https://www.gov.uk/government/publications/send-code-of-practice-0-to-25. [Last accessed 12/3/2017]
Smith,C. B. (1976) Teaching reading in the secondary school. Association for Supervision and curriculum development.
Tarchi, C. (2015) Fostering reading comprehension of expository texts through the activation of readers prior knowledge and inference-making skills. International Journal of Educational Research 72 80-85.
The National Foundation for Educational Research (2016) Available at: https://www.nfer.ac.uk/schools/nfer-tests/ [Last accessed 13/04/2017]
The Save the Children Fund (2014) Read on Get on: how reading can help children escape poverty. London. Save the Children
Thomas, D. (2006) A General Inductive Approach to for Analysing Qualitative Evaluative Data. American Journal of Evaluation. Vol. 27, No. 2. Pp 237-246
Thomas, G. (2009) How to do your research project. London. Sage.
Warnock (1978) Special Educational Needs Report of the Committee of Enquiry into the education of handicapped children and young people. Cmnd. 7212 London. HMSO.
Wilson, E. (2009) School Based Research: A Guide for Education Students. London. Sage.
- Raw Data from NGRT
- Literacy report
Appendix 1 Questionnaire
1. Do you think that the NGRT online test is a good baseline assessment?
2. Do you think the students access the test well in the computer rooms? Please give reasons for your answer in the comments box.
3. How do you use the data collected?
a) To identify students for intervention
b) To see which students need help with reading
c) in the classroom to support teachers with differentiating for lower ability readers
d) I don’t use the data
4. Do you encourage staff to use the data?
5. Do you think all staff use the data effectively?
c) I don’t know
d) Other (please specify)
6. Do you feel there is a better way to test the students reading comprehension?
d) I’m not sure
e) Other (please specify)
7. Is there anything you would change about the process of testing for reading comprehension?
a) If yes please specify what you would change.
b) If no please specify why
c) Other (please specify)
8. Do you think there has been an improvement in raising reading ages since the introduction of the test?
a) If yes please specify how you know
b) If no please specify why you think this is
c) Other (please specify)
9. Do you think that the student’s understand the importance of this test?
Appendix 2 – Raw data from NGRT
Appendix 3 – Interventions
Active Literacy Kit (ALK) for below 7 readers.
With this intervention, TAs needed specific training that took 2 full days to acquire the skills and knowledge needed. There is then an assessment linked to the programme to show gaps in learning for the student and, once this has been completed, TAs had a starting point for each individual. This is a 1:1 intervention due to the personalised tailoring that is needed. Students received 1:1 support 3 – 5 times a week as recommended by the intervention author Dyslexia Action, due to having the training myself in my previous school, no training costs were incurred as I delivered the training sessions. However, the resources needed came to just under £500 with ongoing photocopying costs at an average of £150 per year. Dyslexia Action (2015) believes that this intervention is;
… designed to support all children who experience literacy difficulties, whether dyslexic or not. It offers a series of short timed exercises which have been designed to build the skills needed for automatic, fluent and accurate reading and spelling. (Dyslexia Action 2015:1)
Dockside for readers with a reading age of 7 – 9 years
Decoding is ‘the process of translating a printed word into a sound’ Mayer (2008). Dockside is the intervention we run for students with a reading age of between 7 and 9. By this time, we know that students are able to build some words through phonic sounds and blending techniques, but they are not secure in this decoding skill and the gaps are such that they lack reading confidence. Dockside provides ‘a systematic phonics-based reading intervention programme, specifically designed for older pupils’ (Rising stars 2016:1). It aims to build confidence, encourage engagement and motivation and has a detailed guide for teaching assistants. Students with a reading age of 7 – 9 had small group interventions twice a week for 8 weeks.
Badger Guided Reading for readers with a reading age of 9 – 11 years
This intervention is for those students close to being free readers who require further comprehension skills to be able to manage independently in the mainstream secondary classroom. The outlining assessment is in line with the English National Curriculum and believes that reading ‘real’ books with a range of genres encourages an appreciation and enthusiasm for reading for pleasure not just in school.
Guided reading has significant beneficial effects on helping students develop reading skills. It is one of the most effective tools not only to improve a student’s fundamental reading skills but also to help the student develop higher level comprehension skills. (Gagen 2007:1)
Appendix 4 – Literacy Report
Literacy report – Term 2
Literacy report – Term 3
Literacy report – Term 4
Featured image: Teddy Bear by MabelAmber on Pixabay, licensed under Creative Commons CC0