Taking words out of context?

This post began life in one form, and then further digging uncovered various things that meant I needed to rewrite it for clarity: there seem to be some misunderstandings circulating about the Phonics Screening Check, even among those (in fact especially among those) who are strong advocates for it.

Before I get into the details of the check, I should reiterate that I absolutely understand the value of an early and kindly-done assessment to make sure children are acquiring the skills they need for reading.

I also understand the value of placing restrictions on the scope of any test – but would note that thinking about which restrictions have been chosen can be instructive in relation to the assumptions of those creating the test and wishing to use its results.

In addition, I should say yet again that I see phonemic awareness, and an explicit focus on the building blocks of the sounds of our language, as a deeply valuable aspect of literacy teaching. Systematic Synthetic Phonics seems to me to have some very powerful characteristics which can scaffold the development of this awareness.


Ambiguity in the Check

1. Ambiguous graphemes

I was recently pointed to an article by Dick Schutz, Are “Leading Educationalists” Too Smart to be Dumb about Reading Instruction?. It attempts to counter arguments made by David Reedy, Andrew Davis, and many others together describing themselves as “a coalition of leading educationalists organized by the UK Literacy Association”, in an Open Letter .

The sections indented and in bold below are from the “Open Letter” and the italic represents his responses in each case.


With the pseudo-words, any plausible pronunciation is marked correct. So, for instance, children can decode “vead” in either of two ways: they can produce something rhyming with “bed” or with “seed” “veed” is not included in the Check, and none of the items actually in the Check have this ambiguity. “fot” is the first item in the 2013 Check, and children might possibly pronounce it as rhyming with “foot.” However, 97.1% of Yr 1 children pronounced the item consistent with the Alphabetic Code. 97.2% of Y2 children who re-took the Check pronounced the item consistent with the Code. The concern express by the coalition is nonsense, not the items in the Check.

However, with the so-called “real words”, the blend produced must match the sound of a real spoken word. Hence “blow” pronounced to rhyme with “cow” is unacceptable. Had “blow” not been classed as a real word, a response rhyming with “cow” would have earned a mark. “blow” has not been included in the administration of the Check. The Specifications for the construction of the Check preclude the posited ambiguity.


Schutz links to the Framework for the check, which illustrates how constrained it is. As a test of sounding and blending, of course, some constraints are logical.

However, the Check’s constraints  do not exclude the ambiguous ‘ea’ digraph from the check: on p.12, section 3.2.2 there is a table which specifically lists it, with examples ‘head’ and ‘bead’. At the end of the table there are notes on the acceptable pronunciations of various words; the one for ‘head’ saying

In some regions the ‘ea’ in head is the same phoneme as in bead. The phoneme intended here is the same as the ‘e’ in bed.

There is the further note for ‘book’:

In some regions the ‘oo’ in book is the same phoneme as in room. The phoneme intended here is the same as the ‘u’ in put.

So although the Checks so far may not have included such ambiguities, the Framework assumes that at some point it will, and therefore the UKLA’s concerns in this regard would seem to be justified.


2. Ambiguous syllable stress

In a previous post, ‘Phonemic Capital’, I discussed the importance of syllable stress in creating a meaningful word out of a set of multisyllabic phonemes. Elizabeth Nonweiler commented to say that

There is no problem with which syllable to stress in nonsense words in the phonics check, as all the nonsense words have only one syllable.

I therefore removed the section I had written on this issue, because I thought it did not apply. However, I had not read her words carefully enough, not had I yet seen the detail of the Framework (the perils of wading in…), and in fact it turns out that this problem could indeed arise, since the Framework itself expects that it will. It states (p.9):

The two-syllable words assessed will be real words because of the difficulty of inventing polysyllabic pseudo-words with limited alternative pronunciations that can be scored reliably. This is an issue for two-syllable words because of the effects of stress placement on vowel pronunciation.

This passage needs unpicking. It is clear that it is expected or intended that two-syllable words could be included in the Check. In fact, the Framework states that they should be (p. 15):

Section 2 will contain..4 x two-syllable real words with different orthographical representations, one with five letters, one with six letters, one with seven letters and one with eight letters (4 words).

There seems to me to be inconsistency between Nonweiler’s assumption that excluding two-syllable words from the pseudo-words list will solve the problem, and the actual aims and content of the Check. The Framework does not see a binary distinction between pseudo- and real words. In fact it seems to be constructed on the assumption that it would be beneficial if children did not know the real words that they are to decode: that is, these real words would be no different from pseudo-words from the child’s perspective. The Framework says (p.8):

The real words will include between 40 per cent and 60 per cent less common words, which children are less likely to have read previously. Less common words are included so that the majority of children will need to decode using phonics rather than rely on sight memory of words they have seen before.

Therefore, it is possible within the remit of the Check that the two-syllable words could be unknown to the children, and that they would in those cases possibly be faced with ambiguity about stress. Further, in ‘Phonemic Capital‘ I discussed the issue of syllable stress in ordinary decoding (not just the Check), and said that in my experience with my own children, even with common, well-known words, syllable stress can be all-important to a child’s understanding of a word. The example I gave there was ‘began’, which is not a phonetically complex word, but which relies very much on correct stress for its pronunciation.

The notes on ‘head’ and ‘book’ show that it is expected that there is a correct pronunciation for these words: as Reedy et al. pointed out, this discriminates in favour of certain children for all sorts of complex reasons.


So what…?

To me, these inconsistencies also raise the question: why have any real words at all? If what is being tested is a narrow mechanical skill, what is the rationale for muddying the analysis of results with real words?

In English, sounding and blending are not always in themselves enough to decode a word fully (see my earlier post in which I talk about ‘tweaking’ after blending, a need to which Debbie Hepplewhite in particular has drawn attention). If we think of GPCs as like Lego bricks, in English we always need to be aware that the colour of some of the bricks may only be decided as a result of the other bricks with which they are matched, and often as a result of which other groups of bricks precede and follow them, too.

The various notes and caveats in the Framework for the Check make it clear that ambiguity can be a problem in decoding English. Yet there is no mention of the only means by which some graphemes can be disambiguated: context.


So various points arise, from the details of the Framework document:

  • The ambiguities of English orthography cause problems for creating a fair test of decoding words out of context.
  • The only way of testing decoding according to the SSP definition would be to use purely single-syllable pseudo-words out of context.
  • That is, the effectiveness of synthetic phonics as a method can only be properly tested by excluding a central feature of the English language: ambiguity of pronunciation (even to the exclusion of some very commonly-occurring words and GPCs).

If the check were a full test of children’s decoding of English, there would be no problem in including common ambiguous graphemes such as those in as bow/bow or read/read from the test.

The phonetically restricted nature of the check, and the ambiguities it acknowledges but does not solve, clearly demonstrate, to me, that synthetic phonics is not, on its own, enough for children to be able to decode. If all ambiguous graphemes were to be excluded from the test this would not solve the problem of their relative frequency in the language, as a result of which young readers are likely to (a) know their meaning and (b) come across them relatively regularly.

Although the Framework document mentions ambiguities of pronunciation and stress, it does not suggest any means by which children are supposed to disambiguate. So children are left untested on one of the core skills required to decode English: they are being asked, if they do not know the word, to take a guess at its pronunciation. Since guessing is something that SSP is designed to obviate, this is an uncomfortable situation.

If a test were to be created purely in relation to a child’s ability to recognize unambiguous phonemes and create word-like sounds out of them, this would clearly not be a full test of their decoding ability, but only of part of it.

So, what is the rationale for not testing the other necessary English decoding skills as well, such as the ability to choose the correct pronunciation – and hence the correct word – when reading GPC-ambiguous text? Why leave a child to guess? It seems especially inappropriate to do this to children who have been taught that ‘guessing’ a word is something that they absolutely should not do, and to ask them to do it in a situation where they are being tested by a potentially unfamiliar adult.

The words in the check are presented without context. It is very interesting if the actual content so far has excluded two-syllable words and ambiguous graphemes, since this would suggest tacit recognition that there are other things going on when a child (or anyone else) decodes such words.

None of us could know which pronunciation of bow/bow read/read object/object to choose without the guidance of the surrounding context of vocabulary and syntax, so by excluding context and also excluding words which require context for disambiguation, the check, if it has excluded these issues, implicitly recognizes that context is necessary for the decoding of certain words, even very common ones.

In order to decode ‘bow’ fully, a reader needs context. Is the actor taking a bow or tying a bow? The exclusion of such ambiguous words from the check seems to show that the check is not a test of the whole landscape of decoding ability, but merely one aspect of it. A very important (actually fundamental) aspect, obviously, but not the only one.

Given that the test seems tacitly to demonstrate the importance of context in decoding many English words, even commonly-occurring and otherwise simple ones, why does SSP appear to ban the use of context in teaching children to read, and why does the check not include a contextual element?

If it does not in fact exclude context, why do many people seem to think that it does? Note that the Framework document says that a phonic approach should be the ‘prime’ one, which implies that it might not be the only one:

Since this phonics screening check is a decoding check, only words that are phonically decodable have been included. It is expected that teachers will ensure that elements of early reading not assessed in this phonics screening check are also taught, such as reading and discussing books. The following statements indicate additional skills that children should possess by the end of Year 1 but that will not be included in the phonics screening check.

By the end of Year 1 children should:

  • apply phonic knowledge and skill as the prime approach to reading unfamiliar words that are not completely decodable;
  • read many frequently-encountered words automatically;
  • read phonically decodable three-syllable words;
  • read a range of age-appropriate texts fluently;
  • demonstrate understanding of age-appropriate texts.

It is vital that children are given the opportunity to develop these skills throughout Year 1, in addition to developing the phonic decoding skills that are assessed in the phonicsscreening check.

The phrase ‘only words that are phonically decodable’ is interesting. How is ‘head’ phonically decodable to a child who does not know the word? Even if they do, how are they to know that there are not two possible pronunciations, as with ‘read’? And if a child ‘decodes’ the word with the correct pronunciation because they know it, they have done more than decode phonically: they have referred to their own store of vocabulary.

This seems to me to be a bit of a tangle, and there are some inconsistencies and ambiguities in the Framework, the Check, and the claims made about both (and about SSP too) which need a light to be shone onto them.

Answers, opinions and explanations gratefully received – I’m writing this blog in the spirit of enquiry and further understanding…. If an SSP practitioner could explain (unambiguously!) what the SSP orthodoxy is in relation to deciding between bow/bow, that might stop me blathering on about this. Maybe.




54 thoughts on “Taking words out of context?

  1. Thanks for having a look at the framework. The reference to pronouncing real words correctly still stands in the guidance materials for 2014, and the explanation of ‘blow’ still stands in the video, so it is clearly seen as possible that such a real word might be used. The words, ‘ect’ etc which Schutz dismisses because they are not in the check are in fact in the practice materials and are no more or less nonsense than the actual items.

    Your analysis summarises well the internal contradictions in the check which arise from the nature of the English language. It is difficult to see how the ‘black box’ of the check will give much useful information without further analysis which could have been avoided if nonwords had been used throughout.

    It remains, however, that whatever its form, the main purpose of the check is to ensure that SSP, and reading, are taught in a particular way, regarding the benefits of which the jury is still out.

  2. The Open Letter to Gove included this passage: “With the pseudo-words, any plausible pronunciation is marked correct. So, for instance, children can decode “vead” in either of two ways: they can produce something rhyming with “bed” or with “seed”.

    Apparently, a response from someone named Schutz was the following:

    “veed” is not included in the Check, and none of the items actually in the Check have this ambiguity. “fot” is the first item in the 2013 Check, and children might possibly pronounce it as rhyming with “foot.” However, 97.1% of Yr 1 children pronounced the item consistent with the Alphabetic Code. 97.2% of Y2 children who re-took the Check pronounced the item consistent with the Code. The concern express by the coalition is nonsense, not the items in the Check.”

    This response from Schutz misses the point entirely. When commenting in the Open Letter on the possibility that plausible alternative pronunciations would be allowed for pseudo-words such as ‘vead,’ we were simply rehearsing the official guidelines publicised for the check. Nothing at all hangs on whether ‘vead’ in particular would ever be used – it was just an example – no more.

    We were not complaining about the fact that plausible alternative pronunciations would be marked correct for the pseudo-words – it seemed entirely sensible, as a matter of fact. I have absolutely no idea why Schutz treated this passage in the Open Letter as a complaint of any kind. Nor do I really understand why those designing the check would, in the case of pseudo-words, seek to exclude examples amenable to alternative pronunciations in the first place . If a given grapheme has more than one sound association, what’s wrong with crediting more than one response with a mark?

    The situation with pseudo-words was only rehearsed by way of contrast with the so-called ‘real words’ situation – where correct pronunciation is essential. And, needless to say, we are well aware of the lengths to which test constructors must go in order exclude a range of ‘difficult’ cases. I make this point in some detail in “To Read or Not to Read: Decoding Synthetic Phonics” http://onlinelibrary.wiley.com/doi/10.1111/2048-416X.2013.12000.x/abstract

    in order to bring out how distant dealing with the alleged real words in the check is from full reading of real words in context. The Open Letter contrasts the pseudo-words and ‘real words’ situation to make it clear that the validity of the test is dangerously confused. 5-6 year olds are tested on decoding with the pseudo-words. What on earth are they tested on with the ‘real words’? Both more and less than decoding per se.

    Again, we have to keep pointing up examples to demonstrate just how very very careful the test constructors will have to be with those ‘real words’. ‘bind’ must never appear as a real word. Why? Because a child might decode it to produce something that rhymes with ‘tinned’, and believe that what she has come up with is a real word, remembering that her mother said the previous week that she had binned some broken toys. ‘grind’ must not be used, because the ‘wrong’ pronunciation rhymes with the real word ‘grinned’. Similarly, ‘mild’ must not appear (because of ‘milled’… And so on.

    There must be very many necessary exclusions, once you start thinking about this. Of course the check constructors will exclude such cases. For if they don’t, the check tests (not very well), among other things, confidence, extent of understood spoken vocabulary, and even whether the child already knows how certain words are spelled. Surely that was not supposed to be the idea at all!

    Any sane observer is left wondering what on earth is really going on here. Even SP enthusiasts (well some of them) are admitting that maybe the inclusion of so-called ‘real words’ was a terrible mistake.

    • Thank you. I do *try* to be a ‘sane observer’…

      Just for reference, info about Dick Schutz and his educational company 3RsPlus http://www.3rsplus.com/about_3rsplus.htm.

      Totally unambiguous regular words would be a logical test of SSP skill, and would presumably still function as a useful screening to pick up children who are struggling, but there would be an even more obvious disjuncture between such a test and real reading – so perhaps that’s what made people feel uncomfortable during the test’s design.

      • Actually, not really, as a child might know the real word from their own vocabulary, rather than from decoding it, and therefore pronounce it correctly on that count.

      • Sorry Nemocracy – I meant to say ‘totally unambiguous regular *pseudo-words*’ – tweeting at the same time is a dangerous thing!

  3. Yes, it makes sense for the purposes of the test, but those purposes need looking into. The fact that decoding skill can only be checked through using nonwords demonstrates the tenuousness of the relationship between decoding and reading English. Government policy seems to fly in the face of this simple observation by putting all the eggs in the decoding basket.

  4. “Totally unambiguous regular words would be a logical test of SSP skill, and would presumably still function as a useful screening to pick up children who are struggling”

    Responding as one sane observer (I hope!) to another:

    – but why do the above at all in a nationally imposed ‘check’, when teachers can do it for themselves in a ‘low stakes’ way in seconds when they need to, and also do it in a way that allows children to take account of context in the ways you so admirably illustrate in some of your comments here. Well – we know why there’s a check. Because fundamentalists are trying to make teachers teach in a certain way.
    Fortunately, most teachers are quietly subverting the official line with their customary good sense – viz of course they help children to ‘decode’ because this is often very helpful. But they do so in a way that allows them to deal with meaning, and with the fact that sometimes pronunciation can only be determined once meaning has been fixed.

    But I know from many personal messages that there are teachers and schools doing things more rigidly – perhaps they fear what Ofsted will say when they next visit, etc. This sometimes results in very unhappy children, and, in some cases, children becoming school refusers. The latter cases are normally children who can already read on arrival at school. (Have you seen some of the absolutely disgraceful comments of SP people who deny that such children can ‘really’ read – thus insulting the intelligence of many parents, schools, etc?)

    We already have Ofsted reports on primary schools that mark them down because, in Ofsted’s opinion, the pupils haven’t had enough phonics, or had phonics in the ‘right’ way. The situation may worsen from September when the Year 1 programmes of study in the revised National Curriculum come into force.

    • I can see the logic of having some kind of general check if there are clear problems with literacy in some schools, which become clear only at the stage of the KS1 tests. By that point, there might presumably be all sorts of embedded problems which would take big interventions to unpick. In schools where everything is basically fine in the KS1 tests, though, logically it might not matter particularly how they got there (within reason!). It seems a waste of everyone’s time and money to test children who are in a school which, from its KS1 results, is obviously strong on literacy. I was wondering about the feasibility of having a Y1 check only in schools which were not managing to get the children to the right sort of point at the end of KS1. Not a bald phonics check but something with small sections containing a bit of sounding & blending, a bit of disambiguation from context, and a tiny bit of comprehension – a means of seeing where any weaknesses may lie, and where support really needs to be given. Presumably weaknesses at KS1 could be just as much to do with comprehension as decoding?

      The issue of school inflexibility is one I’m slightly familiar with, and have mentioned in a comment elsewhere in this blog. A relative’s child was consistently held back in reading during Y1, with the teacher apparently refusing to move him up a book band more than once a term. He was making much faster progress in his reading than this, and was utterly bemused and miserable – and yes, it did contribute to school refusal and eventually to him moving schools completely. We have had the opposite experience, with the school being absolutely happy to accommodate the faster progress of our boys’ reading (and those of many others in the class, including children who could read when they started) while still making phonics sessions work for the whole class.

      I’m really interested in getting to the bottom of this, because I’ve seen similar anecdotes about inflexibility and unhappy children from other parents on Twitter, and I worry that the structured nature of SSP – which is presumably meant simply to be efficient and clear – ends up leading to an inflexible approach in some cases. Why, for instance, can it not be accepted that a child is reading to some degree when they start school? It doesn’t matter how they learned, or what level they have reached: is there any reason they couldn’t be included in class phonics sessions? Phonics gives a valuable insight into the structure of the language which they may well not have had the chance to learn about, and they will need to know the SSP analytical vocabulary as they progress through school. In guided and group reading, could sounding and blending be introduced to the already-reading child as a method of dealing with unfamiliar words, and teaching of phonics methods and skills can happen ‘incidentally’, as Debbie Hepplewhite herself has recommended for able readers. The same approach could be taken for children like my two, who start school as non-readers but who take to reading easily and whose learning trajectory in this area is steeper than their classmates. If that makes sense.

      I understand the concerns people have about the effects of introducing SSP across the board without looking at any unintended consequences, and I have huge sympathy for the people whose children have suffered as a result. But I also have sympathy with the SSP advocates I’ve been in touch with who seem perfectly happy to be flexible about how to bring different children along appropriately, and seem to spend half their time countering various awful accusations with ‘But this shouldn’t happen! SSP doesn’t say you should do that!’, etc.

      I think there are various different issues all tangled up here. One is high-stakes testing, as you’ve said, and it looks like there’s quite enough of that already without the PSC. Another is the issue of getting some schools to improve their literacy, and the wisdom (or not) of taking a stick rather than a carrot approach, and also of applying tests to everyone, when only a subgroup are in need of help (and those can be spotted via KS1 tests anyway).

      A further problem, from what you’ve said and I’ve also heard, is Ofsted pushing for SSP in a way which makes schools too afraid to use SSP in a creative and flexible way that’s appropriate for the children that they are actually teaching. Apart from the danger that this will produce a cookie-cutter approach, and potentially disengaged children, it seems to me that in the long run this also stifles the potential for people to make improvements to SSP itself.

      It also seems to me (having had a positive experience so far) that SSP has much to recommend it as a way into reading, as long as teachers are free to use their intelligence, flexibility and creativity in doing so. If people are not free to do this, and their professional life is micromanaged to a stifling degree, it is not surprising if things become rigid, dull, and blind to individual children’s actual progress and needs.

      In the little extra post I wrote last night, I outlined the results of a conversation I had with the poor SSP advocate teacher who I had been asking a series of repetitive questions relating to SSP and context/meaning. She was absolutely adamant that SSP should not exclude context as part of reading, and accepted quite happily that context is necessary for disambiguation of many homographs or words containing ambiguous graphemes. Given all the things that have been said about SSP I was surprised by this but she was completely definite.

      If she is correct, then ‘first, fast, and only’ as a mantra seems to have caused as many problems as it might have solved, if it has created the impression that being able to sound out words in order is enough. I haven’t come across a single SSP advocate, large or small, who has believed that that’s enough, so is this a case of miscommunication, exacerbated by the government’s unhelpfully aggressive and inflexible approach?

      Leaving aside the check, which has all sorts of problems associated with it, do you think teachers who dislike what they’ve seen of SSP would be happier with an approach to it which was sensitive to inclusion of context and meaning, as this particular SSP advocate was keen to stress it should be?

      Sorry that’s such a blurt. Not Mrs. Blurt, obviously – what a relief to leave all *that* nonsense behind!

  5. The SSP response to a situation in which a child is already reading, or picking it up very quickly, is that they still need to learn phonics because they will eventually come across words they cannot decode, or that their memory for words will let them down at a certain point and they will have to depend on decoding. This is quite surprising when you consider how many words a good reader will recognise immediately with no hint of decoding going on. It is also surprising in view of the research of David Share and others, which shows that readers teach themselves new words using their existing lexicon for support. Not only do they teach themselves the words but they remember how to spell them even if they are homophones for other words or possible words.

    Is there a need for pupils to know the vocabulary of phonics when they are reading? My daughter was able to say what a phoneme was when she was in reception class but has forgotten now. Does it matter? She is graduating in a couple of weeks. Yes, she was taught phonics 19 years ago. Phonics has never actually gone out and into fashion.

    SSP does ‘allow’ disambiguation of homophones etc by using context, once the reader has made a decision about the alternative possible sounds and pronunciations they are dealing with. It does not allow that a child might look at a picture and identify the word underneath by deducing that it had something to do with the picture – this is called ‘guessing’. It does not allow that a child might use their deductive skills to identify a word quickly because of its context, although how the teacher knows this is not being done is difficult to know – unless s/he insists that each and every word is sounded out.

    Perhaps we should be more concerned about the children who struggle with reading than those who are quick off the mark, when it comes to phonics. How realistic is a scenario in which such a child looks at an ambiguous word, decodes it to its possible versions, and then decides fom the context which one to go with? In actual fact if is far more common for the child to have a quick phonic stab at a word and end up with something which does no fit the context at all, proceeding to carry on regardless having done the task expected. This child would do better to be listening to the meaning of the text as s/he reads, something which children who habitually sound out each word find difficult. These pupils need phonics for sure, but they are exactly the children who also need lots of comprehension support and experience of books. Sadly, the over-emphasis on phonics and the belief that it will eradicate illiteracy threatens this vital aspect of literacy teaching.

  6. This is all really interesting. As you know, I’m feeling my own way through the reading debate, and have very limited experience of different approaches in practice. Looking at it theoretically, though, I do understand your concerns about the potential for SSP to drive a wedge between the decoding aspect of reading and the comprehension/storymaking aspects.

    One thing that I have been wondering for a while is whether other methods, by their nature, push the teacher to focus more on comprehension and narrative structure during the process of decoding the words. If a teacher has been trained to guide a child in looking for contextual evidence as to which word they are looking at, then the separation between decoding and comprehension would probably not occur. Does that sound reasonable to you?

    This is why I put questions in my original post about how comprehension is supported during SSP reading and during other methods, and about how teachers’ practice changes when they change from one method to another. Elizabeth Nonweiler gave a very detailed and useful response in relation to her experience of switching to SSP, in which it was clear that in her view comprehension is very much part of the process; but so far she’s the only direct responder on that issue, and I would imagine that her experience would differ somewhat from that of a teacher switching to SSP in the current environment of high-stakes testing and pressure to conform.

    I’m still interested in this particular question, because it seems to me that the most urgent problem is to work out what is going so wrong that some schools are apparently letting comprehension work atrophy in reading work. Is it, perhaps, that in the methods the teachers were trained in, the comprehension work occurred naturally in the course of teaching reading, but that with SSP it needs explicit support that they have not had the chance to develop in their practice? Is it about pressure of time and numbers, or about training, or something else?

    I would imagine that any scheme (within reason) would work more successfully if the most effective, imaginative teachers and lots of resources are put at its disposal. It would not be surprising if SSP had a beneficial effect under those conditions. But since teachers are people, not robotic paragons of perfection, the big question is how well it works when teachers are under pressure, classes are big, resources are stretched, and CPD is inadequate.

    This applies to a multi-cueing method as much as to SSP, and perhaps it might be useful to unpick what the strengths and weaknesses are in each case. The biggest strength of SSP, it seems to me, is its focussed simplicity – but its biggest weaknesses are the danger that it is seen as a single pill to solve all problems, and its potential to be used very inflexibly (that is, being ill-matched to children’s actual learning pace or to their linguistic needs re accent or EAL). The biggest strength of a multi-cueing approach might be the fact that comprehension of and engagement with the text cannot be left by the wayside; its biggest weaknesses might be a dependence on (expensive) intangibles such as the quality of teachers’ interaction with the child, and also on the amount of time available for a 1-2-1 focus (including the quality of support at home).
    I can see why the government loves SSP: it’s spreadsheet-friendly, and has a simple, direct message. The government seems to view schools from an input-output perspective; they are looking at it, in a sense, as an abstract system. The focus on analysis by numbers is bound to exacerbate this, unless a specific and credible place is found for qualitative analysis as well.

    The reasons I feel positively towards a sounding-and-blending approach to working out words are nothing to do with spreadsheets, and arise from the fact that it fits well with our own circumstances. I’m very aware that our children have many advantages when it comes to learning to read: English is our first language, and we speak with RP, too. The boys had a vocabulary-rich baby- and toddlerhood, and they have a language-obsessed mother and a music-obsessed father. The extended family is very wordy, and talks *about* words and ideas and music a lot. There is a lot of listening to things and talking about responses to those things.

    We decided not to do any pre-school reading with the boys, but we did talk about words, and think about the sounds within them. When I started reading with them, sounding out came naturally, since words are sounds in the first instance, and I thought of it by analogy with reading music: interpret the symbols to find the sounds, and listen to the sounds to find the meaning. When I’ve written about Debbie Hepplewhite’s idea of ‘tweaking’ from sound-blend to real word, I’ve found myself thinking about it as ‘tuning in’ to the word. It’s an aural experience for me, and since I saw words as inextricably sounds *with meaning*, that was how I approached things when reading with the boys.

    I do think there can be something lovely about focussing on the sounds of the words that are being read – the music of them, as well as the sound as a route to meaning.

    BUT: If English had not been our first language; if our accents had been non-RP; if the boys had experienced a less wordy environment prior to school; if the teacher’s accent had differed substantially from the boys’ own, I imagine things would have been completely different even if the boys had been physiologically exactly the same. So although I’ve experienced a phonics-focussed approach as a very smooth one, I have found myself noticing all sorts of potential issues and barriers to progress along the way.

    How, for instance, is it best to deal with a situation where the teacher’s accent differs so much from the child’s that it is difficult or impossible for them to make the link between the sound being made and their memory of a particular word-sound with meaning? How can that gap be bridged? In such a situation, having a picture to point to, or a bit of chat about the story so far (ie context) might presumably make all the difference?

    It seems to me that there really is beauty in a sound-oriented system, but *only* if the child’s own sound-world is taken into account, and *only* if there are bridging strategies to help them ‘hear’ the language as their own, not just as a mechanical echo of their teacher.

    This leaves me with an imagined approach which is both SSP-plus-support and multi-cueing-with-new-emphasis: an SSP-type spine, with heavy primary focus on the sounds of language (its music – prosody – as well as its phonetics), plus support from other methods if needed, but always with the aim that these other methods are ways in to the sounds *and* meaning of the language.

  7. I think styles of practice are important when it comes to weighing up SSP as a method. Because the aim of teaching SSP is that the child should know every GPC and be able to blend GPCs into words, the teaching focuses on each sound/each word. If you want children to know each GPC that is what you teach; if you want them to pass the check you teach them to blend GPCs into nonwords. You give them decodable readers so that they are constantly practising this skill. This is how the simple and straightforward idea of teaching sounds balloons into a method, supported by all sorts of paraphernalia, spawning many rules and regulations and moving further and further away from the business of teaching reading. Although it’s about the sounds within language it really isn’t about hearing cadences and poetry but about dissecting written words down to the lowest common denominator. There isn’t any romance. It is a utilitarian method with easily measured outcomes (the phonics test score) which appeals to our desire to have easy solutions. And it has great value because of this mechanical aspect – it’s teaching the mechanics of translating written to spoken sounds (leave aside for now the fact that a bit of inspiration – a leap of faith – is needed in the special circumstances of English).

    Teachers who follow SSP programmes can and do, of course, support comprehension. However, comprehension is not part of the SSP approach. It is something which is added on. And if you have an important test of decoding skill looming, is it comprehension or decoding which will be compromised? I don’t think other approaches necessarily focus on comprehension either, and wouldn’t regard mixed methods as the only alternative to SSP. What I do think is that a different climate would allow teachers to judge what focus a child or class needs at a given time. Tests of decoding, deadlines for teaching GPCs, practice of tackling nonwords, curricular targets geared to phonics all interfere with the teacher’s professional judgements by introducing and external professional pressure and change the climate of the classroom. You are right that the brightest and most imaginative teachers will come up with good ways of teaching the subject matter, but if the subject matter is phonics and the success criterion for them linked to phonics that is where they will concentrate their brightest and most imaginative efforts. And that may well be misguided.

    We are conducting a nationwide experiment in this country at the moment into the effectiveness of a SSP-led reading curriculum.

  8. I’ve not long since returned home after providing a training event in a primary school. I trained many of the staff two years ago in the Floppy’s Phonics Sounds and Letters programme – and from Year 2 onwards, staff use the Phonics International programme to continue as spelling programme.

    Both these programmes are content-rich in terms of their bank of cumulative words and sentences/texts.

    The Year One teacher made a very interesting comment to me which I think is relevant to this discussion. She said that she has found the phonics teaching and children’s practice so effective, that the children are reading better than ever thus they are reading books far beyond books that Year One children would normally be reading – so much so that she has to increasingly spend extra time on explaining more advanced vocabulary and language comprehension to match the decoding ability of the children.

    In other words, they have reached the point where they are able to read more widely than in previous years – and both the programme and the level of literature introduce words far beyond the oral language of the children.

    I’m not here to describe or defend other systematic synthetic phonics programmes and practice but I will suggest that those schools that are using a core programme which is content-rich will be achieving great things in both phonics and vocabulary enrichment and comprehension. This may look very different from school to school in comparison to some schools which tend to provide the more ‘fun games and activities’ approach in place of a purpose-designed core programme.

    I likened this only today to teaching football. Two teachers could teach football – but it could be that the teacher who has played football for many years, and has received formal coaching in clubs, and has a passion for football, may well teach the football much more effectively and with more fit-for-purpose activities than the other teacher who has less experience of football, never attended any football clubs, nor watched any football matches – nor had any passion for football in particular. They both teach football – but the experience for the children varies enormously.

    It is the same with systematic synthetic phonics teaching – the programme counts, the training counts, the CPD counts, the dedication to it counts, the interest in it counts – and professional curiosity to ‘want’ to know how well the pupils are doing compared to others – also counts.

    If teachers teach systematic synthetic phonics devoid of ‘meaning’ (which seems to be a commonly-occuring criticism), then it bears no resemblance to the SSP provision that I and my associate SSP proponents advocate.

    • Debbie, my apologies – I’ve only just got to this comment, having thought I’d read all of the ones you made! Your experience with the Y1 teacher sounds very familiar – the use of a content/context-rich approach around SSP causing the *right* kinds of problems at the next school stage 🙂

      My questions elsewhere about more able readers relate to this: when SSP *works* in the context of the school’s other practices and curriculum, and children end up out of sync with current expectations, they need, and the teacher needs, new kinds of support since there will be a different relationship between the level of vocabulary and the cognitive development of the children.

      I have asked about SSP and able readers, and been told that in relation to SSP the concept of ‘more able readers’ is ‘not unproblematical’. I can see that what constitutes an ‘able reader’ might shift, but there will always be some who are more able than others. The flip side is what to do when a large proportion of a class, having gone through the sort of programme you describe, could be classified as ‘able readers’ according to past measures.

      I’m interested in understanding both scenarios, but have yet to have an answer as to why my assumption that there are ‘more able readers’ is problematic, so at the moment I feel a bit stuck!

  9. There is nothing within the SSP principles that makes any reference to the teaching of comprehension. These are the principles as set out by the RRF:
    There is mention of a rich literacy background to the teaching. One would hope there would be a rich literacy background in every infant classroom, but the SP teaching principles do not deal with this provision. They deal with decoding pure and simple.

    If a teacher is covering comprehension and providing meaningful and purposeful reading experiences that is, of course, excellent good practice, but it can’t be said to be systematic synthetic phonics. If Debbie’s programme supplies materials for teachers to use for comprehension and vocabulary support that’s great, too, but it cannot be said to be SSP. One would have to identify a lack of this provision in other teaching approaches to identify it with SSP.

    In fact the Rose Report, the core criteria for phonics programmes, and the RRF all refer to the principles of synthetic phonics without referring to comprehension, and the simple view of reading is regarded as showing that decoding and comprehension are separate. The phonics check has nothing to do with comprehension, except by mistake.

    So what are we to make of all this?

    To use the football analogy : preparing footballers to take penalties and nothing else would be unthinkable because they have to play matches and know all aspects of the game. Similarly with reading: insisting on phonics “first, fast and only” and having a phonics check over-emphasises a single aspect of the reading game.

  10. Did you miss the part about the cumulative words, sentences and texts? Did you miss the part about not applying the multi-cueing guessing strategies when reading texts?

    In other words, of course the SSP provision includes resources from word level to text level and reading books – and of course this involves comprehension and vocabulary enrichment.

    It’s as if you are clutching at anything to discredit SSP teaching and provision through a phonics programme – and I sometimes think such arguments regarding ‘comprehension’ (or the lack thereof) are bordering on nonsensical.

    • I’m editing this comment a little because I’m not sure who Debbie’s comment answers.

      Just to clarify: My question about ‘more able readers’ was directly triggered by a statement by John Walker of Sounds Write. I had assumed, from everything I have seen of Debbie’s work, that differentiation was not an issue, it would happen as part of SSP, so I was surprised by what John Walker said. I asked him what he meant, but he didn’t want to explain on Twitter. So I wrote a little post to try and explain my question further. Nobody who teaches via SSP seemed to want to answer there, either.

      If anyone else can explain what he might have meant, I’d be glad to be able to understand.

      Debbie raised the issue of adapting to the results of *successful* SSP – children whose reading is at a level beyond what would previously have been expected. I specifically said in response to that this was something I’m interested in, and I don’t understand how asking about this could possibly be interpreted as ‘clutching at straws’ to ‘discredit’ SSP – if that was directed at me; I’m not clear whether it was.

      The reason I originally wrote about comprehension was that a number of people had specifically complained about this as a problem with some children learning to read via SSP. I initially found it hard to understand how it could be possible that children could read without understanding. Andrew Old derailed the discussion somewhat with his insistence that comprehension and decoding are entirely separate things. That is part of the context of what I wrote.

      I *really* didn’t expect this blog to get the attention it has. When people started reading it I hoped that by articulating people’s fears, worries and complaints about SSP, it would provide the opportunity for clarification, and with any luck help to assuage fears and identify problems more specifically.

      Sometimes an outsider can clarify by asking stupid questions, if it forces people to clarify their position in their answers. But sometimes it just makes things worse, which is perhaps what I’ve done.

  11. Yes, I’m taking your word for it Debbie that your programme includes comprehension support alongside phonics, but my point is that comprehension is not a *necessary* aspect of synthetic phonics as a method. The method, as described by Rose and by the RRF, is a method of decoding written graphemes into spoken phonemes and blending the strings of phonemes which result. At a sentence or text level this could involve decoding material which is not understood by the decoder, even if at a word level it is. At a word level it could involve decoding words incorrectly (‘steak’ as ‘steek’) or decoding words correctly but not knowing their meaning (I have come across this with the word ‘rug’ for instance, which isn’t known b many children, and in a slightly different example ‘bud’ believed to refer to a bird in the accompanying illustration). The further we get into a SSP ‘first, fast and only’ strait jacket the more likely these failures of comprehension will go unattended. The check is the ultimate demonstration of this in that a child attempting to match the nonword items to their vocabulary is at a disadvantage.

    If SP progress can only be checked by a pupil reading nonwords clearly that is the aim of phonics teaching – not reading but decoding skill. Fine, as a strategy to support reading, not so fine as an end in itself.

    Can you say a little more about what you think is nonsensical about arguments involving comprehension? For instance, do you believe that children will automatically understand text they have decoded? Or do you believe that teachers will always support comprehension despite being told to concentrate on decoding? Perhaps you can elaborate on the methods used in your programme to ensure that comprehension support is added to the SP element.

  12. Meraud, I think it was me who was being accused of wanting to discredit SP!

    As I’ve said before I have no problem with SP being used appropriately. So. I’m not aiming to discredit SP. But I think the government’s policy should be looked at with a critical eye, because it is based on an overemphasis of SP which may have some unintended consequences arising from the nature of SP, English orthography and school accountability.

  13. My sincere apologies – I was directing my last comment at nemocracy’s suggestion that SSP practice is devoid of attention to ‘comprehension’ by endeavouring to point out that references in the RRF ‘Synthetic Phonics Teaching Principles’ to cumulative decodable words, sentences and texts – and references to avoiding multi-cueing reading strategies in the process of reading texts – surely indicates that, of course, SSP includes attention to comprehension.

    Regarding questions about precocious readers, I can only speak for my own programmes and practices whereby I promote ‘two-pronged systematic and incidental phonics teaching’ and where I provide substantial banks of cumulative words/sentences/texts as part of the programmes – with guidance for ways of providing differentiation – such that I have no concern about addressing an early or advanced reader.

    My use and heavy promotion of Alphabetic Code Charts from Reception onwards, in both the programmes I am associated with, should surely indicate my ambition and expectations and provision for all children including the quicker-to-learn early readers and writers.

    However, it is really very worrying that schools in England generally did not use the match-funded initiative for training-purposes because many teachers felt they were already knowledgeable enough. The perspective on ‘enough’ depends on one’s understanding of rigorous practice – or whether one values very specific guidance and training for very specific programmes.

    All of this depends on a view, or an understanding, or appreciation, of whether people like myself and John Walker, and others associated very closely with specific programmes, do have an expertise and insight over and above (for example) trainers and consultants still focused on promoting and delivering training based on ‘Letters and Sounds’ (DfES, 2007) which I, for one, don’t even consider is a programme per se. This is why I was banging on about teachers of football and how their provision may come under the same title but it is not necessarily identical or of equal content or quality!

    Whereas nemocracy raises concerns about government guidance and the Year One phonics screening check perhaps leading to undue practice of, for example, children reading lists of nonsense words – my point would be that if teachers were trained and knowledgeable enough, and were supported by really good-quality programmes, misuse or misunderstanding of nonsense words may not be such an issue.

    I am suggesting, then, that the government is not at fault for promoting systematic synthetic phonics practices and the Year One phonics screening check, but that we still have some way to go in understanding what SSP can ‘look like’, how it relates to ‘comprehension’, how important the Year One phonics screening check is in a number of ways (not least that it informs teachers about their effectiveness, and generates conversations which clearly highlight that the teaching profession does not share a common understanding about SSP teaching or the value of national snapshots of children’s word-level reading skills) and that the messages teachers receive about the body of research is not at all clear whilst so many remain wedded to multi-cueing reading strategies where these amount to word-guessing – and even wedded to practices such as mini whiteboard work and grouped tables for all subjects and activities.

    Perhaps it is time that the focus of detailed discussion moved away from the constant argy-bargy about what SSP is or isn’t – to a more in-depth look and understanding of what phonics practice – including ‘comprehension’ – and wider language and literacy provision – actually looks like in different schools.

    I don’t think this is a case of defending teachers’ right to choose their own practices so much as taking a professional look at how teachers’ practices affect children’s basic literacy knowledge and skills along with the quality of higher-order literacy provision.

    • WordPress threads do seem to have limited nesting, so I think these things are bound to happen sometimes – not a help when discussions are getting heated! I’ve put some thoughts below – not meant as a lecture, because I’m quite sure you know this stuff, but just to get it out there.

      I do agree with much of what you have said; change management is a very tricky thing to do well, especially when the people being asked to change are already very knowledgeable, experienced, and skilled – and are being asked to change practices in which many have invested important aspects of their professional and personal confidence.

      Having seen good and not-so-good change management in the past, I watched the Gove regime’s attempt at it with great frustration: the way they went about things was obviously going to end in tears, and seemed to me very likely to undermine the people who were actually trying to make any good changes happen while that regime was in place. You don’t change institutions for the better by telling people they’re rubbish – they just dig in. Far better to find out what’s good, talk about that, and then work *with* them to create change. All they needed was Change Management for Dummies 😉

      An institution which values change, which can take good decisions quickly, is a much rarer thing than it should be. And in the face of institutional rigidity, or decisions which are baffling to those in front-line work, one of the classic responses is to withdraw into what you *can* control – hence the tendency to personal mini-empire-building. These mini empires are huge barriers to positive change because they are usually at most semi-formal, and represent at least one person’s professional coping strategy.

      If people feel, as I’m sure many must, that SSP has been imposed on them, it will be very difficult for them to feel much more than resentment towards it, which is a huge shame for everyone – the staff, the children they teach, and SSP itself, which could be improved (and everything *can* be improved) with the benefit of their input.

      Whether people like it or not, and whatever the reasons some may dislike it, SSP is currently mandated in England, so I agree that now is absolutely the time to look at how it’s working, how it’s *not* working, and what adaptations need to be made: once anything is scaled up like this, it’s an ideal opportunity to see how it works in different institutions, and over time.

      Most of what teachers know is not suddenly invalidated by SSP; they should be able, explicitly, to bring their knowledge along with them and see that it is valued. But they themselves need to be open to using their knowledge differently, and also to be free to ask questions of SSP. In the end everyone ought to benefit.

      I’ve been working on the assumption that, for the moment, SSP is here to stay, and that therefore it’s very important to look in detail at how it pans out in practice: the horror stories seem to me to be something to be investigated, not denied, because if these things can be brought to light, it’s much more likely that these problems, which are endangering children’s schooling, will be got rid of. Circulating anecdotes only make people (including parents) nervous, and eventually they become urban myths.

      I know for certain that some at least of the stories are true; they suggest real problems with what some people think SSP is, and what they think they should be doing with it. Conversely, I’m sure there are also teachers who have taken SSP entirely on board as a method, yet will have found their own ways of making it adaptable to the needs of their classes.

      The distinction between ‘doing it wrong’ and ‘doing it better’ can be a small one – in fact sometimes a matter of perspective. If everybody expects the SSP Police(!) positive changes will get lost. From a personal perspective, I’ve been told off firmly on Twitter by more than one person for taking an approach to reading with my *own* children which seems (to me) to be pretty much normal within a programme such as your own. As long as people feel embattled, this will continue, I think. So, in spite of the danger of further tellings off, I’m going to keep asking my outsider questions for the moment, talking about SSP and generating debate which, with any luck in the long run, might lead to better understanding and improvements in relation to SSP itself.

      • I think generating debate is an excellent thing – which is why I keep on contributing to that debate as positively as I can.

        What is frustrating, however, is when one responds, for example, to the issues raised by those who are worried by SSP promotion, or their thoughts on its content, or its imposition, including the Year One phonics screening check – but any contributions by way of explanations seem to be somewhat disregarded.

        In other words, the to-and-fro is not really a to-and-fro. It resembles more closely one perspective repeated ad infinitum with no apparent change in understanding or stance – and the other side repeating the same things by way of explanation with no acknowledgement of those things.

        Take, for example, the issue around SSP and ‘comprehension’: Repeatedly I point out that the guidance for teachers, originating with the recommendation in Sir Jim Rose’s report back in 2006, included the promotion of the Simple View of Reading model/diagram which is now invariably included in initial teacher-training and in phonics training and consultancy. The concept of ‘reading’ requiring both alphabetic code knowledge and decoding skill alongside language comprehension is now embedded firmly in detailed guidance in the new national curriculum for English. SSP programmes include content from word-level to text-level plus the guidance to use cumulative, decodable reading books. There is not a disconnect between SSP provision and language comprehension – they are interconnected. They are interconnected both within the SSP lessons themselves and in terms of ‘reading’ being defined as requiring both the technical stuff and the language comprehension. This ‘phonics versus comprehension’ should not be an issue. If teachers do not understand this, then the issue becomes a matter of teacher-education – not that SSP is not appropriate or disconnected from comprehension.

        Take, for example, the issues around the Year One phonics screening check: If teachers’ provision for SSP becomes dominated by children reading copious lists of pseudo-words, then that is an issue for teacher-education – along with management within the school. A week or two prior to the check of such activities are not harmful as they consist of applying alphabetic code knowledge and the blending skill – but, once again, over-use of such activities are an issue for teacher-education and management within the school.

        Further, where is the capacity for teachers and managers – and advisors/consultants/inspectors – to be able to evaluate and compare programmes and practices. A good SSP (or Linguistic Phonics) programme should provide more than enough word/sentence/text level practice to enable virtually every child to sail through the Year One phonics screening check. I hear from the occasional Reception teacher about the results of Reception children reading the words with many of them reaching or exceeding the benchmark or only a mark of two below it (and always errors are based on alphabetic code that has not been taught yet). Programme authors and publishers are finding out about various schools’ check results and they are extremely high indeed – meaning that with good SSP provision children are very well served. Being able to lift the words off the page easily and comprehensively is not a small thing, it is a big thing. Without the check, none of us would get the slightest indication that teachers’ provision – or programmes and practices – really do make a difference in terms of teaching effectiveness.

        And yet, there are ‘leading educationalists’ who seem to see no value whatsoever in this snapshot national information – and I would suggest it is immensely important in terms of teachers’ professional development – and in terms of transparency about practices and effectiveness.

        If teachers have neglected providing a ‘language and literacy/literature-rich environment’ because of pressure to provide phonics teaching – then that is the issue – not that phonics promotion or provision is wrong.

        Ironically, I witness a myth around ’20 minutes of phonics per day’ and despair as this is not nearly adequate enough in terms of providing a comprehensive phonics programme and approach. Twenty minutes is nothing in the scale of things when teachers have to teach such a complex alphabetic code plus skills, and allow sufficient pupil-practice (with its supervision and continued teaching as required) with classes of up to 30 infant children of every description.

        What I am suggesting is that there is too much preoccupation with adults’ professional sensitivities and protests of individuals at the expense of a much more professional analysis of phonics provision and its results and continued professional development.

        Andrew Davis’s preoccupation (and others), for example, about pronunciation alternatives (be it in ordinary phonics provision or for the Year One phonics screening check) indicates, to me, a lack of knowledge about how this is approached (or should be) within good programmes and practice. At no time, for example, has anyone such as Andrew Davis, Michael Rosen, the teaching union leaders, ever approached me with their questions to learn more about my programmes, training provision, professional development, findings – to equip themselves with more specific information. There are not that many programmes which are well-known and well-promoted that such people could not have approached every single programme author/training provider if they are so very concerned. They could have put themselves out to attend training events for each of the main programmes.

        Then, they would be better equipped to continue with their challenges, if necessary. But are such people ‘really’ interested to find out – or do they just feel offended intellectually by government interest and promotion of SSP practices and decodable books? Are they just objecting on points of principle of guidance which differs from their own ‘understanding’ and lifelong ideas?

        I suggest that any government in countries where such a large percentage of people have failed to become literate enough after years of schooling that did not investigate the research and findings in real schools would be entirely remiss. My thoughts are that people’s complaints are often based on objections to the idea of political interference rather than taking the perspective of political responsibility. After all, politicians, in theory, are supposed to represent our interests as a service to us – not neglect or turn a blind eye to the travesty of decades of failure to teach so many children as effectively as we need to.

        Having investigated the teaching of reading and promoted SSP as a consequence, then it is entirely responsible that such a government should then want to find out what is happening in schools by a national snapshot picture (that is NOT a big deal for children to undertake with their class teacher), a national review and so on – in fact I think the government should do so much more to unpick practices beyond the level of a survey of teachers’ views on phonics and the Year One check but by a specific phonics survey conducted alongside people such as me who have chosen to become a specialist in the field.

        For all these ‘leading educationalists’ not to value the bigger picture – or not to put themselves out to do more finding out in real terms – is mystifying.

  14. I find that aspect mystifying myself – hence my first ever post which was an attempt to get straight answers from both sides.

    My own concern with any test is *how* it is implemented (it’s good to share good practice on this), and *what* can be done with the results (the garbage in, garbage out issue).

    Also, however irritating professional sensitivities may be, they can often be the most significant barrier to useful change, so it’s important to take account of them as part of the process, without allowing them to derail potential improvements.

    I’m aware that questions about SSP are much more valuable if coupled with discussion of good practice. The corollary of saying that things are *sometimes* going wrong is to to say that *many* things are going right: and presumably, as with any change, the leadership within an institution has a huge part to play there.

    I’ve so far failed to generate discussion, among classroom teachers, of their experiences of good SSP practice. I hope someone will be able to in a neutral forum (as it were), because giving those experiences a platform might go a long way to dispelling people’s worries – and help generate improvements where they *are* needed.

    And yes, going into schools and looking at what’s happening ought to be the starting point of any discussion, otherwise everyone’s talking about what they think *might* be happening, rather than what actually is.

    • Thank you for appreciating the points I was endeavouring to get across – it means that you ‘get me’ – understand where I am coming from in the context of the national (or indeed ‘international’) debate.

      As for ordinary class teachers contributing to such discussions, I think you’ll find that they are far too busy just getting on with it.

      We can see from the NFER May 2014 that there is a very widespread view of phonics and the Year One phonics screening check – but these views were solicited presumably in a form of isolation rather than in the form of discussion and open forums.

      Thus, that leaves people like yourself to glean what you can from those of us who are inclined to contribute through the online medium – sadly, rarely the teachers themselves.

    • meraudfh, I have made no factual claims about what is happening in schools, beyond a reference to the current situation, where, according to surveys, there’s a variety of approaches to reading in Year 1 classes, and those methods labelled ‘Systematic Phonics’, ‘Synthetic phonics’ , etc. in fact cover a range of practices at the present time. I’m sure these include examples of what many of us would think ‘good’ and ‘effective’ use of phonics in the context of reading for meaning. I have never said otherwise.

      I would respectfully dispute your contention that “going into schools and looking at what’s happening ought to be the starting point of any discussion.” It certainly ought to be the starting point of some of these discussions. But my concern, shared by vast numbers of teachers and reading specialists, has always been with current and shortly forthcoming government policies. The latter can be gleaned from the criteria for matched funding, Ofsted Inspection frameworks and the Year 1 programmes of study from September 2014. The relevant wording is open to a variety of interpretations. The policies can be understood as detaching decoding from reading for meaning in an unhelpful way. Some schools are doing just this. The phonics check, with its bizarre and erroneous claim to assess the reading of ‘real words’ with no context, compounds the difficulties here. Given the intimate interconnections between decoding and reading for meaning that you have so effectively explored, policy insistence on avoiding strategies other than decoding at certain stages is very odd, to say the least. This is open to misinterpretation and, unsurprisingly it is being misinterpreted by some. Part of the problem here is the ‘Simple View of Reading’ that is actually over-simplistic’ since decoding and reading for meaning are not conceptually independent of each other in the manner that SVR implies.

      ‘Good’ SSP practitioners will say that their approaches ‘of course’ incorporate context and meaning, and that teachers should be better ‘trained’ so that the misinterpretations are discouraged. However, no research has, or could in principle justify such exclusivist ‘training’, as we cannot research teaching strategies as though they were drug trials. In any case, no research could possibly justify a tightly prescribed approach to the teaching of reading because it would prevent teachers actually teaching. Experienced teachers select constantly from a rich repertoire of approaches, their selections varying according to pupil attainment, motivation and many other factors. It does sometimes look as though those opposing this point seek teacher proof and pupil proof ‘methods’.. infallible scripts for teachers that would be conveniently insulated from the vagaries of real individual children.

      SP protagonists may object that no such rigid prescription is intended. They may claim that their decoding ‘always’ involves attention to meaning context and individual children. If that is so, why don’t they join the rest of us in deploring the fact that current criteria and statutory guidelines, far from making this clear, give the opposite impression, at least to some? Why don’t they object, as the rest of us do, to the fact that the official guidance fails to include sentences such as ‘Teachers are expected to be properly informed about good approaches to the teaching of reading, including phonics but in the final analysis are free to make professional decisions to suit their particular students and circumstances’ ? Why do they continue to support an incoherent ‘check’ when teachers can assess decoding appropriately as and when appropriate, and with proper attention to the role of context? Why do they try to redefine ‘reading’ so as to exclude a range of perfectly legitimate and sometimes very subtle strategies used by readers to identify words as carriers of meaning and to outlaw non-decoding strategies as ‘guessing’?

      • I would suggest, Andrew, that what children receive at school for their basic literacy skills should not be left to ‘chance’ – or what you would refer to as teachers being ‘free to make professional decisions to suit their particular students and circumstances’.

        It is the same alphabetic code and phonics skills that all children need and teachers need to be knowledgeable and skilled for teaching all children as effectively as possible – that is, if they are to discharge their duty to the children themselves.

  15. I’m sorry, my comment to Debbie wasn’t meant as a specific dig at you. I do understand that a lot of very valuable information can be gleaned from data, but I’m surprised at the apparent assumptions that seem often to be made in education research debates (again, not specifically by you, but in a variety of discussions) that there is a binary distinction between science and non-science and also some quite restricted ideas in some cases as to what ‘science’ is..

    Analogies with medicine can be useful sometimes, but education falls more comfortably within the remit of the social sciences, and therefore although data is fundamental, so is qualitative observation of behaviour. If researchers look only at data they are in danger of looking at input and output but leaving the processes by which one becomes the other as a black box.

    Teaching techniques cannot be tested like a drug; but I don’t see any reason why they can’t be observed qualitatively and systematically, from an anthropological, behavioural and organizational science perspective.

    I also think that seeing how the SSP principles actually pan out in practice is vital to interpreting the input/output data. To what extent, for instance, are teachers doing what they have so often had to do in the past, that is, take a government diktat and turn it into something that actually teaches children? In what ways does SSP differ in *practice* (not in theory) from previous methods? How much does practice vary between schools? Does variety of practice seem to affect results?

    I’m not asking you personally to answer those questions; I’m just suggesting that without addressing such issues it is difficult to have a really fruitful debate – and what teachers are actually doing tends to get ignored.

    (NB I know this only addresses some of your points, but I have to make the dinner – more soon, I hope!)

  16. Andrew, in your last comment you said

    ‘The policies can be understood as detaching decoding from reading for meaning in an unhelpful way. Some schools are doing just this.’

    Is it the following part of the NC which you’re referring to?

    ‘The programmes of study for reading at key stages 1 and 2 consist of two dimensions:
    word reading
    comprehension (both listening and reading).
    It is essential that teaching focuses on developing pupils’ competence in both dimensions; different kinds of teaching are needed for each’

    I can see that this is certainly open to misinterpretation, and since much of the detail hangs on this initial division, it could have an insidious effect on classroom practice.

    You also say that this distinction is currently being imposed in some schools, and I have heard similar stories. None of the SSP programmes seem to desire such a situation, and there’s nothing in SSP *itself* which prevents it from being part of a rich literacy curriculum. So (as ever) a few questions:

    Does the separation of ‘phonics’ from other literacy work show up in the data in any way?

    If it doesn’t (and I see insensitivity to this problem as one of a number of weaknesses in the PSC), what are the best ways of looking for it? Ofsted, or something less scary?!

    What form does other evidence take? Is it the accumulation of anecdotes? Is it from parents’ or teachers’ complaints? if so, by what routes are these worries coming to light?

    A separate but I think related question is about the data for the PSC and for KS1 results. If, as the recent study showed, SSP correlates positively with success in the PSC, but does not correlate with successful KS1 results, then what is happening in the schools which do well at KS1 without an emphasis on SSP?

    That is, what practices might these schools have in common with the successful PSC schools, which are lacking in the schools which don’t do so well at KS1?

    This is another reason why I think it’s worth knowing about specific classroom practice. What if, for the sake of argument, it’s the *systematic* aspect of phonics which is its particular strength? Non-SSP schools could be just very organized and systematic, too, with whatever approach they do use. Or what if the non-SSP schools actually put a lot of emphasis on the phonetic aspects of reading, but simply do not use a SSP approach?

    There are so many possibilities for what the actual distinctions are.

    I do think, too, that there is a contradiction in the view that some schools are driving a wedge between mechanical phonics work and other literacy work, and the view that all teachers are in a position to make the best pedagogical decisions at all times.

    There are two main explanations for the wedge approach: either the staff think it’s a good idea (and it would be worth asking why), or they think it’s a bad idea but believe it’s what they are supposed to be doing and that (possibly) they will be punished by Ofsted, or get worse results, if they do things differently.

    Again, the detail is important. Is there a pattern in choice of programme, in schools where phonics is treated as a standalone or inflexible activity? If there is, can that programme be altered? If the source of the problem isn’t so clear, are there any other factors involved? (Pressure of numbers, experience of staff, etc…)

    There is an obvious disjuncture between what the SSP programmes present as good practice, and the stories of miserable children and sounding without understanding.

    This seems to me an issue which everyone should be addressing together, because nobody wants it. I really think it’s worth setting SSP aside as an issue, and looking at what is happening in successful schools where the children are happy, and in unsuccessful ones where the children are miserable.

    We need a new 4-part diagram like the SVR, with ‘good literacy’ and ‘happy children’ on the axes…….

  17. Meraud,

    Thanks for all these questions and comments- well worth pondering – and I will. I’ve a writing deadline so can’t get back to you straight away – I didn’t want to look as though I was ignoring your responses. Something by mid-week, I hope!

  18. I’m not really here (!), but just in case you hadn’t been sent the link, do read David Aldridge’s excellent piece exposing the nonsense behind claims that research ‘shows’ that reading should be taught via synthetic phonics/systematic phonics/linguistic phonics/ wave your umbrella over the children and they will learn to read…..
    Those familiar with http://onlinelibrary.wiley.com/doi/10.1111/2048-416X.2013.12000.x/abstract

    will recognise that we are both singing from the same hymn sheet


  19. I was going to offer more comment, but my post of Aug 4 still seems to be held up so I’ll wait until that’s gone through first. I hope that’s OK with you.


  20. Andrew, in your last comment you said
    ‘The policies can be understood as detaching decoding from reading for meaning in an unhelpful way. Some schools are doing just this.’
    Is it the following part of the NC which you’re referring to?
    ‘The programmes of study for reading at key stages 1 and 2 consist of two dimensions:
    word reading
    comprehension (both listening and reading).
    It is essential that teaching focuses on developing pupils’ competence in both dimensions; different kinds of teaching are needed for each’

    AD Yes, that is relevant – but there’s more. I can only re-present what I quoted in “To Read or not to Read..” E.g.
    Criteria for matched funding:
    To be deemed ‘high quality’, programmes must:
    be designed for the teaching of discrete, daily sessions progressing from simple to more complex phonic knowledge and skills and covering the major grapheme/phoneme correspondences; demonstrate that phonemes should be blended, in order, from left to right, ‘all through the word’ for reading; ensure that as pupils move through the early stages of acquiring phonics, they are invited to practise by reading texts which are entirely decodable for them, so that they experience success and learn to rely on phonemic strategies. (DfE, 2011)
    National Curriculum Programme of Study for Year 1 English from September 2014 which states that pupils should be taught to:
    • respond speedily with the correct sound to graphemes (letters or groups of letters) for all 40+ phonemes, including, where applicable, alternative sounds for graphemes • read accurately by blending sounds in unfamiliar words containing GPCs that have been taught (DfE, 2013)
    After some more detail on phonics learning, we are told that pupils should be taught to:
    read aloud accurately books which closely match their growing word-reading knowledge and that do not require them to use other strategies to work out words.

    While the wording here covers a multitude of sins, it is not unreasonable to interpret it as requiring some kind of exclusive intensive focus on SP.

    M Does the separation of ‘phonics’ from other literacy work show up in the data in any way?

    AD I don’t have access to data about this. The dangers of the separation of phonics from other literacy work seem self-evident given the wording of policy quoted above, and, to repeat, I have much anecdotal evidence of this. Even if 9 out of ten schools are not separating phonics from literacy in this way, that still leaves a large number of children affected.

    M This is another reason why I think it’s worth knowing about specific classroom practice. What if, for the sake of argument, it’s the *systematic* aspect of phonics which is its particular strength? Non-SSP schools could be just very organized and systematic, too, with whatever approach they do use. Or what if the non-SSP schools actually put a lot of emphasis on the phonetic aspects of reading, but simply do not use a SSP approach?

    AD A nice point – I can’t offer you relevant research, however. We do know from surveys that practice is currently very variable. We know anecdotally that practice called ‘synthetic phonics’ is not always clearly distinguishable from ‘analytic phonics’ or other types of phonics. We know that there are all sorts of ways of interpreting synthetic phonics/linguistic phonics/systematic phonics guidelines – the trouble is that the wording of policy means that inappropriate interpretations are all too possible.

    M I do think, too, that there is a contradiction in the view that some schools are driving a wedge between mechanical phonics work and other literacy work, and the view that all teachers are in a position to make the best pedagogical decisions at all times.

    AD If there is a wedge, it results from the climate resulting from current policies. My view, shared by the vast majority of teachers, is that teachers ought to be put in a position to make the best pedagogical decisions, given their knowledge of their context and their particular pupils. They should be properly informed of a range of strategies to teach early readers. But teachers are not currently ‘free’ – so in a minority of cases they are driving the wedge to which you refer.

    M There are two main explanations for the wedge approach: either the staff think it’s a good idea (and it would be worth asking why), or they think it’s a bad idea but believe it’s what they are supposed to be doing and that (possibly) they will be punished by Ofsted, or get worse results, if they do things differently.

    AD The latter, I think.

    M Again, the detail is important. Is there a pattern in choice of programme, in schools where phonics is treated as a standalone or inflexible activity? If there is, can that programme be altered? If the source of the problem isn’t so clear, are there any other factors involved? (Pressure of numbers, experience of staff, etc…)

    AD It can only be altered if the screening check is abandoned and if the wording of policies is amended to make it very clear that teachers are empowered to make their own professional decisions in this area.

    M There is an obvious disjuncture between what the SSP programmes present as good practice, and the stories of miserable children and sounding without understanding.
    This seems to me an issue which everyone should be addressing together, because nobody wants it. I really think it’s worth setting SSP aside as an issue, and looking at what is happening in successful schools where the children are happy, and in unsuccessful ones where the children are miserable.

    AD Agreed – but it isn’t merely a matter of seeing how successful schools operate, since some of problems of the ‘unsuccessful’ ones relate to current policy and how it can be interpreted by staff in vulnerable schools, maverick Ofsted teams, etc. If that changes, then the ‘unsuccessful’ schools will then have a chance to change too.

    This is as far as I’ve got. Thanks for your all your thoughts and questions. Andrew

    • In the field of reading instruction, one serious issue is about teachers’ professional knowledge and understanding regarding ‘multi-cueing reading strategies’ and, I suggest, how these become confused with issues about ‘meaning-making’.


      Another issue in England is very much about some teachers’ indignation, and the indignation of people like yourself, Andrew, who argue that teachers should simply be left to use their ‘professional autonomy’ no matter what.

      Surely, not all teachers are as well-equipped to make their own judgements or provide content-rich teaching ‘no matter what’ the subject or scenario.

      Some teachers are set-in-their-ways regardless of whether this is good or not so good. Other teachers have a range of beliefs or preferences for how they teach regardless of the wisdom and effectiveness, or acceptability, of their provision. Some teachers are very talented when young in service, others may not be so naturally talented or confident when young in service.

      In other words, teachers are many and varied, but nowadays we have very high expectations regarding levels of professionalism in England – and this is where we get the notion of ‘continuing professional development’.

      If all teachers were the same in terms of their knowledge, talent, capability, experience, confidence and so on, then we would not need any concept of ‘continuing professional development’ and then one could argue that all teachers must be able to exercise their ‘professional autonomy’ at all times whatever the circumstances.

      When you bring the topic of ‘reading instruction’ into the discussion, clearly this has such a rocky history, and even rocky and varied current scenario, that the matter is not at all about the right to ‘professional autonomy’ as there is so much more to this field than that.

      • Thanks Debbie; your response has informed my response to Andrew – so I won’t repeat any of it here, except to say I agree that in any profession, freedom of professional choice should always, perhaps can *only* be founded on good training and CPD, and to add that I think really good CPD should never take a top-down approach but be part of a feedback loop involving teachers, school leadership, professional organizations, commercial interests and the relevant government departments.

    • Clog away 🙂 These issues need time and detailed discussion, not soundbites. I’ve responded to some of your points below, not with the intention of being argumentative, but in the spirit of disentangling the various issues to my own satisfaction…. (With apologies for the many asterisks of emphasis – I hope they’ll make the text a little easier to navigate, at least.)

      AD: ‘While the wording here covers a multitude of sins, it is not unreasonable to interpret it as requiring some kind of exclusive intensive focus on SP.’

      M: It seems clear to me, looking at the Rose Report, the old NC and the 2014 NC, that there is increasing focus on SSP over time. The issue, I think, is what the effect of this has been; and *that* question should really be of interest to both SSP supporters and those who have concerns about it.

      One effect of a focus on SSP, of course, means that a school gets better results in the PSC, since it is a test of the skills which SSP emphasises. This is not surprising. If the DfE decided to teach reading by throwing darts at a series of words, and then tested children’s dart-throwing abilities, ‘improved results’ in dart throwing at words would be likely to follow. The success of SSP needs to be otherwise defined, and otherwise tested. SSP strikes me as pretty robust as a method, so it deserves a more sensitive evaluation (not via high-stakes testing, obviously, since that will skew the results too much). It is currently so entangled in the government’s clumsy methods of imposing it across the board that it has become very difficult to separate the effects of SSP itself from the effects of the government’s punish-and-reward approach.

      The potential difficulty (again frequently raised anecdotally) of including children in SSP-based class literacy work when they have started school able to read to some extent but have not learned via SSP,, is a further effect which I have not yet got to the bottom of. Debbie’s incidental approach seems sensible – SSP as a good way of working out unfamiliar words even when most words can be read fluently – but there is more to the issue than this, of course.

      Another effect seems to be, in some schools, that teaching had become dominated by various kinds of oversimplistic, inflexible uses of SSP methods, apparently to the extent that basic aspects of SSP programmes, such as embeddedness in a rich literacy curriculum, and the importance of meaning, are lost. I believe the stories of these problems that I’ve heard myself, as I’ve said before, because in their provenance strongly suggests credibility.

      The other most common effect which surfaces anecdotally is SSP’s efficiency in getting children reading early and quickly, and its value in avoiding a ‘reading wall’ around about year 2. I can believe this, too, even from what little I’ve seen of it. BUT… What I find *most* interesting is the finding that focus on SSP does not have an effect on later literacy success – even in schools which have had improved results in the PSC through increased emphasis on SSP. I know I’ve said this before, but this suggests that something else is contributing to later literacy success, and it ought to be the focus of *someone’s* research to find out what that might be.

      Logically, however, this does *not* negate the possibility that SSP is the main factor in improved literacy at the earliest stages. SSP advocates have in all cases become so as a result of positive experiences in this regard, after all.

      It is not *necessarily* the case therefore, that some NC focus on SSP is an entirely bad thing: in order to judge, it would be necessary to investigate the effects of this early focus, which as yet the DfE has not done, the PSC being a test of how much SSP children have internalized, not a test of how well it was taught (ie how well it was embedded in richer, deeper literacy work), or of anything else which really needs to be known in order to judge it properly as a way into good later literacy.

      AD: ‘[problematic methods in SSP programmes] ..can only be altered if the screening check is abandoned and if the wording of policies is amended to make it very clear that teachers are empowered to make their own professional decisions in this area.’

      M: I’m going to divide my response to this into two aspects: the point about the PSC itself, and the wider issue of professional freedom.
      Firstly, the PSC and the idea of its abandonment: I can see that removing the current strict parameters (including the focus on the PSC) would be bound to have an effect, and would free up both SSP programmes and teachers to improve methods and/or the application of them, *but* the existence of the PSC can’t be absolutely correlated with problematic teaching methods, at least in the short term, since all schools are taking part in the check and many schools have very good literacy and very good, happy, and varied school cultures.

      So I think it is worth separating *success* (or not) in the PSC from the *effects* of it within the school and on the children and their families. In an emotionally intelligent institution, with good leadership and supportive engagement from parents, I should imagine most negative effects of such a check would be substantially mitigated, or even avoided. Anecdotally, again, there are apparently many schools which do not find the PSC troubling, and which are able to ensure that it is not troubling for the children either.

      So the *effects* of the PSC may relate less to the children’s sounding-and-blending abilities than to the culture of the school – or, at least, the resources it has available to buffer the children (and possibly parents) from the high stakes aspect of the check. Should a school have to do this? Probably not – but it’s worth thinking about as yet another unintended consequence of the government’s approach to literacy teaching.

      The power of the school to ‘buffer’ relates to the second issue, that of professional freedom. It is credible that Ofsted pressure has created school cultures which cannot buffer children from the purposes of a test such as the PSC. But it is equally credible that there are other additional factors, especially since in many cases Ofsted pressure does not seem to have had this effect. Increased professional freedom in any sphere of activity cannot happen unless the professionals concerned have truly internalized the values, standards and knowledge that the profession demands. These values, standards and norms also need to be articulated clearly by the profession itself. Anecdotally, again, it is clear that this is not always the case in schools (for all sorts of understandable reasons). Ofsted sounds, from everything I’ve heard, like a very big problem, but putting the blame on it entirely runs the risk of missing other equally important issues which might not even be all that difficult to resolve once brought to light.

      It seems to me, though, that government prescription and professional freedom are not so much a binary opposition as different aspects of deciding on, and defining, the parameters of acceptable practice. Obviously, no professional can be entirely autonomous: it’s a matter of freedom of choice based on agreed norms plus professional knowledge and experience. Government over-prescription, by enforcing its own norms, tends to be restrictive and create unintended negative consequences. Total teacher autonomy would tend to require, as Debbie has said, that all teachers be absolutely the best at pretty much all times. I suspect it would also have the consequence that individual schools would end up reinventing the wheel in terms of good practice, when their energies could have been better spent elsewhere.

      Given that in the past all sorts of ‘teaching’ methods were used which would not be acceptable today (dunce caps, whipping boys, pederastic relationships, cold showers, the cane…) even the most professionally autonomous teacher (even in Finland…!) is operating within an agreed set of parameters. The question, to me, is not whether or not there should *be* parameters but rather *who* agrees which practices are acceptable, *how* they are defined, and also the *process* by which they are embedded within the profession – in ways which allow teachers to take moment-to-moment appropriate decisions as to which approach is useful at any one time (as in the scenarios you have described).

      So to me one of the most important concepts that you’ve mentioned is *empowerment*. This would take courage on all sides: from the DfE, school leadership, teachers, and other professional and educational organizations (including SSP programmes). Empowerment also means responsibility for those who are empowered, and trust from those who are letting go a little.

      Attempts at total control are always, *always* counterproductive. The DfE has about as much chance of *enforcing* good teaching as King Canute. But too little control is dangerous too – the problems relating to lack of oversight in some schools are distressing evidence of that. Recent policy seems to me to be very contradictory: an iron fist on the method used to teach 5-yr-olds reading (because, according to Rose, we can’t ‘wait’ for research into its value), but a laissez-faire (or maybe just underfunded) approach to really valuable CPD in the area of SSP, so that it’s maybe a bit of a lottery as to who gets training and who just has to sink or swim. Again, my evidence for that is purely anecdotal so perhaps I’m completely wrong….

      My point is that if teachers are to make a professional assessment of SSP they need to understand it in detail. If all could understand it in detail, with proper training, and have the opportunity to raise any concerns (or suggestions for adaptation) publicly without being labelled ‘deniers’, and if problems could be addressed in constructive ways with professional support, then it would soon become clear whether SSP really is as good as many believe it to be. In many schools presumably this training and support and opportunity for feedback are already in place. A follow-up question is: to what extent does this correlate with individual teachers’ happiness with SSP methods? Another would be, for teachers who are fully knowledgeable about SSP and have experienced it in practice, but still have worries about it, what opportunities do they have to register their specific concerns? What, specifically, do they see as being lost through SSP in circumstances where SSP is done *well* and in an *integrated* way? What are their specific suggestions for putting these things back in?

      I’ve always imagined that telling a teacher how to teach is like telling an actor how to act, a vicar how to preach, or a barrister how to present a case in court. There are methods, and tricks of the trade, and agreed norms, but the particular expression of those things is an indefinable aspect of the particular actor, or barrister, or vicar, or teacher. But as with acting, law, and the Church, professionalism in teaching must entail remaining within those norms. To take an example in teaching early literacy, the holding back of children who take easily to reading seems to me to step outside what should be the norms of professional teaching. By (possibly accidentally) implying that this should be done, the Rose Report and NCs seem to *undermine* good practice in this respect, so something equally high-profile is needed to balance out the likely misinterpretations of what these documents have said.

      AD: ‘..it isn’t merely a matter of seeing how successful schools operate, since some of problems of the ‘unsuccessful’ ones relate to current policy and how it can be interpreted by staff in vulnerable schools, maverick Ofsted teams, etc. If that changes, then the ‘unsuccessful’ schools will then have a chance to change too.’

      M: I agree, which is why I’ve said I think people need to look *in detail* at the day-to-day practice in unsuccessful schools. Not from the point of view of Ofsted – judging and telling off – but as an observational exercise, in order to help the institution identify the ways it needs to change. Professional freedom for the majority depends on the provision of robust and humane intervention strategies where there are problems.

      There is of course the issue of how success is defined, and the danger of a self-reinforcing cycle based on erroneous assumptions leading to supposed ‘success’ which is in fact no good to anybody. This does seem to be what has happened with many targets in the public sector as a whole.

      So in order to try and avoid the mirage of ‘successes’ which are no such thing, it is worth looking at *all* activities without dismissing any as irrelevant. Anything could be behind differences in later literacy success, from totally immersive SSP to the provision of a good breakfast: we can make very good guesses as to likely factors, but when the system fails, all factors need to be considered (and they might even be different for every school).

      Without this level of observation it would be very difficult to decide what differences there really are between successful and unsuccessful schools where literacy is concerned, and what role SSP has to play.

      The Rose Report focused very hard on the idea of good practice (which, of course, it treats as basically synonymous with a SSP approach), and reserves its criticisms for the Searchlights model, so does not seem to discuss specifically bad or problematic practice in relation to any other approach (such as phonics itself). I think this is a great shame; with any luck future reviews will look closely at *all* the effects of using SSP, both positive and negative.

      I absolutely accept the strong possibility that the various personal anecdotes from teachers and parents reflect problems within the practical application of SSP in some schools. It seems to me urgent that the causes of these problems should be brought out into the light: maybe the unions could help by doing a confidential survey of EYFS/KS1 teachers to find out whether our guess is correct that teachers don’t really want to be doing these things but think (or have been told) that they have to.

      Another question for a survey would be the level of training these teachers have received before embarking on a SSP programme. Some very strong advocates of SSP as well as others have expressed concerns that training (ie CPD) in SSP is in some cases so minimal as to be non-existent. Without deep understanding of what SSP is supposed to be doing, and how, there seems to me to be little hope of teachers being able to incorporate it properly into their practice, or of making professionally informed decisions about its use (and the effects of its use) in their classrooms.

      I think that’s as much as I can manage to think at the moment!

  21. I’m risking clogging up this site altogether (!), but I’d also meant to respond to your comment about researching teaching interventions, and how this couldn’t be done as natural science would do it.

    The quickest thing I can do is to quote a passage from a book I’ve written with Christopher Winch and Gerard Lum called ‘Educational Assessment on Trial’ – to be published by Bloomsbury in the next few months. Sorry about the length! Andrew

    … it is worth reflecting on the problems inherent in any attempts by the state to intervene in teaching methods.
    Examples of methods that were prescribed included the following: divide children into sets by attainment; teach phonics in a specific way; bring children together at the end of a lesson for a plenary session in which themes arising from the body of the lesson are drawn together and consolidated…

    Further, when we look closer at least some of the attempts to prescribe teaching methods, additional problems emerge. Effective teaching, whether in the course of interaction with the whole class or smaller groups, seems bound to take account of the pupils’ responses. I mean both that it ought to take account of responses, and also that in practical terms it is impossible not to in any case. This means that any particular lesson is to a degree unpredictable, even if a detailed plan is being used, such as one informed by government stress on phonics.
    The teacher has to gauge minute by minute the level of her students’ interest and motivation and the extent to which they seem to be gaining understanding. She continually modifies her style of explanation, tone, timing and organisation. That is why one teacher’s lesson on a particular topic may differ significantly from another’s, even if the same content and similar age and attainment groupings of pupils are involved.

    Suppose, for instance, that teachers were required to explain a concept in mathematics or science according to a specific prescription. This might take the form of a text, devised by experts in the subjects, that the teacher had to use. They might read it from a copy, or feel that it would be better if delivered directly, so they could learn it off by heart before the lesson. It is very difficult to take such a suggestion seriously. Any teacher worth even a fraction of her salary would note the reactions of her students after the first sentence or so. She might take questions, amplify what she had just said, or even move on very quickly to a later section if she realizes that elements of the official explanation are excessively familiar and easy for her students. If we videoed ten teachers allegedly implementing the prescribed explanation, we would see ten different complex social events.

    If the teacher did not enter into dynamic interaction with her students, would she actually be teaching? If someone paid no attention to how their students responded to their actions, they could be replaced by pre-recorded teacher speeches or pre-recorded teacher demonstrations. Clearly this could be prescribed for schools by an authoritative agency. It might be characterized as a ‘teacher proof’ method. Some politicians place so little trust in teachers that they would be delighted to find a way of embedding such methods in schools.

    Prescriptions could take other guises. Perhaps certain questions could be laid down, or the use of specific items of equipment insisted upon. Again, we will wonder whether, once we take account of the myriads of micro-decisions made by teachers in the course of the lesson, we still have an identifiable approach whose success in promoting learning could possibly be supported by research evidence. There will often be nothing in these educational contexts to correspond with, say, the drugs that are legitimately and rigorously tested by means of randomised controlled trials.

    The supporter of prescriptions might try once more to make her case. The trouble with the argument so far, she might urge, is that the target approaches are absurdly narrow and specific. Instead, we should be talking about broader categories of teaching. What broader categories might our prescriber have in mind? They might include ‘chalk and talk’, ‘group work’, ‘interactive teaching’, ‘lecture’, ‘student-centred workshop’, and many other approaches. By their very nature each of these categories cover a huge range of possibilities. In order for research to support their effectiveness, there would have to be at least one feature in common to all the examples subsumed in any one category, and it would have to be a feature on whose presence in a teaching episode a range of impartial observers could agree. Such a feature could not, for instance, be a complex social phenomenon open to a range of interpretations. It would have to be something specific and observable. So, for example, a ‘lecture’ might have to be identified by the fact that the tutor was standing at the front of the class, talking without interruption for, say not less than 80% of the time and so forth. We are being driven back to the specific, and, arguably to aspects of lessons that no professional teacher could afford to follow in a rigid and literal-minded fashion.

    Moreover, even where it makes any kind of sense to think of ‘methods’, teachers may need to switch from one method to another as a result of her appraisals of student reactions. Consider, for instance, a typical sequence of events in an English comprehensive school. Initially a teacher decides that an open-ended interactive discussion approach is appropriate for the treatment of a certain PSE issue. As the lesson develops, however, the students may become over-excited or ‘silly’, so she quickly modifies her style in the direction of a more structured lesson with an emphasis on her authority from the front. In extreme circumstances teachers may need to abandon a lesson completely and begin on something else. If someone was incapable of exercising this level of professional autonomy, many would be unable to regard them as worthy of the name ‘teacher’.

  22. Meraud – Thanks yet again for your comments. I’ll just respond to a few things:

    M “the existence of the PSC can’t be absolutely correlated with problematic teaching methods”

    AD – OK . But remember that the PSC is internally incoherent. It combines testing decoding of pseudo-words (preferably that subset with only one pronunciation, apparently, though the guidelines cover the possibility of more than one pronunciation) with an alleged testing of real words – where the latter are presented out of context, and must be very carefully selected to exclude heteronyms, possible results of decoding where the ‘wrong’ pronunciation is a homophone of a different real word, etc. No one has succeeded in making clear what the so-called ‘real words’ element actually tests and why, if it had succeeded in testing the reading of real words, it tried to do so when marketed as a test of decoding only. The very fact that half the check apparently attempts to test something different from the pseudo-words is disastrous in itself, let alone the fact that there is an important sense that the check fails to offer children any real words.

    Teaching to a test with problems at this level is likely to be educationally suspect in some schools, at least. I suggest that this crucial point about the check’s problems should make us worry about some of your other points about the check not being ‘troubling’ for some schools. Given the PSC’s extraordinary character, it is quite disturbing if schools don’t find it troubling!

  23. There are so many issues wrapped up in the PSC that it’s hard to disentangle all of them.

    The major point in its favour (and in the right circumstances it’s a very big one which might override many objections) is as a simple bucket-in-the-river check to look for ‘indicator’ issues (by analogy with ‘indicator’ species).

    The main negatives seem to me to be:

    *inconsistency (real and pseudo words)
    *insensitivity (pass/fail; also blind to pedagogical issues)
    *narrowness (sounding-and-blending only)
    *distortion (high-stakes; encourages teaching of nonsense-words)
    *unfairness (real words discriminate in favour of children with English-rich home lives)
    *unsupportiveness (no followup on how to deal with issues raised by the test).

    A test which is not meant to be exhaustive but merely an indicator can bear a certain amount of eccentricity and inconsistency. Yet even if the test were purely internal for diagnostic purposes, and even if ‘pass/fail’ were to be described instead as ‘fine for the moment/needs a bit of extra help’, important problems remain:

    *opinion is divided as to whether sounding-and-blending are the indicator issues to be looking for (or even, in some cases, whether they are indicator issues at all);

    *there appears to be a lack of appropriate followup, either for teachers (what other approaches might be useful to try?) or their pupils (what specific kinds of help might they need?).

    One problem which (it seems to me) a SSP-framed check really ought to pick up, but this one won’t, is the one raised by Nemocracy, who has had to deal with children who are sounding but not blending. If the PSC is intended as a diagnostic tool, this seems to me to be a weakness even within the specific parameters of SSP practice, and a lost opportunity (among many, possibly).

    In talking about the PSC’s value (or not) as a screen for indicator issues, it might clarify the discussion to have a clear list of what *all* the possible indicator literacy issues might be, as well as how problems related to these indicators show themselves in a classroom setting, and how they can best be picked up and dealt with.

    I still wonder, though, about the value for public money of using the PSC at all in schools where the KS1 and KS2 results are fine. Wouldn’t it be better to use a revised version of it only for schools with lower results and put the money saved into funding collaborations between those schools and schools whose literacy results are better?

    But schools are not my world, and I’m used to testing/assessing different kinds of things. So maybe that’s not practical, I don’t know…

  24. If you subscribe to the Tristram Shandy theory of names, I wonder what kind of life transformation you hope will result from your username revolution….

    Meanwhile, I hope you don’t mind me putting links here with some relevant and (IMHO) insightful stuff from Robert Price (Informutation on twitter) – where he challenges some of my views. If you haven’t seen it, you might be interested:

    I’m still wondering why you want a PSC for any schools, or why you’d still consider it for schools with poor National Curriculum test results. For such a policy to be defensible, we’d have to know:-
    (a) that synthetic phonics was essential for all in its first and fast intensive incarnation, and that poor NC results indicated that SP wasn’t being done properly and that we needed an accountability stick to beat the schools with.
    (b) that National Curriculum tests are a good indication of educational quality, even in our high stakes testing regime, where assessment is inappropriately used to hold schools to account. Donald Campbell said virtually all that needs to be said about that in 1976:

    “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor. . . when test scores become the goal of the teaching process, they both lose their value as indicators of educational status and distort the educational process in undesirable ways. ”

    What to do instead? Ensure teachers have the knowledge (including knowledge of English orthography, the excellent phonics schemes out there, etc. plus other available approaches to early reading) Put in place just a few radical reforms of Ofsted (!) so that there’s no longer an unequal power relationship between Ofsted and schools and yet the rigour of external scrutiny is still maintained. Inspectors should become real critical friends of schools, just as the best of LEA advisors often were in the past. But the inspectors should be as accountable to schools as schools are to their external auditors.

    Meanwhile, Early Years teachers can discover for themselves, if and when they need to, whether a child can literally ‘decode’ a piece of nonsense text. Why would anyone think an externally administered test could possibly be a better diagnostic tool than the teacher’s own interaction with an individual child with whom she has a real relationship? When you teach young children, continual dynamic assessment/assessment for learning is at the very heart of the processes concerned – if any teacher thought she had ‘learned’ something about a pupil’s knowledge or skills from the results of the PSC I would seriously wonder about that teacher’s professional competence.

  25. I’m just aware that I’m sort of swimming out of my depth….. There are other reasons, but that’s the one you’re getting! 🙂

    What I’m thinking about though is when and why things go wrong. If a teacher is competent then I would always assume that he or she will be perfectly able to assess what needs a particular child has – the system would have collapsed long ago otherwise.

    But what is going wrong in schools with low results for literacy? (If in fact anything *is* ‘going wrong’, rather than it being a case of intake dictating results whatever the school does, which is, in itself, worth discovering.)

    Although good teachers will know what to do, the question is, if they know it then why, in those schools, are teachers’ efforts not having the same results as elsewhere? Maybe they are good teachers doing the wrong thing for that school, or good teachers doing the right thing in difficult circumstances (and it’s not enough to get results to the expected level), or, possibly, they are not ‘good teachers’. The PSC as it currently stands is obviously not going to give an answer to this question, but perhaps another kind of early literacy assessment might begin to do so?

    Looking at schools with low results could turn up staff competence issues, or turnover issues, or Ofsted pressure issues, or resourcing issues, or all sorts of other issues. But there must be *some* kind of difference between schools which are doing OK in this area (KS1 literacy) and schools which are not. As I said before, this might be different for every school, and as you’ve said, this may well be to do with teachers being forced to do things they don’t like. But if no-one is willing to look in detail at what is going on, via detailed assessment, no one will ever know what those different problems might be, and those teachers will not get a chance to explain why they’re doing things in the way that they are. Since such great emphasis has been placed on SSP over the past few years, it makes sense to include the skills it prioritizes as part of the assessment.

    So, given the fact that there is already a test at KS1, I’ve said many times that I don’t really understand the point of a universal check at the end of Y1. Small children have better things to do. But a check *of some kind* would be valuable in situations where schools are not reaching levels of results at KS1 which are generally agreed to be acceptable, in that it might help to point up strengths and weaknesses in what’s going on in the school. [Edited to add: That’s really what I meant when I talked about keeping the PSC for some situations, but also about being clear about what other indicators there might be.]

    It is normal in most areas of activity to have specific policies and procedures for dealing with problems within an organization or other kind of system, and a *non-judgmental* diagnostic phase is an essential part of doing that well. Fixing problems is also about more than individual competence: it is about how that competence works within the wider system. The KS1 results might suggest a potential problem, but something else needs to be done to find out what the causes of it might be. As an outsider, it seems to me, too, that Ofsted ought to fulfill that role for education, but clearly it does not, at least at the moment.

    I doubt the DfE would ever want to look openly at these kinds of questions, though, because presumably a lot of the time the solutions would come down to money: better resources and lower ratios, and also, as Disappointed Idealist has pointed out, to questions of intake dictating results.

  26. I think I got a bit off the point in that one – sorry. Basically, I’d love to know the DfE’s thinking behind making all schools do the PSC regardless of KS1 results, rather than targeting resources at schools where KS1 results are low.

    • KS1 results are largely about levels of spoken language – that is, higher order language comprehension and expression.

      Children in Year 2 can do reasonably well with KS 1 results and yet still be weak with alphabetic code knowledge and blending skills. As texts get more challenging, children truly need phonics knowledge and skills – even the articulate children.

      • Thanks Debbie – so does that mean that there tends to be a drop in results at KS2 where phonics skills have not been embedded in children’s literacy skills? Or even at KS3…? I could imagine that as children progress through the system and the higher-order skills are increasingly emphasized, a lack of phonics skills, if it were having an effect, might be invisible in test results. It must be harder to track specifics like this beyond the end of KS2 because of the move to a new school – but do you know of research which has looked at this (ie tracking literacy over the longer term in relation to initial teaching method)…?

  27. “But a check *of some kind* would be valuable in situations where schools are not reaching levels of results at KS1 which are generally agreed to be acceptable, in that it might help to point up strengths and weaknesses in what’s going on in the school.”

    I agree with much of what you say – I’m now trying to develop a specific area of potential agreement between us, Onlyamanatee (if I may!)

    Many years ago, a government-funded agency called the Assessment of Performance Unit (APU) looked at the performance of large numbers of students in a number of curriculum areas. The results, especially in maths were often very useful to teachers, especially the practical assessments in maths.

    How can I say that the results were ‘useful’ to teachers, you may ask, given other things I’ve said on this site?

    The APU sampled performances. The results were not publicly linked to particular schools, and so the tests were not ‘high stakes’. The APU was not about assessment for accountability purposes, let alone assessment in order to make teachers teach in a particular way.

    Hence there was a much better chance that the assessments had validity worth having than is currently the case with National Curriculum tests, let alone the infamous PSC. Schools could reflect on the results, consider what might have skewed them in view of the inevitable fallibility of any kind of assessment task, and draw important lessons for their attempts to improve their own provision.

    So lets bring back the APU! Results of APU-type sampling might well be very useful to schools who in some senses are not enabling their children to reach ‘acceptable’ standards of reading.

  28. This is exactly the sort of thing I was thinking about. Sampling as a means of quality control (as it were) seems to me to have huge benefits in that it separates the assessment of progress by the children from the assessment of the overall effectiveness of the system.

    This is extremely cheeky of me, but I’ve been thinking about this for a while in between/during household chores: if anyone asked me how to design a testing system I’d have:

    1. Mostly internal testing to provide feedback within schools: not created by individual schools in isolation but well-supported, shaped, and guided by collaboration between schools, subject associations, etc. (They could be as standard as you like, just not high-stakes.)

    2. Random or semi-random sampling of the whole system via standard tests agreed by representatives of interested parties.

    3. Specific diagnostic tests to go in deeper to particular schools or groups of schools (or groups across all schools, with specific needs such as EAL, FSM, or whatever) if problems seem to be arising.

    4. Minimal standardized external tests – maybe one at age 12 and then something at 16/18.

    All of this with a well-trained, well-paid teaching profession, with enough *support staff* to make sure that teachers can focus on teaching and on CPD (including sabbaticals). Plus funding for research into the effectiveness of the system over all and in parts.

    There you go – problem solved 😉
    (NB I know it isn’t.)

  29. Terrific – as they say, if you and I and our hairdressers and taxi drivers ruled the world, how wonderful everything would be…

  30. Pingback: Phonics and literacy – separate strands, or warp and weft? | Miscellaneous Witterings

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s