1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Hi Guest, welcome to the TES Community!

    Connect with like-minded education professionals and have your say on the issues that matter to you.

    Don't forget to look at the how to guide.

    Dismiss Notice

SAT Inequalities in testing?

Discussion in 'Primary' started by Davebrigg, Apr 27, 2012.

  1. My son's school has already been told that it will not be moderated. [​IMG]
    I have had experience of marking the KS2 writing tests, and KS3 classwork, and the two processes are quite different. For the KS2 tests we have to identify individual features, which have been given different numerical weightings, score each separately and arrive at a number which then translates to a level. For teacher assessment it is a case of looking at the level criteria and giving a 'best fit' judgement, bearing in mind that some AF's are more important than others. It's not surprising that these different methods sometimes produced different outcomes.
    For the 2012 tests, far fewer markers have been recruited, because it seems most schools are marking their writing internally. The positions have been offered to those who scored the highest in the 2011 marker standardisation tests, which may produce a higher level of consistency this year.
  2. WolfPaul

    WolfPaul New commenter

    Well yes you could, but it's sugnificantly harder to cheat in the context of taking a national externally marked test than it is to simply say that a child's work is independent when in fact it isn't.
    I'm sure it is. How can you be sure that the school down the road has similar robust moderating in place too?
    I don't see the point of using supported writing for assessment purposes.
    ...but teachers are also expected to ensure that all children make the correct amount of progress, and the stakes are as high as ever when it comes to the Y6 statistics. Do you not perceive the temptations therein?
  3. markuss

    markuss Occasional commenter

    But tests are barely relevant, if at all, in assessing end of key stage levels in English.
    And it's even more of a myth that our children do SATs - that's utter nonsense.
    No idea why people think there are any as, bs, cs, ds, es, etc in NC levels. There aren't.
  4. WolfPaul

    WolfPaul New commenter

    Relevant to what?
    The DFE seem to disagree with you, Markuss:
  5. LOL.
    Isn't there already a grammar test planned?
  6. littlerussell

    littlerussell New commenter

    The teacher assessments for English last year were exactly the same as the test outcomes (81% for both) which does make you wonder why the cost and hassle of the tests is necessary at all. Unfortunately I can't find teacher assessment figures for Writing, only English.
    I think that this will actually work in KS3's favour.
    Suppose writing results go up by 5% (ie 5% more L4s, who, according to KS3, aren't really L4).
    From experience, I would say that something in the region of 15-20% of our kids come back with the wrong level. The tests have weaknesses too: a strong speller can get 7 of the 25 marks for L4 needed before they begin, and can therefore produce L3 writing and get a L4. If a child misinterprets the task (ie remember the Miptor task, where they wrote an explanation rather than instructions), they've blown it completely, by effectively making one error on one piece of work.
    Wouldn't you rather have an assessment based on what they normally write, rather than a one-off?

Share This Page