1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Hi Guest, welcome to the TES Community!

    Connect with like-minded education professionals and have your say on the issues that matter to you.

    Don't forget to look at the how to guide.

    Dismiss Notice

Progress in English/Progress in Maths test reliability

Discussion in 'Assessment' started by GordonWright, Sep 3, 2013.

  1. GordonWright

    GordonWright New commenter

    For the last six years, we've administered these two tests each November with the idea that, since they were standardised commercially produced tests, they would provide 'tracking' - the chance to monitor standardised scores to see whose were rising/dropping year on year and whose were steady and then acting on the results. What we've found, however, is that the scores 'yo-yo' up and down. Good, steady classroom performers can have a 15 point SS drop one year followed by a 10 point rise the next and a further 8 point rise the next, for example. There seems to be no consistency from year to year. Certain year groups always seem to experience a lot of drops (even when staff teaching those year groups have changed!) only to be followed by an upward bounce in other year groups.

    As you can imagine, far from informing us of pupils who are dropping off the pace or gradually improving through effort, as you would hope from year on year tracking, they are simply bamboozling us with their erratic results!

    Has anyone else found a similar problem with these tests? Since they are standardised with large samples, we are at a loss to understand why they can't give more 'reliable' results.

    As a follow on question, does anyone use similar 'standardised' commercially produced tests year on year for tracking (other than optional SATs, which we use as end of year summative tests) with good success?

Share This Page