NAPLAN: A Testing Time For All

Next week my classrom colleagues return to school for the very busy and many would argue the most critical second term. Second term is often the term when the “real” work takes place, classroom routines have been established, timetables are in place and student groupings and other structures are generally humming along. With all these in place more care and attention can be directed to instruction and learning. The end of second term also marks the critical mid year report time, a time when teachers collate all the information they have gathered over the first and second term in order to inform parents and set directions for the rest of the year.

Second term is also the term when the NAPLAN (National Assessment Program, Literacy and Numeracy), tests are administered. These Australia wide tests are set down to be administered for the second time between the 12th and14th of May, taking around three hours to administer over these three days. Ordinarily, and without going into the efficacy of the tests, this is a relatively minor intrusion into the term. As with most things educational however there is a difference between the expectation and reality.

Historically the tests derive from a desire to gain systemwide data in order to make informed decisions across schools. The original 1994 Victorian-based LAP tests were non compulsory and parents could elect to withdraw these children from the program. Fairly soon this decision was reversed and the testing program was instituted for all grades grade 3 and 5 students statewide. Later the test was quaintly renamed the Achievement Improvement Monitor or AIM test. With the AIM test came a new reporting format which was designed to give parents an understanding of where each individual student fitted within the curriculum framework levels. This was an interesting development as it sought to extrapolate and pinpoint each students’ level even down to the grade level indicated by the test results. It also included an abbreviated description of what each student was typically capable of doing along with some suggestions on how they could improve their performance. Not unlike the horoscopes found in daily newspapers, many of these statements were fairly bland and open to interpretation.

Speaking from personal experience, there were many times when aberrant results were thrown up in this process with suggestions that students may be operating at year levels 3-5 years above their designated class when in fact their classroom and other results suggested otherwise. There were also times when the opposite occurred with otherwise high performing students scoring poorly on the tests.

Along with the provision of these student reports, the department began to provide schools with detailed box and whisker analysis on how the school fitted within the statewide results. Fairly soon schools and more importantly principals, were also provided with sets of guidelines to assist in improving these results. Principal performance pay also began to be tied in part to the AIM results. As a result many schools began to feel subtle pressure to begin detailed analysis of the results in order to seek improvement. This often meant many meetings dedicated to forensic analysis of the AIM data. Slowly but surely the three-hour AIM process began to consume increasing hours across the school year.

It didn’t take very long for curriculum coordinators and teachers to work out that one of the prime causes for students problems with the test was in the format of the test. Nowhere else in the school year were students presented with multiple-choice question formats. Rarely were they given work across a whole grade that included examples that even the most capable student was expected to struggle with. Other items within the testing regime was similarly foreign. As a result schools began to assist students to best demonstrate their capability by providing students samples of previous tests in order to practice how to deal with such tests. Fairly soon classes were devoting up to three or four hours per week in the weeks leading up to the AIM/NAPLAN test in leading these practice sessions. Soon the notional three hours of testing were consuming large slabs of other time both within and without the classroom. Additionally as many classrooms across the state are multi-age it is not just the grades 3 and 5 students who might experience disruption.

Of course not many schools would admit to devoting so much time to this process however given a report in the Age newspaper it would seem that such practices have been given impramatur of the Education Department Secretary Peter Dawkins who amongst other statements was quoted as saying;

that students will need help to prepare so they can “understand the genre of testing and the cognitive demands they will be placed under to successfully complete the task”.

All of this in order to

achieve high numbers in the top performance categories.

Perhaps the Secretary is reacting to the last Victorian Auditor-General’s 2009 report on (school’s) Literacy and Numeracy which says in part

Victoria has invested heavily in this complex and challenging area, with more than $1.1 billion allocated over the past six years for improving literacy and numeracy in government schools. While there is evidence of real gains in some areas, the overall report card for the 10 years to 2007 is disappointing. Past efforts have not led to the sustained improvements that were expected. While the most recent evidence from the 2008 national indicators shows promise, it is too soon to make a call on future trends.

Perhaps my classroom colleagues should be devoting the whole of the term to NAPLAN preparation at least then they can’t be accused of not giving their student’s the best chance at providing high numbers in top performance categories…….

Oh if you want to get in a bit of NAPLAN practice at home then you can always try the actual test samples or try the entreprenurial version at trialtests, (gotta love those who have the vision to make money in these times of GFC :). Maybe I could arrange with Wes Fryer to get details to take up the franchise for State Standards Deluxe????

State Standards Deluxe

State Standards Deluxe

Software for elementary school testing at the Apple Store by Wesley Fryer

This entry was posted in Testing and tagged , , . Bookmark the permalink.

6 Responses to NAPLAN: A Testing Time For All

  1. Brendano says:

    Spot on John!

  2. In the interest of brevity, this sums up my POV.

  3. johnp says:

    Hey Graham,

    Having applied complex distractor analysis to the chicken and notwithstanding the fact that said chicken is using a HB pencil as opposed to the mandated but otherwise totally useless in any other context 2B pencil supplied with the testing material, can I phone a friend???

  4. Bill Healy says:

    For busy teachers – Free downloads of detailed suggested answers to the 2008 official sample NAPLAN questions at http://naplan.blogspot.com

  5. garybau says:

    could save a lot of time by publishing the 2009 and 2010 results at the same time…

    similar to the assumption on 5e
    which was revised to 7e over ten years ago!

    multiple choice(guess) is good for quick marking, and plenty of graphs
    however, the actual value of the data is highly questionable, unreliable..and if it were accurate, applies only until the next learning event..then the results change

    good for league tables though…nad removing resources for high(ly) achieving schools

    seems there is more to be gained by having lower results as resources are tied to the school numbers

    with fixed resourcing to networks(districts!?) it appears those that have will donate to those that do not

    a zero sum situation..when additional resources to maintain and extend the learning opportinities would reasobaly be expected

    …flight to private/independent schools?
    ..only for those with an opinion about education..rather than schooling!

  6. There is a way to beat the negative impact of the NAPLAN and similar standardised tests: INFORM parents that they DO have the option of withdrawing their child from participation!

    (see my own blog entry on this on my site linked above to my name)

    …and thanks for yet another blog entry that is openly critical of this imposition that in effect replaces sound pedagogical practice by bureaucratic nonsense.

Leave a reply to garybau Cancel reply