AATE Submission to Senate Inquiry on NAPLAN, June 2013

AATE Submission to Senate Inquiry on NAPLAN, June 2013

AATE Submission to Senate Inquiry on NAPLAN, June 2013

Information from

The effectiveness of the National Assessment Program - Literacy and Numeracy

Information about the Inquiry

On 15 May 2013 the Senate referred the following matter to the Senate Education, Employment and Workplace Relations Committees for inquiry and report.

The effectiveness of the National Assessment Program - Literacy and Numeracy (NAPLAN),
With specific reference to:
a) whether the evidence suggests that NAPLAN is achieving its stated objectives;

b) unintended consequences of NAPLAN's introduction;

c) NAPLAN's impact on teaching and student learning practices;

d) the impact on teaching and student learning practices of publishing NAPLAN test results on the MySchool website;

e) potential improvements to the program, to improve student learning and assessment;

f) international best practice for standardised testing, and international case studies about the introduction of standardised testing; and

g) other relevant matters.

Submissions should be received by 07 June 2013. The reporting date is 27 June 2013.

The Committee is seeking written submissions from interested individuals and organisationspreferably in electronic form submitted online or sent by email to as an attached Adobe PDF or MS Word format document. The email must include full postal address and contact details.

AATE draft submission

a)whether the evidence suggests that NAPLAN is achieving its stated objectives;

  1. Whatever claims politicians may make for NAPLAN, the first thing to be said about the objectives of the program is that they are not clearly stated on the National Assessment Program (NAP) website.
  2. The part of the NAPwebsite specifically relevant to NAPLAN ( ) does not include a clear statement of objectives. It merely says that: “NAPLAN tests the sorts of skills that are essential for every child to progress through school and life, such as reading, writing, spelling and numeracy.” It does not adequately explain the purpose of such testing. Assessment data certainly has the potential to be useful but it needs to be used in ways that improve learning. It does not, of itself, improve educational outcomes. The relevant aphorism is that no pig ever grew fatter just through being regularly weighed.
  3. The “Why NAP” page of the NAP website includes the statement: “The National Assessment Program (NAP) is the measure through which governments, education authorities, schools and the community can determine whether or not young Australians are meeting important educational outcomes.” ( More properly, this should read “a measure” rather than “the measure” and it is important to note that only some educational outcomes are involved. A fundamental problem with NAPLAN has been that the data has been misused and inflated claims made for its meaning.AATE members find most disturbing the claim that such a limited testing regime is used to ‘enhance the capacity for evidence-based decision making about policy, resourcing and systemic practices’.
  4. The publication of school NAPLAN results on the My School website and the way that some parts of the media report and comment on these results have caused many people to quite inappropriately view them as a valid measure of whole-school performance. The tests probe only a very narrow slice of the whole school curriculum and are not designed and are not adequate to support this sort of evaluation.
  5. Research indicates that NAPLAN is most commonly viewed by teachers as either “a school ranking tool or a policing tool” (Dulfer et al, 2012, p8). This is borne out by informal comments by our members.
  6. The “Why NAP” page of the NAP website goes on to say: “Two benefits of the NAP are to help drive improvements in student outcomes and provide increased accountability for the community.
  7. If NAPLAN improved student outcomes overall it could probably be considered a good thing. However, because it focuses on only a very narrow slice of the whole school curriculum, it has tended to distort the efforts of schools. It needs to be understood that being able to complete multiple choice and short answer questions, and short time, on-demand writing tasks do not, on their own, add up to a quality education.
  8. As to accountability, the nation would probably be better served if it exhibited more faith in, and respect for, its teaching profession. The NAPLAN and My School regime sends a message that teachers cannot be trusted to do a good job but have to be externally checked up on and shamed into doing it.The focus should not be on questioning teacher performance but on how the data can be used to inform teaching and learning.

b)unintended consequences of NAPLAN's introduction;

  1. In Paragraph 4 above we mention that it seems that many people quite inappropriately view NAPLAN results as a valid measure of whole-school performance. When this misunderstanding combines with the publication of results on the My School website and the focus on school choice, there is potential for long-term harm to the sort of access to educational opportunity that is essential for a healthy democracy. In a recent article in The Australian defending the publication of NAPLAN results on the My School website (“My School sheds a welcome light”, 17 May 2013), Jennifer Buckingham from the Centre for Independent Studies argued that this is not the only factor which influences parents’ school choice.She then made the telling point that “some parents have no choice”.
  2. This is at the heart of one of the unintended consequences of the NAPLAN regime. While there may not have been mass exoduses from schools with low NAPLAN results, the tendency over time is likely to be that higher performing students will be moved to other schools. When this happens to any significant degree, school cultures change.This risks the formation of residual ghetto schools which are unlikely to be able to deliver the educational fair go for all that, as a nation, we like to claim we believe in.
  3. Informal input from our members aligns with the research finding that teachers generally believe that “that the publication of ‘weaker than expected’ results would negatively affect parental perception of the school.” (Dulfer et al, 2012, p8) These researchers note that their findings “strongly suggest that NAPLAN is viewed by the teaching profession as ‘high stakes testing,’ confirming views already expressed by Lingard (2010) and Lobascher (2011).”(Dulfer et al, 2012, p9). This also conforms with informal input from our members.
  4. Also of concern to AATE is the potential negative effect on the morale of teachers working in schools which tend to have low average NAPLAN results largely because of the nature of the student clientele and/or limited school resources.
  5. Improving student outcomes in education is always a laudable goal but it is important that improvements in some areas are not achieved at the expense of others. A sensible goal would be overall improvement. Since schools are being judged on NAPLAN scores it is only to be expected that schools will devote time and energy to trying to ensure that students perform well on the tests. Time available to schools is finite so if more time is devoted to NAPLAN preparation, then correspondingly less time will be devoted to other areas of the curriculum that are not measured and reported in this way.

c)NAPLAN's impact on teaching and student learning practices;

  1. NAPLAN’s impact on teaching and learning practices – whether good or bad – cannot, in our view, really be separated from the fact that the results are published on the My School website. This, along with associated media coverage, is what makes the activity high stakes for schools and causes a range of unwelcome distortions of educational effort.
  2. Consequently, our comments on this aspect are to be found in Section d) below.

d)the impact on teaching and student learning practices of publishing NAPLAN test results on the MySchool website;

  1. We note the comment in a literature review conducted by the Whitlam Institute at the University of Western Sydneythat: “There is considerable evidence in the international literatureof the impact that high stakes testing can have on the qualityof the learning experience of children. Evidence has emergedthat such testing can structure the educational experiencesof students in ways that limit the development of the range ofskills and literacies needed in the modern world, encouraginglow-level thinking and promoting outcome measures ratherthan the intrinsic processes of learning and acquiringknowledge” (Polesel et al, 2012, p5).
  2. This literature review also comments that: “Research on high stakes testing has also found thatthese tests may be having a negative impact on teacherpedagogies with a resultant degradation of students’experience of learning. The impact of this may be defined asa shift from a focus on the needs of the child to the needs ofthe evaluation and reporting process” (Polesel et al, 2012, p5).
  3. AATE has not had the capacity to carry out any formal research itself but informal feedback from our members is in keeping with the concerns expressed above.
  4. Teacher concern about preparation for NAPLAN adding to an already crowded curriculum is indicated by the Dulfer et al (2012) research which again conforms with anecdotal comments from our members who find they are required to sacrifice some of the more critical and creative aspects of English in favour of narrow preparation for the tests .
  5. Some of our English teacher members report that, in order to prepare students for the tests, they are required to include persuasive writing in units of work where that genre does not comfortably or naturally fit.
  6. While persuasive writing, or whatever the NAPLAN genre happens to be at any particular time, is important, other areas of writing will receive less emphasis than they deserve if there is an inappropriate emphasis on preparing students for NAPLAN. Some of our members report this sort of unproductive imbalance.
  7. Implementing the writing test for all students in years 3, 5, 7 and 9 in the same way ignores the different developmental levels of the students and works to distort how effective this test becomes as a test of writing skills. In high schools, English teachers teach the writing process wherein students are provided with time to brainstorm ideas, plan them, write and finally edit their work. This process is generally practised in class situations and, where tested in exam conditions, over an extended period – from 70 minutes to 2 hours. The 40 minute timeframe employed for the NAPLAN test makes a mockery of the writing process, providing instead a highly constrained and reduced version of student writing. The fact that this timeframe is then integrated into a wide range of practice for this task within schools undermines the writing process within schools. The second problem that exists with the current writing task design is the common topic for Years 3, 5, 7 and 9, meaning the topic chosen needs to communicate equally well to students in Year 3 as to students in Year 9.

e)potential improvements to the program, to improve student learning and assessment;

  1. If the program is to be retained, it should be changed from a census to a sample test. Sample tests are deemed adequate to provide information about such important areas of the curriculum as science literacy, civics and citizenship, and ICT literacy. Sample, rather than census, testing could also deliver adequate information about the overall performance of the nation’s schools in literacy and numeracy.
  2. This change would prevent NAPLAN from having high stakes status and could be expected to remove the unintended negative consequences that have unfortunately developed.
  3. With only sample testing, individual schools would not be identified on a platform like My School and unhelpful pressure from certain sorts of media reporting and commentary would not impede educational efforts at school level.

f)international best practice for standardised testing, and international case studies about the introduction of standardised testing; and

  1. On international measures, Australia already outperforms countries that have high stakes standardised testing, e.g. the USA. By contrast, countries like Finland, with education systems acknowledged as high performing, do not have such testing regimes.
  2. Many of our members are at a complete loss to understand why our political leaders choose to imitate educational practices that have been shown to be ineffective overseas.
  3. It has long been known in educational circles that negative unintended consequences ensue when standardised testing regimes like NAPLAN become high stakes activities for schools. International best practice is not to make this mistake in the first place.

g)other relevant matters.

  1. Cost-benefitNo proper cost-benefit analysis has ever been done on NAPLAN. AATE believes that the reputed annual cost of around $100 million for the testing and the associated publication of results on the My School website could be put to better use in other areas of school education.
  2. Grammar, spelling and punctuationAATE’s English teacher members certainly think that grammar, spelling and punctuation are important. However, given that the NAPLAN tests probe only a narrow slice of the whole school curriculum, we do wonder why these three elements are effectively measured twice. They are three of the criteria used to assess the on-demand writing task and are also measured separately in the language conventions test. We doubt that this double focus reflects their relative importance among all the things we wish young Australians to learn at school. Instead, we suspect that it is just that they are relatively easy to measure with multiple choice tests.
  3. Diagnostic information from NAPLANSome, like Federal Education Minister Peter Garrett, claim that NAPLAN delivers useful diagnostic information for schools. This, however, has to be doubted. Students sit the tests in mid-May but the results aren’t released for months. Genuinely useful diagnostic information would be on hand within a couple of weeks of the tests to guide lesson planning before the end of Term 2. Even when they are available, the results seldom tell competent teachers much that they don’t already know.

References

Dulfer, N, Polesel, J and Rice, S, 2012 The Experience of Education:The impacts of high stakes testingon school students and their families - An Educator’s Perspective. Whitlam Institute, University of Western Sydney.

Lingard, 2010, Policy borrowing, policy learning: testing times inAustralian schooling. Critical Studies in Education, Vol. 51, No. 2,pp. 129-147.

Lobascher, S 2011, What are the Potential Impacts of HighstakesTesting on Literacy Education in Australia? AustralianJournal of Language & Literacy, Vol. 34, No. 2, pp. 9-19.

Polesel, J, Dulfer, N, Turnbull, J, 2012 The Experience of Education:The impacts of high stakes testingon school students and their families, Literature Review. Whitlam Institute, University of Western Sydney.

1