“Instead of teaching generic critical-thinking skills, we ought to focus on subject-specific critical-thinking skills that seek to broaden a student’s individual subject knowledge and unlock the unique, intricate mysteries of each subject.”
A film version is being made of the Shakespearean play you have studied. What would you include on a poster advertising the film, to represent what you think is important in the play and to create a sense of anticipation for its upcoming release? Explain your decisions with reference to the play.
Junior Cycle English Examination 2017 Higher Level, State Examinations Commission.
This week saw the publication of the first Chief Examiner’s Report on the Junior Cycle English exam. This two-hour exam replaces the five-hour Junior Certificate exam that was sat over two papers and aimed to assess the breadth of candidates’ achievements across the syllabus. (The vast majority of these candidates are third year pupils, roughly fifteen years of age.) The new exam is not based on a syllabus, but is “linked to” the Junior Cycle English Specification: a document which the Examiner’s report tells us “aims to put students at the centre of the educational experience”. The Specification, despite its name, specifies very little in the way of particular English subject-specific knowledge: it’s not what you learn that matters, it’s “the quality of learning.”
I am not here to weep over the corpse of the old Junior Cert exam, which certainly had its issues.What I will lament is that the old exam – flawed as it undoubtedly was – was broadly predictable in format and assessed all of the key areas of the syllabus. Candidates had to write an essay, do a bit of functional writing and Paper II covered Drama, Poetry and Fiction always in the same order and with equal marks awarded to each section. This predictability minimised the class time necessary for dealing with exam format and timing. Now, there were problems, as I’ve said: there were a lot of questions to be done and the most able candidates sometimes ran out of time. The ratio of unseen: studied was 50:50, weighted too much towards the former, and another problem was that candidates could use any play in the drama section and so some students were deprived of the experience of studying a Shakespearean play. We will not speak about Media Studies.
All of these issues pale like distant stars against the glare from the new Junior Cycle exam. The Report mentions twice that this is a “no-choice examination”, despite a course that is so open-ended it can hardly be described as such. This makes the exam an effective lottery. Some candidates will be lucky and more of the questions will match what they have learned and the texts they have studied. Others will be less lucky, but that isn’t supposed to matter. The format of the exam will change from year to year so that candidates cannot be advised in advance when it comes to timing. As to what may come up it could be Shakespeare and poetry (like this year), fiction and film, some kind of media studies, functional writing, or any combination of the above. Candidates in 2017 studied two novels from a list that includes “Jane Eyre”, “To Kill a Mockingbird” and “Animal Farm” and there was no fiction question on the exam. Their learning in this area was not thought important enough to assess, although there were two questions about posters.
In his book “Measuring Up” Daniel Koretz outlines two problems with assessment design and validity , both of which are thrown up by the new Junior Cycle exam. The first of these is “construct under-representation”. This is “a failure to measure what we want to measure…..This harks back to the notion of a test as sample from a domain. To measure the intended construct well- – vocabulary, proficiency in algebra,, whatever – we have to sample adequately from the domain implied by that construct” [italics mine] . Koretz mentions extended writing as an example, and indeed there is no essay-style question on the Junior Cycle English paper. Even the longest answer demanded is no more than a standard A4 page in length. (At the in-service I attended, queries around this omission were answered by pointing out that the CBA2 (a type of portfolio) is now “the home of extended writing”. But this CBA2 is assessed only by the teacher and is not completed under exam conditions. It is still fair to say that extended writing is not meaningfully assessed before the Leaving Cert). In terms of sampling from across the areas of English, the exam format deliberately leaves out large, discrete sections of the subject domain. It is quite possible that Junior Cycle candidates in 2018 will not be asked a question about a play, even though the Specification demands that Higher Level candidates study at least one Shakespearean play in its entirety. It is also possible that there will be no question related to poetry. But there might be an infographic to interpret.
The second problem that Koretz raises around design and validity is “construct-irrelevant variance”. This is “variation in the performance [of candidates] that is irrelevant to the construct intended”. The Report states that the intentionally unpredictable format means that “incorrect reading of the instructions and rubrics led to confusion for some candidates” and that “good examination technique and effective time management are critical”. This admits that the confusing nature of the exam impedes certain candidates, particularly at Ordinary Level, from demonstrating the knowledge and skills they have gained in their three years studying English at secondary school. Surely this is what we want to assess, not whether candidates can interpret arbitrary instructions for carrying out a task that most of them will never repeat. My own feeling is that the prioritising of Carrying Out Instructions over Demonstrating Knowledge is part of an overall drive to reduce our education system from one of opening minds to one of training workers. We are in danger of returning to Pearse’s murder machine, where the “sleek..obsequous..and dexterous” candidate writes inside the lines while watching the clock and is rewarded accordingly.
We will in time have to explain how copying numbers from infographics and subjecting posters advertising children’s films to critical analysis was the highest attainment we aimed for young people in our subject. The Report remarks glowingly on the appearance of posters in two sections writing of Q9 “which asked candidates to nominate material for a poster advertising a film version of the play they had studied.”
“Having completed question two earlier in the examination paper, some candidates modelled their responses on what they had learned from their critical analysis of the cinema poster for Fantastic Beasts and Where to Find Them, thus demonstrating effective examination technique and transferrable critical thinking skills.”
This is one of the most worrying parts of the report. It demonstrates how much the educational establishment of the DES, the NCCA and the SEC are in thrall to the progressive ideological narrative – as, in fairness, promoted by the OECD – that schools need to think a lot less about what knowledge children are acquiring and much more about nebulous “transferrable” skills such as critical thinking. The Junior Cycle framework is itself based around “key skills” and we see here the damage this thinking has on assessment. Exams are no longer about assessing what candidates know, they’re aiming to assess how well they think . Having the cop on to discuss material presented on one part of the paper in an answer on a later section somehow counts for more than being able to answer the question using a knowledge of Shakespeare’s themes and language. It’s not hard to see this leading to the worst excesses of “teaching to the test”, where teachers coach students in these kind of tricks rather than ensuring they have a secure knowledge-base to tackle rigorous and targeted questions of the kind we haven’t seen at this level since the Inter Cert.
Critical thinking is quite rightly a primary aim of education, but it is not a “transferrable skill”. Thinking about anything properly requires knowledge of that thing, as Daniel Willingham writes in this widely-accepted article, and elsewhere in books such as “Why Don’t Students Like School?”. It is knowledge we have to work on as attempting to build children’s thinking capacity as a skill in itself is counterproductive. Carl Hendrick writes here how this leads to shallow teaching and even shallower assessment (the Junior Cycle English exam being an example). He calls instead for a subject-specific approach, which in English would include texts of established literary merit along with knowledge of their authors and, where appropriate, their historical context.
The Specification and its assessment – including the final exam – are currently under review by the NCCA. It is desirable but highly unlikely that English will be restored as a two-paper exam. Even we had to live with a two-hour exam and the lottery of not knowing which elements will be examined, I think teachers could get over that if the questions asked were fair, and honestly challenged candidates to write extended answers where they have to use and manipulate their knowledge of the texts they have studied as well as display the language skills so many of them work so hard to acquire.