top of page
Search

JC Science 2023 - an overview


Following is a personal response from Dr Richie Moynihan to the JC Science paper from this June, and many thanks to Richie for doing this work for us.....


This year, we saw junior cycle students sitting what was the third junior cycle science examination paper, published by the State Examinations Commission for the science specification, published 2015. While I see may people online and otherwise looking at the Leaving Certificate exam papers and making comments, I think there is scope to comment on what we saw on the junior cycle paper and give it the attention it deserves.

For this analysis, I’ve decided not just to go question by question and give a quick comment but instead look at a way to properly analyse the paper by looking at what it was asking of the students, how it was asking it and what they were expected to do to answer the questions, without taking scoring or grades into account. To do this, I’m going to use Bloom’s taxonomy (see the image above) as a lens to interpret the questions and comment on what type of thinking is requested of the students to approach the answers. Loosely speaking, Bloom’s taxonomy divides different learning actions into 6 levels, based on what manner of cognitive processes are required to complete the action.


  • The first two levels are Knowledge and Comprehension – in which tasks based on these are complete when student have either learned off a body of knowledge and/or demonstrate they can understand what the knowledge means. They are lower order thinking skills on the taxonomy.

  • The next two levels are Application and Analysis, in which the students demonstrate that can apply their knowledge & understanding in contexts familiar, semi-familiar and unfamiliar to the contexts in which they were learned.

  • The final two levels of the taxonomy are usually Synthesis and Evaluation, in which students must show that they can use their knowledge to change and/or create some new to them in the given context they are presented in.

One thing to note about the taxonomy is that higher level tasks do not necessarily mean they are difficult for students to complete. You can require student to remember lots of individual units of knowledge, interwoven in complex ways, or you can ask students to create a tower that holds a marble using lollipop sticks. No doubt, students may find the first task to be more difficult that the latter.


Instead of looking at every single question, I’ll give a few examples of how I analysed the paper and assigned the cognitive levels to the questions and present the overall assignments in a bar chart.


Q1 (a) asks students to label the cell nuclei with the letter N. Students may not have necessarily observed both cell types in their academic experience of JC Science and may have only seen them both in diagrams (while potentially only observing one of the types with a microscope) so I attributed this are an application question. Students have to interpret the pictures to identify what cell organelles were present and use apply their knowledge to identify the nuclei.


Q1 (b) is a recognition question, which falls under the knowledge category. Students are given key terms for organelles and functions and asked to match them. They are simply recognising works and terms.


Overall, I don’t think any teacher nor student would complain about this question, and it’s a nice question to quell any nerves students may initially be feeling.



Q4 targets introductory science concepts related to conservation of mass and physical changes. I would assign to the understanding level of bloom’s taxonomy as the students needs to be able to read the text and understand the processes described to pick the correct term form those given.


Questions like this can be useful to identify if students understand the concepts that they are presented, although I have seen variations in which student are given more key terms than spaces or some key terms are used more than one in the given text. Assuming students score well on questions of this type nationally, these could be ways in which these questions could be given another layer of complexity, while still probing for understanding.



Q7 involves students applying their concepts of speed, time and acceleration on a graphical representation of data. This whole question looks at different aspects of interpreting and analysing the graph, using various procedural and conceptual knowledge tasks.

I can imagine students making errors on this question, but not realising at the time they made errors, particularly in (c) and (d). Being able to read, analyse and interpret graphs is a key skill for not only any career scientist, but also business, maths and even reading the business section of newspapers. This question nicely probes age-appropriate knowledge and understanding, and I’d like to see more of these questions.



Q9 (a) involved students using their procedural knowledge of how to determine the valency of atoms and predicts an appropriate bonding ratio for the elements given. There are several individual steps involved in doing this and students who have mastered this have been able to complete question. As the exam asks for 3 different ratios, student who guess answers will likely be evident, as will those who can correctly determine the ratios when given atoms, as in the table.


Q9 (b) probes student understanding, asking them to recall and explain the properties of metals and non-metals, and demonstrate differences between the two.


I suspect Q9 (a) may have been a difficult question for many students, as not only does it ask for higher order thinking skills but there are numerous steps required to construct the correct ratios. Repeated practice using worked examples and repeated-step examples can be key to enabling students to master the procedures required to complete these questions. Q9 (b) is a fair question, that should be accessible by a high number of students, in my opinion.


Using the manner of analysing the questions as seen above, this chart represents the total assignment of cognitive domains to all the questions asked on the paper, presented as relative frequency of the total.

The paper delivers what I would consider to be a good spread of question types across the entirety of the paper. While having a good recall of the knowledge of junior cycle science is important to the students taking the paper, it is not enough to just know the scientific knowledge but must be able to use it in the manners required of the higher order cognitive domains.


For instance, if a student just learned off the content but didn’t understand any of it or couldn’t use it, they would struggle to answer over 40% of the questions. If you can recall the knowledge, use it, apply it and analyse new information using it as a lens, a student could answer ~80% of the questions.'


In the absence of a marking scheme, this does not necessarily translate to grade outcomes, but it does indicate that paper gives enough scope to students who struggle with science concepts to score an appropriate result without attaining the higher grades that require a more in-depth understanding and ability to use the content. For this, I would suggest teachers ensure that they plan their teaching and learning to include not only tasks for knowledge transfer and understanding but also include tasks to promote the higher order thinking skills in the classroom over the three years. It is apparent that these skills are being asked on the exam papers (and it’s a welcome aspect of them, in my opinion) and more opportunities to promote them will only enable students maximise (a) their ability to access the questions on the paper and (b) maximise grade results.


Another point of note is that Biology and Earth & Space questions tended to populate the lower cognitive domains while Physics and Chemistry tended to populate the higher ones. This would be in line with observations about the Leaving Certificate sciences, as seen in Burns, et al., (2018), which produced results to show that knowledge acquisition and recall is more fruitful for answer questions than in chemistry and physics questions. My own recommendation from this is to prepare to see that inverted somewhat as the years progress. Higher order questions in biology and earth and space may easily appear on future papers, as early as next year, to help readdress this balance, and as teachers, we should plan that our students develop their higher order thinking skills in all four strands of the course, as to ensure that future higher order questions in all strands are unlikely to throw students off.


Over the years of the new JC rollout, I heard people saying something along the line of this quote – “It’s not about what you know, it’s about what you can do.” To be honest, I think it’s an absurd quote, I don’t know who said it originally and it’s not in any documents related to the JC Framework or the Science specification. The JC framework refers to a “new balance between subject knowledge and skills” (pg 7). The JC Science specification refers to developing skills, knowledge and understanding of science concepts (pg 13). It seems to be a more apt phrase should be “Its about what science you know, and what you can do with the science that you know.”


With these as the official documents that we develop teaching, learning and assessment for our students (regardless of people’s opinions of strengths and weaknesses of the documents) I think it is evident that the State Examinations Commission produced an exam that not only assessed not only what the students know but what they can do with what they know.





Comments


bottom of page