Browsing by Author "Jones, Ian"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Open Access Measuring conceptual understanding in randomised controlled trials: Can comparative judgement help?(Wiley, 2019-06-01) Jones, Ian; Bisson, M. J.; Gilmore, Camilla; Inglis, MatthewAn impediment to conducting high-quality quantitative research studies in education is the paucity of valid measures of learning gains. Studies often seek to investigate students’ deep, conceptual understanding yet many measures assess only surface, procedural understanding. One reason is that the development of validated measures of conceptual understanding is resource intensive, time consuming, and success is not guaranteed. We evaluated a novel and efficient technique, based on comparative judgement, for assessing conceptual understanding. We applied the technique to a randomised controlled trial in which students were taught simple algebra based on either the Grid Algebra or the MiGen software package. The participants were Year 5 students (N = 188) drawn from four primary schools who had not encountered algebra previously. An instrument from the literature (Concepts in Secondary Mathematics and Science: Algebra Scale), and a novel comparative judgement assessment were administered following the intervention. Students in the Grid Algebra condition outperformed those in the MiGen condition on both post-test measures. The comparative judgement technique performed similarly to the standard instrument but was far more efficient to design and implement. The technique can, in principle, be quickly applied to any target concept of interest. We conclude that comparative judgement is a valid, reliable and practical tool that could help to increase both the quantity and quality of quantitative research in education.Item Open Access Teaching using contextualised and decontextualised representations: examining the case of differential calculus through a comparative judgement technique(Taylor and Francis, 2019-12-18) Gilmore, Camilla; Inglis, Matthew; Jones, Ian; Bisson, M. J.An ongoing debate concerns whether novel mathematical concepts are better learned using contextualised or decontextualised representations. A barrier to resolving this debate, and therefore to progress in the discipline, has been the paucity of validated methods of measuring students’ understanding of mathematical concepts. We developed an innovative and efficient method for measuring, in experimental settings, students’ understanding of any mathematical concept using comparative judgement. We demonstrate the method by applying it to the comparison of learning outcomes from two teaching conditions. Participants (260 15–16 year olds across six schools) were introduced to differential calculus using contextualised or decontextualised representations. We then assessed participants’ comparative conceptual understanding of derivatives. We found evidence that contextualised and decontextualised representations were equally effective at promoting student learning in this context. The assessment method yielded valid and reliable results, suggesting that it offers a robust and efficient approach for the problem of assessing conceptual understanding in experimental or other comparative settings.