Conference time: -
REAP Conference Fora (in programme order)
Subject: Student development of assessment items

You are not authorized to post a reply.   
Author Messages  
Rebecca Sisk
Posts: 5

29/05/2007 14:21  
This is interesting to me because, as a nursing instructor, we are faced with the need for students to drill down questions related to their licensing exam, both for content and to familiarize students with the types of questions they will face. While this rote drilling may meet both goals, it seems too focused on the professional exam and not on true learning, an attitude of lifelong learning, etc. Medical schools must face something similar and this suggestion is an interesting one. Intuitively, you would expect students to learn from developing the clinical vignette at the very least. Do your students work individually or in groups? Have you examined the reliability and point-biserials of the student-generated questions you have used?
David Nicol
Posts: 18

29/05/2007 17:38  
James, I know of two very good papers in this area as it is an idea I have been trying to promote at my University. One is by Fellenz and the other by Neal Arthur at the University of Sydney. I can send you a copy of Fellenz if you can't find it. The Neal Arthur paper can be found if you google these words sydney, synergy, mcqs neal arthur or some combination. The article is in a magazine on teaching and learning produced by the University of Sydney. Neal's discipline is accounting while Fellez's is organisations or organisational development. I would be interested to hear more about your own study.
James Oldham
Posts: 2

29/05/2007 17:55  
We encourage the students to write vignettes about their experience in their clinical placements. They write them in groups or individually. Interestingly students that use the formative test more do less well on the progress test - they use the formative test as a cramming tool rather than a deep learning tool - see case study. Reliability tests (such as point biserial) are difficult to apply as the formative test is designed for learning not as a barrier to progression. As a result a student may deliberately choose the wrong answer to get the feedback on the wrong answer, alternatively they may take the test in a group making a reliability analysis difficult.
Rebecca Sisk
Posts: 5

29/05/2007 20:53  
Very helpful answers. Now I understand a little better that you are using the test questions the students write as formative assessment only. Sounds like your challenge here is to find a way to make the process of writing vignettes and questions the "meatier" activity than of taking the formative test. Maybe some type of reflective activity and/or group discussion to think through the vignette and why various alternative answers are right or wrong would add meaning to the activity, though there is only so much time in a course to help the students gain the most important learning.
Alison Muirhead
Posts: 14

30/05/2007 14:04  
Apologies, the following post was inadvertantly deleted:

James Oldham; 29/05/07; 11:41
At Peninsula Medical School the students have developed a database of 400 assessment items in the format of the progress test (clinical vignette, question, 5 options). The students write the items with an emphasis on 'response contingent feedforward'. We deliver the tests online. Initial evaluations suggest that item writing is an effective learning environment both for specific content expertise and metacognition. We are now in the process of formally integrating item writing into the curriculum and will evaluate the outcomes more rigorously. Can anyone recommend any literature that explores the educational advantages and disadvantages of student generated assessment in the field of applied medical knowledge?
Andy Sharp
Posts: 11

30/05/2007 23:22  
Hi Alison

here is a suggestion for your literature search it may not meet your needs exactly but many provide some useful insights.

Marbach-Ad,Gilli.
Sokolove, Phillip G
Can undergraduate biology students learn to ask higher level questions?

Journal of Research in Science Teaching Vol 37 (8)

http://www3.interscience.wiley.com/cgi-bin/abstract/73502045/ABSTRACT?CRETRY=1&SRETRY=0

Abstract
Our goals in this study were to explore the type of written questions students ask after reading one or more chapters from their textbook, and to investigate the ability of students to improve their questions during the course of a single semester. In order to classify student's questions we used a taxonomy that we have developed specifically for this purpose. Two comparable populations were examined: Undergraduate students in a large, introductory biology class who were taught in traditional lecture format, and students in a similar class who were taught in cooperative/active learning style. After the taxonomy was presented to the active learning class, more students were able to pose better, written questions. Their questions became more insightful, thoughtful, and content-related, and were not easily answered by consulting the textbook or another readily available source. The best questions could be recast as scientific research questions (i.e., hypotheses). In contrast, when the taxonomy was presented to students in the traditionally taught class, the quality of student-posed questions was largely unchanged. Various explanations for the difference in outcomes are discussed, and methods are suggested about how generally to encourage students' questions and to improve their question-asking skills regardless of overall teaching style. © 2000 John Wiley & Sons, Inc. J Res Sci Teach 37: 854-870, 2000

Hope this is useful to your search
You are not authorized to post a reply.  
Forums > Keynotes > David Boud > Student development of assessment items



ActiveForums 3.6