|About the Case Study Library|
In the case study library you will find all the resources you need to participate in the conference, including keynotes, case studies, panel position statements and other supporting documents. You can also access them through the corresponding session page.
The library includes the collection of case studies that are being featured at the REAP conference. These are organised by theme, with titles that correspond to the conference sessions.
|Case Study Library|
- T1 - Assessment and the First Year Experience
- T2 - Great Designs for Assessment
- T3 - Institutional Strategies for Assessment
- Panel Sessions
- Feast of Case Studies
|Great Designs presentation and audio||David Boud keynote presentation 'Great Designs: what should assessment do?'|
|chat great designs.pdf||Transcript of 'Great designs: what should assessment do?' live chat session held on 30th May 2007|
David Nicol with Steve Draper
|Principles of good assessment and feedback.pdf||David Nicol keynote paper 'Principles of good assessment and feedback: Theory and Practice'|
|Principles of good assessment and feedback presentation||David Nicol keynote presentation 'Principles of good assessment and feedback: Theory and Practice'|
|A momentary review of assessment principles.pdf||Steve Draper keynote paper 'A Momentary Review of Assessment Principles'|
|chat theory and practice.pdf||Transcript of 'Theory and practice of assessment' live chat session held on 29th May 2007|
|Assessment old principles new wrapping.pdf||Mantz Yorke keynote paper 'Assessment, especially in the first year of higher education: old principles in new wrapping?'|
|chat assessment in the first year.pdf||Transcript of 'Student input to assessment design & strategy' live chat session held on 31st May 2007|
|Culture of Evidence Color Version.pdf||Trudy Banta - Keynote handout 'Planning, Evaluation, and Improvement at IUPUI'|
|Development in Reflective Thinking.pdf||Trudy Banta: Keynote handout 'Development in Reflective Thinking'|
|Trudy Banta - Keynote.ppt||Trudy Banta - Keynote presentation 'Using Electronic Portfolios to Assess Learning at IUPUI' featuring Trudy Banta, Sharon Hamilton & Susan Kahn, Indiana University-Purdue University Indianapolis|
|chat using electronic portfolios.pdf||Transcript of 'Using electronic portfolios to assess learning at IUPUI' live chat session held on 30th May 2007|
^ T1 - Assessment and the First Year Experience
^ Effective feedback to 550 students
|Online collaborative work large first year psychology class.pdf||Jim Baxter, University of Strathclyde, UK.
Jim Baxter's case study focuses on Basic Psychology at Strathclyde University, currently the largest psychology class in the UK. The student roll has in some years totalled 610 students and has never fallen below 520 students in the last 12 years. Although successful in terms of student enthusiasm and engagement with the lecture course, one persistent problem with the class was that more material was presented in lectures than was referred to by the majority of students in their examination answers. This imbalance was, of course, the reverse of what should be the case when students take responsibility for their learning. Another difficulty was that no system of early assessment - summative or formative - was available to Basic Psychology students. Setting conventional essays was not practicable given the size of the class with the result that students got no feedback on their performance in each semester except for multiple choice class tests held in December and April. This paper presents details of an initiative designed to provide early and regular formative assessment opportunities to this large class. An example of the scheme working is presented and students’ ratings of the scheme are reported and discussed.
|Rowntree session topic review effective feedback to 550 students.pdf||Review for Session Topic: Effective Feedback to 550 Students Commentary on: Baxter " A Case Study of Online Collaborative Work in a Large First Year Psychology Class"
By Prof Derek Rowntree, The Open University, UK.
|chat effective feedback to 550 students.pdf||Transcript of 'Effective feedback to 550 students' live chat session held on 29th May 2007|
^ Interaction of peer & tutor feedback
|Collaborative problem solving in first year physics.pdf||Simon Bates, University of Edinburgh, UK.
In this case study Bates describes an assessment design for collaborative group working of first year students in Physics, implemented in teaching activities we call ?workshops?, that were originally introduced 5 years ago as a replacement for the standard tutorial-plus-laboratory format. These workshops comprise a variety of different group activities, but here I focus on collaborative problem-solving and the way that the workshop activities feed into the assessments for the course. The case study addresses themes 1 and 2 of this conference (1st year experience and assessment designs). I also describe the spread of this activity throughout first and second year teaching in Physics and how it has led to a reconsideration of the importance of variety in teaching accommodation at an institutional level.
|Formative assessment in a professional doctorate context.pdf||Barbara Crossouard and John Pryor, University of Sussex, UK.
Crossouard and Pryor's case study describes a task designed to support formative assessment in doctoral contexts. It was conducted within the collaborative EU-funded project ‘Internet-Based Assessment’ (2002-2004). Formative assessment was conceptualised from the perspective of sociocultural learning theories. These emphasise learning as becoming students’ construction of new identities (here as researchers) through engagement in authentic tasks. The task design involved a series of peer and tutor formative assessment activities in a blended learning setting. The study has added interest from having been implemented by a tutor whose own research area was formative assessment (see Pryor and Torrance, 2000; Torrance and Pryor, 1998; 2001) and for its contribution to formative assessment theory from a sociocultural perspective.
|Elton session topic review interaction of peer and tutor feedback.pdf||Review for Session Topic:The interaction of peer & tutor feedback Commentary on: Bates "Collaborative Problem-solving in First Year Physics" and Crossouard & Pryor "Formative assessment in a professional doctorate context: developing identities as researchers" By Prof Lewis Elton, University of Manchester|
|chat interaction of peer and tutor feedback.pdf||Transcript of 'Interaction of peer & tutor feedback' live chat session held on 31st May 2007|
^ Writing for scientists
|Integrating feedforward on academic writing.pdf||Charlotte Taylor, University of Sydney, Australia.
In this case study Taylor describes a major initiative in providing feedback to students engaged in writing activities in a first year science unit. The Writing in Biology program has provided interesting challenges, mainly associated with the large number of students and staff involved (n=1000 and 50 respectively). We have however persisted in this endeavour since we consider writing an essential component of the undergraduate curriculum, which must be integral to learning from the beginning of the degree program.
Activities include a series of opportunities for practicing writing, individual face-to-face feedback sessions for all students, and an online discussion forum during the writing process. Students and staff have actively engaged with the activities, and evaluations have consistently indicated enthusiasm for more practice and feedback opportunities. We have also incorporated a series of educational research projects into this program so that we can better understand how students learn while writing.
|Laboratory reports reflective essays and contributing student approach.pdf||John Hamer, University of Auckland, New Zealand.
Requesting reflective essays instead of technical reports has led to students observing more, writing more, developing their own personal writing style, and becoming more self-aware and self-critical. The essays complement a course taught using Collis? Contributing Student Approach by providing a rich source of material for use in subsequent learning resources such as course notes and self-evaluation quizzes. They also provide timely feedback, both for the students and for the lecturer, and help foster a collaborative, supportive learning environment.
|Milligan session topic review writing for scientists.pdf||Review for Session Topic: Writing for Scientists
Commentary on: Hamer "Laboratory Reports, Reflective Essays, and the Contributing Student Approach" and Taylor "Integrating Feedforward on academic writing into an undergraduate science course"By Dr Colin Milligan University of Strathclyde
|chat writing for scientists.pdf||Transcript of 'Writing for scientists' live chat session held on 31st May 2007|
^ T2 - Great Designs for Assessment
^ Collaborative writing in divergent disciplines
|Essay writing with peer reviewing and marking.pdf||Quintin Cutts, Department of Computing Science, University of Glasgow, UK
This case study examines a coursework assessment design, largely formative, aimed at improving students? writing and reviewing skills. The assessment makes use of an on-line assignment system that supports essay submission, blind reviewing of those essays by students, and then blind summative assessment of the reviews again by students. Students additionally write a formal response-to-reviewers, and then summatively assess these. The students are required repeatedly to evaluate the arguments of other students in essays, in reviews, and in response-to-reviewers and compare them with their own viewpoints. The aim is to move them from a Perry-like stage where only their view counts and there is only one correct viewpoint, to a higher stage where multiple viewpoints are valid and they need to find their own position within them. Rather than relying on staff to make judgements, they need to take a step forward in practising and employing their own ability to make judgements about work.
This case study fits into Theme 2, Great designs for assessment. It is ideally fitted to the overall conference title of Assessment Design for Learner Responsibility, since engendering a responsible attitude among the students is a key attribute of the design.
|Shakespeare page stage screen.pdf||Nandini Das and Stuart McGugan, University of Liverpool, UK
Das and McGugan's case study reports on an English assessment where editorial teams of students work with a passage from a Shakespearean play and prepare the text either for publication in a modernised, scholarly edition, or for a modern theatrical performance. The assessment demands that the results be presented in the form of the annotated ?edition? of the chosen passage and a properly referenced, 3000-word commentary on editorial and directorial decisions. Introduced in the academic year 2005-2006, the assessment represented significant innovation in a research led School which places small group tutorial teaching at the heart of the educational experience. The case illustrates how imaginative assessment design can be used to develop the skills of critical reasoning and independent group decision making.
|Owen session topic review collaborative writing in divergent disciplines.pdf||Review for Session Topic: Collaborative Writing in Divergent Disciplines
Commentary on: Cutts "Essay Writing with Peer Reviewing and Marking" and Das & McGugan "Shakespeare: Page Stage Screen"By Catherine Owen, University of Strathclyde
|chat collaborative writing in divergent disciplines.pdf||Transcript of 'Collaborative writing in divergent disciplines' live chat session held on 31st May 2007|
^ In-class vs out-of-class work
|Collaborative assessment using clickers.pdf||Maha Bali and Heather Keaney, American University in Cairo, Egypt.
Bali and Keney's case report on the American University in Cairo (AUC) piloting the use of clickers (also known as personal response systems) from fall 2006. Most instructors piloting this new technology were from economics, science or engineering disciplines, as it was found easier to create multiple-choice questions for their classes. In spring 2007, however, clickers were piloted in a history class. We present a unique case of a one-off use of technology in assessment - a graded activity in which students worked in teams to answer questions using clickers. Prior to this activity, instructors at AUC had succeeded when following ?best practice? of using clickers for non-graded formative assessment only, however, this graded activity showed improvement in student learning, and 95% of them said they would like to use clickers in a similar activity again.
|Learning gains my ARS.pdf||Andy Sharp and Angela Sutherland, Glasgow Caledonian University, UK.
Sharp and Sutherland's case evaluates the experience of a group of Vision Science students exposed to Audience Response System (ARS) technology. Uniquely students were given ownership of the (ARS) software. Students constructed knowledge on several pre-allocated themes with the aim of engaging peers in self-learning, peer-to-peer learning, discourse and assessment of peer responses using (ARS). Guidance was given on question design, engagement and formative assessment. Findings demonstrate both immediate informal and delayed formal feedback, is significant in assisting students deepen their learning, whilst improving motivation, enjoyment and engagement. Evaluations indicate that students using (ARS) to generate questions can be effective in developing higher order learning.
|Draper session topic review in class vs out class work by students.pdf||Review for Session Topic: In-class vs out-of-class work by students
Commentary on: Bali & Keaney "Collaborative Assessment Using Clickers" and Sharp & Sutherland "Learning Gains My (ARS) The impact of student empowerment using Audience Response Systems Technology on Knowledge Construction, Student Engagement and Assessment"
By Dr Steve Draper, University of Glasgow
|chat inclass vs out of class work.pdf||Transcript of 'In-class vs out-of-class work' live chat session held on 29th May 2007|
^ Raising students meta-cognition
|Certainty based marking for reflective learning and knowledge assessment.pdf||Tony Gardner-Medwin, University College London, UK and Nancy Curtin, Imperial College London, UK.
Gardner-Medwin and Curtin's case study focusses around Certaintly Based Marking. Certainty Based Marking (CBM) involves asking students not only the answer to an objective question, but also how certain they are that their answer is correct. The mark scheme rewards accurate reporting of certainty and good discrimination between more and less reliable answers. This encourages reflection about justification and soundness of relevant knowledge and skills, and probes weaknesses more deeply. It is easily implemented with existing test material, popular with students, grounded firmly in information theory and proven to enhance the quality of exam data. We report our experience with CBM and raise questions about constructive, fair and efficient assessment.
|Developing clinical self assessment skills in 1st year dental.pdf||Tracey Winning, Dimitra Lekkas & Grant Townsend, University of Adelaide, Australia
Case study overview
|Watson session topic review raising metacognition.pdf||
Review for Session Topic:
Raising students’ meta-cognition (self-assessment) abilitiesCommentary on: Gardner-Medwin & Curtin "Certainty-Based Marking (CBM) for reflective learning and proper knowledge assessment" and Winning et el. “Developing clinical self-assessment skills in first-year dental students”
by Dr Nigel Watson, University of Strathclyde, UK
|chat raising students meta cognition.pdf||Transcript of 'Raising students' meta-cognition (self-assessment) abilities' live chat session held on 30th May 2007|
^ Students deciding assessment criteria
|Students engagement in development of assessment criteria.pdf||Rosario Hernandez, University College Dublin, Ireland
This case study describes the process of engaging students in the development of criteria to be adopted in teacher, peer and self-assessment practices. The study was undertaken with undergraduate students of Hispanic Studies participating in a semester-long module whose main aim is the development of students? written competence in Spanish. The involvement of students in the development of assessment criteria was an attempt to move into a student-centred approach to teaching and learning, and to integrate alternative assessment practices with the teaching experience. This case study is interesting because the involvement of students in the development of criteria to be adopted for self-, peer, and teacher assessment of their learning, is an influential exercise that empowers learners to take an active role in the assessment process. This exercise also contributes to the development of students? ownership of their learning through assessment.
|Taras session topic review students deciding on assessment criteria.pdf||Review for Session Topic: Students deciding on assessment criteria
Commentary on: Hernandéz "Students engagement in the development of written criteria to assess written tasks"By Maddalena Taras, University of Sunderland
|chat students deciding assessment.pdf||Transcript of 'Students deciding assessment criteria' live chat session held on 31st May 2007|
^ Web 2.0 pedagogic design
|A wikied assessment strategy.pdf||Mark Atlay, Lesley Lawrence and Mark Gamble, University of Bedfordshire, UK.
Atlay et al's case study describes the assessment strategy used on a module as part of a postgraduate certificate in academic practice for tutors at HE level. The module has a focus on making the links between pedagogies and practice.
Two apparently opposing factors influenced the design of the assessment strategy. Firstly, since the participants were all academic staff (albeit with varying degrees of experience of teaching) an important aspect was to draw on participants? own experiences and to develop the notion of collaborative learning and a community of practice (Lave and Wenger). Secondly, the nature of the subject matter, which involves extensive reading, and the geographical spread of the twenty or so participants suggested more of a distance-learning emphasis. A third factor, encouraging participants to think creatively about the assessment strategies they use for their students and their relevance to students? needs, provided a sub-plot to the strategy implemented. The assessment strategy implemented combined the development of a collaborative Wiki with an analysis of critical incidents drawn from participants' own practice.
|Using wikis for summative and formative assessment.pdf||Marija Cubric, University of Hertfordshire, UK.
Cubric's case study describes a wiki-based assessment strategy and the underlying "blended learning" process, that have been formulated and implemented in a series of ?trials at University of Hertfordshire Business School (Cubric, 2006; 2007). The main motivation for use of wikis was to gain regular insight into students' understanding, so to enable more targeted and frequent feedback. The common characteristic of all (four) trials was that they were based on weekly wiki updates by students, that were triggered by tutor-set questions and assessed. The results of the trials have shown that students like the idea of using wikis for learning, particularly if supported by well-defined learning and teaching process.
|Kandlbinder session topic review web2 pedagogic design.pdf||Review for Session Topic: It’s Not Just Web 2.0, it’s All About Pedagogic Design
Commentary on: Atlay, Lawrence and Gamble “A Wikied Assessment Strategy” and Cubric “Using Wikis for Summative and Formative Assessment”
By Peter Kandlbinder, University of Technology Sydney, Australia
|chat web2 pedagogic design.pdf||Transcript of 'Theory and practice of assessment' live chat session held on 29th May 2007|
^ T3 - Institutional Strategies for Assessment
^ Aligning assessment at institutional level
|Adoption of innovation literature to guide institutional strategies for assess.pdf||Peter Gray, United States Naval Academy, USA.
Case study overview:
|Developing an assessment procedure to enhance student learning outcomes.pdf||Sean A. McKitrick, Binghamton University, USA.
This case study describes the initial organization of an assessment process that evaluates student performance with regard to critical thinking and information management, a general education student learning outcome at Binghamton University (State University of New York) which is assumed to be integrated throughout the university's general education curriculum. This case study adopts an organizational perspective toward assessment, positing that without including key faculty groups in initial attempts to assess learning outcomes, and without a central system for implementing assessment efforts, impacting curriculum, teaching, and learning in critical thinking/information management is a most difficult task. The case study applies to universities, university divisions, departments, and programs that desire to learn about organizational strategies of assessment that, in the end, could impact teaching, curriculum, and pedagogical initiatives that affect student learning. This case study fits under the theme "great designs for assessment" and "institutional strategies (designs) for assessment."
|Ehrmann session topic review aligning assessment at institutional level.pdf||Review for Session Topic: Aligning assessment at the institutional level
Commentary on: McKitrick "Developing an assessment procedure to enhance student learning outcomes in critical thinking/information management" and Gray "United States Naval Academy Case Study Using the Adoption of Innovation Literature to Guide Institutional Strategies for Assessment"By Dr Stephen C. Ehrmann, The Flashlight Program, The TLT Group
|chat aligning assessment.pdf||Transcript of 'Aligning assessment at institutional level' live chat session held on 29th May 2007|
^ Panel Sessions
^ Student input to assessment design & strategy
|Creanor position statement for HEA panel.pdf||Position Statement for Higher Education Academy Panel: Student input to assessment design & strategy
"Should learners be involved in designing assessments and strategy?"
By Linda Creanor, Caledonian Academy, Glasgow Caledonian University
|Price position statement for HEA panel.pdf||Position Statement for Higher Education Academy Panel: Student input to assessment design & strategy
"Student involvement - cheap labour or learning partnerships?"
By Professor Margaret Price, Director of ASKe
|Smith position statement for HEA panel.pdf||Position Statement for Higher Education Academy Panel: Student input to assessment design & strategy
"Can we involve students as key change agents to enhance assessment?"
By Professor Brenda Smith, Assistant Director, The Higher Education Academy
|McCloskey position statement for HEA panel.pdf||Position Statement for Higher Education Academy Panel: Student input to assessment design & strategy
By Katy McCloskey, Past President, University of Strathclyde Students Association
|chat student input to assessment.pdf||Transcript of 'Student input to assessment design & strategy' live chat session held on 31st May 2007|
^ Sharing responsibility for assessment: reflections
|chat sharing responsibilty for assessment.pdf||Transcript of 'Sharing responsibility for assessment: reflectiosn' live chat session held on 31st May 2007|
^ Feast of Case Studies
|Addressing examination issues in criterion reference assessment.pdf||Clair Hughes, University of Queensland|
|Assessing students learning process.pdf||Yandi Andri Yatmo & Paramita Atmodiwirjo, Department of Architecture, University of Indonesia|
|Assessment for learning current practice exemplars from CETL.pdf||Liz Mc Dowell et al., CETL|
|Assessment in EAP courses a communicative approach.pdf||Dr. Francine Robinson|
|Assessment using mcqs on the first year.pdf||Aidan O'Dwyer, School of Control Systems and Electrical Engineering, Dublin Institute of Technology|
|A quiz a week keeps anxiety at bay.pdf||Laurine Hurley, Australian Catholic University|
|Competencies management tool for autonomous learning and formative assessment.pdf||F. Georges & C. Dupont, LabSET-IFRES, University of Liège , Belgium|
|Criteria for assessing pre service teacher performance.pdf||Natalie Brown, University of Tasmania|
|Development of a professional experience rubric.pdf||Natalie Brown, University of Tasmania|
|DIDET project cross continent assessment in design engineering.pdf||Caroline Breslin, Hilary Grierson & Andrew Wodehouse, University of Strathclyde|
|Examples of assessment design for learner responsibility.pdf||Richard Baker, College of Science , Australian National University|
|Extending the pedagogic role of online interactive assessment.pdf||Sally Jordan, Barbara Brockbank and Philip Butcher, The Centre for the Open Learning of Mathematics, Computing, Science and Technology (COLMSCT) The Open University|
|Formative assessment for progress tests of applied medical knowledge.pdf||James Oldham, Peninsula Medical School , University of Plymouth|
|Formative feedback on a teacher education degree using a PLE.pdf||Magnus M B Ross & Mary P Welsh, University of Strathclyde|
|Fostering participation in electronic forums.pdf||Wajeeh Daher, Kasemi College|
|General education evaluation in a baccalaureate nursing program.pdf||Rebecca Sisk & Kimberly Johnston, Methodist College of Nursing , Peoria , IL , U.S.A.|
|Generative learning and assessment strategies concept mapping.pdf||S.M. Brüssow & A.C. Wilkinson|
|Grand designs for assessment fieldtrip consultancy.pdf||James Derounian|
|Great designs for assessment planned or emergent strategy.pdf||Susan M Ogden & Alec Wersun, Glasgow Caledonian University|
|Institutional strategies for e assessment City University.pdf||Neal Sumner & Annemarie Cancienne, City University|
|Leadership and assessment strengthening the nexus.pdf||Marina Harvey & Sharon Fraser, Macquarie University|
|Phased on-line summative assessment in first year accounting.pdf||Pru Marriott, Alice Lau & David Lewis, University of Glamorgan|
|Principles of good online assessment design.pdf||David J. Walker, University of Dundee|
|Redesigning computer based assessment tests for learning.pdf||Carol Collins, Learning and Skills Network (LSN) & Professor Angus Duncan, University of Bedfordshire|
|Reflective journaling as assessment and teaching.pdf||Anna Fortson & Rebecca Sisk, Methodist College of Nursing, Peoria, IL, U.S.A|
|Relative requirements and responsibility in assessment design.pdf||Damian Ruth, Department of Management and Enterprise Development, Massey University, Wellington|
|Signposting learning defined learning outcomes to facilitate alignment.pdf||Maureen M. Morris & Dr Anne Porter, University of Wollongong, Australia|
|Simulation learning and professional legal practice||Paul Maharg, Glasgow Graduate School of Law, University of Strathclyde, Glasgow, UK|
|Teaching portfolio for professional development in higher education.pdf||M. Poumay & C. Dupont, LabSET-IFRES, University of Liège, Belgium|
|Transforming learning in life sciences.pdf||Katarzyna Hempel, Linda Morris & Marcus Cross, CeLLS Project, University of Dundee / College of Life Sciences|
|Using a faculty-created transformational assessment paradigm for general education.pdf||Catherine M. Wehlburg, Texas Christian University|
|Using technology in assessment.pdf||Tshepo Batane, University of Botswana|
|Using technology to encourage study and engagement with feedback.pdf||Richard C. Rayne & Glenn K. Baggott, School of Biological and Chemical Sciences, University of London|
|Using tutor and demonstrators to support assessment.pdf||Christos Petichakis & Stuart McGugan, Educational Development Division, Centre for Lifelong Learning, University of Liverpool|
|chat feast of case studies.pdf||Transcript of 'Feast of case studies' live chat session held on 30th May 2007|
|Some questions about assessment.pdf||Stylianos Hatzipanagos, Ana Lucena & Steven Warburton, King's College London|
|Three principles for usable feedback.pdf||David J. Walker, University of Dundee|