3.Constructing multiple choice questions as a method for learning.
Annals of the Academy of Medicine, Singapore 2006;35(9):604-608
INTRODUCTIONMany different strategies exist to try and encourage students to increase their knowledge and understanding of a subject. This study was undertaken to measure the effect of student-based construction of multiple choice questions (MCQs) as a stimulus for the learning and understanding of topics in clinical surgery.
MATERIALS AND METHODSThe study was carried out at the University of Adelaide, Australia and had 2 components. Fourth-year students were required to provide a case study during a surgical attachment and half of the group was asked to supplement this with MCQs. These students were pre- and post-tested and the effect of the additional intervention (MCQ-construction) measured. Fifth-year students were polled on their preferred methods of learning before and after a learning exercise in which they were asked to undertake a case presentation and create some MCQs.
RESULTSThe MCQ questions designed by the students were of a high standard and clearly displayed an understanding of the topic concerned. The 4th-year students in the MCQ construction group showed equivalent outcomes as the case study control group. Students initially ranked MCQ-construction amongst the least stimulating methods of learning, but after the exercise their opinion was significantly more favourable, although still much less than traditional learning methodologies (tutorials, books).
CONCLUSIONSConstruction of MCQs as a learning tool is an unfamiliar exercise to most students and is an unpopular learning strategy. However, students are capable of producing high quality questions, and the challenge for medical faculties is how best to use this initiative to the students' advantage.
Australia ; Educational Measurement ; methods ; Educational Status ; Humans ; Learning ; Students, Medical ; Surveys and Questionnaires ; standards
5.A simple instrument for the assessment of student performance in problem-based learning tutorials.
Si-Mui SIM ; Nor Mohd Adnan AZILA ; Lay-Hoong LIAN ; Christina P L TAN ; Nget-Hong TAN
Annals of the Academy of Medicine, Singapore 2006;35(9):634-641
INTRODUCTIONA process-oriented instrument was developed for the summative assessment of student performance during problem-based learning (PBL) tutorials. This study evaluated (1) the acceptability of the instrument by tutors and (2) the consistency of assessment scores by different raters.
MATERIALS AND METHODSA survey of the tutors who had used the instrument was conducted to determine whether the assessment instrument or form was user-friendly. The 4 competencies assessed, using a 5-point rating scale, were (1) participation and communication skills, (2) cooperation or team-building skills, (3) comprehension or reasoning skills and (4) knowledge or information-gathering skills. Tutors were given a set of criteria guidelines for scoring the students' performance in these 4 competencies. Tutors were not attached to a particular PBL group, but took turns to facilitate different groups on different case or problem discussions. Assessment scores for one cohort of undergraduate medical students in their respective PBL groups in Year I (2003/2004) and Year II (2004/2005) were analysed. The consistency of scores was analysed using intraclass correlation.
RESULTSThe majority of the tutors surveyed expressed no difficulty in using the instrument and agreed that it helped them assess the students fairly. Analysis of the scores obtained for the above cohort indicated that the different raters were relatively consistent in their assessment of student performance, despite a small number consistently showing either "strict" or "indiscriminate" rating practice.
CONCLUSIONThe instrument designed for the assessment of student performance in the PBL tutorial classroom setting is user-friendly and is reliable when used judiciously with the criteria guidelines provided.
Education, Medical ; methods ; standards ; Educational Measurement ; methods ; Humans ; Problem-Based Learning ; standards ; Retrospective Studies ; Students, Medical ; Surveys and Questionnaires ; standards
6.A comparison of learning strategies, orientations and conceptions of learning of first-year medical students in a traditional and an innovative curriculum.
Kosala N MARAMBE ; T Nimmi C ATHURALIYA ; Jan D VERMUNT ; Henny Pa BOSHUIZEN
Annals of the Academy of Medicine, Singapore 2007;36(9):751-755
INTRODUCTIONStudents adapt their learning strategies, orientations and conceptions to differences in the learning environment. The new curriculum of the Faculty of Medicine, University of Peradeniya, Sri Lanka, which commenced in 2005, puts greater emphasis on student-centred learning. The aim of this study was to compare the learning strategies, orientations and conceptions measured by means of a validated Sri Lankan version of the Inventory of Learning Styles (ILS) at the end of the first academic year for a traditional curriculum student group and a new curriculum student group.
MATERIALS AND METHODSThe Adyayana Rata Prakasha Malawa (ARPM) 130-item Sinhala version of the ILS was administered to students of the traditional curriculum and the new curriculum at the end of their first academic year respectively. Mean scale scores of the 2 groups were compared using independent sample t-test.
RESULTSStudents of the new curriculum reported the use of critical processing, concrete processing and memorising and rehearsing strategies significantly more than those in the traditional curriculum group. With respect to learning orientations, personal interest scores were significantly higher for the new curriculum students while reporting of ambiguity was significantly lower among them.
CONCLUSIONThe results favour the assumption that changes made to the organisation of subject content and instructional and assessment methods have a positive impact on students' use of learning strategies and motivation.
Curriculum ; standards ; Educational Measurement ; Humans ; Learning ; Orientation ; Sri Lanka ; Students, Medical ; psychology
7.Student academic committees: an approach to obtain students' feedback.
Dujeepa D SAMARASEKERA ; Indika M KARUNATHILAKE ; Ranjan DIAS
Annals of the Academy of Medicine, Singapore 2006;35(9):662-663
In 1995, the Colombo Medical Faculty changed its curriculum from a traditional model to an integrated one. The major challenge to the Faculty was obtaining students' feedback on their learning activities. To overcome this, a new method where staff and student groups from different years of study engage in an interactive discussion relating to their learning environment was developed. This feedback was then processed and forwarded to the relevant authorities for necessary action.
Curriculum
;
standards
;
Education, Medical
;
methods
;
Educational Measurement
;
methods
;
Feedback
;
Humans
;
Students, Medical
;
Surveys and Questionnaires
8.An online evaluation of problem-based learning (PBL) in Chung Shan Medical University, Taiwan - a pilot study.
Jia-Yuh CHEN ; Meng-Chih LEE ; Hong-Shan LEE ; Yeou-Chih WANG ; Long-Yau LIN ; Jen-Hung YANG
Annals of the Academy of Medicine, Singapore 2006;35(9):624-633
INTRODUCTIONProblem-based learning (PBL) embraces principles of good learning and teaching. It is student-directed, fosters intrinsic motivation, promotes active learning, encourages peer teaching, involves timely feedback, and can support student self and peer assessment. The most important function of the assessment process is to enhance student learning, to improve the curriculum and to improve teaching.
MATERIALS AND METHODSTo improve the PBL tutorial in Chung Shan Medical University, we developed an online evaluation system containing the evaluation forms for students, tutor, self and peer. The Cronbach alpha reliability coefficients were 0.9480, 0.9103, and 0.9198 for the Student, Tutor and Self and Peer Evaluation Form, respectively. The online evaluations were mandatory to both students and tutors, and the information was completely anonymous.
RESULTS AND CONCLUSIONSThe survey response rates of the online evaluations ranged from 95.6% to 100%. The online evaluations provided a documented feedback to the students on their knowledge, skills and attitudes. Correspondingly, tutors too received feedback from students in evaluating their performance on the appropriateness and effectiveness of tutoring the group. Although there was an initial lack of coordination regarding responsibilities and how to use the online system for both students and the Faculty, the system enabled us to look into how effective our PBL course had been, and it provided both process and outcome evaluations. Our strategy for evaluating the success of PBL is only at its initial stage; we are in an ongoing process of collecting outcome data for further analysis which will hopefully provide more constructive information to the PBL curricula.
Education, Medical ; standards ; Educational Measurement ; Humans ; Online Systems ; Pilot Projects ; Problem-Based Learning ; methods ; Taiwan ; Universities
9.Computer-based versus pen-and-paper testing: students' perception.
Erle C H LIM ; Benjamin K C ONG ; Einar P V WILDER-SMITH ; Raymond C S SEET
Annals of the Academy of Medicine, Singapore 2006;35(9):599-603
BACKGROUNDComputer-based testing (CBT) has become increasingly popular as a testing modality in under- and postgraduate medical education. Since 2004, our medical school has utilised CBT to conduct 2 papers for the third- and final-year assessments - Paper 3, with 30 multiple choice questions featuring clinical vignettes, and the modified essay question (MEQ) paper.
AIMSTo obtain feedback from final-year students on their preferred mode of testing for Paper 3 and MEQ components of the Medicine track examination, and the reasons underlying their preferences.
METHODSAn online survey was carried out on 213 final-year undergraduates, in which they were asked to provide feedback on Paper 3 and MEQ papers. Students were asked if they thought that the CBT format was preferable to the pen-and-paper (PNP) format for Paper 3 and the MEQ, and why.
RESULTSOne hundred and fourteen out of 213 (53.5%) students completed the online survey. For Paper 3, 91 (79.8%) felt that CBT was preferable to PNP, 11 (9.6%) preferred the PNP format and 12 (10.5%) were unsure. For the MEQ, 62 (54.4%) preferred CBT over PNP, 30 (26.3%) preferred the PNP format and 22 (19.3%) were unsure. Reasons given to explain preference for CBT over PNP for Paper 3 included independence from seating position, better image quality (as images were shown on personal computer screens instead of projected onto a common screen) and the fact that CBT allowed them to proceed at their own pace. For the MEQ, better image quality, neater answer scripts and better indication of answer length in CBT format were cited as reasons for their preference.
CONCLUSIONSOur survey indicated that whereas the majority of students preferred CBT over PNP for Paper 3, a smaller margin had the same preference for the MEQ.
Clinical Competence ; Computers ; Education, Medical ; methods ; standards ; Educational Measurement ; methods ; Humans ; Students, Medical