BIMHSE Archive

From assessment to diagnosis: A two-part workshop on writing and interpreting multiple-choice assessments

  • Date / Time
27 Nov 2024
18 Dec 2024
  • Abstract
Navigating the complex world of academic assessment requires a deep understanding of multiple-choice test construction and evaluation. This two-part workshop series is designed to help educators improve multiple-choice assessments and analyze their effectiveness in measuring student learning.

PART I: Fundamentals of writing multiple choice tests

Date: 27 November, 2024 (Wednesday)
Time: 1:00-2:00pm
Venue: Seminar Room A6-09, 6/F, William MW Mong Block, 21 Sassoon Road

Designing effective multiple-choice tests is essential for assessment of/for/as learning. Clang association, window dressing, ambiguity, grammatical clues are only some issues that make our test flawed. In this interactive workshop, participants will delve into the fundamentals of creating high-quality multiple-choice assessments that truly measure students' learning. The workshop will begin with an overview of the importance of well-designed tests in the learning process. We will discuss the anatomy of multiple-choice tests focusing on how to best design each part: stem, choices (key, distractors). We will examine common pitfalls in question construction and share best practices for crafting clear, concise, and unbiased questions that challenge students to think critically. Participants will learn how to use actionable insights to identify problematic questions, refine distracters, and improve overall test quality.


PART II: Are your tests measuring up? Uncovering the fundamentals of item analysis for classroom assessments

Date: 18 December, 2024 (Wednesday)
Time: 12:30-2:00 pm
Venue: Yu Chun Keung Medical Library Knowledge Hub

How can we ensure that the tests we develop are effective?
Are the items too easy or too challenging for our students?
Do the test options attractive enough? If so, to whom are the correct and incorrect choice/s attractive – to low or high performing group?
Can our tests accurately discriminate between students with varying intellectual maturity?

This workshop, designed specifically for new teachers, aims to enhance their skills in creating and evaluating classroom assessments. The session will delve into the crucial aspects of item analysis, including item difficulty, item discrimination, item distractors, and response frequency. Participants will learn to calculate and interpret item difficulty—a measure of the number of test-takers who answer an item correctly—and item discrimination, which reveals how well an item differentiates between high and low performers. The seminar will also highlight the importance of crafting attractive item distractors, the incorrect response options in multiple-choice items, and their impact on the overall effectiveness of an item. Additionally, attendees will investigate response frequency, analyzing how often each response option is chosen and what it implies about an item's quality. Through practical activities, participants will apply these concepts to real test data, enabling them to make informed decisions regarding test construction and revision, ultimately leading to more effective classroom assessments.

Participants are encouraged to attend both workshops. Due to the limited seating capacity in the computer laboratory, we can only accommodate the first 30 participants for the Part II Workshop.

All HKU Staff are welcome.

  • Speaker(s)
Professor Fred Ganotice, PhD in Educational Psychology

Mr. Wilzon Dizon, MS in Psychology
  • Descriptions
  • Booking
Bookings are closed for this event.
  • Add to calendar