Workshop Title :
Clinically situated meaningful learning conversations
Workshop Facilitators :
Sandra Kemp

There are many challenges for health professions supervisors and educators when conducting meaningful learning conversations with trainees and students. It is a complex task to focus on feedback and engage in dialogue that helps trainees and students to develop professional, ethical, and appropriate ways of learning and working in clinical settings. This workshop will focus on understanding the challenges with engaging in meaningful learning conversations and will consider the current evidence-based literature. Participants will have the opportunity to develop some practical strategies to use in their clinically situated meaningful learning conversations, across a range of issues from clinical contexts, and relevant for both face-to-face and online learning conversations.

Workshop Title :
Assessing Professionalism
Workshop Facilitators :
John Norcini

There is a growing awareness of the importance of professionalism and great interest in methods for assessing it. The goal of this workshop is to familiarize participants with the range of methods currently available. In addition, it will focus on one of those methods, patient and peer questionnaires, and address the issues of deciding on content for the questionnaire, determining the scale and scoring procedures, specifying ways of developing reliable scores, and estimating the bias introduced by settings and patients. Active involvement will be encouraged throughout, and small group exercises will focus on defining behaviors associated with professionalism and developing items to capture those behaviors.

Workshop Title :
Reorienting conventional assessment tools to account for collective competence
Workshop Facilitators :
Lorelei Lingard

In small groups, participants will consider how/whether the concept of collective competence fits with the existing frameworks of objectives and competencies that inform their assessment practices. Participants will work in small groups to describe the key dimensions of their existing assessment frameworks and their overlap or conflict with the concept of collective competence.  Exploring our sense of consonance and dissonance of fit between current practices and the concept of collective competence, we will work together to develop strategies for integrating the concept of collective competence into existing assessment practices.
 
Objectives:
1. To consider how the concept of ‘collective competence’ may be consonant or dissonant with existing assessment approaches and instruments
2. To discuss strategies and challenges related to integrating ‘collective competence’ into existing assessment practices

Workshop Title :
Narrative comments in assessment through human and AI interpretation
Workshop Facilitators :
Lorelei Lingard

Our aims in this workshop will be to review how participants are using narrative comments in their assessment practices, to understand the strengths and limitations of human interpretation of those comments, and to explore the potential of using AI to assist in such interpretation. Participants will work in small groups to sketch the nature and role of narrative comments in their existing assessment practices and articulate key challenges they currently confront. Drawing on existing evidence, we will compare the theoretical strengths and weaknesses of human and AI-based interpretation of narrative assessment comments. Using AI systems that participants have free access to (e.g., ChatGPT, Gemini), we will do an hands-on exercise to input sample narrative comments into large language models and discuss output patterns in terms of their usefulness for assessment purposes. The session will culminate in the development of a preliminary list of best practices for educators wishing to explore the use of AI to interpret narrative comments in their programs.
 
Objectives:
1. To review the evidence about narrative comments as assessment data in medical education
2. To recognize the potential and drawbacks of AI-based interpretation of narrative comments for assessment purposes
To articulate best practices for incorporating AI tools to support interpretation of narrative assessment comments

Workshop Title :
Self-assessment and reflection as a means towards improvement
Workshop Facilitators :
Kevin Eva

Self-directed learning and self-regulation are long-standing pillars of the medical profession. Being able to regulate one’s practice has been argued to depend upon accurate self-assessment with the responsibility for ‘keeping up to date’ resting ultimately with the individual practitioner. Unfortunately, adherence to these axioms persists despite considerable evidence that self-assessment generally offers a poor proxy for actual ability. The discourse of self-assessment supports viewing it not solely as an individual cognitive activity but also as a social activity (i.e., a formative, facilitated, “directed” activity influenced by external resources). As such, self-assessment activities can be more or less valuable for particular purposes in particular contexts. These perspectives have led us to propose prioritization of “directed self-assessment”, a term used to describe self-assessment activities informed by external resources.  Research suggests it is an externally supported reflective assessment process that is influenced by context and culture. There is, however, not a clear understanding of the activities and influences involved in directed self-assessment and understanding of best practices continues to evolve. The purpose of this workshop, therefore, will be to explore the use of “directed” self-assessment across the continuum of authentic training and practice settings. Through participation, participants will …
1. Discuss the evidence from various literatures in an effort to come to a rich understanding of the concept of directed self-assessment, its flaws and strengths;
2. Explore the ways in which directed self-assessment has been used to good effect;
3. Consider how to further move the field towards a positive culture of directed self-assessment for professional practice, faculty development, and teaching/learning.

Workshop Title :
How to use judgment effectively in rater-based assessment
Workshop Facilitators :
Kevin Eva

Establishing highly functioning assessment practices is filled with challenge because proficient performance by health practitioners involves the integration of many different skill sets including the abilities to communicate effectively, operate within a healthcare team, and bring expert knowledge to bear on the situation encountered. Combine those needs with the fact that what behaviours constitute successful use of those skill sets is dependent on the specifics of the situation, and it becomes abundantly clear that determining the quality of performance is highly dependent on the judgment of observers. As in any walk of life, judgment regarding competence is fallible with considerable idiosyncrasy of opinion and susceptibility to influence from many factors that extend well beyond the quality of the performance observed. This workshop will highlight ways in which research techniques and findings from cognitive science have led to the conclusion that assessment protocols in medicine can be improved to a greater extent by design that takes limitations of human cognition into account than it can by efforts to fundamentally alter the way in which raters form impressions.

Workshop Title :
Programmatic assessment: what’s it all about?
Workshop Facilitators :
Sylvia Heeneman

Programmatic assessment optimizes assessment at the program level by providing continuous feedback and diverse assessment methods, culminating in high-stakes decisions based on aggregated performance data. This approach integrates learning and decision-making functions and is increasingly adopted in health professions education
Programmatic assessment viewpoints have become rooted in health professions education, it is ‘talked about’, embraced or debated. We have insights from the ‘innovators’, and the ‘early adopters’ (the pioneers). Is HPE in the stage of the ‘early majority’, ‘late majority’ or is lagging behind (Rogers’ adoption model)?

Programmatic assessment has definitely ‘blurred’ the more traditional viewpoints and approaches to assessment.

This workshop will explore the principles, curriculum design impacts, and outcomes of programmatic assessment, drawing on literature and practical experiences.

Workshop Title :
Coaching of professional performance in systems of assessment - Managing tensions in coaching versus assessing
Workshop Facilitators :
Sylvia Heeneman

Competency-based medical education requires assessment and feedback systems for learner progress, supported by coaching relationships, described by Telio et al. as “educational alliances". These alliances ideally have continuity of coaching, involve a feedback dialogue and the learner is supported in follow-up of feedback and self-regulation of learning. However, tensions can arise in coaching relationships, there is a need for trust and credibility, and a certain continuity. In systems of assessment, like programmatic assessment, merging feedback with decision-making or involving coaches in high-stakes decisions can create further tensions. This workshop will discuss the educational alliance as part of systems of assessment, drawing on literature and practical experiences.

Workshop Title :
How to use portfolios to support learning?
Workshop Facilitators :
Erik Driessen

Portfolios are widely used in medical education, not only as a source of information for authentic assessment, but also to support learning of students or trainees. Therefore, portfolios are often used in combination with a form of mentoring or coaching. In this workshop participants will be introduced in the possibilities of portfolios to support learning.  After a short introduction, we will practice with the possibilities of portfolios for supporting learning. The workshop will be highly interactive, requiring participants to use portfolio evidence for supporting learning. The workshop can facilitate participants with different levels of expertise on portfolios.

Objectives
1. To explore the possibilities and pitfalls of portfolios for supporting learning
2. To discuss different mentoring strategies that can be used in portfolio meetings  

Workshop Title :
Making EPAs work: opportunities and limitations
Workshop Facilitators :
Erik Driessen

Entrustable Professional Activities (EPA) are expected to bridge the educational and clinical practice in our clinical workplaces, in a language that can be understood by every clinician. Research findings, however, indicate that there is a gap between EPAs theoretical potential and the practical realities of day to day practice. The aim of this workshop is to explore how we can make EPAs work in our practices. We will review in small groups why and how EPAs do (not) work in which contexts. Examples from 17 years of working with EPA in Dutch post-graduate education will be shared. assessment for learning. Sharing their practical and scientific experience with this topic, the workshop moderator and participants will co-produce EPA principles that effectively support learning and assessment in clinical practice.

Objectives
- to understand under what circumstances EPAs can optimally support learning and assessment in practice,
- to know what role individual trainees and preceptors can play in making EPA work for them.

Workshop Title :
Designing/re-designing a system of assessment
Workshop Facilitators :
Sandra Kemp and John Norcini

For leaders and directors of medical and health professions education programmes, it is important to understand how their systems of assessment for the programme - as a whole - are aligned with the purposes of various stakeholders. The framework outlined in the 2018 Consensus Framework for Good Assessment (Medical Teacher) provides guidance to educators and the challenge is to apply this in practice, in a particular local/national context. This workshop will connect the Consensus Framework guidance with practical approaches to designing or re-designing a system of assessment, and workshop activities will assist leaders of programmes to understand and apply criteria that are important for good systems of assessment.

International Advanced Assessment Masterclass: exploring the cutting edge in health professions assessment

Royal College of Pathologists, 6 Alie Street, London, E1 8QT

3 and 4 June 2025

The 2025 iteration of our International Advanced Assessment Masterclass will reflect current cutting-edge discourses in health professions assessment and teaching, as focus turns to understanding how expert judgement in clinical (workplace) settings impacts on learners and educators.

Our speakers and workshop facilitators are internationally renowned experts in the field and will explore these critical contemporary issues in health professions education.

The plenaries will provide advanced-level coverage of contemporary assessment discourses and best practice, drawing on evidence from the literature, while the workshops will offer opportunities to delve into more challenging areas of assessment.

Aim

This Masterclass is for people involved in health professions education assessment and teaching who wish to broaden and update their perspectives as well as deepen their understanding of specific areas of assessment.

Learning Outcomes

Participants will:

  • gain deeper understanding of critical contemporary discourses in assessment
  • learn to evaluate the latest evidence from the literature, especially relating to developments in best practice
  • identify issues and opportunities in the assessment of challenging areas in healthcare education

Suitability

We have welcomed educators, leaders, managers and regulators from medicine, dentistry, pharmacy, physiotherapy, veterinary medicine and nursing to past courses.

Participants are assumed to have prior knowledge of fundamental assessment principles and experience of assessment practice.

This Masterclass is suitable for both undergraduate and postgraduate levels.

Format

This course will be delivered over two days: four plenary sessions will address key areas of current interest, while the workshops will provide hands-on experience of methods and approaches relating to some of the newer and more challenging assessment areas. Participants will have the opportunity to attend four workshops, during which they will be able to explore areas of particular interest.

Plenaries

Erik Driessen

Professor, Faculty of Health, Medicine and Life sciences, SHE, Maastricht University and Editor in Chief for the Journal ‘Perspectives on Medical Education’

Portfolios: dream or disaster?

Kevin Eva

Associate Director and Scientist in the Centre for Health Education Scholarship, Professor and Director of Educational Research and Scholarship in the Department of Medicine, at the University of British Columbia, Canada and Editor-in-Chief for the journal ‘Medical Education’

The problem with feedback: why, after a century of research, we still can’t get it right

Lorelei Lingard

Professor, Department of Medicine and Senior Scientist, Centre for Education Research & Innovation Schulich School of Medicine & Dentistry, Western University, Canada

Interdependence and Collective Competence in clinical training environments

John Norcini

President Emeritus of the Foundation for Advancement of International Medical Education, Research (FAIMER®) and Research Professor in the Department of Psychiatry at Upstate Medical University

Assessment of Professionalism and the development of professional identity

Workshops

1.

Designing/re-designing a system of assessment

2.

Clinically situated meaningful learning conversations

3.

Assessing professionalism

4.

Reorienting conventional assessment tools to account for collective competence

5.

Narrative comments in assessment through human and AI interpretation

6.

Self-assessment and reflection as a means towards improvement

7.

How to use judgement effectively in rater-based assessmen

8.

Programmatic Assessment: what’s it all about?

9.

Coaching of professional performance in systems of assessment: managing tensions in coaching versus assessing

10.

How to use portfolios to support learning

11.

Entrustable Professional Activities : Making EPAs work : opportunities and limitations

This course is supported by risr/

https://risr.global

At risr/, we provide assessment and lifelong learning through one integrated platform, giving our customers the control and clarity they need to drive better learning. At the heart of the risr/ platform are our robust modules, including risr/assess, which provides everything you need to manage and deliver exams and assessments; from creation, preparation and delivery to analysis, reporting, and feedback.

Visit: https://risr.global
Contact: info@risr.global

Fees

£850 per participant (Super Early Bird)

Please note that payments via Telegraphic Transfer and Stripe will incur an administrative tax fee. HPAC does not make any income from these fees as they are levied by the respective service providers.

Submit

Meet the tutors

Erik Driessen

Professor in the School of Health Profession Education at Maastricht University, the Netherlands. For theJournal Perspectives on Medical Education, he is Editor in Chief.

Kevin Eva

Associate Director and Scientist in the Centre for Health Education Scholarship, and Professor and Director of Educational Research and Scholarship in the Department of Medicine, at the University of British Columbia

Sylvia Heeneman

Dept.of Pathology/ School of Health Professions Education, Maastricht University

Sandra Kemp

Deputy Dean, Innovation and Scholarship, Medical Education

Graduate School of Medicine, University of Wollongong, Australia

Lorelei Lingard

Professor in the Department of Medicine and Senior Scientist at the Centre for Education Research & Innovation, both at Western University

John Norcini

Research Professor in the Department of Psychiatry at Upstate Medical University (SUNY) , a Fellow of Presence (a Center at Stanford Medical School) and President Emeritus of the Foundation for Advancement of International Medical Education and Research (FAIMER®)