Register      Login
Journal of Primary Health Care Journal of Primary Health Care Society
Journal of The Royal New Zealand College of General Practitioners
RESEARCH ARTICLE (Open Access)

Perceptions of the effectiveness of using patient encounter data as an education and reflection tool in general practice training

Linda Klein https://orcid.org/0000-0002-2063-1518 1 2 * , Michael Bentley https://orcid.org/0000-0003-3016-6194 3 , Dominica Moad https://orcid.org/0000-0002-2593-6038 1 2 , Alison Fielding https://orcid.org/0000-0001-5884-3068 1 2 , Amanda Tapley https://orcid.org/0000-0002-1536-5518 1 2 , Mieke van Driel https://orcid.org/0000-0003-1711-9553 4 , Andrew Davey https://orcid.org/0000-0002-7547-779X 1 2 , Ben Mundy https://orcid.org/0000-0001-5574-9375 1 2 , Kristen FitzGerald https://orcid.org/0000-0002-7280-2278 3 , Jennifer Taylor https://orcid.org/0000-0002-5075-6629 2 , Racheal Norris https://orcid.org/0000-0003-2758-6323 1 2 , Elizabeth Holliday https://orcid.org/0000-0002-4066-6224 1 , Parker Magin https://orcid.org/0000-0001-8071-8749 1 2
+ Author Affiliations
- Author Affiliations

1 School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, University Drive, Callaghan, NSW 2308, Australia.

2 GP Synergy, NSW and ACT Research and Evaluation Unit, Level 1, 20 McIntosh Drive, Mayfield West, NSW 2304, Australia.

3 General Practice Training Tasmania, Level 3, RACT House, 179 Murray Street, Hobart, Tas. 7000, Australia.

4 General Practice Clinical Unit, Faculty of Medicine, The University of Queensland, 288 Herston Road, Brisbane, Qld 4006, Australia.

* Correspondence to: Linda.Klein@racgp.org.au

Handling Editor: Felicity Goodyear-Smith

Journal of Primary Health Care 16(1) 12-20 https://doi.org/10.1071/HC22158
Submitted: 9 January 2023  Accepted: 18 May 2023  Published: 7 June 2023

© 2024 The Author(s) (or their employer(s)). Published by CSIRO Publishing on behalf of The Royal New Zealand College of General Practitioners. This is an open access article distributed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND)

Abstract

Introduction

Patient encounter tools provide feedback and potentially reflection on general practitioner (GP) registrars’ in-practice learning and may contribute to the formative assessment of clinical competencies. However, little is known about the perceived utility of such tools.

Aim

To investigate the perceived utility of a patient encounter tool by GP registrars, their supervisors, and medical educators (MEs).

Methods

General practice registrars, supervisors and MEs from two Australian regional training organisations completed a cross-sectional questionnaire. Registrars rated how Registrar Clinical Encounters in Training (ReCEnT), a patient encounter tool, influenced their reflection on, and change in, clinical practice, learning and training. Supervisors’ and MEs’ perceptions provided contextual information about understanding their registrars’ clinical practice, learning and training needs.

Results

Questionnaires were completed by 48% of registrars (n = 90), 22% of supervisors (n = 182), and 61% of MEs (n = 62). Most registrars agreed that ReCEnT helped them reflect on their clinical practice (79%), learning needs (69%) and training needs (72%). Many registrars reported changing their clinical practice (54%) and learning approaches (51%). Fewer (37%) agreed that ReCEnT influenced them to change their training plans. Most supervisors (68%) and MEs (82%) agreed ReCEnT reports helped them better understand their registrars’ clinical practice. Similarly, most supervisors (63%) and MEs (68%) agreed ReCEnT reports helped them better understand their registrars’ learning and training needs.

Discussion

ReCEnT can prompt self-reflection among registrars, leading to changes in clinical practice, learning approaches and training plans. Reaching its potential as an assessment for learning (as opposed to an assessment of learning) requires effective engagement between registrars, their supervisors and MEs.

Keywords: clinical practice, general practice registrars, health care education, patient encounter data, performance and evaluation, primary health care, professional education, programmatic assessment, reflective practice.

WHAT GAP THIS FILLS
What is already known: In-practice learning is central to GP registrars developing competencies within the apprenticeship-style model of GP training. Feedback followed by reflection isare important for GP registrars’ in-practice learning. Patient encounter tools can provide clear in-practice feedback to registrars, yet little published evidence is available to support the use of such tools within general practice training.
What this study adds: This study demonstrates that patient encounter tools, such as ReCEnT, can be useful for registrars’ self-reflection on their clinical practice and can lead to changes in practice and learning approaches. The relative lack of engagement between numerous registrars and their supervisors or MEs on ReCEnT feedback reports indicates a missed opportunity for supported reflection and suggests more work is needed on effective engagement for ReCEnT to be used as an assessment for learning in general practice training.

Introduction

In Australia’s apprenticeship-style model of general practitioner (GP) training, registrars (trainees) work under the guidance of an experienced GP supervisor within a broader education program delivered by regional training providers.1 There is a strong emphasis on development and assessment of clinical competencies to be a GP,2 and in-practice learning is central to developing these competencies.3 An important feature of in-practice learning is feedback and reflection on registrars’ clinical practice as well as follow-up of this feedback and reflection with a mentor (clinical supervisor and/or medical educator, ME).4,5

In 2019, as part of an Australian national framework for programmatic assessment within general practice training, one recommendation was that all registrars complete a patient encounter tracking and learning (PETAL) tool across their training terms.6 Such a tool could form a low-stakes assessment for learning in various proposed domains of competency (eg medical knowledge, patient care, communication skills, practice-based learning, etc).7 Acknowledging that there are tensions in ‘simultaneously stimulating the development of competencies and assessing its result,’8 educators argue that competency-based medical education requires assessment for learning as well as assessment of learning, with information and documentation to support this.9 PETALs encompass an audit of consecutive patient consultations, leading to feedback and, thence, prompting reflection on practice and learning. Such tools can be important in assessing registrars’ in-practice clinical exposure, which is accepted as an important determinant of learning.10,11 Yet, there is little published evidence underpinning the use of patient encounter tools within general practice training.

The audit literature provides some evidence of the benefits of feedback and reflection processes that are applicable to PETALs. Typically, audits focus on a particular issue (eg diabetes management) with comparisons made to existing guidelines. Feedback and reflection cycles within audit processes can improve quality and safety, clinical judgement and self-confidence among physicians, particularly when targeting specific practices (eg test ordering).12,13 A recent qualitative study of GPs highlighted the importance of audit and feedback for changing GPs’ practice behaviour and emphasised the added value of formally discussing feedback with other GPs.14 Discussion with peers motivated reflection on change in clinical practice by providing insights into the possible outcomes of change.14

Fundamental to moving from feedback to change in practice is reflection. Although many factors may influence whether feedback is used or not,15 self-reflection and facilitated reflection are viewed as a key component of decision-making toward change.16 Reflection is a ‘strategy for learning’,16 a key intellectual activity that leads to new understanding and appreciation of experience as shown through feedback.17

The Registrar Clinical Encounters in Training (ReCEnT) project is an ongoing (2010–present) educational and research project conducted in Australian GP training. Its principal function is educational as a PETAL. ReCEnT processes are summarised in Box 1 and described more fully elsewhere.18 Briefly, ReCEnT assists registrars, in conjunction with supervisors and MEs, to reflect on their practice and educational needs, and encourages quality improvement.18 Data reporting by registrars takes approximately 2 min per patient on average, cumulatively about 2 h for 60 patients. Currently, ReCEnT is completed by 44% of all Australian registrars in each of their three 6-month mandatory general practice training terms.19

Box 1. The Registrar Clinical Encounters in Training (ReCEnT) project
Data collection
Each registrar completes details of 60 consecutive consultations in each of their three 6-month mandatory general practice training terms, documenting information in categories about:
  • the registrar (eg age group, gender),

  • their patients (eg age group, gender, Indigenous status, primary language),

  • the encounter (eg consultation duration, presenting problems, investigations, management, procedures and learning goals).

Only office-based consultations are recorded (not home visits, etc.) and not within-practice clinics (eg immunisation clinics).
Report
In each term, registrars receive an individualised feedback report summarising the information, along with ‘prompts to reflection’ related to most feedback topics or areas. The report provides comparisons of a registrar’s results with:
  • their own results over time (ie term-to-term),

  • aggregate registrar data,

  • previously published national data for established GPs where available.

A de-identified copy of an individualised ReCEnT report is available online (see Supplementary File S1).

ReCEnT provides multiple opportunities for reflection: during practice when completing the patient encounter data collection for each patient; individually following receipt of the feedback report; and in discussion with supervisors and MEs.20 An example ReCEnT report is provided as supplementary material online (see Supplementary File S1). Registrars are encouraged to reflect on whether the findings are valid in the context of their usual practice and demographic profile of their patients. Repeating the process in three terms allows registrars to consolidate their clinical experience, providing a valuable ‘time-in-context’ development of competence.21 Educationally, the goal is for registrars to reflect on their clinical experience gaps, clinical behaviours and learning needs,9 which may in turn prompt action and change.20

As ReCEnT has the necessary educational features to enhance learning (ie clear cycles of data collection, feedback, reflection and discussion),20 a case can be made for inclusion of this PETAL within a programmatic assessment framework as an assessment for learning rather than an assessment of learning.4,22 However, there is a gap in knowledge regarding the extent to which ReCEnT is used by GP registrars or by their supervisors and MEs. Indeed, there is a lack of published evidence for the utility of any PETAL as a reflective tool to enhance educational outcomes and improve practice. We aim to address this gap by assessing perceptions of registrars, supervisors and MEs regarding the effectiveness of ReCEnT as an educational tool for registrars to enhance reflection and influence change in practice.

Methods

We invited registrars, supervisors and MEs from two Australian regional training organisations (RTOs), GP Synergy and General Practice Training Tasmania, to complete a cross-sectional questionnaire (including five-point Likert scales), designed to address the research aim and tailored to each respondent group. Inclusion criteria for participation were: all 2020 registrars who had completed two or more rounds of ReCEnT and had completed their final round of ReCEnT (in General Practice Term 3) prior to the onset of the COVID-19 pandemic; and all MEs and supervisors who had a registrar(s) complete ReCEnT in 2019.

With consent, registrars’ questionnaire data was linked to their routinely collected RTO and ReCEnT demographic and practice data to reduce survey response fatigue. Non-consenting registrars were requested to complete additional demographic/practice questions to supply this information.

To maximise the response rate, registrars received invitations to complete the questionnaire online via email and by hard copy via mail. MEs and supervisors were invited via email only to complete the questionnaire online. Email reminders were sent 1.5- and 3-weeks post-invitation. A $25 gift card was offered to registrars and supervisors for questionnaire completion. MEs (RTO employees) could complete the questionnaire in work time.

Registrars’ perceptions of the effectiveness of ReCEnT were in six domains: extent to which involvement in ReCEnT influenced reflection on their (1) clinical practice, (2) learning needs, and (3) training needs, and influenced change in their (4) clinical practice, (5) approach to learning and/or exam preparation, and (6) training plans (Figs 1 and 2 contain full descriptors).

Fig. 1. 

Registrar participants’ ratings of how ReCEnt influenced their reflection, by percentage (n = 90).


HC22158_F1.gif
Fig. 2. 

Registrar participants’ ratings of how ReCEnt influenced change, by percentage (n = 90).


HC22158_F2.gif

Supervisor and ME perceptions were elicited to provide contextual information about the usefulness of ReCEnT for themselves in assisting their registrars, specifically the extent to which registrars’ involvement in ReCEnT influenced educators’ understanding of their registrars’ clinical practice and learning and training needs.

Data were analysed descriptively and separately for registrars, supervisors and MEs. Where summarised, Likert scales were dichotomised into agree (Strongly agree + Agree) and not agree (Strongly disagree + Disagree + Neither disagree nor agree). Categorical variables are presented as frequency with percentage and 95% confidence intervals for main findings, whereas continuous variables are presented as mean with standard deviation (s.d.). Analyses were programmed using STATA 15.1.

Ethics approval was gained from the University of Newcastle Human Research Ethics Committee (HREC), approval number H-2020-0103.

Results

Registrars

Questionnaires were sent to 187 registrars, achieving a response rate of 48% (n = 90). Registrar participant characteristics are presented in Table 1.

Table 1. Characteristics of participating registrars, supervisors and MEs.

CharacteristicsClassRegistrars n = 90Supervisors n = 182MEs n = 62
n (%)n (%)n (%)
Regional Training Organisation (RTO)GP Synergy70 (78)165 (91)46 (74)
GP Training Tasmania20 (22)17 (9)16 (26)
GenderBMale34 (38)96 (58)13 (22)
Female53 (60)68 (41)45 (75)
Prefer not to say2 (2)2 (1)2 (3)
AgeCMean ± s.d.38.4 ± 8.651.8 ± 10.242.6 ± 10.3
(Min, Max)(20, 61)(30, 72)(29, 68)
Country of primary medical degreeAustralia50 (56)115 (69)54 (90)
Other country39 (44)51 (31)6 (10)
FellowshipFRACGP86 (97)
FACRRM3 (3)
Training pathwayGeneral40 (45)
Rural49 (55)
Work timeFull time throughout43 (48)
Part time throughout11 (12)
Mix of both35 (39)
Registrars supervised in 2019AMean ± s.d.1.8 ± 1.01
(Min, Max)(1, 8)
Registrars managed in 2019Mean ± s.d.19.7 ± 12.0
(Min, Max)(1, 50)
Practice locationMetro/Inner regional (RA = 1, 2)113 (68)46 (77)
Outer regional/Remote/Very remote (RA = 3–5)53 (32)14 (23)
Experience in rural practice (RA 3–5)No52 (58)
Yes37 (42)
Years as supervisor/MEMean ± s.d.9.3 ± 8.05.7 ± 5.3
(Min, Max)(1, 30)(1, 30)

An = 179 (three supervisors did not provide this information).

Bn = 166 (16 supervisors did not respond to the demographic and practice characteristics questions).

Cn = 164 (a further two supervisors did not supply their age).

Fig. 1 shows that registrars agreed that their participation in ReCEnT helped them reflect on their clinical practice (79%; CI: 71–87%), learning needs (69%; CI: 59–79%) and training needs (72%; CI: 63–81%).

Fig. 2 shows that 54% (CI: 44–64%) of registrars agreed that they had made changes to their clinical practice, and 51% (CI: 41–61%) agreed making changes to their learning approach, including exam preparation. However, only 37% (CI: 27–47%) of registrars agreed that their participation had influenced them to make changes to their training plans.

Regarding engagement with ReCEnT, 89% of registrars recalled spending time reviewing their ReCEnT feedback report by themselves. About half (51%) of registrars reported discussing their report with their supervisor, whereas less than a quarter (22%) discussed their report with their ME. The remaining 44% did not discuss their report with either supervisor or ME.

Supervisors and medical educators

The response rate for supervisor questionnaires was 22% (n = 182 of 818 sent). The response rate for ME questionnaires was 61% (n = 62 of 101 sent). Supervisor and ME participant characteristics are presented in Table 1.

Fig. 3 shows that 68% (CI: 61–75%) of participating supervisors agreed that ReCEnT feedback reports helped them better understand their registrars’ clinical practice, and 63% (CI: 56–70%) agreed the reports helped them better understand their registrars’ learning and training needs.

Fig. 3. 

Supervisor participants’ ratings of how ReCEnt influenced their understanding of their registrar(s), by percentage (n = 182).


HC22158_F3.gif

Fig. 4 shows that 82% (CI: 72–92%) of participating MEs agreed that ReCEnT feedback reports helped them better understand their registrars’ clinical practice, and 68% (CI: 56–80%) agreed that the reports helped them better understand their registrars’ learning and training needs.

Fig. 4. 

ME participants’ ratings of how ReCEnt influenced their understanding of their registrar(s), by percentage (n = 62).


HC22158_F4.gif

Regarding engagement, 60% of supervisors reported reading all their registrars’ ReCEnT feedback reports, whereas 27% of supervisors reported reading some and 13% did not read any. About 48% of supervisors reported communicating with all their registrars about ReCEnT reports, whereas 35% communicated with some. Typical communication was via face-to-face conversations, either within a formal scheduled teaching session (76%) and/or an informal conversation (85%). Email communication was rare (8%).

Many participating MEs (71%) reported reading all their registrars’ ReCEnT reports, 27% reported reading some and 2% did not read any. Only 34% of MEs reported communicating with all their registrars about the ReCEnT feedback report, 57% of MEs only communicated with some registrars and 8% did not communicate with any. MEs most commonly communicated within formal phone conversations (87%) or by email (53%). MEs were less likely to report speaking face-to-face with registrars regarding ReCEnT feedback reports, whether formally (41%) or informally (17%).

Discussion

Our study shows that a majority of registrars, supervisors and MEs agreed there are benefits from completing ReCEnT and/or receiving ReCEnT reports. Most registrars agreed ReCEnT helped them reflect on their clinical practice, learning and training needs, and many reported changing their clinical practice and learning approaches. Supervisors and MEs reported better understanding of their registrars’ clinical practice and learning and training needs.

Interpretation of findings and comparison with previous research

Reflection

The ability to critically reflect increases learning in graduate medical education.23 As a reflection tool, ReCEnT was highly rated by registrars in helping them to reflect on their clinical practice, learning and training needs. In addition, ReCEnT helped supervisors and MEs better understand their registrars’ clinical practice and learning and training needs. Registrars are encouraged to reflect on whether their ReCEnT reports represent their usual practice and whether particular findings were affected by factors such as the demographics of their personal patient population or that of their current teaching practice, their personal clinical approach, and their training practice procedures or culture.

The findings support the utility of ReCEnT as an iterative process for reflection.24 As there is a gap between submitting their information and receiving their report (ie 2–3 weeks), it is an example of delayed reflection-on-action,24 which is perceived in postgraduate training as aiding self-reflection.25 Examples of reflection-on-action include attempting to select patients with more varied demographic backgrounds, address time-efficiency, manage in-consultation help-seeking, and place more emphasis on rational test-ordering or prescribing.26 Some registrars have reported previously26 that in the process of completing a subsequent ReCEnT round, they used reflection-in-action, where they can make immediate changes.24

Engagement

Registrars were well acquainted with ReCEnT as they completed two or more rounds during their training terms. Although most registrars self-reflected on their reports, only half the registrars discussed their report with either their supervisor, ME or both – indicating the remaining registrars missed a key feature of enhancing reflection and change. This variability in seeking feedback from a supervisor or ME highlights a need to explore how engagement with a patient encounter tool can lead to more effective registrar reflection and feedback, rather than just being a perfunctory exercise.27,28

Most supervisors and MEs read all their registrars’ reports; however, less than half communicated with their registrars about the reports – again missing a key opportunity to enhance reflection and change in their registrars. Although registrars are encouraged to discuss the feedback with their educators, this can be difficult for some registrars to initiate. It has been suggested that putting formal processes in place at the outset of training could ensure effective use of feedback through discussion,2931 ensuring that this key aspect of reflection is not missed.

Influencing change

In conjunction with reflection on and engagement with ReCEnT, more than half of registrars agreed they had made changes to their clinical practice and learning approach. This finding is supportive of ‘deeper’ learning during the use of a PETAL, resulting in change.32 Interestingly, far fewer registrars agreed that their participation had influenced them to change their training plans. Training plans are often made earlier in training33 and can be outside of registrars’ control.

Assessment for learning

Our results show that ReCEnT is perceived as an effective tool for registrars, MEs and supervisors in understanding registrars’ learning. This is consistent with other research showing the value of reflection for learning.34 For ReCEnT to be a component of programmatic assessment within GP training, it is best used as an assessment for learning rather than of learning.7 However, further research is needed to better understand how such a patient encounter tool fits into a programmatic assessment framework.

Implications for policy and practice

This research suggests that ReCEnT can prompt self-reflection, and that reflection may lead to changes in clinical practice, learning approaches and/or training plans. However, to access ReCEnT’s potential as an assessment for learning will require more effective engagement between registrars, supervisors and educators. This may require enhanced supervisor and ME training, as the responsibility for engagement may need to come from educators, especially as part of a framework of programmatic assessment.

Implications for future research

Further research is required to understand in greater detail the experiences of registrars, supervisors and MEs in using PETAL tools, and the impact they have on building self-reflection skills in GP registrars.

Further research would also assist in developing frameworks for supervisors and MEs in supporting registrar participation in self-reflection, and how PETALs can be used within a programmatic assessment model.

Strengths and limitations

A strength of this study is that it was conducted across two of the three Australian RTOs involved with ReCEnT. These RTOs are responsible for training 36% of all Australian registrars in general practice19 and have a demographic and geographic presence across the range of Australian GP vocational training.

A significant limitation introduced due to the COVID-19 pandemic was the registrars available for survey completion. The sample frame was restricted to registrars who had last completed a round of ReCEnT prior to the pandemic to avoid confounding survey responses due to the impacts of COVID-19 on GPs in training,35 such as the introduction of telehealth, which would change the patient encounter data.36,37 Given this limitation, the response rate for registrars (48%) was positive for a GP questionnaire.38 By comparison, the National Registrar Survey achieved a 28% response rate during the COVID-19 pandemic.19 Nevertheless, absolute numbers available for analysis were relatively small and this limits the precision of point estimates in our findings.

Although the response rate for MEs (61%) was positive, the response rate for supervisors (22%) was small, even though consistent with other surveys conducted with GPs.38 The relatively low supervisor response rate limits generalisation regarding supervisor perceptions.

Another limitation is that registrars’ recall of their experiences with ReCEnT from 2019 (6–12 months earlier) may have introduced recall bias. However, most registrars had completed three rounds of ReCEnT, which would have modulated their knowledge of the process. Recall bias may have been less of an issue for supervisors and MEs who continued to see registrars who had completed their previous round of ReCEnT in 2019.

Conclusions

The positive responses from registrars, supervisors and MEs regarding the utility of ReCEnT for reflection and learning support its use as an educational patient encounter tool for reflection and action (ie change in learning approach and clinical practice), and add to the paucity of literature on this topic. To reach its potential as a tool for effective feedback on registrars’ clinical practice and learning and training needs, and thus as an assessment for learning, effective engagement between registrars, their supervisors and MEs is required. Further qualitative research would provide a deeper understanding of the potential for using patient encounter tools in programmatic assessment of general practice training.

Supplementary material

Example of a typical 14-page ReCEnT report for a registrar. Supplementary material is available online.

Data availability

The data that support this study cannot be publicly shared due to ethical or privacy reasons.

Conflicts of interest

Several authors on this paper are investigators on the ReCEnT project and, therefore, declare an interest in the project that gave rise to this study. Specifically, Parker Magin, Alison Fielding, Andrew Davey, Amanda Tapley, Ben Mundy and Dominica Moad are involved in the day-to-day administration of the ReCEnT project, including its promotion and distribution to general practice trainees. The other authors declare no conflicts of interest.

Declaration of funding

This research project was supported by the Royal Australian College of General Practitioners (as part of the 2020 Education Research Grant program) with funding from the Australian Government under the Australian General Practice Training Program.

References

Hays RB, Morgan S. Australian and overseas models of general practice training. Med J Aust 2011; 194(11): S63-4 PMID: 21644855.
| Crossref | Google Scholar |

Royal Australian College of General Practitioners. The Clinical Competencies for the CCE. East Melbourne, Vic.: Royal Australian College of General Practitioners; 2021. Available at https://www.racgp.org.au/education/registrars/fracgp-exams/clinical-competency-exam/the-clinical-competencies-for-the-cce/the-clinical-competencies-for-the-cce [accessed 21 December 2022].

Kolb DA. Experiential learning: experience as the source of learning and development, 2nd edn. NJ: Pearson Education; 2015.

van der Vleuten CPM, Schuwirth LWT, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach 2012; 34(3): 205-14.
| Crossref | Google Scholar |

van der Vleuten CPM, Schuwirth LWT, Driessen EW, et al. Twelve tips for programmatic assessment. Med Teach 2015; 37(7): 641-6.
| Crossref | Google Scholar |

GPEx. Workplace-Based Assessment Framework for General Practice Training and Education. Adelaide, SA: GPEx; 2019. Available at https://gpex.com.au/developing-an-evidence-based-practical-and-contextualised-workplace-based-assessment-framework/ [accessed 21 December 2022].

Schuwirth LWT, van der Vleuten CPM. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach 2011; 33(6): 478-85.
| Crossref | Google Scholar |

Schut S, Maggio LA, Heeneman S, et al. Where the rubber meets the road — An integrative review of programmatic assessment in health care professions education. Perspect Med Educ 2021; 10(1): 6-13.
| Crossref | Google Scholar |

Lockyer J, Carraccio C, Chan M-K, et al. Core principles of assessment in competency-based medical education. Med Teach 2017; 39(6): 609-16.
| Crossref | Google Scholar |

10  de Jong J, Visser M, Van Dijk N, et al. A systematic review of the relationship between patient mix and learning in work-based clinical settings. A BEME systematic review: BEME Guide No. 24. Med Teach 2013; 35(6): e1181-96.
| Crossref | Google Scholar |

11  de Jong J, Visser MR, Mohrs J, et al. Opening the black box: the patient mix of GP trainees. Br J Gen Pract 2011; 61(591): e650-7.
| Crossref | Google Scholar |

12  Flottorp SA, Jamtvedt G, Gibis B, et al. Using audit and feedback to health professionals to improve the quality and safety of health care. World Health Organization; 2010. Available at https://apps.who.int/iris/handle/10665/332014 [accessed 21 December 2022].

13  Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev 2012; 2012(6): CD000259.
| Crossref | Google Scholar |

14  van Braak M, Visser M, Holtrop M, et al. What motivates general practitioners to change practice behaviour? A qualitative study of audit and feedback group sessions in Dutch general practice. BMJ Open 2019; 9(5): e025286.
| Crossref | Google Scholar |

15  Brehaut JC, Colquhoun HL, Eva KW, et al. Practice feedback interventions: 15 suggestions for optimizing effectiveness. Ann Intern Med 2016; 164(6): 435-41.
| Crossref | Google Scholar |

16  Sargeant J, Lockyer J, Mann K, et al. Facilitated reflective performance feedback: developing an evidence-and theory-based model that builds relationship, explores reactions and content, and coaches for performance change (R2C2). Acad Med 2015; 90(12): 1698-706.
| Crossref | Google Scholar |

17  Boud D, Keogh R, Walker D. Promoting reflection in learning: a model. In: Boud D, Keogh R, Walker D, editors. Reflection: Turning experience into learning. London: Routledge; 2013. pp. 18–40.

18  Davey A, Tapley A, van Driel M, et al. The registrar clinical encounters in training (ReCEnT) cohort study: updated protocol. BMC Prim Care 2022; 23: 328.
| Crossref | Google Scholar |

19  Taylor R, Clarke L, Radloff A. Australian General Practice Training Program: National report on the 2021 National Registrar Survey. Australian Council for Educational Research; 2021. Available at https://research.acer.edu.au/cgi/viewcontent.cgi?article=1076&context=higher_education [accessed 21 December 2022].

20  Morgan S, Henderson K, et al. How we use patient encounter data for reflective learning in family medicine training. Med Teach 2015; 37(10): 897-900.
| Crossref | Google Scholar |

21  Teunissen PW, Kogan JR, Ten Cate O, et al. Learning in practice: a valuation of context in time-variable medical training. Acad Med 2018; 93(3): S22-6.
| Crossref | Google Scholar |

22  Bok HGJ, Teunissen PW, Favier RP, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ 2013; 13: 123.
| Crossref | Google Scholar |

23  Winkel AF, Yingling S, Jones A-A, et al. Reflection as a learning tool in graduate medical education: a systematic review. J Grad Med Educ 2017; 9(4): 430-9.
| Crossref | Google Scholar |

24  Mann KV. Reflection’s role in learning: increasing engagement and deepening participation. Perspect Med Educ 2016; 5(5): 259-61.
| Crossref | Google Scholar |

25  Embo MPC, Driessen E, Valcke M, et al. Scaffolding reflective learning in clinical practice. Med Teach 2014; 36(7): 602-7.
| Crossref | Google Scholar |

26  Magin P, Morgan S, Henderson K, et al. The Registrars’ Clinical Encounters in Training (ReCEnT) project: educational and research aspects of documenting general practice trainees’ clinical experience. Aust Fam Physician 2015; 44(9): 681-4 https://search.informit.org/doi/10.3316/informit.513249323722850.
| Google Scholar |

27  Garth B, Kirby C, Silberberg P, et al. Utility of learning plans in general practice vocational training: a mixed-methods national study of registrar, supervisor, and educator perspectives. BMC Med Educ 2016; 16: 211.
| Crossref | Google Scholar |

28  De la Croix A, Veen M. The reflective zombie: problematizing the conceptual framework of reflection in medical education. Perspect Med Educ 2018; 7(6): 394-400.
| Crossref | Google Scholar |

29  Pelgrim EAM, Kramer AWM, Mokkink HGA, et al. The process of feedback in workplace‐based assessment: organisation, delivery, continuity. Med Educ 2012; 46(6): 604-12.
| Crossref | Google Scholar |

30  Sturman N, Fitzmaurice L, Ingham G, et al. Getting good help: a guide for reflection, debriefing and feedback conversations about in-consultation supervision. Educ Prim Care 2021; 32(2): 118-22.
| Crossref | Google Scholar |

31  Wearne S, Brown J. GP supervisors assessing GP registrars – theory and practice. Aust Fam Physician 2014; 43(12): 887-91 PMID: 25705742.
| Google Scholar |

32  Webb ME, Fluck A, Magenheim J, et al. Machine learning for human learners: opportunities, issues, tensions and threats. Educ Technol Res Dev 2021; 69(4): 2109-30.
| Crossref | Google Scholar |

33  Tran M, Wearne S, Tapley A, et al. Transitions in general practice training: quantifying epidemiological variation in trainees’ experiences and clinical behaviours. BMC Med Educ 2022; 22: 124.
| Crossref | Google Scholar |

34  Mann K, Gordon J, MacLeod A. Reflection and reflective practice in health professions education: a systematic review. Adv Health Sci Educ 2009; 14(4): 595-621 PMID: 18034364.
| Crossref | Google Scholar |

35  White I, Benson J, Elliott T, et al. Australian general practice registrars’ experiences of training, well-being and support during the COVID-19 pandemic: a qualitative study. BMJ Open 2022; 12(6): e060307.
| Crossref | Google Scholar |

36  Snoswell CL, Caffery LJ, Haydon HM, et al. Telehealth uptake in general practice as a result of the coronavirus (COVID-19) pandemic. Aust Health Rev 2020; 44(5): 737-40.
| Crossref | Google Scholar |

37  Taylor A, Caffery LJ, Gesesew HA, et al. How Australian health care services adapted to telehealth during the COVID-19 pandemic: a survey of telehealth professionals. Front Public Health 2021; 9: 648009.
| Crossref | Google Scholar |

38  Bonevski B, Magin P, Horton G, et al. Response rates in GP surveys: trialling two recruitment strategies. Aust Fam Physician 2011; 40(6): 427-30 PMID: 21655493.
| Google Scholar |