Applying an after-action review process to examine a complex public health response in New South Wales (NSW), Australia: lessons for reflective practice
Caroline H. Sharpe A * , Alexander Willems B , Amanda Robinson C , ToveLysa Fitzgerald D , Julie Letts B , Craig Dalton F G H and Andrew J. Milat B EA
B
C
D
E
F
G
H
Abstract
After-action reviews (AARs) are used to systematically examine the functions, capabilities and barriers impacting effective pandemic responses. This paper describes the methods used for and the lessons learnt from undertaking the first formal state-wide AAR of the public health response to COVID-19 in New South Wales (NSW), Australia.
A state-wide AAR was applied to examine the public health response to COVID-19 conducted by Health from January 2020 until May 2022.
The AAR was conducted between March and November 2022. The World Health Organization ARR approach was used and involved six stages including: (1) AAR design, (2) AAR planning, (3) team debriefs, (4) workshop preparation, (5) consensus workshop and (6) AAR report review and finalisation.
The AAR process involved over 100 people across the NSW network through surveys, team debriefs and workshops. The stepped process used to complete the review, with standardised templates, was found to be acceptable and feasible. The preparatory stage elicited important insights, provided an opportunity for structured reflection and helped identify themes for discussion in the workshop. Feedback methods included two participant satisfaction surveys and one post-implementation review session, which identified strengths in the process and areas that could be modified for future iterations of other public health reviews in NSW.
The AAR process successfully engaged multi-disciplinary pandemic response staff in a systematic reflection process. The process was perceived by most participants as a highly valuable opportunity to reflect and it led to important findings to improve public health emergency responses. It is important that the scope of the AAR is well understood by participants and that the psychological needs of the workforce are considered in the AAR process. There is merit in applying such reviews as standard practice in future public health emergencies.
Keywords: after-action review, COVID-19, emergency preparedness, operational review, public health debrief, quality improvement, reflective practice, resilience planning.
Introduction
The response to COVID-19 by the New South Wales (NSW) public health network
The NSW public health network (‘the network’) operates using a decentralised ‘hub and spoke’ model, with the NSW Ministry of Health (‘the Ministry’) functioning as the ‘hub’ and 17 public health units (PHUs) within 15 Local Health Districts (LHDs) functioning as the ‘spokes’. Coordination of the NSW public health COVID-19 pandemic response (‘the response’) was undertaken centrally by the Ministry. Local PHUs had primary responsibility for managing responses within their respective LHDs.
At the peak of the response in NSW (September 2021, Delta outbreak), staff numbers surged to over 1800 full-time equivalent staff.1 By June 2022, this workforce had contracted significantly as community vaccination rates exceeded 94%,2 COVID-19-related hospitalisations had decreased relative to periodic high community disease prevalence, and many aspects of intense public health action, such as contact tracing and community lockdowns, had eased. More information about the NSW response is detailed in a comprehensive debrief report.1
The application of after-action reviews (AARs) in the NSW context
The World Health Organization (WHO) defines AARs as ‘a qualitative review of actions taken in response to an event of public health concern’.3 WHO’s Guidance for After Action Review,3 informed by a systematic review of AAR methods used across health and other disciplines, provides an outline with templates and resources to support organisations in conducting AARs. A variety of methodologies are available; however, the common goals of AARs are to ‘identify strengths, best practices, gaps and lessons’ to inform future outbreak responses.4 The key steps in discovering findings involve reviewing what happened, what went well, what did not go well and what can be done better to prepare for a future event. AARs are one of many quality improvement methods that can be used to review both large-scale and more contained incident management responses including disease outbreaks. Other formal methods include facilitated lookbacks, root-cause analyses, the peer assessment approach, post-event analyses and critical incident reviews.5,6
Following the start of the pandemic, WHO urged countries to undertake intra-action reviews (IARs) and AARs of their responses to systematically examine the functions, capabilities and barriers affecting an effective pandemic response.7 IARs/AARs have since been applied globally in the context of the COVID-19 pandemic, with over 150 IARs conducted or planned by member states.8 Recent large-scale AARs using similar methods to those presented here have been published,9 but few provide much practical detail on delivering an AAR. In early 2022, NSW Health began undertaking an AAR of the NSW public health response to COVID-19. This was the first formal AAR completed at a state-wide level. This AAR explicitly aimed to inform a broader debrief of the public health response in NSW, Australia.1
This paper describes the methods used for and the lessons learnt about the process of undertaking the AAR by NSW Health.
Methods
Preparatory stage
NSW Health’s approach for this AAR primarily followed WHO’s guidance3 but also considered a guide to conducting AARs by author CD.10–12 Because of the scale of the response and the number of stakeholders involved, an AAR methodology most closely aligned to the ‘working group format’ was chosen.3 The aim was to deliver an efficient and focused face-to-face workshop for representatives from across the network. This format also enabled the sharing of participants’ experiences, which was considered important to staff who had had limited opportunity for group reflection and debriefing.
The scope of the AAR was limited to the core public health functions of the response. The functional areas selected were workforce, governance, and surveillance and reporting. The timeframe under consideration was from January 2020 until May 2022.
An Organising Committee (OC) was convened to oversee the planning and execution of the AAR. Membership was multidisciplinary and included public health nurses, public health physicians, a public health officer trainee, a contact tracing team manager and a director of public health research and evaluation. OC members were selected because of their involvement in the response, capacity to undertake the review and prior experience in conducting AARs, as well as by nomination from LHDs. The OC prepared supporting materials including local facilitator discussion guides, a staff pre-survey (Supplementary File S1), a response template (Supplementary File S2), instructional videos and a timeline of the response as an ‘aide memoire’ to assist teams. The OC also identified external consultants to support the AAR process and facilitate the workshop.
‘Teams’ were defined according to each response unit within the Ministry (e.g. operations) and each PHU workforce. Given the limited capacity among staff, the scope of the review and the timeframe for the AAR to be completed, a stepped approach was taken to gather perspectives (Fig. 1). The aim was to collect as much information as possible from all teams before the workshop. The workshop was then used to develop a consensus on what the priority issues were and what actions could be taken.
To achieve this, teams were directed to coordinate a pre-workshop debrief process and collate staff perspectives on each functional area for input into a standardised template. The template outlined the scope of each functional area and provided guiding questions to focus discussions. Teams were provided with the supporting materials and given 4 weeks to undertake their debriefs and submit the completed response templates. Responses were then analysed and thematically coded by external consultants, and the themes that emerged were used to formulate the specific questions for the workshop.
State-wide workshop
The workshop was conducted by external facilitators, and attendees included LHD- and Ministry-nominated network representatives (minimum of one representative per team). Attendance at the workshop was limited to 30 representatives given the complexity of the topic and the need to ensure contributions could be made by all participants. The workshop structure included alternating plenary and break-out sessions to enable in-depth discussions. Discussion points were documented on flip charts and supplemented by notetakers to ensure a thorough written record of key points was made. All notes captured on the day were collated and thematically analysed to identify recommendations. A final report documenting the AAR process and findings was provided to the network.
Feedback was collected to support future AAR processes. This involved two electronic surveys using Likert scales to evaluate participants’ views on the helpfulness of the pre-workshop resource package (the AAR process) and satisfaction with the AAR workshop itself (Supplementary File S3). Each survey also had a free-text option for general feedback. Both surveys were circulated after completion of the workshop, and the mode of responses was calculated. Following completion of the AAR process, a structured 60-min post-implementation review session was conducted with OC members, reflecting on the strengths and weaknesses of and key lessons learned about the AAR process.13 Key learnings were recorded and agreed upon by members.
Results
Over 100 staff (~10% workforce) contributed to the team debriefs, and 27 staff participated in the AAR workshop, with another 6 OC representatives acting as observers and note-takers. Representation was broad and included public health physicians and nurses, epidemiologists and data analysts, policy officers, program managers and executive managers.
Preparatory stage
Before the workshop, teams submitted 24 response templates comprising over 200 pages of content (Supplementary File S2). The thematic content analysis discovered five common themes (Fig. 2).
State-wide workshop
Workshop participants developed 21 areas for recommendations with many overlapping across the 3 functional areas and 5 themes (Fig. 2). Nonetheless, insufficient time was available to complete the prioritisation and consolidation of overlapping recommendations in the final session of the workshop. However, the context of this AAR meant its findings and proposed areas for recommendations were cross-checked against those of the broader debrief of the response.1 Encouragingly, insights and recommendations elicited through this AAR were highly consistent with and incorporated into the broader debrief findings, providing confidence that both processes had identified appropriate areas for strengthening future public health responses.1
Feedback
Most feedback regarding the AAR process and workshop collected through participant surveys was very positive. Twenty-nine (of >100) staff completed the AAR process survey, and 12 (of 27) staff completed the workshop survey. None of the OC completed surveys. Of respondents who completed the AAR process survey, 93% (27/29) felt the supporting materials helped in understanding the AAR process and preparing for the team debriefs. Similarly, 92% (11/12) and 83% (10/12) of respondents were satisfied with the workshop format and facilitation, respectively. Improvement suggestions included participants having more input into the AAR scope; team debriefs requiring more time and resources; and including more staff in team sessions. Some participants advocated that AARs should be a mandatory practice for all major state and local public health responses.
The key topics discussed in the post-implementation OC review session and the reflections that surfaced are outlined in Supplementary File S4. A summary is provided in Fig. 3.
Discussion
The NSW COVID-19 AAR was implemented with strong support from the network and provided useful insights that correlated very closely with the findings of a broader debrief of the response.1 The scale of the response presented challenges in relation to the scope and depth of the AAR. Our scope was defined by the inter-relationship with the concurrent broader debriefs’ objectives. Although other reviews had occurred in the previous 2 years, and processes had evolved to optimise response activities, a formal state-wide AAR of the response had not been conducted. The long timeframe (2020–2022), the scope of the review, competing priorities, team capacity and timeframe pressures may have limited the depth with which issues were explored. However, the recommended areas for improvement we identified do mirror those commonly found in other reviews.14
Our experience identified the challenges in determining the right number of participants for AAR processes, especially with a large response. There was a need to balance the number of participants with the likelihood of productive discussions, as well as to ensure all relevant functional areas were represented in workshops.
As there was limited practical experience in planning AARs to draw on, this AAR process was based on WHO guidance, which was supplemented by other documentation including a guide to conducting AARs by author CD.10–12 Adaptation of guidance to context has been used by other authors, though recent evaluations of AAR methods in the health system context are scant.9,15 Deciding on the best approach to delivering the AAR in the NSW context was challenging, and the working group format required considerable resources during an ongoing, though declining, response.
The OC provided the necessary oversight of the process and enabled different perspectives to affect the design of the AAR, including how workshop questions should be framed. In addition, seeking OC nominations from PHUs assisted in getting staff buy-in for the AAR process and revealed how team participation could be supported, such as what types of pre-workshop resources would be helpful. All OC members had experience working in the response, which greatly assisted in the understanding of the operational environment and allowed thorough briefing of external consultants.
The OC decided early in the planning process to use external consultants to facilitate the workshop to provide an appropriate level of neutrality and enable more open discussion among participants. The external consultants selected had substantial experience in facilitating complex workshops and included former senior NSW Health officials. The OC co-designed the workshop approach with consultants; however, it was sometimes challenging for the consultants to fully understand public health functions and the pandemic response context, and additional input was required from the OC when interpreting workshop results. Finding the balance between facilitators having enough knowledge to support the process versus being biased through involvement in the incident is a clear challenge in conducting AARs and depends on the context and resources under which the AAR is occurring.
Our objective to gather comprehensive information ahead of the workshop was resource intensive. Developing pre-workshop materials affected the time available for the team debriefs, which potentially constrained discussions.
As many workshop participants were meeting for the first time, and because of the number of themes for discussion, workshop time involved reviewing insights elicited through team debriefs. This was necessary to create collegiality and an atmosphere of shared experience. However, more time could have been allocated to developing recommendations. The workshop could have been expanded to occur over 2 days, or several multi-day meetings, particularly given the size of the response. In our context, the AAR was not a standalone process, and other concurrent reviews have captured very similar reflections and recommendations, so it is unlikely major gaps are present. Nonetheless, allowing enough time for the full AAR process is imperative to the objectives of an AAR.
Ensuring the intent of the AAR was clear to participants was a recurring issue. The AAR needed to provide a safe space for stressful experiences to be surfaced while also framing discussions around learnings and defining a set of recommendations to strengthen future responses. The intent of the AAR was therefore to learn rather than to directly address the psychological needs of staff (debriefing-to-learn rather than debriefing-to-manage or debriefing-to-treat),16 but both needed to be balanced in the process.
After the workshop, managers raised an ongoing need for staff to psychologically debrief on their professional experiences. As a result, a series of sessions facilitated by an experienced trauma-informed social worker was organised for staff, separate from the AAR process. Organisations conducting AARs need to acknowledge and plan for the emotional aspects of debriefings and ensure appropriate support and expertise are in place.
AARs play a critical part in learning from responses. WHO recommends that AARs be incorporated into standard procedures for public health authorities globally, in line with the International Health Regulations Monitoring and Evaluation Framework.17 Conducting AARs is important to public health practice, alongside conducting recurring simulation exercises ‘during peacetime’.4 However, it is important to recognise there are many alternative models to assess the quality of response capabilities, including assessment protocols, capacity assessments and critical incident registries.18–20
One of the reported challenges of conducting AARs is acting upon their findings.13 NSW Health has begun implementing some of the findings of the AAR and broader debrief, but ongoing and dedicated resourcing to monitor and report on the implementation of the recommendations will be required. The obvious limitations to this AAR are that it is a single review and the first to be conducted at a state-wide level in NSW. Methods and findings have not been formally evaluated, but they do align with other published reviews, acknowledging practice and reporting of AARs is inconsistent.14 Another limitation is that only a proportion of participants provided feedback on the AAR process and workshop (~30% and 44% of participants, respectively) which may not be representative of all participants (introducing selection bias) and the AAR methodology chosen may not have been as acceptable to the network as findings suggest.
Lessons learnt
Participation in AARs can offer an important opportunity for collective reflection and debriefing following intensive periods of emergency response.
It is important that the scope of the AAR is well understood by staff and further support is made available to address the psychological needs of the workforce, both during and after a response period.
Education and training of the public health workforce to implement AARs and other quality improvement processes is an essential component of emergency preparedness and will strengthen the effectiveness of future responses and support staff resilience.
Conclusion
The AAR process can be a feasible and appropriate method for systematic reflection on public health emergencies. The experience of the AAR and the resources that were developed will provide a foundation for the NSW public health network to conduct AARs more easily and efficiently in the future. It is hoped this article will provide practical steps and experiences to support other organisations to undertake their own similar reviews and critical reflections of public health events.
Data availability
All data generated/analysed during this study are available from the author on reasonable request.
Conflicts of interest
AM is an Editorial Board Member of Public Health Research & Practice, but was not involved in the peer review or decision-making process for this paper. There are no further conflicts to declare.
Declaration of funding
This work was funded by the NSW Ministry of Health, St Leonards, New South Wales, Australia.
Acknowledgements
The authors would like to acknowledge all New South Wales public health network staff who contributed to the after-action review. This work was completed while Alexander Willems was employed as a trainee in the NSW Public Health Training Program funded by the NSW Ministry of Health. He undertook this work while based at the Centre for Epidemiology and Evidence.
Author contributions
CS led the AAR process and delivery. CS, AW and AM drafted the manuscript. All authors were part of the AAR Organising Committee. JL led the post-implementation review with the Organising Committee and CD was a methodological adviser. All authors reviewed/edited the manuscript.
References
1 NSW Ministry of Health. Public Health – NSW COVID-19 Response. Sydney: NSW Ministry of Health, Population and Public Health Division; 2023. Available at www.health.nsw.gov.au/Infectious/covid-19/evidence-hub/Publications/phr-report.pdf [cited 18 May 2023].
2 NSW Health. COVID-19 Weekly Data Overview – Epidemiological week 25, ending 25 June 2022. Sydney: NSW Government; 2022. Available at www.health.nsw.gov.au/Infectious/covid-19/Documents/weekly-covid-overview-20220625.pdf [cited 2 November 2022].
3 World Health Organization. Guidance for after action review (AAR). Geneva: WHO; 2019. Available at www.who.int/publications/i/item/WHO-WHE-CPI-2019.4 [cited 30 January 2022].
4 World Health Organization. After Action Reviews and Simulation Exercises under the International Health Regulations 2005 M&E Framework (IHR MEF). Geneva: WHO; 2018. Available at https://extranet.who.int/sph/sites/default/files/document-library/document/WHO-WHE-CPI-2018.48-eng.pdf [cited 12 February 2022].
5 Aledort JE, Lurie N, Ricci KA, Dausey DJ, Howard S. RAND Corporation. Facilitated Look-Backs: A New Quality Improvement Tool for Management of Routine Annual and Pandemic Influenza. Santa Monica, CA; 2006. Available at www.rand.org/pubs/technical_reports/TR320.html [cited 9 November 2022].
6 Singleton CM, Debastiani S, Rose D, Kahn EB. An analysis of root cause identification and continuous quality improvement in public health H1N1 after-action reports. J Public Health Manag Pract 2014; 20(2): 197-204.
| Crossref | Google Scholar | PubMed |
7 Mayigane LN, de Vázquez CC, Vente C, Charles D, Copper FA, Bell A, et al. The necessity for intra-action reviews during the COVID-19 pandemic. Lancet Glob Health 2020; 8: e1451-e1452.
| Crossref | Google Scholar | PubMed |
8 World Health Organization. Intra-Action Review (IAR). Strategic Partnership for Health Security and Emergency Preparedness (SPH) Portal. Geneva: WHO; 2023. Available at https://extranet.who.int/sph/intra-action-review
9 Zhelyazkova A, Fischer PM, Thies N, Schrader-Reichling JS, Kohlmann T, Adorjan K, Huith R, Jauch KW, Prückner SM. COVID-19 management at one of the largest hospitals in Germany: Concept, evaluation and adaptation. Health Serv Manage Res 2022; 36(1): 63-74.
| Crossref | Google Scholar | PubMed |
10 European Centre for Disease Prevention and Control. One-day in-action review (IAR) protocol in the context of COVID-19. Stockholm: ECDC; 2021. Available at www.ecdc.europa.eu/sites/default/files/documents/One-day-in-action-review-protocol.pdf [cited 28 May 2023].
11 World Health Organization. Guidance for conducting a country COVID-19 intra-action review (IAR). Geneva: WHO; 2020. Available at https://iris.who.int/bitstream/handle/10665/333419/WHO-2019-nCoV-Country_IAR-2020.1-eng.pdf?sequence=1 [cited 30 April 2022].
12 Dalton C. What just happened? A guide to conducting after-action reviews of significant public health events without stress or blame. Callaghan, NSW: University of Newcastle; 2020. Available at http://hdl.handle.net/1959.13/1420587 [cited 1 May 2022].
13 Federation University Australia. Post Implementation Review Guide. Sydney, NSW, Australia; 2010. Available at https://policy.federation.edu.au/forms/18.1%20Post%20Implementation%20Review%20Guide.pdf [cited 30 May 2023].
14 Dalton C, Kirk MD, Durrheim DN. Using after-action reviews of outbreaks to enhance public health responses: lessons for COVID-19. Med J Aust 2022; 216(1): 4-9.
| Crossref | Google Scholar | PubMed |
15 Quach HL, Nguyen KC, Vogt F. After-action reviews for emergency preparedness and response to infectious disease outbreaks. Western Pac Surveill Response J 2023; 14(1): 1-8 PMCID: PMC10090030.
| Crossref | Google Scholar |
16 Kolbe M, Schmutz S, Seelandt JC, Eppich WJ, Schmutz JB. Team debriefs in healthcare: aligning intention and impact. BMJ 2021; 374: n2042.
| Crossref | Google Scholar | PubMed |
17 World Health Organization. The global practice of After Action Review: A Systematic Review of Literature. Geneva: WHO; 2019. Available at https://iris.who.int/bitstream/handle/10665/331432/WHO-WHE-CPI-2019.9-eng.pdf?sequence=1 [cited 16 November 2022].
18 Centers for Disease Control and Prevention. Morbidity and Mortality Weekly Report. Assessment of Epidemiology Capacity in State Health Departments - United States, 2009. Atlanta: CDC; 2009. Available at www.cdc.gov/mmwr/preview/mmwrhtml/mm5849a1.htm [cited 24 April 2023].
19 Piltch-Loeb R, Kraemer JD, Nelson C, Stoto MA. A Public Health Emergency Preparedness Critical Incident Registry. Biosecur Bioterror 2014; 12(3): 132-143.
| Crossref | Google Scholar | PubMed |
20 National Association of Country and City Health Officials. The Public Health Emergency Preparedness Landscape Findings from the 2018 Preparedness Profile Assessment. Washington: NACCHO; 2018. Available at www.naccho.org/uploads/downloadable-resources/2018-Preparedness-Profile-Report_external_final.pdf [cited 25 May 2022].