Register      Login
Australian Health Review Australian Health Review Society
Journal of the Australian Healthcare & Hospitals Association
RESEARCH ARTICLE (Open Access)

Measuring clinician experience in value-based healthcare initiatives: a 10-item core clinician experience measure

Reema Harrison https://orcid.org/0000-0002-8609-9827 A * , Louise A Ellis B , Maryam Sina A , Ramya Walsan https://orcid.org/0000-0002-4359-6794 A , Rebecca Mitchell B , Ramesh Walpola C , Glen Maberly D , Catherine Chan E and Liz Hay E
+ Author Affiliations
- Author Affiliations

A Centre for Health Systems and Safety, Australian Institute of Health Innovation, Macquarie University, Sydney, NSW 2109, Australia.

B Centre for Healthcare Resilience and Implementation Science, Australian Institute of Health Innovation, Macquarie University, Sydney, NSW 2109, Australia.

C School of Health Sciences, University of New South Wales, NSW 2052, Australia.

D Western Sydney Diabetes, Blacktown and Western Sydney Local Health District, NSW 2151, Australia.

E Strategic Reform Branch, NSW Ministry of Health, Sydney, NSW 2065, Australia.

* Correspondence to: reema.harrison@mq.edu.au

Australian Health Review 48(2) 160-166 https://doi.org/10.1071/AH24003
Submitted: 8 January 2024  Accepted: 22 February 2024  Published: 12 March 2024

© 2024 The Author(s) (or their employer(s)). Published by CSIRO Publishing on behalf of AHHA. This is an open access article distributed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND)

Abstract

Objective

Clinician’s experiences of providing care are identified as a key outcome associated with value-based healthcare (VBHC). In contrast to patient-reported experience measures, measurement tools to capture clinician’s experiences in relation to VBHC initiatives have received limited attention to date. Progressing from an initial 18-item clinician experience measure (CEM), we sought to develop and evaluate the reliability of a set of 10 core clinician experience measure items in the CEM-10.

Methods

A multi-method project was conducted using a consensus workshop with clinicians from a range of NSW Health local health districts to reduce the 18-item CEM to a short form 10-item core clinician experience measure (CEM-10). The CEM-10 was deployed with clinicians providing diabetes care, care for older adults and virtual care across all districts and care settings of New South Wales, Australia. Psychometric analysis was used to determine the internal consistency of the tool and its suitability for diverse clinical contexts.

Results

Consensus building sessions led to a rationalised 10-item tool, retaining the four domains of psychological safety (two items), quality of care (three items), clinician engagement (three items) and interprofessional collaboration (two items). Data from four clinician cohorts (n = 1029) demonstrated that the CEM-10 four-factor model produced a good fit to the data and high levels of reliability, with factor loadings ranging from 0.77 to 0.92, with Cronbach’s alpha (range: 0.79–0.90) and composite reliability (range: 0.80–0.92).

Conclusions

The CEM-10 provides a core set of common clinician experience measurement items that can be used to compare clinician’s experiences of providing care between and within cohorts. The CEM-10 may be supported by additional items relevant to particular initiatives when evaluating VBHC outcomes.

Keywords: clinician experience, measurement, survey, value‐based care, workforce.

Introduction

In taking a whole-of-health system focus, value-based healthcare (VBHC) considers health outcomes and experiences relative to the resources or costs of care provision over a full cycle of care.1 Operationalising VBHC has necessitated the development of additional outcome measures.2 Patient reported experience measures (PREMs) and patient reported outcome measures (PROMs) for a range of contexts and conditions have been drawn upon to complement existing clinical and cost outcome measures.2 As key actors in health systems and services, clinician’s (from all professions) experiences are identified as a key outcome for VBHC initiatives, yet measurement of clinician’s experiences of providing care has received limited attention to date.3

In 2020, a rapid systematic literature review confirmed that the concept of ‘clinician experience of providing care’ relevant to VBHC was poorly defined with an absence of dedicated measurement tools.3 The included articles demonstrate that clinician’s experiences of providing care are dynamic, influenced by the work context, professional group and patient cohort. Yet the review concluded from 94 articles that there are key common indicators of whether clinician’s have a positive or negative experience of providing care. These indicators include their ability to input into decision-making, provide safe and high quality care, be respected and valued in interprofessional working and have psychological safety in their workplace.3

A definition of ‘clinician experience’ in the context of VBHC was developed drawing from the rapid review. The role of clinician experience within VBHC is described variably in different countries and contexts. Internationally, it has been conceptualised as a key outcome indicator of the success of VBHC initiatives.4,5 Clinician experience was defined as ‘clinicians’ perceptions of the quality and safety of care provision, interprofessional collaboration, work environment, their engagement in decision-making, and psychological experiences in the workplace when providing care’.6 The definition was used as a basis for the co-design and construction of an 18-item clinician experience measure (CEM) comprising four domains. The 18-item CEM was applied to enable the New South Wales (NSW) health system to benchmark and assess clinicians’ experiences of providing care system-wide during the acute coronavirus disease 2019 (COVID-19) pandemic period.

Progressing the VBHC agenda in NSW, across the Australian health system and internationally, requires the routine assessment of clinician’s experiences of providing care. In NSW, VBHC measures of efficiency and patient outcomes and experiences have been embedded in statewide programs including Leading Better Value Care, Integrated Care and Collaborative Commissioning. Clinician experience outcomes are being explored across these programs.4,7 Given that clinician’s experiences are dynamic, our initial work applying the 18-item CEM in the context of virtual care provision indicated that clinician experience measurement requires context and cohort specific items to be used in addition to a core set of measures. As such NSW Ministry of Health determined that a brief set of no more than 10 common core items are required in a clinician experience measure for it to be used as a core measure in conjunction with other survey tools and items relevant to each VBHC initiative. To address this, the present project aimed to first gain consensus on a 10-item version of the CEM (CEM-10) that would be relevant to all care settings and clinicians in the NSW public health system, and then validate the CEM-10 measure.

Method

Design

A sequential study was conducted to develop the CEM-10 comprising of:

  1. Consensus workshop with NSW Health clinicians to reduce the items in the 18-item CEM.

  2. Online surveys using the CEM-10 with four clinician cohorts (diabetes care, care for older adults, virtual care, virtual rural generalist services) to determine face and construct validity, and internal consistency.

Phase 1: consensus building workshops

Setting

This project was undertaken in the NSW public health system, which is structured as 15 local health districts (LHDs) and specialty health networks (SHNs) under NSW Health. Each district is responsible for the delivery of care through public hospital and other health facilities to people living in their given geographic area, with specialty networks responsible for providing care for key population cohorts such as children and people in the justice system.8

Sampling

Clinician leaders who were nurses, doctors, pharmacists and allied health staff working in NSW Health were eligible to take part in a consensus building workshop by the NSW Ministry of Health. Representation was sought from each of the NSW LHDs and speciality networks (e.g. paediatrics) across metropolitan, regional and rural areas, and from all service areas. Invitations to contribute to the workshop were distributed via email to clinical leads from each of the state's 30 communities of practice by the NSW Ministry of Health, with a calendar invite that they could choose to accept.

Procedure

One, 90-min workshop was held with clinician leaders (n = 20) representing 20 communities of practice and 10 LHDs and two specialty networks within the participating public health system. Online video-conferencing software was used to conduct the workshop, which was facilitated by the lead author (RH). Prior to the workshops, members were provided with a copy of a literature review reporting current evidence about the measurement of clinician experience of providing care, the 18-item tool and key discussion items for the session. The group worked through each domain, ranking the relevance of each statement to the domain topic, followed by the clarity of each statement. Statements ranked least relevant were removed, and in some cases, statements were revised for enhanced clarity. The process continued until all statements had been ranked and domains reduced as far as possible.

The resulting draft CEM-10 was then disseminated to the workshop members for their review and no further changes were made. This process revealed that the tool took no longer than 5 min to complete unless extensive qualitative detail was added. The final CEM-10 (Table 1) comprised the same four domains as the 18-item CEM: quality of care (three items); interprofessional collaboration (two items); psychological safety (two items); and clinician engagement (three items). All items were rated on a 7-point Likert scale from 1 = strongly disagree to 7 = strongly agree. The CEM-10 was uploaded within Qualtrics for administration across the identified VBHC initiatives for the purposes of validation.

Table 1.10-item clinician experience measure (CEM-10) domains and items.

DomainItems
1. Quality of care1. I am confident that I am able to provide high quality patient care
2. I am able to be responsive to the needs of individual patients to create a positive patient experience
3. I am able to provide care aligned with the currently accepted best practice
2. Interprofessional collaboration4. My colleagues and I make changes to our working approaches based on each others feedback
5. My colleagues and I share decision-making power with each other
3. Psychological safety6. Members of staff in my organisational unit are able to talk about problems and tough issues
7. I feel safe to present new ideas and challenge current practice in my organisational unit
8. My contributions are valued in decision-making in my services
4. Clinician engagement9. I have the opportunity to participate in decision-making in my service(s)
10. My voice is heard in the process of making change in my service(s)

Phase 2: CEM-10 reliability analyses

Sampling and procedure

An embedded link to the CEM-10 was distributed to eligible clinicians (any clinician working under the identified VBHC initiatives) for their voluntary and anonymous completion. The link was sent to one cohort at a time and remained active for a 3-week period for each cohort between August and December 2023. Once the link was deactivated, the project team downloaded the data from the Qualtrics platform for analysis. The resulting data were used to assess the internal consistency of the CEM-10 and explore its structural validity in relation to clinician experience for the range of included clinician cohorts.

Participants with missing values for the CEM-10 were excluded from the analysis.9,10 Frequency distributions were calculated to test whether items violated the assumption of univariate normality (i.e. skewness index ≥3, kurtosis index ≥10).11 The 10 CEM items were evaluated psychometrically via Confirmatory Factor Analysis (CFA). Each item was loaded on the one factor it purported to represent. Goodness-of-fit was assessed using the Tucker Lewis Index (TLI), Comparative Fit Index (CFI), Root Mean Square Error of Approximation (RMSEA) and the relative chi-squared (chi-squared/d.f.). The TLI and CFI yield values ranging from 0.00 to 1.00, with values greater than 0.90 and 0.95 being indicative of acceptable and excellent fit to the data.12 For RMSEA, values less than 0.05 indicate good fit, and values as high as 0.08 represent reasonable errors of approximation in the population.13 Chi-squared tests are sensitive to sample size; therefore, the relative chi-squared (chi-squared/d.f.) was used as an index of fit, with values less than 2 indicating a good model fit.14 Reliability of each of the subscales was assessed through split-half reliability, equivalent to Cronbach’s alpha (using SPSS v29), and composite reliability (using AMOS v29).

Ethics

Ethical approval for this research was granted by the Macquarie University Human Research Ethics Committee (12283). Completion and submission of the online survey provided implied consent to participate in the project.

Results

Data were received from 1387 clinician respondents across the four VBHC initiatives. After excluding participants who did not populate any of the CEM-10 items, the final sample included in the analysis was 1029 with 860 respondents completing all 10 CEM items. Clinician experience data were pooled from four VBHC initiatives in the NSW public health system to explore the use of the CEM-10 in a range of contexts: (1) virtual care (n = 412/1029; 40.0%), (2) virtual rural generalist service (VRGS) (n = 40/1029; 3.9%), (3) care for older adults (n = 282/1029; 27.4%), and (4) diabetes care (n = 295/1029; 28.7%).

Demographic characteristics

Table 2 provides a summary of the demographic make-up of respondents included in the analysis. From the total of 1029, 850 participants responded to at least one of the demographic questions that were positioned at the end of the survey. A total of 180 participants responded to the CEM-10 but did not answer any of the demographic questions; of these, 132 were from the virtual care cohort, 30 were from the diabetes care cohort, 11 were from the older adults cohort and 7 were from the VRGS. Respondents who were females (n = 659/847; 77.8%) and aged between 30 and 60 years (n = 677/839: 80.7%) were highly represented in the sample. More than two-thirds of respondents reported having been in their profession for more than 10 years (n = 598/849; 70.4%).

Table 2.Demographic data for survey respondents.

CharacteristicVirtual careCare for older adultsVRGSDiabetes careTotal
Years in profession
 Less than 28 (2.9)23 (8.5)0 (0.0)13 (4.9)44 (5.2)
 2–521 (7.5)27 (10.0)3 (9.1)20 (7.6)71 (8.4)
 6–1040 (14.2)43 (15.9)6 (18.2)47 (17.8)136 (16.0)
 11–2084 (29.9)84 (31.0)10 (30.3)77 (29.2)255 (30.0)
 21–3077 (27.4)49 (18.1)8 (24.2)60 (22.7)194 (22.9)
More than 3051 (18.2)45 (16.6)6 (18.2)47 (17.8)149 (17.6)
 Total28127133264849
Manage other staff
 Yes125 (44.3)82 (30.6)14 (42.4)87 (33.6)308 (36.6)
 No157 (55.7)186 (69.4)19 (57.6)172 (66.4)534 (63.4)
Total28226833259842
Professional role
 Allied health professional100 (35.5)151 (55.9)1 (3.0)56 (21.1)308 (36.2)
 Doctor78 (27.7)20 (7.4)15 (45.5)52 (19.6)165 (19.4)
 Nurse/midwife79 (28.0)80 (29.6)17 (51.5)125 (47.2)301 (35.4)
 Aboriginal health worker0 (0)0 (0)0 (0)1 (0.4)1(0)
 Pharmacist0 (0)2 (0.7)0 (0)12 (4.5)14 (1.7)
 Clinician manager12 (4.3)12 (4.4)0 (0)6 (2.3)30 (2.2)
 Care coordinator0 (0)0 (0)0 (0)3 (1.3)3 (0.0)
 Other13 (4.3)5 (1.9)0 (0)9 (3.4)28 (3.3)
 Total28227033265850
Indigenous status
 Aboriginal descent5 (1.8)8 (3.0)1 (3.0)4 (1.5)18 (2.1)
 Torres Strait Islander descent0 (0)0 (0)0 (0)0 (0)0 (0)
 Aboriginal and Torres Strait Islander descent0 (0)0 (0)0 (0)0 (0)0 (0)
 Neither Aboriginal or Torres Strait Islander descent249 (89.2)242 (90.6)27 (81.8)248 (93.9)766 (90.9)
 Prefer not to respond25 (9.0)17 (6.4)5 (15.2)12 (4.6)59 (7.0)
 Total27926733264843
Gender
 Male62 (22.1)29 (10.7)13 (39.4)43 (16.3)147 (17.4)
 Female202 (72.1)228 (84.4)19 (57.6)210 (79.6)659 (77.8)
 Non-binary/third gender0 (0)0 (0)0 (0)1 (0.4)1 (0.1)
 Prefer not to say15 (5.4)13 (4.8)1 (3.0)10 (3.8)39 (4.6)
 Other1 (0.4)0 (0)0 (0)0 (0)1 (0.1)
 Total28027033264847
Age group
 <20 years0 (0)0 (0)0 (0)1 (0.4)1 (0.1)
 21–30 years16 (5.9)34 (12.6)2 (6.1)11 (4.2)63 (7.5)
 31–40 years54 (19.8)71 (26.4)8 (24.2)56 (21.2)189 (22.5)
 41–50 years97 (35.5)59 (21.9)11 (33.3)74 (28.0)241 (28.7)
 51–60 years74 (27.1)77 (28.6)5 (15.2)91 (34.5)247 (29.4)
 61–70 years28 (10.3)27 (10.0)6 (18.2)27 (10.2)88 (10.5)
 70+ years4 (1.5)1 (0.4)1 (3.0)4 (1.5)10 (1.2)
 Total27326933264839

Descriptive statistics for each of the CEM-10 items is shown in Table 3. Acceptable values of skewness fall between −3 and +3, and kurtosis is appropriate from a range of −10 to +10 when utilising CFA.11 While values that fall above or below these ranges are indicative of non-normality, CFA is a relatively robust analytical method, so small deviations are unlike to represent violations of normality.15 As such, here, none of the items within the CEM domains violated the established criteria of skewness and kurtosis. The 10-item four-factor model was then tested through CFA using the four clinician cohorts. Each item was loaded on the one factor it purported to represent.

Table 3.Descriptive statistics for CEM-10 items.

ConstructItem wordingMeans.d.RangeSkewness indexKurtosis index
Quality of careI am able to be responsive to the needs of individual patients to create a positive patient experience5.531.331–7−1.351.66
I am able to provide care aligned with the currently accepted best practice5.521.351–7−1.371.72
I am confident that I am able to provide high quality patient care5.821.191–7−1.713.65
Interprofessional collaborationMy colleagues and I share decision-making power with each other5.521.331–7−1.361.81
My colleagues and I make changes to our working approaches based on each other’s feedback5.491.271–7−1.382.13
Psychological safetyMembers of staff in my organisational unit are able to talk about problems and tough issues5.541.401–7−1.552.44
I feel safe to present new ideas and challenge current practice in my organisational unit5.131.581–7−1.040.53
Clinician engagementI have the opportunity to participate in decision-making in my service(s)5.171.541–7−0.990.32
My voice is heard in the process of making changes in my service(s)4.861.611–7−0.78−0.16
My contributions are valued in decision-making in my service(s)5.231.481–7−1.130.98

As shown in Table 4, the CEM-10 instrument constituted four domains, contained 10 items and demonstrated strong psychometric qualities. The 10-item four-factor model produced a good fit to the data, χ2 (29) = 112.38, TLI = 0.98, CFI = 0.98 and RMSEA = 0.05, with factor loadings ranging from 0.62 to 0.94. Correlations between the factors were significant but were generally low to moderate (range = 0.28–0.78, median = 0.45), suggesting good discriminant validity between factors.11,16 Cronbach’s alpha (range: 0.79–0.90) and composite reliability (range: 0.86–0.97) demonstrated that all four factors had high levels of internal consistency.

Table 4.Confirmatory factor analysis and reliability results for the CEM-10.

ConstructFactor loadingCoefficient alphaCoefficient reliability
Quality of care0.770.850.89
0.92
0.78
Interprofessional collaboration0.860.870.88
0.90
Psychological safety0.830.790.80
0.80
Clinician engagement0.910.900.92
0.92
0.78

Discussion

By developing and applying a 10-item measure of clinician’s experiences of providing care in VBHC initiatives across New South Wales, this study provides a novel tool to evaluate clinician’s experiences of providing care relevant to value-based care initiatives. The study demonstrated that the CEM-10 has strong internal consistency and can be reliably used with clinicians from a range of professions and health service settings, with more than 1000 clinicians from diverse professions readily completing the CEM-10 across metropolitan and rural contexts and a range of service types and initiatives. Incomplete responses were rare, indicating the instrument is quick, easy and feasible to complete in the context of clinician’s busy work schedules.

Beyond being quick to complete, the 10 items provide an indication of clinician’s experiences of providing care that can be used to benchmark and/or compare experiences between different time points, clinician cohorts, healthcare settings or geographic locations relevant to value-based care. Rich data from the accompanying qualitative responses to three free text items provides depth of detail regarding the factors that clinicians perceive are contributing to their experiences. As a generic tool designed for relevance to a range of professions and work contexts, the qualitative items provide essential insight to enable the CEM-10 data to be understood in the context of the specific professional or service factors to direct improvements. Further work may use the CEM-10 to also explore the relationship between patient and clinician experiences of care.

Capturing data of clinicians’ workplace and psycho-social experiences has been the subject of decades of management and psychological research, leading to a multitude of instruments.1720 Existing instruments have largely explored individual psychological states and traits, well-being and/or workplace conditions.3 Similar constructs are regularly explored in workplace surveys among healthcare staff in Australia and internationally.2123 While value-based care outcomes include clinician’s experiences of providing care, the concept of clinician experience in this context has been ill-defined contributing to challenges in measuring and reporting this outcome from VBHC initiatives. In taking a collaborative approach to the design and refinement of the CEM-10, we arrived at a definition of clinician experience and an associated novel tool. The CEM-10 captures the key indicators of clinician’s experiences of providing care as determined by international research literature,3 but also by clinicians in the NSW public health system.6 In doing so, the CEM-10 is a contextually relevant instrument for VBHC initiatives in NSW that requires examination for relevance to further Australian and international health systems.

While the CEM-10 provides a valuable, novel and brief measure to assess clinician experiences in a value-based care framework, there are notable potential selection biases that may have influenced the design and content of the resulting instrument. By co-developing this measure with a small group of clinicians who were predominantly senior doctors and nurses from a single health system, the resulting measure may be influenced by their experiences and standpoint, meso- and macro-contextual factors. Data for psychometric testing were obtained from a large group who worked in a variety of healthcare services, geographic locations and with diverse patient groups, but still within a single public health system. Respondents were predominantly those working with virtual models of care, allied health professionals were over-represented, and most respondents had many years of experience; these factors may have shaped the resulting data. The demographic questions were positioned at the end of the survey and were not completed by 180 respondents, limiting our understanding of the make-up of this subset of the sample. This study reports only a first phase of validation and other psychometric qualities should be documented, in addition to external validation from a perspective of transcultural adaptation. Further validation work in which the CEM is validated against other measures of clinician experience such as through qualitative interviews conducted with clinicians with a range of positive and less positive experiences is also required. The resulting measure has attracted interest from health systems progressing value-based care including Canada and the UK and is being assessed for its relevance to other such health systems. The instrument is also undergoing more expansive testing and validation in further Australian states and specialities.

Conclusion

The CEM-10 provides a novel, brief and widely relevant tool to capture, benchmark and compare clinician’s experiences of providing care, which is a valuable tool towards the evaluation of value-based healthcare initiatives in New South Wales. The CEM-10 would benefit from further validation and analysis of its application in other Australian states and territories as well as internationally.

Data availability

The data that support the findings of this study are available from NSW Health, but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data may be requested, however, from the authors upon reasonable request and with permission of NSW Health.

Conflicts of interest

LH and CC are employed by the NSW Ministry of Health. Other authors do not have any conflicts of interest to declare.

Declaration of funding

This project was funded by the NSW Ministry of Health. The funding body was responsible for the conceptualisation of the research but did not participate in the design of the survey instrument. LH and CC as authors of the manuscript contributed to interpretation of data relating to the manuscript content and to drafting the manuscript.

References

Porter ME. What is value in health care. N Engl J Med 2010; 363(26): 2477-81.
| Crossref | Google Scholar | PubMed |

Koff E, Lyons N. Implementing value‐based health care at scale: the NSW experience. Med J Aust 2020; 212(3): 104-6.e1.
| Crossref | Google Scholar | PubMed |

Iqbal MP, Manias E, Mimmo L, Mears S, Jack B, Hay L, et al. Clinicians’ experience of providing care: a rapid review. BMC Health Serv Res 2020; 20(1): 1-10.
| Crossref | Google Scholar |

NSW Government. About value-based health care. 2023. Available at https://www.health.nsw.gov.au/Value/Pages/about.aspx

Sikka R, Morath JM, Leape L. The Quadruple Aim: care, health, cost and meaning in work. BMJ Qual Saf 2015; 24(10): 608-10.
| Crossref | Google Scholar | PubMed |

Harrison R, Manias E, Ellis L, Mimmo L, Walpola R, Roxas-Harris B, et al. Evaluating clinician experience in value-based health care: the development and validation of the Clinician Experience Measure (CEM). BMC Health Serv Res 2022; 22(1): 1484.
| Crossref | Google Scholar | PubMed |

Dawda P, True A, Dickinson H, Janamian T, Johnson T. Value-based primary care in Australia: how far have we travelled? Med J Aust 2022; 216(S10): S24-7.
| Crossref | Google Scholar | PubMed |

IBM Corp. IBM SPSS Statistics for Windows, Version 25.0. Armonk, NY: IBM Corp; 2017.

10  Tabachnick BG, Fidell LS. Using multivariate statistics, International edn. Pearson; 2013.

11  Kline RB. Principles and practice of structural equation modeling. Guilford Publications; 2015.

12  McDonald RP, Marsh HW. Choosing a multivariate model: Noncentrality and goodness of fit. Psychol Bull 1990; 107(2): 247.
| Crossref | Google Scholar |

13  Joreskog K, Sorbom D. Structural equation modelling: Guidelines for determining model fit. New York: University Press of America; 1993.

14  Ullman J. Structural equation modeling: In Tabachnick BG, Fidell LS, editors. Using multivariate statistics, 6th edn.Harlow: Pearson Education Limited; 2014. pp. 731–836.

15  Griffin MM, Steinbrecher TD. Large-scale datasets in special education research. Int Rev Res Develop Disabilities 2013; 45: 155-83.
| Google Scholar |

16  Brown TA. Confirmatory factor analysis for applied research. Guilford Publications; 2015.

17  van Diepen C, Fors A, Ekman I, Hensing G. Association between person-centred care and healthcare providers’ job satisfaction and work-related health: a scoping review. BMJ Open 2020; 10(12): e042658.
| Crossref | Google Scholar | PubMed |

18  Jarden RJ, Siegert RJ, Koziol-McLain J, Bujalka H, Sandham MH. Wellbeing measures for workers: a systematic review and methodological quality appraisal. Front Public Health 2023; 11: 1053179.
| Crossref | Google Scholar | PubMed |

19  Edú-Valsania S, Laguía A, Moriano JA. Burnout: A review of theory and measurement. Int J Environ Res Public Health 2022; 19(3): 1780.
| Crossref | Google Scholar | PubMed |

20  Karaferis D, Aletras V, Niakas D. Determining dimensions of job satisfaction in healthcare using factor analysis. BMC Psychol 2022; 10(1): 240.
| Crossref | Google Scholar |

21  NHS Staff Coordination Centre. NHS Staff Surveys. 2023. Available at https://www.nhsstaffsurveys.com/

22  NSW Government Public Service Commission. People Matter Employee Survey. 2023. Available at https://www.psc.nsw.gov.au/reports-and-data/people-matter-employee-survey

23  Victorian Public Sector Commission. About the People matter survey. 2023. Available at https://vpsc.vic.gov.au/data-and-research/about-the-people-matter-survey/