Using AI scribes in New Zealand primary care consultations: an exploratory survey
Angela Ballantyne




1
2
3
Abstract
AI scribes have had a rapid uptake in primary care across New Zealand (NZ). The benefits of this new technology must be weighed against the potential risks they may pose.
This study provides a snapshot of AI scribes use in primary care to generate clinical notes. We aimed to understand emerging provider experiences, identify perceived clinical benefits and concerns, and flag potential ethical and legal issues as a basis for future research and policy development.
GPs and health providers working in primary care across NZ were invited to participate in an anonymous survey about their experience with AI scribes (February–March 2024).
One hundred and ninety-seven respondents completed the survey, 88% (n = 164) of whom were GPs. Of these, 40% (n = 70) had experience with AI scribes. Reported benefits included: reduced multitasking (n = 46), saved time (n = 43), and improved rapport with patients (n = 43). Key concerns included: compliance with NZ legal and ethical frameworks (n = 108), data security (n = 98), errors or omissions (n = 93), and data leaving New Zealand (n = 91). Only 66% (n = 41) had read the terms and conditionss of the AI scribe tool, and 59% (n = 35) reported seeking patient consent. Most (80%, n = 50) found AI scribes helpful or very helpful, and 56% (n = 35) said the tool changed consultation dynamics.
While there is strong uptake and enthusiasm for AI scribes in primary care in NZ, critical issues remain around legal and ethical oversight, patient consent, data security, and the broader impact on clinician–patient interactions. Health providers need clearer guidance and regulatory support for safe, ethical, and legal use of AI tools.
Keywords: accountability, advantages, artificial intelligence, bioethics, clinical notes, concerns, consent, data ethics, health law, primary care, scribes, survey, transcribe.
WHAT GAP THIS FILLS |
What is already known: The use of AI scribes in primary care consultations raises significant and complex clinical, ethical, medicolegal, and data governance issues. Perceived benefits (eg reducing administrative burden, enhancing efficiency, and patient care) have driven rapid front-line uptake of these tools in New Zealand (NZ) while national regulations and guidelines are still being developed |
What this study adds: The key advantages of AI scribes noted by NZ primary care practitioners surveyed in 2024 were: reducing multi-tasking, time savings, reduction in cognitive load, and improved rapport with patients. Key concerns were: compliance with NZ legal and ethical frameworks, security of patient data, errors or omissions in clinical notes, and the risk of patient data leaving NZ. |
Introduction
The introduction of artificial intelligence (AI) scribes in general practice has had a rapid uptake and significant impact on patient documentation processes. This has happened at a challenging time for primary care in Aotearoa/New Zealand (NZ), characterised by workforce shortages, stress and burnout, greater complexity of clinical presentations, and the ‘inbox tyranny’ of increased time required to review tests, referrals, and patient progress.1 In the UK, general practitioners (GPs) were reported to spend an estimated 44% of their working hours on administrative work and 14% of consultation time on recording and updating notes.2 In NZ, 20% of consultation time was spent interacting with a computer, with 12% completely excluding the patient.3
AI clinical scribes have been promoted as tools for mitigating burdens associated with clinical note documentation. Services such as Nabla, Heidi, and Chartnote are AI software tools that record a spoken conversation then transcribe and process the data into structured clinical notes. Bespoke NZ-based AI scribes are in use (eg TEND). Surveys run in 2024 and 2025 by the AI in Primary Care Group found that AI use in primary care is growing, and in early 2025, 68% of respondents now use AI tools.4
There is ongoing debate, uncertainty, and variability in the uptake and use of AI scribes in primary care. Although some early technical concerns (eg accuracy) have received attention, many questions remain.5–7 Recent systematic reviews agree that although AI scribes hold substantial potential for reducing administrative burden and enhancing efficiency, their use raises significant issues about privacy, consent, data security, and the legal framework governing their deployment.8,9 To navigate these challenges, health providers will need to develop ‘fresh competencies in data comprehension and technological proficiency’ (p. 1767).10
Balancing the benefits of AI scribes with patient rights and data security is crucial.11 Health data breaches can have severe consequences, as illustrated by the Waikato District Health Board breach in 2021.12 Most currently available AI scribes rely on international cloud-based platforms (often privately owned and controlled) for processing and storing data, which raises questions about where data is stored, who has access to it, and how it can be protected from cyber threats.
At present in NZ, the decision to use AI scribes rests with the individual health provider/practice. There is no regulatory approval or external testing of the validity, safety, or accuracy of AI scribes.A The NZ government has promoted the Organisation for Economic Co-operation and Development (OECD) AI Principles as a key platform for NZ’s approach to responsible AI.13 But there remain NZ-specific data governance issues that need to be recognised and resolved, particularly Māori data sovereignty and the current (in)ability of many AI scribes to recognise te reo Māori.14 Health sector agencies such as the World Health Organization and the UK National Health Service have published general AI principles as well as specific guidance for large multi-modal models.15–17 Such guidance generally recognises the potential utility of AI tools but flags concerns about privacy, inaccurate output, bias, lack of transparency, and data sovereignty. There are some specific local resources regarding AI clinical notes tools (eg RACGP AI Scribes and Well South PHO’s Primary Care AI Resource Hub).18,19 Guidance from Health NZ emphasises a precautionary approach to the use of large language models (LLMs) and health providers’ responsibility for the content of clinical notes.20
This paper reports on a survey of GPs and other primary care providers in NZ conducted in the first quarter of 2024. Our aim was to provide a snapshot in time of how and why AI scribes were being used in clinical practice to generate clinical notes. Additional objectives were to better understand health providers’ emerging experience with AI scribes, identify perceived clinical advantages and concerns, and potential ethical or legal issues, as a basis for future research and policy development.
Methods
An anonymous non-probabilistic survey was conducted after piloting with a small group of local primary care practitioners (see question schedule in Supplementary material). The survey ran from 13 February to 24 March 2024. Participants were recruited via the Royal New Zealand College of General Practitioners (RNZCGP) newsletter e-Pulse, social media sites dedicated to primary healthcare, and by email. The survey was administered via Qualtrics and was open to all GPs and health providers working in primary care in NZ. Given this was an explorative survey, respondents would have varying degrees of experience with AI scribes, and as we knew staff working in primary care are busy, respondents were able to skip questions they did not want to answer.
Quantitative results were analysed using the embedded tools in Qualtrics. Free text comments were read and iteratively discussed by all members of the research team to identify key themes. Feedback on presentations at various national hui to clinical, policy, and primary care audiences provided additional validation. This article presents themes of particular and immediate relevance to the primary care community, highlighting key ethical and legal aspects.
The study was approved by the University of Otago Human Research Ethics Committee (D23/357).
Results
One hundred and ninety-seven respondents completed the survey, and respondents did not answer all questions. Of the respondents, 88% (n = 164) were GPs, and the others (n = 23) included nurses, nurse practitioners, rural emergency care, practice managers, and other non-clinical primary care roles. The majority were very experienced: over half (52%) had worked in their role for 11–30 years, and 24% for 30+ years. Over half (57%) currently worked 4–5 days per week in clinical practice.
A total of 40% (n = 70) of respondents reported experience with an AI scribe (of which 97% had experience with Nabla and 18% had experience with other tools eg Chartnote, AI-Scribe, Lyrebird AI, Heidi, Freed). Of those who had tried an AI scribe, 84% (n = 53) had used it in a clinical consultation with patient(s), but the amount of reported use of with the tool in this setting varied: 37% (n = 23) had used it 11–50 times and 32% (n = 20) more than this.
Key advantages noted via multi-choice questions (n = 58 answered this question) were: ‘reduced multitasking’ (n = 46), ‘saved time’ (n = 43), and ‘improved rapport with the patient’ (n = 43) (Fig. 1).
Key concerns were (n = 151 answered this question): ‘compliance with NZ legal and ethical frameworks’ (n = 108); ‘security of patient data’ (n = 98), ‘errors or omissions to the clinical notes’ (n = 93), and ‘risk of patient data leaving NZ’ (n = 91) (Fig. 2).
Medicolegal issues
Respondents expressed concerns in free-text comments about compliance with NZ legal and ethical frameworks:
Privacy is the main issue. (#17)
I am careful not to verbalise identifying data during consult – ie names, DOB. I pause the recording if I do need specifics. (#11)
Others expressed a desire to use the tools but wanted regulatory guidance and safety assurances:
I have not tried using this yet (although would very much like to) because of concerns about safety of patient information and patient consent. As well I could not find any specific guidance from medical council and college endorsing/approving use of such technologies. (#18)
There is a pressing need for appropriate guidance from professional bodies in NZ to ensure that doctors using such tools (or others) are protected from accusations of professional misconduct. (#44)
Concerns were also raised about observed shortcomings of the tool, including issues with accuracy, completeness, conciseness, and adequacy for recording key issues (‘red flags’) arising in complex consultations:
The … note missed some critical negative findings. This meant I didn’t trust it. (#122)
The hallucination rate is quite high and often quite subtle … so I’ve stopped using it. (#100)
The notes … often don’t capture the essence of the consult properly. (#53)
We risk ending up with long notes that are not necessarily very useful for colleagues nor patients. (#52)
Consent
Only 66% (n = 41) of respondents had read the terms and conditions before using the tool, and 59% (n = 35) reported getting consent from patients. Where consent was sought, the majority of health providers based this on a 1–2 sentence verbal description of the AI tool at the start of the consultation. Most said they briefly described to patients what the tool does (transcribes, prepares notes), data storage, and information privacy. There was significant variability in language used to describe data security; for example: ‘no personal details are stored’, ‘not identifiable’, ‘no information is stored’, (data is) ‘not saved’, ‘confidential’, ‘encrypted’, ‘non-identifying data only submitted’, ‘probably similar risk to using Gmail’. A minority described using a disclosure notice/sign on the wall, providing an information sheet to patients, getting written consent from patients, or documenting verbal consent in the notes. One respondent envisaged that AI scribes:
… will soon be so commonly used that expressed consent is not necessary. In the same way that we don’t ask for consent to use a PC during the consult, it’s just a given. (#48)
Implications for practice
A majority (71%, n = 50) of respondents found AI scribes helpful or very helpful. The AI scribe was perceived by many to be a significant time saver: 47% (n = 33) estimated that using an AI scribe for every consultation could save between 30 min and 2 h daily. Those who self-identified as struggling to type quickly or accurately were more likely to say that the AI scribe saved time.
Time savings were perceived as particularly valuable for long or complex consultations and for subsequent referrals.
Many patients now bringing multiple issues into each appt – using AI allows me to not worry on capturing notes along the way knowing I may miss something out. (#11)
It saves time for subsequent referrals as I just use these notes. (#23)
The AI scribe was also a useful backstop when under time pressure during a clinical session:
It’s most useful when I’m running really behind and don’t have time to write notes after the consult so I can go back later and still have all the info from the consult there. (#142)
On the other hand, a significant minority commented that the tool did not actually save time overall because it took so long to edit and correct the AI-generated notes.
I didn’t save time. It took as much time to review and edit the notes as it would have done to create them myself. (#168)
Some said that their initial assessment of the AI scribe was very positive, but over time they realised how much editing was required due to shortcomings in the AI templates and output.
I thought it was great at first, but I still find I spend as long modifying the note to be better laid out and more concise.’ (#142)
The notes … are way too long and wordy. (#53)
Over half of respondents (56%, n = 35) said that the AI scribe changed the dynamics of the consultation. Using an AI scribe encouraged practitioners to verbalise physical examination findings and their thought processes to allow the transcription tool to capture this information:
I talk out loud when I examine, try to word things to benefit notes. (#64)
Needed to verbalise a lot of things I would otherwise not. (#110)
This resulted in ‘offering better explanations to patients, more structured history taking’ (#97) and greater transparency: ‘the patient also hears everything that is happening’ (#98).
However, it was noted that this increased verbalisation could also lengthen consultations and encourage some inapposite responses to patients:
Lots more talking so the consult takes longer. (#100)
Today someone said, ‘I’ve got pain here’ and pointed to the area, and so I said out loud ‘oh pain in the right upper quadrant?’ (#53)
Respondents noted that overseas-designed AI scribes had limitations regarding transcription and summarising content in languages other than English and/or recognising the NZ accent or vocabulary.
Sometimes missed out on significant kiwi-isms. (#110)
It would be good if it was able to pick up other languages other than English. (#9)
There was a clear consensus that using an AI scribe allowed for greater focus on the patient, thus facilitating engagement and rapport building via more eye contact and active listening: ‘able to listen instead of typing and listening at the same time’ (#156). Others noted they felt more relaxed during the consultation and that their job satisfaction was enhanced: ‘So much more is achieved. More conversation and bonding with patients. It makes GP work FUN again by putting the patient back at the centre of everything’ (#12).
It was also noted that having the clinical notes available facilitated communication with patients about their treatment plan via sharing the notes at the end of the consultation or making it easier to provide follow-up.
Patient advice portion easily emails to patient and saves time for nurses if patients call back to query plan. (#65)
Several respondents were concerned that widespread use of AI scribes might pose a longer-term risk to the quality of clinical reasoning, because not having to type the notes meant a lost opportunity to ‘process’ information gleaned during the consultation:
I know of colleagues who have tried it but do not continue using it as they feel it negatively affects their thought processes during and after the consultation, when they do a lot of their ‘thinking’ about the patient. (#52)
Others saw implications for how clinicians might process complex information and develop their interpretive skills in the future.
Have to change reasoning approach as use typing to think through and avoid missing important questions. (#183)
… an inability to have that almost intuitive ability that a highly experienced GP will have to ‘think outside the square (box)’ particularly with very complex cases. (#40)
Overall, respondents recognised that AI scribes had room for improvement, ‘it’s not good enough yet’ (#36), while recognising the utility, and perhaps inevitability, of AI scribes. This prompted comments about practitioners needing further training and reflection on how to use AI scribes safely and effectively.
I’m still learning how best to use these tools, so expect to save more time as I get more familiar with the tools and how to get the type of notes I need. (#106)
I would need significant education around the role of AI. (#6)
There were specific suggestions about how developers could improve the usability of AI scribes for the primary care setting, including fuller integration into practice systems at a systems level.
The format is a bit funny for patients I know well/see often for chronic and many issues. They need different templates for consults and ideally it could integrate with the PMS to code things. (#53)
Discussion
This study provides a snapshot of AI scribe use in NZ in the first quarter of 2024, an area of clinical practice that continues to change rapidly. Our results demonstrate there was significant adoption of and enthusiasm for AI scribes soon after their release, but also highlight substantial issues that remain to be resolved and/or further researched. These matters include legal and regulatory oversight, ethical issues relating to patient consent and data security, and unknown effects of AI scribes on the nature of patient–provider communication and interactions.
Medicolegal issues
Respondents expressed concerns about compliance with NZ legal and ethical frameworks and called for regulatory guidance. The higher number of respondents indicating concerns with the tool (n = 151) (compared to those noting advantages of AI scribes n = 58), likely include respondents who have not used AI scribes, precisely because they have concerns.
The Health Privacy Information Code (HIPC) and the Code of Health and Disability Services Consumers’ Rights (the Code) do not refer explicitly to LLMs, generative AI (GAI), or AI scribe tools. However, they do provide patients with rights to consent before services are provided and for their authorisation to be obtained for the use and disclosure of their health information (subject to certain exceptions).
Previous research identified the variability in health providers’ clinical notes and the harms that might arise from inaccurate or unclear notes.21 Health providers retain professional and legal responsibility for the accuracy of their clinical notes regardless of whether they have used AI scribes.22,23 This requires checking all outputs for accuracy to avoid confabulations and omissions. However, as many survey respondents noted, carefully checking each AI-generated clinical note eats into, and sometimes negates, the proffered time saving.
To capture their promised benefits while minimising the risks, several other medicolegal issues raised by AI scribes should be addressed, including legal requirements to retain health data, the implications of pausing a transcription recording, patients’ rights to access and correct their health information, the fact that the software erases original audio before transcripts have been checked, and the use of recordings or transcripts in legal disputes.24
Consent in primary care: explicit consent, reasonable disclosure, coercion
Our study demonstrated significant variability in practice regarding disclosure and patient consent. Nearly half (41%) of the providers in our survey did not seek patient consent to use AI scribes. Consent norms in primary care typically reflect an established therapeutic relationship between patients and GPs, often relying on inferred or implied consent rather than express or documented agreement.25 However, consent processes should reflect the needs of the patient and the nature of the treatment or investigation. The Health and Disability Commissioner (HDC) has ruled that the use of novel or innovative technology in the delivery of healthcare requires more explicit consent.26 AI scribes may be novel for many patients.
The Code gives patients the right to information that a reasonable person, in that person’s circumstances, needs to make an informed choice or give informed consent. Only 66% of respondents had read the terms and conditions before using the AI tool, and respondents reported significant variability of language used to describe data security and risk associated with the AI tool. Efforts to simplify data security issues may be intended to make this information more accessible for the patient but may introduce inaccuracies. The standard regarding what a ‘reasonable patient’ would expect to know about the use of AI scribes or other tools in their clinical care is not ethically settled and has not been legally tested in NZ. The Medical Council will release guidance about the use of AI in health in 2025 and it is expected to require patient consent.27 GPs should be cautious about relying on marketing claims provided by the company and should be aware that, unlike pharmaceuticals, which go through rigorous testing and regulatory approval processes, most AI scribes will have been through none.28,29
Finally, GPs should be aware that the power imbalance between patients and doctors may put undue pressure on patients to consent, and patients may feel uncomfortable expressing disagreement with the doctor who proposes use of an AI tool.30,31
Impact on patient–provider interactions
A total of 56% of respondents noted that the use of AI scribes changed the consultation dynamic, for example via more verbalising of their thought process and increased focus on the patient. Although most respondents viewed these as positive changes, their implications for consultation processes and outcomes warrant further study. For example, a health interaction study in the USA found that when doctors verbalised their observations during physical examination of children with viral illnesses, parents were more likely to demand unnecessary antibiotics,32 the implication being that parents were unable to interpret the doctor’s verbal commentary and thus overestimated the seriousness or misunderstood the nature of their child’s illness. As AI is integrated into health systems, it is important that future research studies the impact on patient–provider interactions, particularly in relation to communication, trust, and decision-making.
Strengths and limitations
As a non-probabilistic sampling strategy was used for this study, some bias in the responses received is likely, with participants self-selecting based on strong interest in the topic. However, a representative sample was not required to fulfil the study aims, which were exploratory in nature. A strength of the study was the inclusion of qualitative open-ended questions and ample space for comment alongside the quantitative measures. The free text answers elicited provided rich detail and insight into how clinicians were wrestling with decisions about the adoption and use of AI scribe tools at the beginning of 2024, and what they perceived to be the key risks and benefits: this snapshot in time will provide a useful basis for professional reflection and further policy work to ensure that the adoption and regulation of AI in health meets the needs of health providers and patients.
Conclusions
AI scribes are more than just transcription tools; they are changing the dynamic of clinical consultations in primary care. Safe use requires consideration of medicolegal issues, resources, guidance, and training to ensure accuracy of notes, and support of consent mechanisms with the ability to opt-out by patients and still access clinical care. With the rapid development of AI tools, their use, usefulness, legal, ethical, and policy settings need to be evaluated.
Data availability
The data that support this study will be shared upon reasonable request to the corresponding author angela.ballantyne@otago.ac.nz.
Acknowledgements
We thank the Royal New Zealand College of General Practitioners for distributing the survey to their members via their newsletter ePulse.
References
1 RNZCGP. 2022 workforce survey; 2022. Available at https://www.rnzcgp.org.nz/resources/workforce-survey/2022-workforce-survey/ [accessed 25 March 2025].
2 Edwards PJ. GPs spend 14% of their session time documenting consultation notes and updating electronic health records. Br J Gen Pract 2024; 74(742): 202.
| Crossref | Google Scholar | PubMed |
3 Dowell A, Stubbe M, Scott-Dowell K, et al. Talking with the alien: interaction with computers in the GP consultation. Aust J Prim Health 2013; 19(4): 275-82.
| Crossref | Google Scholar | PubMed |
4 AI in Primary Care Group. Round two AI in primary care survey highlights growing adoption. GPNZ; 2025. Available at https://gpnz.org.nz/our-work/ai-in-primary-care-group/https://gpnz.org.nz/media-releases/joint-media-release-round-two-ai-in-primary-care-survey-highlights-growing-adoption/
5 Terry AL, Kueper JK, Beleno R, et al. Is primary health care ready for artificial intelligence? What do primary health care stakeholders say? BMC Med Inform Decis Mak 2022; 22(1): 237.
| Crossref | Google Scholar |
6 Falcetta FS, De Almeida FK, Lemos J, et al. Automatic documentation of professional health interactions: a systematic review. Artif Intell Med 2023; 137: 102487.
| Crossref | Google Scholar | PubMed |
7 Tran BD, Mangu R, Tai-Seale M, et al. Automatic speech recognition performance for digital scribes: a performance comparison between general-purpose and specialized models tuned for patient-clinician conversations. AMIA Annu Symp Proc 2023; 2022: 1072-80.
| Google Scholar | PubMed |
8 Okwor IA, Hitch G, Hakkim S, et al. Digital technologies impact on healthcare delivery: a systematic review of artificial intelligence (AI) and machine-learning (ML) adoption, challenges, and opportunities. AI 2024; 5(4): 1918-41.
| Crossref | Google Scholar |
9 Tierney AA, Gayre G, Hoberman B, et al. Ambient artificial intelligence scribes to alleviate the burden of clinical documentation. NEJM Catal Innov Care Deliv 2024; 5(3): CAT-23.
| Crossref | Google Scholar |
10 Shuaib A. Transforming healthcare with AI: promises, pitfalls, and pathways forward. Int J Gen Med 2024; 17: 1765-71.
| Crossref | Google Scholar | PubMed |
11 Nawab K. Artificial intelligence scribe: a new era in medical documentation. Artif Intell Health 2024; 1(4): 12-5.
| Crossref | Google Scholar |
12 InPhySec. Waikato District Health Board (WDHB) Incident Response Analysis. Final Report, 2 September 2022; 2022. Available at https://www.tewhatuora.govt.nz/assets/Publications/Proactive-releases/WDHB-Final-Report-2.0-redacted.pdf
13 MBIE. Cabinet paper: Approach to work on Artificial Intelligence. 25 July 2024. Available at https://www.mbie.govt.nz/dmsdocument/28913-approach-to-work-on-artificial-intelligence-proactiverelease-pdf
14 Whittaker R, Dobson R, Jin CK, et al. An example of governance for AI in health services from Aotearoa New Zealand. NPJ Digit Med 2023; 6: 164.
| Crossref | Google Scholar | PubMed |
15 National Health Service. Artificial intelligence (AI) and machine learning guidelines NHS – England. 13 March 2024 (last updated 7 April 2025); 2025. Available at https://www.england.nhs.uk/long-read/artificial-intelligence-ai-and-machine-learning/
18 RACGP. Artificial Intelligence (AI) scribes. Available at https://www.racgp.org.au/running-a-practice/technology/business-technology/artificial-intelligence-ai-scribes [accessed 27 March 2025].
19 WellSouth Primary Health Network. Primary Care AI Resource Hub. Available at https://wellsouth.nz/provider-access/clinical-resources/ai-in-primary-care [accessed 27 March 2025].
21 Cohen GR, Friedman CP, Ryan AM, et al. Variation in physicians’ electronic health record documentation and potential patient harm from that variation. J Gen Intern Med 2019; 34: 2355-67.
| Crossref | Google Scholar | PubMed |
22 Medical Council of NZ. Regulation in the Era of Artificial Intelligence (AI); 2024. Available at https://hail.to/the-medical-council-of-new-zealand/publication/yPK9Okn/article/XwbONx7 [accessed 26 March 2025].
23 Medical Council of NZ. Managing patient records; 2020. Available at www.mcnz.org.nz/assets/standards/0c24a75f7b/Maintenance-patient-records.pdf [accessed 26 March 2025].
25 Tunzi M, Satin DJ, Day PG. The Consent Continuum: a new model of consent, assent, and nondissent for Primary Care. Hastings Cent Rep 2021; 2021 51(2): 33-40.
| Crossref | Google Scholar | PubMed |
26 Health and Disability Commissioner - Opinion 08HDC20258. Informed consent to innovative surgery; 2009. Available at https://www.hdc.org.nz/decisions/search-decisions/2009/08hdc20258/
27 Forbes S. Medical Council working on new guidelines for use of AI. NZ Doctor; 2025. Available at https://www.nzdoctor.co.nz/article/news/medical-council-working-new-guidelines-use-ai
28 Palaniappan K, Lin EYT, Vogel S. Global Regulatory Frameworks for the use of Artificial Intelligence (AI) in the Healthcare Services Sector. Healthcare 2024; 12(5): 562.
| Crossref | Google Scholar | PubMed |
29 Ebad SA, Alhashmi A, Amara M, et al. Artificial Intelligence-Based Software as a Medical Device (AI-SaMD): A Systematic Review. Healthcare 2025; 13(7): 817.
| Crossref | Google Scholar | PubMed |
30 Klitzman R. Pleasing doctors: when it gets in the way. BMJ 2007; 335(7618): 514.
| Crossref | Google Scholar |
31 Dixon-Woods M, Williams SJ, Jackson CJ, et al. Why do women consent to surgery, even when they do not want to? An interactionist and Bourdieusian analysis. Soc Sci Med 2006; 62(11): 2742-53.
| Crossref | Google Scholar | PubMed |
32 Heritage J, Elliott MN, Stivers T, et al. Reducing inappropriate antibiotics prescribing: the role of online commentary on physical examination findings. Patient Educ Couns 2010; 81(1): 119-25.
| Crossref | Google Scholar | PubMed |
Footnotes
A Te Whatu Ora’s National Artificial Intelligence and Algorithm Expert Advisory Group has recently endorsed two Ambient AI Scribe tools: iMedX and Heidi (Health NZ has endorsed the Enterprise version of Heidi, not the free version of Heidi). See https://www.tewhatuora.govt.nz/health-services-and-programmes/digital-health/generative-ai-and-large-language-models (last updated: 18 July 2025).