Hong Kong Med J 2025;31:Epub 8 Oct 2025
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
EDITORIAL
“Inborn errors” of artificial intelligence tools and practical tips for expert witnesses
James SP Chiu, FHKAM (Surgery), LLB (Hons) Lond1; Gilberto KK Leung, FHKAM (Surgery), LLM2
1 Senior Research Fellow, Centre for Medical Ethics and Law, The University of Hong Kong, Hong Kong SAR, China
2 Co-Director, Centre for Medical Ethics and Law, The University of Hong Kong, Hong Kong SAR, China
 
Corresponding author: Dr James SP Chiu (drjameschiu@yahoo.com.hk)
 
 Full paper in PDF
 
 
Expert witnesses serve important functions in the administration of justice. Their opinions have a significant impact on the outcomes of proceedings and represent a respectable source of information to parties seeking explanation, understanding, and closure in a dispute.1 However, there have been cases where expert witnesses have presented incorrect expert evidence to the courts. An oft-quoted example is the UK case of R v Sally Clark, where Professor Sir Roy Meadow, a prosecution expert witness, gave evidence relating to probabilities of sudden infant death syndrome.2 It later transpired that he had misunderstood and misinterpreted statistical data, leading to legal battles between him and the General Medical Council.3
 
Materials in support of the expert’s opinion
Expert witnesses are expected to conduct research on legislation, codes of conduct, or practice guidelines issued by professional bodies, and to cite authorities from textbooks and published articles. Materials which have been utilised to support their opinions must be specified in their reports.4 This was emphasised by the Court of Appeal of New Zealand: “We have noted that Mr Keys [the witness] cited no professional literature or other material to verify his “elemental” methodology… These methodological difficulties sufficiently justify the Judge’s conclusion that Mr Keys’ evidence was neither helpful nor reliable”.5 Furthermore, the courts and tribunals need to be satisfied that the opinion and conclusions in the expert reports they receive are reliable.
 
The use of artificial intelligence in court proceedings
The emergence of artificial intelligence (AI) is impacting not only the legal sector but every aspect of society around the world at lightning speed, giving rise to many challenges as well as opportunities.6 In the UK, it was reported that at the end of 2022, three-quarters of the largest solicitors’ firms were using AI; over 60% of the large law firms and a third of the small firms were at least exploring the potential of the new generative AI systems.7
 
In 2023, Lord Justice Birss, a Court of Appeal judge in the UK, used ChatGPT to provide a summary of an area of law. He is the first British judge known to use an AI chatbot to write part of a judgement.8 This shows that, when used with caution, AI can be a useful tool as an assistant to expert witnesses and is acceptable to the courts.
 
That same year, the Courts and Tribunals Judiciary in the UK published online guidance for Judicial Office Holders on the use of AI.9 It acknowledges that AI tools are capable of summarising large bodies of text although, as with any summary, care needs to be taken to ensure accuracy. In contrast, AI tools are a poor way of conducting research to find new information one cannot verify independently. While they may be useful in identifying materials that one would recognise as correct, the current public AI chatbots do not produce convincing analysis or reasoning.10
 
There is evidence that some expert witnesses in Hong Kong are already using or are considering using AI to write their reports. In a Workshop on Expert Witness Report Writing organised by the Hong Kong Academy of Medicine in August 2025, there were 32 participants from different specialties. They were given a fictitious case of a civil claim against a general practitioner, based on which they were asked to write expert reports on behalf of the defendant doctor and comment on his management. Some of them had written expert witness reports before. Of the 24 participants who replied to a survey on the use of AI tools in their preparations, nine indicated that they did use AI tools for various purposes, such as searching for references, summarising the literature, analysing the case based on updated medical standards in the practice of that specialty (family medicine), listing out all the favourable and unfavourable evidence/findings, organisation, editing, formatting, improving grammar and sentence structure, changing the wording to layman’s terms, and proofreading. Of the AI tools used, DeepSeek was the most popular (n=3), followed by ChatGPT (n=2) and Perplexity (n=2). Each of the following tools was used only once by the participants: Copilot, English Editor, Gemini, Poe, Grok and AI Genesis by Hospital Authority. Some participants have used more than one AI tool for their reports.11
 
“Inborn errors” of artificial intelligence tools
The limitations of AI in the context of legal research have already been recognised. Since 2023, a number of non-existent judicial opinions with fake quotes and citations created by AI tools have been presented to the courts in the United States and the UK.12 Users may not know that AI tools have “inborn errors”. Artificial intelligence language models such as ChatGPT can be more prone to a mistake known as ‘hallucination’, where a system produces highly plausible but incorrect results. This is because they work by anticipating the text that should follow the input they are given, but they do not have a concept of ‘reality’.7
 
The currently available large language models are trained on materials published on the internet, and the quality of answers generated depends on the quality of the underlying datasets, as well as how one engages with the relevant AI tool, including the nature of the prompts entered. Erroneous output from AI tools may arise from misinformation (whether deliberate or otherwise), data selection bias, and/or data which are not up to date. Even with the best prompts, the information provided may be inaccurate, incomplete, misleading, or biased. Artificial intelligence tools may make up fictitious legal cases, citations, or quotes, or refer to legislation, articles, or legal texts that do not exist, yielding incorrect or misleading information regarding the law or how it might apply. Therefore, one should always have regard to this possibility, and the accuracy of any information provided by AI tools must be checked before it is relied upon and used in an expert opinion report.10
 
Even experts on generative AI may commit these mistakes. In a recent case in the United States, it was discovered that Professor Jeff Hancock had included citations to two non-existent academic articles and incorrectly cited the authors of a third article. He admitted that he had used GPT-4o to assist him in drafting his declaration but, in reviewing the declaration, he failed to discern that GPT-4o had generated fake citations to academic articles. The irony is that Professor Hancock, a credentialled expert on the dangers of AI and misinformation, had fallen victim to the siren call of relying too heavily on AI in a case that revolved around the dangers of AI.13
 
Duties to the courts and tribunals
Expert evidence is absolutely fundamental to the rule of law. The Code of Conduct for Expert Witnesses applies to an expert who has been instructed to give or prepare evidence for the purpose of proceedings in the Court. It specifies that an expert witness has an overriding duty to help the Court impartially and independently on matters relevant to the expert’s area of expertise.4 Flawed evidence can lead to a court, acting in good faith, reaching an unsound decision, miscarriages of justice and, in turn, a lack of confidence in justice and a degradation of the rule of law.14
 
All legal representatives are responsible for the materials they put before the court/tribunal and have a professional obligation to ensure they are accurate and appropriate. They must confirm that they have independently verified the accuracy of any research or citations that have been generated with the assistance of an AI chatbot.10 Because expert witnesses also have a duty to the Court, it is crucial for them to verify the accuracy of any research or case citations that have been generated with the assistance of AI tools in the reports they submit to the persons who instruct them and/or the courts/tribunals.
 
Privacy, personal data protection, and confidentiality
The current publicly available AI platforms remember every prompt and any other information entered into them, which may then be used to respond to queries from other users. As a result, anything entered into an AI platform could, in principle, become publicly known. Therefore, one should be mindful of the importance of protecting data privacy and avoid entering any information into a public AI chatbot that is not already in the public domain, and which is private and confidential. To maintain data security, one should use workplace computer devices to access AI tools and one’s work email address (rather than personal ones).10 The Office of the Privacy Commissioner for Personal Data has also provided some tips for users of AI chatbots such as ChatGPT in protecting personal data privacy.15
 
Liabilities of expert witnesses
Expert witnesses may be liable to professional disciplinary proceedings for professional misconduct3 and there may even be legal consequences.16 The Code of Conduct for Expert Witnesses makes it clear that, “Proceedings for contempt of court may be brought against a person if he makes, or causes to be made, a false declaration or a false statement in a document verified by a statement of truth without an honest belief in its truth”.4 Therefore, it is pertinent for expert witnesses to bear the following rules in mind:
- learn the basic knowledge of what AI tools can and cannot do before using them;
- check and verify the accuracy and appropriateness of any information provided by an AI tool when it is used or relied on;
- be prepared to correct any errors and bias in the information generated by AI tools; and
- protect the data privacy of the parties.
 
Conclusions
As with any other information available on the internet in general, while AI tools may be useful to find material one would recognise as correct but do not have to hand, they are a poor way of conducting research to find new information one cannot verify. Public AI chatbots might not provide accurate answers derived from authoritative databases. They generate new text using an algorithm based on the prompts they receive and the data they have been trained upon. Their output is based on what the model predicts to be the most likely combination of words (based on the documents and data that it holds as source information) and is not necessarily the most accurate answer.10 Expert witnesses must be vigilant when they conduct research on legislation, codes of conduct, or practice guidelines issued by professional bodies and cite authorities from textbooks and published articles in their expert witness reports. Moreover, when they use AI tools as aids, they must check that the information provided is accurate and appropriate before it is used or relied upon. They must also ensure confidentiality is maintained and that the personal data and privacy of the parties are protected.
 
Although this article is written with medical expert witnesses in mind, it applies equally to expert witnesses of other professions.
 
Author contributions
Both authors contributed to the editorial, approved the final version for publication, and take responsibility for its accuracy and integrity.
 
Conflicts of interest
Both authors have disclosed no conflicts of interest.
 
Funding/support
This editorial received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
 
References
1. Hong Kong Academy of Medicine Professionalism and Ethics Committee, Task Force on Laws for Healthcare Practitioners. Foreword. In: Best Practice Guidelines for Expert Witnesses. 2nd ed. Hong Kong: Hong Kong Academy of Medicine; 2025: 1.
2. R v Sally Clark [2003] EWCA Crim 1020
3. The General Medical Council v Professor Sir Roy Meadow [2006] EWCA Civ 1390
4. Cap 4A The Rules of the High Court, Appendix D: Code of conduct for expert witnesses. Available from: https://www.elegislation.gov.hk/hk/cap4A@2018-02-01T00:00:00. Accessed 4 Sep 2025.
5. Prattley Enterprises Limited v Vero Insurance New Zealand Limited [2016] NZCA 67 Para 105 and 109
6. The Law Society of Hong Kong. The impact of artificial intelligence on the legal profession. Position paper of the Law Society of Hong Kong. January 2024. Available from: https://www.hklawsoc.org.hk/-/media/HKLS/Home/News/2024/LSHK-Position-Paper_AI_EN.pdf?rev=77bf900208614367b9cbb15fd10aaa58. Accessed 4 Sep 2025.
7. Solicitors Regulation Authority, United Kingdom. Risk Outlook report: the use of artificial intelligence in the legal market. 20 November 2023. Available from: https://www.sra.org.uk/sra/research-publications/artificial-intelligence-legal-market/. Accessed 4 Sep 2025.
8. Farah H. Court of appeal judge praises ‘jolly useful’ ChatGPT after asking it for legal summary. The Guardian. 23 September 2023. Available from: https://www.theguardian.com/technology/2023/sep/15/court-of-appeal-judge-praises-jolly-useful-chatgpt-after-asking-it-for-legal-summary. Accessed 4 Sep 2025.
9. Courts and Tribunals Judiciary, United Kingdom. Artificial intelligence (AI): guidance for judicial office holders. 12 December 2023. Available from: https://www.judiciary.uk/wp-content/uploads/2023/12/AI-Judicial-Guidance.pdf. Accessed 4 Sep 2025.
10. Courts and Tribunals Judiciary, United Kingdom. Artificial intelligence (AI): guidance for judicial office holders. Updated 14 April 2025. Available from: https://www.judiciary.uk/wp-content/uploads/2025/04/Refreshed-AI-Guidance-published-version.pdf. Accessed 4 Sep 2025.
11. Hong Kong Academy of Medicine. Event report of HKAM Workshop on Expert Witness Report Writing [unpublished internal document]. Hong Kong: Hong Kong Academy of Medicine; 2025.
12. Roberto Mata v Avianca, Inc 22-cv-1461 (PKC) and Felicity Harber v The Commissioner for His Majesty’s Revenue and Customs [2023] UKFIY 1007 (TC)
13. Kohls v Ellison, Case No. 24-cv-3754 (LMP/DLM)
14. McMullin B. The expert. Law Society Ireland Gazette. 28 February 2024. Available from: https://www.lawsociety.ie/gazette/in-depth/2024/february/the-expert/. Accessed 4 Sep 2025.
15. Office of the Privacy Commissioner for Personal Data, Hong Kong. 10 Tips for users of AI chatbots. September 2023. Available from: https://www.pcpd.org.hk/english/resources_centre/publications/files/ai_chatbot_leaflet.pdf. Accessed 4 Sep 2025.
16. Jones v Kaney [2011] UKSC 13