No Result
View All Result
Daily Jus
  • News
  • Legal Tech & AI
  • Legal Insights
  • Reports
  • Jus Mundi Arbitration Review (JMAR)
  • Publish on Daily Jus
  • The Daily Jusletter
  • About us
  • News
  • Legal Tech & AI
  • Legal Insights
  • Reports
  • Jus Mundi Arbitration Review (JMAR)
  • Publish on Daily Jus
  • The Daily Jusletter
  • About us
No Result
View All Result
Daily Jus
No Result
View All Result

Home Legal Tech & AI

Use of AI in Arbitration – Commentary on the CIArb Guidelines

22 July 2025
in Arbitration, Arbitration for In-House Counsel, Commercial Arbitration, Investor-State Arbitration, Legal Insights, Legal Tech & AI, London VYAP, World, Worldwide Perspectives
Use of AI in Arbitration – Commentary on the CIArb Guidelines

THE AUTHOR:
Verity Anne Thomson, Arbitration and Litigation Lawyer


In March 2025, the Chartered Institute of Arbitrators (“CIArb”) published the Guideline on the Use of AI in Arbitration (2025) (the “Guideline“). The Guideline aims to provide practical guidance for arbitrators, parties, and their representatives on the responsible, effective, and legally sound use of AI tools in arbitration proceedings. The Guideline seeks to maximize the benefits of AI—such as efficiency and quality—while mitigating risks to procedural fairness, confidentiality, enforceability, and the integrity of the arbitral process. The document is not a technical manual; rather, it is a practical framework to support informed decision-making and risk management in the evolving landscape of AI and its use in arbitration.

Key Takeaways for the Safe and Effective Use of AI

The two key messages for arbitration practitioners which emerge from the Guideline are that arbitration practitioners must understand the AI tools before using them; and that the practitioners remain accountable for the work when using AI (see paras. 2.4 3.4, 8.4 of the Guideline). Before using an AI tool, arbitration practitioners should ensure that they properly investigate and understand the tool, the model it uses, the potential biases that may be embedded in its outputs and the confidentiality risks that may arise from the input of confidential data (see paras. 3.1-3.2 of the Guideline). They should then weigh efficiency and cost savings against potential threats to due process, confidentiality, and enforceability that may arise. Additionally, the ultimate responsibility for any document or other work product created with AI remains with the user, not the AI tool; therefore, arbitration practitioners should verify AI outputs before relying on them and never delegate decision-making to AI tools.

How Arbitration Practitioners May Maximize the Benefits of AI tools

CIArb identified several benefits to using AI in arbitration in the Guideline, which are summarised below.

  • Legal Research: Employ AI-powered search and predictive models to improve research accuracy and speed (Guideline, para 1.2)
  • Data Analysis: Utilize AI for pattern recognition, detecting inconsistencies, and analysing large datasets (Guideline, para 1.3).
  • Text Generation: Leverage AI for drafting, summarizing, and formatting documents (Guideline, para 1.4).
  • Collecting Evidence: Use AI to streamline the taking of evidence (Guideline, para 1.5).
  • Translation/Interpretation: Apply AI for document translation and real-time interpretation in multilingual proceedings (Guideline, para 1.6).
  • Transcription: Generate hearing transcripts using AI at reduced cost (Guideline, para 1.7).
  • AI Detection: Use AI tools to verify the authenticity of evidence and detect deep fakes (Guideline, para 1.8).
  • Case Analysis: Employ AI for outcome prediction and procedural strategy insights (Guideline, para 1.9).
  • Equal Treatment: AI tools can be used by under-resourced parties to help remedy “inequality of arms” (Guideline, para 1.10).

Some of the benefits of AI that CIArb identifies appear more realistic than others.

Using AI tools to streamline evidence review, data analysis, and legal research is already the norm for many legal practitioners; it has the benefit of increasing both the efficiency and quality of the process. Many leading legal platforms, such as Relativity, Westlaw and Jus Mundi (Jus AI), have already integrated AI into their offerings.

The use of AI for text generation, translation, and transcription is likely to become more widespread in the coming years. Currently, the AI offerings are not sufficiently accurate to be relied on, particularly in the context of an arbitration where accuracy is vital.

While AI-powered tools exist to verify the authenticity of documents and detect ‘deep fakes’, their technology is still evolving. Their effectiveness varies, and arbitration practitioners should use them as supplementary aids rather than definitive proof.

The hope that AI tools will level the playing field for under-resourced parties seems unlikely. The nature of new technologies is that better-resourced parties are always likely to find a way to maximize the benefits of a new technology in a way that under-resourced parties are unlikely to be able to match. Furthermore, AI tools that cater to legal professionals and provide the necessary privacy protections are often costly, making them unlikely to be accessible to under-resourced parties.

Risks Associated with the Use of AI in Arbitration

The risks associated with the use of AI in arbitration are numerous.  Courts around the world have already released rulings relating to the use of AI in litigation (e.g., Ayinde v London Borough of Haringey, and Al-Haroun v Qatar National Bank). Simply put, while AI tools can be used to increase the efficiency and quality of outputs, they cannot replace the need for human oversight.

The risks associated with AI use outlined in the Guideline are:

  • Confidentiality: Arbitration practitioners are encouraged to understand and carefully vet AI tools for how data is used and stored by the AI tool (Guideline, para 2.2).
  • Data Security: Be vigilant about cybersecurity risks, especially when using third-party AI platforms (Guideline, para 2.3).
  • Impartiality & Independence: Arbitrators must maintain full control and accountability for decision-making and avoid excessive dependence on AI tools (Guideline, para 2.4).
  • Due Process: Ensure AI use does not compromise parties’ ability to present their case or the arbitrators’ obligation to consider all arguments (Guideline, para 2.5).
  • “Black Box” Problem: Be cautious with AI tools whose decision-making processes are opaque; avoid using outputs that cannot be independently verified (Guideline, para 2.6).
  • Enforceability: Ensure AI use complies with all applicable laws, regulations, and institutional rules to avoid jeopardizing award enforceability (Guideline, para 2.7).
  • Environmental Impact: Consider the energy consumption and environmental footprint of AI tools (Guideline, para 2.8).

To mitigate the risks identified in the Guideline, arbitration practitioners are advised to educate themselves on how AI tools work and create appropriate vetting processes before using them.

Recommendations on the Use of AI in Arbitration

Underlying all CIArb’s recommendations is the principle that participants in arbitration remain responsible and accountable in the same way as if the AI tool were not used. While AI tools can be useful, they are in no way a replacement for human decision-making and judgment.

The Guidelines encourage both parties and arbitrators to:

  • Make reasonable enquiries about any prospective AI Tool and any AI-related law or regulation in the relevant jurisdictions (Guideline, para 3.1); and
  • Engage in understanding the potential risks associated with any prospective use of AI and weigh the risks vs the benefits (Guideline, para 3.2).

As for Parties, the Guideline recommends that:

  • Arbitrator’s Authority: Arbitrators can direct and regulate parties’ use of AI in proceedings, unless restricted by party agreement or mandatory rules. They may appoint AI experts for technical guidance (Guideline, para 4.1).
  • Disclosure: Arbitrators can require parties to disclose their use of AI tools, except for private uses that do not affect the process. Disclosure ensures transparency and protects the integrity and enforceability of the arbitration (Guideline, para 4.4).
  • Procedural Orders and Non-Compliance: Decisions regarding AI use should be documented in procedural orders and can be revisited as needed. If parties fail to comply or disclose as required, arbitrators may take remedial actions, draw adverse inferences, or adjust costs (Guideline, para. 4.6).
  • Party Autonomy: Parties may agree on how AI is used in the arbitration, but such agreements are subject to overriding laws and rules. Arbitrators should clarify or invite discussion if the arbitration agreement is silent on AI (Guideline, paras 5.1-5.4).
  • Rulings and Admissibility: Arbitrators may rule on disputed AI use, considering both benefits and risks (e.g., fairness, confidentiality, bias). They must ensure all rulings comply with applicable laws and ethical standards (Guideline, paras 6.1-6.8).

The Guideline rightly places significant responsibility on arbitrators to control the arbitration process, acknowledging that unregulated AI use could compromise procedural fairness and the credibility of outcomes. While party autonomy, a core pillar of arbitration, is respected, the emphasis on understanding the AI tool and disclosing its use signals a necessary caution against over-reliance on opaque or biased AI tools in arbitration proceedings.

CIArb also provided guidance aimed at arbitrators:

  • Decision-Making: Arbitrators may use AI tools to assist with efficiency and quality, but must not delegate decision-making to them (Guideline, paras 8.1-8.2).
  • Verification: Arbitrators should independently verify all AI-generated information before relying on it. (Guideline, para 8.3)
  • Responsibility: Arbitrators remain fully responsible for all decisions and the content of awards (Guideline, para 8.4).
  • Transparency: Arbitrators should consult with parties before using AI tools and refrain from their use if parties object (Guideline, para 9.1).

Arbitrators should be cautious when using AI tools to carry out their duties. Unscrupulous use of AI by arbitrators could undermine the credibility of arbitration outcomes. It is also clear that arbitrators will need to acquire familiarity with and an understanding of AI tools in order to make informed decisions relating to their use by parties and themselves.  

Conclusion

The CIArb AI Guideline offers a balanced and practical framework for integrating AI tools into arbitration, promoting transparency, accountability, and adherence to legal and ethical standards. By outlining the risks for both parties and arbitrators in using AI tools and sharing best practices to mitigate those risks, the Guideline helps ensure that AI enhances rather than compromises the fairness and integrity of arbitral proceedings.


ABOUT THE AUTHOR

Verity Anne Thomson is a Canadian lawyer, specialising in arbitration and litigation. She completed a JD and BCL from McGill University‘s Faculty of Law, where she was awarded the Dawson A. McDonald, Q.C. Memorial Prize and the Philip Meyerovitch, Q.C. Prize. She then clerked for now Chief Justice de Montigny of Canada’s Federal Court of Appeal before moving to London to focus on international dispute resolution. She practises both arbitration and litigation with a focus on maritime and trade disputes. She will shortly be joining Kennedys. 


*The views and opinions expressed by authors are theirs and do not necessarily reflect those of their organizations, employers, or Daily Jus, Jus Mundi, or Jus Connect.

Related Posts

Cross-Border Appeals: The BICC-SICC Model

Cross-Border Appeals: The BICC-SICC Model

by Jus Mundi
21 July 2025

Singapore–Bahrain cross-border appeals model sets new precedent in judicial cooperation, blending sovereignty with international commercial court innovation.

Why The Minority View Got It Right In Gayatri Balasamy

Why The Minority View Got It Right In Gayatri Balasamy

by Jus Mundi
18 July 2025

Gayatri Balasamy dissent warns against judicial overreach in arbitration, affirming that award modification undermines party autonomy and international enforceability.

Singapore’s Evolving Arbitration-Insolvency Framework: A Cross-Border Balancing Act

Singapore’s Evolving Arbitration-Insolvency Framework: A Cross-Border Balancing Act

by Jus Mundi
17 July 2025

Singapore courts balance party autonomy and creditor protection, refining how arbitration and cross-border insolvency coexist in evolving jurisprudence.

Load More

Your daily dose of arbitration and legal industry insights.

Follow Us

Ressources

  • News
  • Legal Tech & AI
  • Legal Insights
  • Reports
  • Jus Mundi Arbitration Review (JMAR)
  • Publish on Daily Jus
  • The Daily Jusletter
  • About us

Newsletter

loader

Sign up now to get weekly digests of the latest arbitration updates and articles in your inbox.

© 2023 Jus Mundi

  • Home
  • About us
  • Jus Mundi
  • Jus Connect
No Result
View All Result
  • Home
  • News
    • Products
    • Partnerships
    • Conference Reports
  • Reports
  • Legal Insights
    • Arbitration
      • Commercial Arbitration
      • Investor-State Arbitration
      • Arbitration Aftermath
    • Mediation
    • Worldwide Perspectives
      • Arbitral Institutions’ Spotlights
      • Clyde & Co
      • London VYAP
      • SG VYAP
  • World
    • Africa
    • Americas
      • U.S.A
      • Brazil
      • Latin America
    • Asia-Pacific
      • Central Asia
      • China
      • Hong Kong SAR
      • India
      • Japan
      • Singapore
    • Europe
      • France
      • Germany
      • Poland
      • Spain
      • Switzerland
      • The Netherlands
      • United Kingdom
    • Middle East & Turkey
      • Turkey
      • UAE
  • Awards
    • Jus Connect Rankings
    • Arbitration Team Of the Month
    • Arbitration Practitioner Of the Week
  • Business Development
    • Firm growth
    • Professional Development
  • In conversation with
  • Legal Tech & AI
  • Jus Events
  • Publish on Daily Jus
    • Become an Author
    • Editorial Guidelines & Process
    • Editorial Policies
  • The Daily Jusletter
  • About us

© 2024 Jus Connect