No Result
View All Result
Daily Jus
  • News
  • Legal Tech & AI
  • Legal Insights
  • Reports
  • Jus Mundi Arbitration Review (JMAR)
  • Publish on Daily Jus
  • The Daily Jusletter
  • About us
  • News
  • Legal Tech & AI
  • Legal Insights
  • Reports
  • Jus Mundi Arbitration Review (JMAR)
  • Publish on Daily Jus
  • The Daily Jusletter
  • About us
No Result
View All Result
Daily Jus
No Result
View All Result

Home Legal Tech & AI

Confidentiality by Design: Should Confidentiality Protections be embedded into AI Systems in International Arbitration?

7 July 2025
in Arbitration, Commercial Arbitration, Investor-State Arbitration, Legal Insights, Legal Tech & AI, World
Confidentiality by Design: Should Confidentiality Protections be embedded into AI Systems in International Arbitration?

THE AUTHORS:
Alexander Foerster, Independent Arbitrator & Sole Practitioner at AdvokatFoerster AB
Mark Malekela, Government Liaison at Alistair Group


As Artificial Intelligence (“AI”) continues its march into the fabric of international commercial arbitration, its promise is clear: greater efficiency, deeper insights, and faster outcomes. Yet a quieter concern underlies this technological revolution: how do we preserve confidentiality as one of arbitration’s most fundamental virtues?

To answer this, we must look not just at arbitration rules, regulations, guidelines, and contracts, but at technology itself. What if confidentiality were not merely something to enforce or hope for, but something built in?

This is the premise of “Confidentiality by Design” (“CbD”), a concept proposed in Malekela’s research article to ensure that AI systems used in arbitration embed confidentiality protections directly into their architecture, mirroring the principle of “Privacy by Design” (“PbD”) under the EU General Data Protection Regulation (EU) 2016/679 (“GDPR”).

Why Confidentiality Still Matters in a ‘Big Data’ Arbitration Era

Confidentiality remains one of the primary reasons parties choose arbitration over litigation. Arbitration allows for the resolution of disputes without or with very minimal public disclosure of sensitive financial, technical, or strategic information. However, the rise of AI-powered tools or systems, from predictive analytics to document review and decision-support platforms, poses novel risks to this foundational principle in international arbitration.

AI thrives on data. But in an international arbitration context, every byte of input, that is, submissions, written witness statements, and other documentary evidence, internal memos, or correspondences, can represent a potential point of exposure. International businesses are finding it more difficult to handle the challenges of storing, processing, sharing, and analyzing their data in this “big data” era. Parties, counsel, arbitrators, and arbitral institutions all run a higher risk of disclosing information in international arbitrations, either needlessly or in violation of confidentiality obligations derived from the parties’ agreements, applicable procedural rules, and/or applicable national laws.

Without proper confidentiality safeguards, data processed by AI tools may be subject to unauthorized access, inadvertent leaks, or even misuse by third-party providers.

From Privacy by Design to Confidentiality by Design: A Conceptual Framework

In addressing the confidentiality challenges posed by AI, this article supports the proposition that the concept of CbDshould be explored by the arbitration community. This concept mirrors the PbD principle codified under Article 25 of the GDPR, which requires that organizations must incorporate data protection features into the development and operation of systems from the outset. Article 25 of the GDPR mandates “data protection by design and by default,” requiring organizations to embed privacy features into the development of data-processing systems from the outset.

CbD, analogously, suggests that AI systems developed for use in arbitration must come equipped with technical and procedural safeguards that prioritize confidentiality protection at every level—from codebase to user interface. Its key features could include:

  • End-to-end encryption for data in transit and at rest;
  • Role-based access controls to limit exposure;
  • Secure, in-house storage over cloud reliance where feasible;
  • Audit trails and tamper-proof logs; and
  • Data anonymization and pseudonymization protocols.

From Theory to Practice: Jus AI as a Case Study

One might wonder if the CbD can be practically implement. A promising example of a practical implementation is Jus AI, the AI-powered tool recently launched by Jus Mundi, tailored to support practitioners navigate, interpret, and work with complex arbitral awards and legal texts. Far more than just a document parsing or summarization tool, Jus AI offers a range of capabilities; supporting translation, drafting, deep document analysis, multilingual search, and citationally-backed natural language responses.

According to publicly available information, Jus AI employs robust encryption protocols and strict internal access to protect sensitive data. In December 2024, Jus Mundi was certified under  ISO 27001. These measures reflect a mature, proactive, and deeply embedded architecture that demonstrates Jus Mundi’s serious commitment to confidentiality and its readiness to tackle the security risks of AI head-on. This signals a recognition of the risks AI poses, and a willingness to mitigate them proactively.

Can Confidentiality by Design Be Realized?

Realizing the concept of “CbD” in arbitration may be achieved through direct negotiations between arbitration institutions and AI service providers regarding the latter’s use in protecting arbitration-related data. However, it may also face several significant challenges.

Technically, it is challenging to design AI systems that are both highly intelligent and secure, as this requires striking a balance between innovation and necessary constraints. Most current AI tools are not tailored for sector-specific compliance, such as the strict confidentiality requirements in arbitration. Additionally, there is a notable vendor asymmetry where arbitration institutions and law firms often depend on third-party technology providers whose business models prioritize scalability over bespoke confidentiality solutions. However, this imbalance could be mitigated if arbitral institutions leverage their collective bargaining power, either by forming consortia or acting through umbrella organizations such as the International Federation of Commercial Arbitration Institutions (IFCAI) to negotiate for technology solutions that better align with arbitration-specific confidentiality requirements. Compounding this issue is the absence of regulatory pressure. Unlike privacy law, where instruments like the GDPR exert binding force, arbitration lacks a global regulatory authority to enforce comparable confidentiality standards.

Finally, implementing confidentiality-focused technological features entails significant development costs and technical complexity. As individual arbitrators, often acting as sole practitioners, typically lack the financial and institutional capacity to support such investments, the responsibility must fall on arbitral institutions. Only these institutions possess the structural resources and long-term incentives to develop and fund robust confidentiality safeguards within the digital systems they administer.

Despite these hurdles, CbD can evolve from a novel theory into an industry norm, but only with coordinated action. Arbitral institutions must go beyond merely issuing soft-law guidelines or best practices. They should take an active role in driving the development of technological tools specifically designed to meet the unique confidentiality and procedural needs of arbitration. For example, institutions such as the ICC (International Court of Arbitration) and the SCC (SCC Arbitration Institute) have already begun developing secure digital platforms like the ICC Case Connect and SCC Platform which aim to streamline proceedings while safeguarding sensitive information. These efforts illustrate how institutions can leverage their position and resources to establish purpose-built, secure environments for arbitration. Broader institutional investment in such tailored AI tools is essential to ensure that confidentiality is preserved in the digital age. Stakeholder coalitions, comprising law firms and other entities, can also negotiate collective standards with tech providers. Regulators and policymakers, on the other hand, could explore extending existing data protection frameworks to the application of AI in arbitration.

As noted in the white paper “Checking the Boxes: Confidentiality and Data Protection in International Arbitration” (2024, Kluwer Arbitration), the fast-evolving legal-tech landscape requires proactive rather than reactive strategies.

Recent Developments: CIArb, SVAMC Guidelines, and the EU AI Act

Recent policy developments underscore the growing urgency to address the intersection of AI and confidentiality in arbitration. Notably, CIArb (Chartered Institute of Arbitrators) released its 2025 Guidelines on the Use of Artificial Intelligence in International Arbitration, providing a much-needed framework for the ethical and effective integration of AI tools. These guidelines emphasize not only the need for transparency and informed consent but also explicitly reference confidentiality. While they do not yet mandate CbD per se, they advocate for AI systems to be evaluated for their impact on the confidentiality of proceedings and documents, including when third-party platforms are used. This endorsement of proactive confidentiality measures marks a step toward operationalizing CbD.

Similarly, the SVAMC (Silicon Valley Arbitration & Mediation Center) Guidelines on the Use of AI in Arbitration (2024) address confidentiality risks, particularly in relation to data handling and the reliance on third-party AI vendors. The SVAMC Guidelines call for parties and institutions to ensure that AI tools used in arbitration comply with applicable confidentiality standards. In addition to the CIArb and SVAMC guidelines, the SCC Guide to the Use of Artificial Intelligence in Cases Administered Under the SCC Rules (2024) also emphasizes both transparency and security in the use of AI technologies, recognizing the growing complexity of managing sensitive data across borders and institutional settings. While not prescribing specific technological mandates, the SCC urges arbitral institutions and practitioners to adopt AI solutions that align with confidentiality obligations enshrined in institutional rules and party agreements. This aligns well with the concept of CbD, particularly where the guidelines recommend early risk assessments and ongoing monitoring of AI tools used during arbitration. These proactive measures reinforce the argument that confidentiality should be architecturally embedded, not merely procedurally enforced, throughout the digital arbitration lifecycle.

These guidelines, although not binding, can serve as benchmarks for integrating CbD principles into practice.

These professional frameworks are emerging alongside the broader regulatory environment taking shape in Europe, most notably the EU’s Artificial Intelligence Act (“EU AI Act”), which introduces tiered obligations based on the risk level of AI applications. Although the Act primarily governs the safety, transparency, and accountability of AI systems, it indirectly supports the CbD concept by requiring high-risk AI systems to incorporate robust data governance, record-keeping, and cybersecurity measures. Given that many AI tools used in arbitration may fall under high-risk categories (particularly in the legal domain), the EU AI Act provides an additional regulatory basis for embedding confidentiality safeguards at the design stage.

These recent developments confirm that the arbitration community recognizes confidentiality as a non-negotiable design feature of AI tools. The convergence of institutional guidelines and regulatory mandates presents a timely opportunity for the arbitration field to embrace CbD, not just as a theoretical construct, but as a concrete expectation in the deployment of AI technologies.

Conclusion

In arbitration, confidentiality is more than a preference, it is a pillar of parties’ trust. As AI becomes a ubiquitous part of arbitral proceedings, this trust must be reengineered into our technologies themselves.

CbD presents an opportunity for the arbitration community to take the lead, not just in adopting AI, but in shaping how that AI is used. Just as privacy engineers helped birth the GDPR’s PbD, arbitration professionals must now step into the arena.

If arbitral institutions and other powerful stakeholders fail to act, we risk losing not just data but the very integrity of arbitration.


ABOUT THE AUTHORS

Alexander Foerster is anindependent arbitrator and sole practitioner at AdvokatFoerster AB. After over 25 years as a partner at Sweden’s leading law firm, Mannheimer Swartling Advokatbyrå, and a member of its prominent dispute resolution practice group, Alexander transitioned in 2022 to a career as an independent arbitrator and expert. He is qualified as both a Swedish Advokat and a German Rechtsanwalt. Over the past 20 years Alexander Foerster’s focus has been exclusively on international arbitration and cross-border litigation. He has participated in numerous arbitrations under SCC, DIS, ICC, VIAC, ICSID, and Swiss Rules, as well as ad hoc proceedings under Swedish and German law. Several of these cases involved complex issues of international public law. Alexander is also a teaching fellow of the Frankfurt University of Applied Science, where he teaches international arbitration.

Mark Malekela is a Government Liaison at Alistair Group, and a Committee Member of TIArb’s Young Members Club (“TIArb-YMC“). He holds an LL.B. from Mzumbe University, and an LL.M. in International Commercial Arbitration Law from Stockholm University, where his thesis explored AI’s integration in International Commercial Arbitration and its implications on confidentiality. Mark began his legal career at Breakthrough Attorneys in Dar es Salaam, where he developed a solid foundation in corporate commercial law. He then trained and worked in the international arbitration practice at De Brauw Blackstone Westbroek N.V. in Amsterdam and Clyde & Co, focusing on amongst others, corporate-commercial law, and regulatory compliance.


*The views and opinions expressed by authors are theirs and do not necessarily reflect those of their organizations, employers, or Daily Jus, Jus Mundi, or Jus Connect.

Related Posts

Damages in Arbitration Series – A Perspective from Singapore

Damages in Arbitration Series – A Perspective from Singapore

by Jus Mundi
4 July 2025

This article outlines the key principles of damages under Singapore law, including compensatory remedies, penalties, and tribunal discretion—critical insights for...

Why Posse Herrera Ruiz Relies on Jus AI

Why Posse Herrera Ruiz Relies on Jus AI

by Jus Mundi
3 July 2025

Discover how one of Latin America’s leading law firms is streamlining legal research and scaling its arbitration practice with the...

From Milestone to Mission: 30 Years of Arbitration Leadership at CIESP/FIESP

From Milestone to Mission: 30 Years of Arbitration Leadership at CIESP/FIESP

by Jus Connect
2 July 2025

From procedural insights to AI challenges, the 30th-anniversary event of the CIESP/FIESP Chamber mapped the past, present, and future of...

Load More

Your daily dose of arbitration and legal industry insights.

Follow Us

Ressources

  • News
  • Legal Tech & AI
  • Legal Insights
  • Reports
  • Jus Mundi Arbitration Review (JMAR)
  • Publish on Daily Jus
  • The Daily Jusletter
  • About us

Newsletter

loader

Sign up now to get weekly digests of the latest arbitration updates and articles in your inbox.

© 2023 Jus Mundi

  • Home
  • About us
  • Jus Mundi
  • Jus Connect
No Result
View All Result
  • Home
  • News
    • Products
    • Partnerships
    • Conference Reports
  • Reports
  • Legal Insights
    • Arbitration
      • Commercial Arbitration
      • Investor-State Arbitration
      • Arbitration Aftermath
    • Mediation
    • Worldwide Perspectives
      • Arbitral Institutions’ Spotlights
      • Clyde & Co
      • London VYAP
      • SG VYAP
  • World
    • Africa
    • Americas
      • U.S.A
      • Brazil
      • Latin America
    • Asia-Pacific
      • Central Asia
      • China
      • Hong Kong SAR
      • India
      • Japan
      • Singapore
    • Europe
      • France
      • Germany
      • Poland
      • Spain
      • Switzerland
      • The Netherlands
      • United Kingdom
    • Middle East & Turkey
      • Turkey
      • UAE
  • Awards
    • Jus Connect Rankings
    • Arbitration Team Of the Month
    • Arbitration Practitioner Of the Week
  • Business Development
    • Firm growth
    • Professional Development
  • In conversation with
  • Legal Tech & AI
  • Jus Events
  • Publish on Daily Jus
    • Become an Author
    • Editorial Guidelines & Process
    • Editorial Policies
  • The Daily Jusletter
  • About us

© 2024 Jus Connect