No Result
View All Result
SUBSCRIBE | NO FEES, NO PAYWALLS
MANAGE MY SUBSCRIPTION
NEWSLETTER
Corporate Compliance Insights
  • About
    • About CCI
    • Writing for CCI
    • NEW: CCI Press – Book Publishing
    • Advertise With Us
  • Explore Topics
    • See All Articles
    • Compliance
    • Ethics
    • Risk
    • Artificial Intelligence (AI)
    • FCPA
    • Governance
    • Fraud
    • Internal Audit
    • HR Compliance
    • Cybersecurity
    • Data Privacy
    • Financial Services
    • Well-Being at Work
    • Leadership and Career
    • Opinion
  • Vendor News
  • Downloads
    • Download Whitepapers & Reports
    • Download eBooks
  • Books
    • CCI Press
    • New: Bribery Beyond Borders: The Story of the Foreign Corrupt Practices Act by Severin Wirz
    • CCI Press & Compliance Bookshelf
    • The Seven Elements Book Club
  • Podcasts
    • Great Women in Compliance
    • Unless: The Podcast (Hemma Lomax)
  • Research
  • Webinars
  • Events
  • Subscribe
Jump to a Section
  • At the Office
    • Ethics
    • HR Compliance
    • Leadership & Career
    • Well-Being at Work
  • Compliance & Risk
    • Compliance
    • FCPA
    • Fraud
    • Risk
  • Finserv & Audit
    • Financial Services
    • Internal Audit
  • Governance
    • ESG
    • Getting Governance Right
  • Infosec
    • Cybersecurity
    • Data Privacy
  • Opinion
    • Adam Balfour
    • Jim DeLoach
    • Mary Shirley
    • Yan Tougas
No Result
View All Result
Corporate Compliance Insights
Home Governance

AI Risk in 2026: 3 Critical Changes for the General Counsel

The discipline legal ops brings — technology evaluation, vendor management, compliance monitoring — maps to capabilities required for AI governance

by Jenny Hamilton
January 20, 2026
in Governance
scales of justice statue on desk

When courts sanction lawyers for AI hallucinations, they hold counsel responsible regardless of which department selected the tool or how sophisticated the vendor’s claims were, and the approach of delegating technology decisions to IT or legal ops fails spectacularly with AI. Exterro’s Jenny Hamilton explores how AI malpractice risk should provide legal ops with a renewed mandate, why the discipline it brings maps to capabilities required for AI governance and how procurement conversations will shift from “Can this tool increase efficiency?” to “Can this tool withstand scrutiny if challenged?” 

The legal profession faces a new category of risk that is accelerating faster than previous technology-mediated legal obligations: the use of AI for legal work. This shift places in-house counsel in unfamiliar territory, and it is starting to keep general counsels up at night. As a result, in 2026, general counsels (GCs) will begin to engage more deeply with their legal tech strategy, invigorate the legal ops function with greater authority and push their professionals to use purpose-built legal AI.

The traditional malpractice risk calculus for in-house counsel has been relatively benign compared with that for law firm attorneys and insurers of firms paying claims exceeding $50 million in the past two years. Likewise, the risk of court sanctions has also been relatively rare for in-house counsel. 

AI disrupts this dynamic because hallucinated legal advice heightens organizational liability, exposing companies to third-party claims, regulatory violations and transaction failures that cannot be resolved by terminating in-house counsel using AI without subject matter expertise. 

AI sanctions and malpractice risk arrive at an inopportune moment for in-house counsel. Legal departments face mounting pressure to reduce outside counsel spend while managing increasingly complex regulatory environments. Many in-house have turned to AI solutions to counteract law firm rate hikes, particularly for experienced regulatory counsel. However, using AI for legal work introduces the risk of malpractice and sanctions that these tools were not designed to guard against. It also significantly increases reputational risk for attorneys who use AI tools without understanding their legal and technical capabilities.

In 2024 the American Bar Association (ABA) issued ethics guidance establishing that lawyers have a reasonable understanding of AI’s capabilities and limitations and must verify all AI-generated output. While the opinion stopped short of imposing strict liability, it reinforced the lawyer’s duty to maintain technical competence established by the ABA in 2012 in Rule 1.1, comment 8. 

Despite this, there are now more than 600 AI hallucination cases on record, implicating 128 lawyers and including attorneys from top-tier firms. In Johnson v. Dunn, a federal court in Alabama disqualified a Nashville law firm from the case, referred the attorneys to the state bar associations in all jurisdictions where they were licensed and required them to file a copy of the sanctions order in every pending case in which they were counsel of record.

While these cases focus on lawsuits managed by outside counsel, the judiciary is clear: The duty to use AI responsibly attaches to the attorney personally — not the tool, not the vendor. Failure to adhere to these ethical obligations will increase the risk of sanctions and has already done so. 

fractional leadership concept wood figures
Governance

General Counsel on Demand: Why High-Risk Sectors Are Embracing the Fractional Model

by Alex Leonowicz
November 17, 2025

Unlike outside counsel who parachute in and out, fractional GCs embed within the business to shape strategy and build systems

Read moreDetails

Prediction 1: GCs will get more engaged with technology

Driven by the risk of sanctions and malpractice, general counsels will engage at unprecedented levels in their legal technology sourcing strategies over the next year.

Historically, legal departments have delegated technology decisions to IT or legal ops, engaging more substantively when vendor contracts require review or compliance questions arise. This approach fails spectacularly with AI.

When courts sanction lawyers for AI hallucinations, they hold counsel responsible regardless of which department selected the tool or how sophisticated the vendor’s claims were. Consider the duty to preserve evidence, a substantive legal obligation that cannot be outsourced. Courts have consistently held that the obligation to preserve evidence runs first to counsel, who must then advise clients and monitor compliance. The duty to preserve has led to far more malpractice and sanctions cases than issues involving other legal technologies.

AI governance demands the same level of professional responsibility. Professional responsibility will require greater engagement from the top down.

Prediction 2: legal operations reinvigoration

The second shift will be the reinvigoration of legal operations as a discipline. Legal ops emerged in the early 2000s when corporate law departments at companies like GE and Bank of America began applying business operations principles to legal service delivery. The 2008 financial crisis accelerated adoption as legal budgets tightened and CFOs demanded greater accountability for legal spend. By 2010, legal ops were firmly established in enough large enterprises that practitioners formed the Corporate Legal Operations Consortium (CLOC) to develop industry standards. Yet in many organizations, legal ops has struggled to secure consistent executive engagement, support and funding, often being relegated to tactical process improvement rather than strategic program management.

AI malpractice risk should provide legal ops with a renewed mandate to educate and protect in-house practitioners from misuse of AI technology. The discipline that legal ops brings, like technology evaluation, workflow standardization, vendor management, metrics and compliance monitoring, maps to capabilities required for AI governance and will protect in-house from mistakes that cost their organization.

GCs who recognize this will staff and fund legal ops teams to build more detailed guardrails in playbooks that support active AI governance programs, creating a natural growth path for a discipline that has long sought to demonstrate its value to the C-suite.

Prediction 3: greater scrutiny of AI tools used for legal work

The third shift will be increased scrutiny over which AI tools are used for legal work and how they are used. Corporate legal departments have long relied on general-purpose technologies adopted from other functions, often prioritizing cost efficiency and ease of deployment. That approach is increasingly complex to reconcile with the professional obligations imposed on lawyers using AI.

Courts and regulators have made clear that output verification, transparency and security controls matter more than a tool’s novelty or popularity. As a result, general counsel will begin to assess AI systems less on convenience and more on whether their design, training and governance mechanisms support defensible legal workflows.

This does not mean one category of technology will fit all organizations. It does mean procurement conversations will increasingly center on risk alignment: whether a given system enables lawyers to meet their ethical duties, maintain confidentiality, and explain how they reached their conclusions. The central question will shift from “Can this tool increase efficiency?” to “Can this tool withstand scrutiny if challenged?”

Applying lessons learned

These predictions are not speculative. They represent the natural evolution of law department behavior in response to significant shifts in technology and risk.

As AI for legal work gains popularity, 2026 will be an inflection point for GCs. Those who recognize the risk to their people and reputation will respond with greater engagement and investment in education, operations and technology to manage exposure before it impacts the organization at scale.


Tags: Artificial Intelligence (AI)
Previous Post

The EU AI Act Change That No One Is Talking About

Next Post

Avetta Launches Training Platform for Supply Chain Workers

Jenny Hamilton

Jenny Hamilton

Jenny Hamilton is chief legal officer of Exterro, a data management provider. She formerly was deputy general counsel at HaystackID and held roles in commercial litigation at multiple firms.

Related Posts

ai summary on google

If AI Search Engines Don’t Know Your Brand, Fraudsters Will Define It for You

by Jonathan Armstrong
February 24, 2026

Financial services organizations face particular exposure as investment and employment scams proliferate through AI-generated content

business using AI concept collage

The Rising Tide of AI-Washing Cases in Securities Fraud Litigation

by James Christie and Nick Manningham
February 24, 2026

Opendoor algorithm couldn’t adjust to changing conditions; Upstart model didn’t respond dynamically to macroeconomic changes — both faced fraud claims

ai black box

Your Foreign AI Vendor’s Black Box Is an Ethics Problem, Not a Technical One

by Vera Cherepanova
February 18, 2026

Without someone inside the organization who can meaningfully challenge an AI system's behavior, documented controls slide into paperwork rather than...

data center under construction

Higher Power: Can AI Investment & Climate Strategy Co-Exist?

by Tim Weiss
February 11, 2026

At your next board meeting where AI appears on the agenda, add one question: Can our AI growth plans and...

Next Post
Avetta U Launch

Avetta Launches Training Platform for Supply Chain Workers

No Result
View All Result

Privacy Policy | AI Policy

Founded in 2010, CCI is the web’s premier global independent news source for compliance, ethics, risk and information security. 

Got a news tip? Get in touch. Want a weekly round-up in your inbox? Sign up for free. No subscription fees, no paywalls. 

Follow Us

Browse Topics:

  • CCI Press
  • Compliance
  • Compliance Podcasts
  • Cybersecurity
  • Data Privacy
  • eBooks Published by CCI
  • Ethics
  • FCPA
  • Featured
  • Financial Services
  • Fraud
  • Governance
  • GRC Vendor News
  • HR Compliance
  • Internal Audit
  • Leadership and Career
  • On Demand Webinars
  • Opinion
  • Research
  • Resource Library
  • Risk
  • Uncategorized
  • Videos
  • Webinars
  • Well-Being
  • Whitepapers

© 2026 Corporate Compliance Insights

Welcome to CCI. This site uses cookies. Please click OK to accept. Privacy Policy
Cookie settingsACCEPT
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT
No Result
View All Result
  • About
    • About CCI
    • Writing for CCI
    • NEW: CCI Press – Book Publishing
    • Advertise With Us
  • Explore Topics
    • See All Articles
    • Compliance
    • Ethics
    • Risk
    • Artificial Intelligence (AI)
    • FCPA
    • Governance
    • Fraud
    • Internal Audit
    • HR Compliance
    • Cybersecurity
    • Data Privacy
    • Financial Services
    • Well-Being at Work
    • Leadership and Career
    • Opinion
  • Vendor News
  • Downloads
    • Download Whitepapers & Reports
    • Download eBooks
  • Books
    • CCI Press
    • New: Bribery Beyond Borders: The Story of the Foreign Corrupt Practices Act by Severin Wirz
    • CCI Press & Compliance Bookshelf
    • The Seven Elements Book Club
  • Podcasts
    • Great Women in Compliance
    • Unless: The Podcast (Hemma Lomax)
  • Research
  • Webinars
  • Events
  • Subscribe

© 2026 Corporate Compliance Insights