A man dressed in a light-coloured long-sleeved polo shirt works with his laptop and tablet
Time saver: law firms experimenting with generative AI hope its ability to automate tasks will increase productivity © Getty Images

Most law firms are either experimenting with generative artificial intelligence or say they plan to use it. They hope the technology will cut costs and increase productivity by automating everyday tasks, such as reviewing and drafting contracts and providing initial legal opinions.

Yet, despite much excitement about generative AI — which can quickly create humanlike text, images and other content in response to simple prompts — firms are also mindful of its limitations.

Australian law firm Gilbert + Tobin has introduced a generative AI question and answer function, providing its lawyers with specialist knowledge across various practice areas, including guidance on contracts.

To encourage its lawyers to engage with generative AI, the firm also offered a “bounty” of A$20,000 last year, for ideas on how to use the technology. The initiative attracted more than 100 suggested uses, says Caryn Sandler, partner and chief knowledge and innovation officer at the firm. Across all levels of the organisation, there is “genuine interest in generative AI . . . for help with drafting, with communication and synthesising data”, she reports.

Gilbert + Tobin, which has run generative AI “masterclasses” for clients, is developing prototype tools based on winning ideas from its brainstorming initiative. The firm plans to build some of the tools itself and buy the rest from external software suppliers, says Sandler.

The risks of generative AI have been well documented, including so-called hallucinations, where the technology fabricates information and presents it as if it were fact. There have even been cases where lawyers have relied on flawed AI-generated text for court submissions.

Sandler says the firm has built “guardrails”, which include keeping client data secure and giving the AI access only to information produced or vetted by the firm.

MinterEllison, another Australian law firm, has developed a generative AI tool that can produce a first draft of legal advice within a minute that is about 80 per cent accurate, drawing on historic advice the firm has given in other cases plus other legal resources. The tool — based on Open AI’s generative AI chatbots — was developed by MinterEllison and IT company Arinco, using software including a chatbot from Microsoft.

“We’re rolling [the tool] out to the whole firm and the goal is . . . within 12 months, for 80 per cent of the firm to be using it,” says Simon Ball, a partner in MinterEllison’s environment and planning team, who led the project. The system is “incredibly efficient”, he says, providing support on tasks that include summarising case law and draft advice.

Automating such tasks will free junior lawyers to “operate higher up the value chain” and spend more time solving clients’ problems, Ball adds.

To minimise the risk of errors, the tool shows the information sources its output is based on. A senior lawyer will also check its output before sending anything to a client, Ball says.

As with any IT system, generative AI will produce useful output only if the information it is fed is accurate and relevant. Members of Ball’s team are therefore responsible for keeping this information updated.

Like most law firms, MinterEllison has a traditional time-based pricing model but Ball says that, even if the technology increases lawyers’ productivity, the firm will need more evidence of generative AI’s impact on client work before making any changes to pricing.

Some law firms are also looking to sell their AI technology to corporate clients’ own legal departments. A&O Shearman is among the first to try this. Its AI contract negotiation tool, ContractMatrix, is being rolled out to clients in an attempt to drive new revenues, attract more business, and save time for in-house lawyers. The firm has estimated that the tool could save seven hours per contract during negotiations.

The technology, developed in partnership with Microsoft and legal AI start-up Harvey, draws on existing templates for contracts, such as non-disclosure agreements and merger and acquisition terms, to draft new contracts that lawyers can then amend or accept.

“We try to remove all the glitz and glamour and focus on the fundamental things,” says David Wakeling, a partner, and head of A&O Shearman’s markets innovation group, which developed ContractMatrix. This involves assessing “where is it not good and where is it good?”, he says. “It does sound a bit boring, but a 20 per cent productivity gain [an estimate based on internal and client feedback] is pretty serious for business.”

For others, however, finding high-quality generative AI can be difficult. Pete Zhang, a partner at JunHe, says the Chinese law firm has been testing various AI products.

China’s generative AI software is useful for some tasks but is not yet of sufficient quality to draft legal documents, he says. “Currently, we just use legal AI to do the research, the legal analysis, the [language] translation of emails . . . and also sometimes summarise legal documents.” 

Zhang is impressed by the quality of some foreign tools for the legal sector, but says China’s data privacy rules, including restrictions on data transfer across borders, can complicate their use at Chinese firms.

But, in spite of regulatory issues and concerns over generative AI’s reliability, some law firms say it will be integral to their future operations.

Sandler at Gilbert + Tobin sums it up for many when she says generative AI “presents an enormous opportunity for our practice and for all law firms . . . and enormous risks. It’s moving at such an incredible pace that if you don’t start you can get left behind.”

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Comments