Artificial Intelligence (AI) is disrupting almost every industry and profession, some faster and more profoundly than others. Unlike the industrial revolution which automated physical labor and replaced muscles with hydraulic pistons and diesel engines, the AI-powered revolution is automating mental tasks.
In this way, AI is having a profound effect on the practice of law. Though AI is more likely to aid than replace attorneys in the near term, it is already being used to review contracts, find relevant documents in the discovery process, and conduct legal research. More recently, AI has begun to be used to help draft contracts and also predict legal outcomes.
The potential benefits of AI in the law are real. It can increase a lawyer’s productivity and avoid costly mistakes. In some cases, it can also grease the wheels of justice to increase the speed of research and decision-making.
Lawyers are already using AI, and especially Machine Learning (ML), to review contracts more quickly and consistently, spotting issues and errors that may have been missed by human lawyers. Startups like Lawgeex provide a service that can review contracts faster, and in some cases more accurately, than humans.
For some time, algorithms have been used in discovery — the legal process for identifying the relevant documents from an opponent in a lawsuit. Now, ML is also being used in this effort. Companies like CS Disco, provide AI-powered discovery services to law firms across the US.
Another area where AI is already used extensively in the practice of law is in conducting legal research. Practicing lawyers may not even be aware they are using AI in this area, since it has been seamlessly woven into many research services. One such service is Westlaw Edge, launched by Thomson Reuters and LawPavilion. Another example of an AI-powered feature from LawPavilion is Legal analytics which uses AI to show the ratio decendi of reports.
AI can generate content as well as analyze it. Unlike AI used to power self-driving cars where mistakes can have fatal consequences, generative AI does not have to be perfect every time. In fact, the unexpected and unusual artifacts associated with AI-created works are part of what makes it interesting. AI approaches the creative process in a fundamentally different way than humans, so the path taken or end result can sometimes be surprising. This aspect of AI is called “emergent behavior.” Emergent behavior may lead to new strategies to win games, discovering new drugs or simply expressing ideas in novel ways. In the case of written content, human authors are still needed to manage the creative process, selecting which of the many AI-generated phrases or versions to use.
Much of this is possible due to new algorithms and enormous AI models. GPT-3, created by OpenAI, is one such model. GPT-3 is a generative model that can predict the next token in a sequence, whether that token is audio or text. GPT-3 is a transformer, meaning it takes sequences of data in context, like a sentence, and focuses attention on the more relevant portions to extend the work in a way that seems natural, expected and harmonious. What makes GPT-3 unusual is that it is a pre-trained model, and it’s huge — using almost 200 billion parameters, and trained on half a trillion words.
This approach has already been used in creative writing and journalism, and there are now lots of generative text tools in that area, some built on GPT-3. With a short prompt, an AI writer can create a story, article or report — but don’t expect perfection. Sometimes the AI tool brings up random topics or ideas, and since AI lacks human experience, it may have some factual inaccuracies or strange references.
In order for AI to draft legal contracts, for example, it will need to be trained to be a competent lawyer. This requires that the creator of the AI collect the legal performance data on various versions of contract language, a process called “labeling.” This labeled data then is used to train the AI about how to generate a good contract. However, the legal performance of a contract is often context-specific, not to mention varying by jurisdiction and an ever-changing body of law. Plus, most contracts are never seen in a courtroom, so their provisions remain untested and private to the parties. AI generative systems training on contracts run the risk of amplifying bad legal work as much as good. For these reasons, it’s unclear how AI contract writers can get much better any time soon. AI tools simply lack the domain expertise and precision in language to be left to work independently. While these tools may be useful to draft language, human professionals are still needed to review the output before being used.
Another novel use of AI is predicting legal outcomes. Accurately assessing the likelihood of a successful outcome for a lawsuit can be very valuable. It allows an attorney to decide whether they should take a case on contingency, or how much to invest in experts, or whether to advise their clients to settle. Companies such as Lex Machina use machine learning and predictive analytics to draw insights on individual judges and lawyers, as well as the legal case itself, to predict behaviors and outcomes.
First published on Mondaq
Hon. Justice Sinmisola Adeniyi of the Abuja Judicial Division of the National Industrial Court has…
By Ebun-Olu Adegboruwa, SAN The main responsibility of the court is to interpret the law…
ByAmb. Hameed Ajibola Jimoh, Esq. FIGPCM, CGARB. (CERTIFIED GLOBAL PEACE AND CONFLICT RESOLUTION AND MANAGEMENT…
CASE TITLE: ORIENTAL ENERGY RESOURCES LTD v. NICON INSURANCE PLC (2024) LPELR-61988(CA) JUDGMENT DATE: 25TH…
The body of law for copyright protection in Nigeria is the Copyright Act 2022 and judicial decisions…
What is the Meaning of Indefinite Suspension? Suspension is the placement of an employee in…