Using AI in group actions - the beginning of a boom?

The vast potential impact of artificial intelligence (AI) is a red-hot topic in the legal industry, from the shape of its potential regulation to changing the ways that legal services are provided.

This article first appeared in the August 2023 issue of PLC Magazine.

As the Master of the Rolls, Sir Geoffrey Vos, recently said, AI “is likely to transform the work that lawyers need to do and possibly even, in the slightly longer term, the business of judging”.

Another area of significant attention is the rise in group actions in England and Wales. However, despite having multiple group action regimes, including a nascent “opt-out” regime before the Competition Appeals Tribunal, group actions are not as common in this jurisdiction as they are in others. The introduction of AI in the legal industry could particularly benefit group actions. Language models and other advanced machine learning have the potential to help claimants and funders, defendants and the courts to litigate group actions more efficiently and cost-effectively. However, there are also potential risks and key policy considerations associated with the use of AI in group actions.

Benefits for claimants

AI could generate significant time and cost efficiencies in three key facets of the group action process for claimants and funders: market intelligence, data analysis and administration.

Market intelligence

A significant initial hurdle for claimant firms and funders is identifying, contacting, and obtaining instructions from a claimant pool of sufficient size to make the claim financially viable. This is compounded by the fact that the costs of advertising a claim to the public are, in general, irrecoverable. AI tools could transform the way in which group actions are marketed. Cognitive advertising algorithms and data analytics tools, among other AI methods, could help firms gather better market intelligence to identify their target audience more accurately and communicate with them in a more personalised manner, driving down the cost and improving the engagement of potential claimants.

Data analysis

Each group action regime in England and Wales involves an initial threshold for satisfying the court that it is appropriate for the action to proceed as a group action. Language models and automated analytical tools could drive efficiencies in collecting and analysing data on individual claims to identify common characteristics, trends and patterns. This would help legal advisers to formulate the most appropriate strategy for satisfying these initial thresholds.

Administration

At present, the day-to-day administration of group actions carries a significant time and cost burden and is a particular problem for claims conducted using a group litigation order (GLO). Where a GLO has been made, the size of the claimant group may continue to increase until a specified deadline. The claimant law firm will be engaged in gathering evidence as well as communicating with the claimant group members and generally progressing the claim. Language tools have the capability to help the legal team complete these tasks more quickly and efficiently. It is likely that the technology will be able to do this at a cost, and at a level of accuracy, that a large group of claims handlers may be unable to achieve.

Benefits for defendants

Group actions can be attractive to defendants if the alternative is having to deal with many separate claims. However, to take advantage of the potential efficiencies without prejudicing their ability to run a defence effectively, defendants need to ensure as early as possible that the issues in the case are specific and well-defined. Just as AI tools can help claimants to identify common characteristics, trends and patterns for the purpose of bringing a group action, defendants can use them to assist with identifying the key generic issues in the case. In the right circumstances, those issues may then be amenable to a preliminary determination by the court, which in turn could avoid a lengthy and expensive trial process.

Another key area in which defendants may make significant savings using AI is disclosure. Disclosure is often the most time- and cost-intensive procedural step for a defendant in a group action, and can be a significant burden on internal and external resources. The English courts have recognised the potential benefits of technology-assisted review of documents for some time, but more advanced AI tools could help to conduct and verify all stages of the disclosure process, from identifying potentially relevant document sources through to collecting potentially relevant data, formulating a reliable search methodology and review process, and filtering results for production.

In addition, AI may assist with any potential settlement strategy. An AI-supported analysis of the claimant pool, combined with inputs on the defendant’s wider strategic aims and limitations, could assist with determining potential settlement parameters and more accurately estimate potential liability.

Benefits for funders 

In broad terms, anything that makes claims cheaper to run is good news for litigation funders. At present, the outlay required to bring a group action is significant, so any savings are an attractive prospect. This is true not only at the level of individual group actions, but also in the sense that, if artificial intelligence makes group actions more cost-efficient to run, it may mean that more of them are brought overall. This would be good news for funders that rely on this type of action for their business model.

The other key potential benefit for funders is predicting case outcomes. As funders need to understand at the outset the possible return on their investment, they will usually only take on cases that meet a threshold probability of success. Currently, this probability is often determined by a legal opinion but, in the US legal market, statistical modelling predictive case outcomes are starting to emerge with a reasonable degree of accuracy. Once more UK group action judgments are available for this type of analysis, it may become possible to make case outcome predictions in the UK market that will assist funders’ assessment of risk.

Benefits for the courts

It is not only the parties to the dispute that could benefit from the assistance of AI during the group litigation process: there could also be benefits for the court that is determining the dispute. There are practical and ethical issues associated with the possibility of AI-generated judgments on individual claims, which are beyond the scope of this article. However, an AI judicial assistant could provide real value to a judge faced with determining factual and legal issues across a large class. For example, in the early stages of a case, AI tools could help the court to determine the most appropriate test cases in a GLO, or to assess in more detail the extent to which a representative claimant really has the “same interest” as the class they purport to represent. In the later stages, similar tools could help to tailor rulings on generic issues to the circumstances of individual cases.

Risks

For all of these potential benefits, there are risks associated with the adoption of AI tools in the context of group claims.

Parties and the general public are unlikely to accept the use of AI that has not been tried and tested to produce accurate and verifiable results. This is especially important in a group litigation context, where steps taken with the assistance of AI have the potential to affect the claims of many people. Any errors made by AI tools in a group action could be multiplied across the whole claimant class. A reliable means of reviewing and, where necessary, correcting errors will be essential for lawyers using AI tools.

Eliminating or correcting for data and societal biases will also be a significant challenge. At present, AI tools cannot be relied on to be completely unbiased due to biases in the underlying data used to develop and train them, and the underlying societal biases and assumptions that are introduced by human developers, whether consciously or unconsciously. Unless properly accounted for, AI applied across a claimant group could ingrain these biases across the class.

Claimant and defendant lawyers will also need to ensure that the AI they adopt does not undermine their professional obligations to their clients, including confidentiality, and other regulatory responsibilities such as data privacy obligations. Data privacy issues will be of particular relevance in group claims because using AI tools to gather and assess information and evidence from a claimant class will almost inevitably involve processing significant quantities of personal data. On 15 June 2023, the Information Commissioner’s Office issued a warning to organisations using generative AI to ensure that they comply with relevant law and regulation, those involved in group actions will be no exception.

Policy considerations

Overall, AI is likely to radically alter not only the financial and cost considerations that drive group actions as between claimants, defendants and funders, but also the policy considerations applied by both the courts and the government that underpin those claims. The widespread use of AI to generate time and cost efficiencies in running and determining group claims could open the door to them becoming more prevalent in England and Wales.

In a recent judgment, Mr Justice Knowles commented that “in a complex world, the demand for legal systems to offer means of collective redress will increase not reduce” (Commission Recovery Limited v Marks & Clerk LLP and another [2023] EWHC 398). There is a public policy argument in encouraging the use of AI to meet that demand and improve access to justice by facilitating the bringing, defending and judgment of group actions in a more cost-effective and proportionate manner. However, as in other contexts, the risks associated with the use of AI will need to be considered carefully.

This article was co-authored by trainee Milo Grounds.