Home Executive Interviews Machine Learning Disrupting the Law Industry

Machine Learning Disrupting the Law Industry

As companies continue their journey to digital transformation, they’re increasingly adding artificial intelligence and machine learning to their toolbox.

Today, the law industry is facing the same disruption as ai and machine learning is transforming the legal profession in many ways.

In this conversation with Rob Wescott, Chief Strategy Officer (CSO) at Ayfie, he explains how machine learning aids lawyers in their jobs, the limitations of AI in law and the wider opportunities that exist.

Rob Wescott, Chief Strategy Officer (CSO) at Ayfie
Rob Wescott, Chief Strategy Officer (CSO) at Ayfie

What does your company do? Which industry do you serve?

We provide software solutions to answer difficult questions based on data hidden in large amounts of text.

Our primary industry is legal where we solve problems in discovery and litigation as well as knowledge discovery based on historical documents.

What specific problem(s) are you trying to solve? Explain which part (s) of the process of business are you addressing?

We add structure to unstructured text which allows it to be leveraged for deeper analysis.

The structure is identified leveraging a codified understanding of natural language.

With a structured representation of the data, businesses can make data-driven decisions quicker. This lowers the cost of intervention and increasing the opportunity to influence outcomes.

For example:

Growth of data sets in eDiscovery continues to outpace budgets and human resources that can be applied to a matter.

We aid the eDiscovery process by identifying structure in large volumes of text. Law firms and in-house counsel automatically remove redundant information by identifying unique and inclusive content.

MORE: Top 10 Applications of Artificial Intelligence in Law

The structure allows for guided search, visual navigation, and the entity which helps counsel explore data and identify highly-relevant documents.

Conceptual overlap allows similar and related documents to be identified.

Elevated Machine Learning leverages input from review teams to continually learn and refine trained models predicting relevancy, prioritizing the review set, and evaluating the consistency of human reviewers.

Knowledge discovery in law — by indexing data throughout the firm, adding structure our clients are turning internal data sources into a knowledge resource and providing a single-point of access.

Documents are linked through shared concepts, references, customers and matters. These connections allow for previous work product to be leveraged and analyzed.

Problems like finding who at our firm has worked on this type of matter in this jurisdiction before becoming views of the data connections.

If a company is on a lookout for AI products/services or embarking on their AI strategy. What specific advice/tips will you give them?

Understand what you are trying to accomplish. Are you solving a problem to which a machine can do as good or better the job a human does?

Identify where in the business process is human reasoning required and what can the technology accomplish at higher speeds, similar or better results, at a lower cost than humans.

Understanding the difference between dreaming of “Artificial” magic and effectively applying “Augmented Intelligence” to solve business problems will drive a successful strategy.

What are the key challenges facing your clients?

With data volumes growing, business processes that require humans reading documents is becoming costlier.

The high volumes of textual data combined with no clear view into that content, where it lives, what is in it and how can it can be used is the biggest challenge.

What are the limitations of AI?

The core limitations of AI, with regards to natural language, are its inability to leverage context, understand variances and resolve the ambiguity.

The human brain is incorporating an unimaginable number of nested and independent contexts into our conversations.

Cultural context, for example, can be noise in the form of prejudice, but at times it provides vital clues as to how to interpret a sentence.

The prior “knowledge” that women are worse at driving is not helpful, but the cultural clue that “How are you?” is a greeting rather than a question is essential for following a conversation.

Our brains, for the most part, have no trouble processing these concepts. AI, on the other hand, does not behave like our brains.

It is not able to decipher context, and it is unable to deal effectively with variances or ambiguity.

What are the most important initiatives you are addressing right now?

Identifying with high-levels of precision the existence of PII and PHI and their connections in data stores is a critical business concern.

There is a significant focus for our customers on data breach investigations, privacy compliance, and monitoring systems for exposed and vulnerable data.

Many vendors provide niche products when can leave companies with multiple platforms and tools. How will this affect the business and your clients?

Tools that solve very specific workflow issues for very specific problems can often benefit from a universal index of content and structured meta-data

Our clients can often leverage the text-lake create to power niche needs with other vendors or building workflows on top of the data.

Machine learning and deep learning is fashionable now. A lot of vendors don’t have any ‘real’ AI components in their software but are just automation tools. How do you incorporate AI into your products and services?

We focus on the identification of concepts in the text through context, it leverages natural language insights codified into the system through dictionaries, local grammars and extraction algorithms.

This allows us to develop, weight and normalize the features that drive statistical analysis of content. This lowers the volume of training documents needed to achieve high-levels of precision in machine learning on textual data.

Can you share with us how the AI component of this? 

The key components of the  AI are a series of structures around how language works and specific rule in the form of dictionaries, local grammars, and extraction algorithms.

Dictionaries: deploys (manually constructed) morpho-semantic dictionaries, which specifies all the morphological forms of simple and complex words of a language.

In addition to general language, specialized dictionaries of entities (people, locations, organization, etc.) as well as technical vocabulary are necessary.

For example in English, the basic dictionary system contains around 20 million entries. For the analysis of the English Wikipedia about 60 million lexical terms are detected.

Local Grammars: Local Grammars are specifications of how billions of complex expressions (e.g. noun phrases, verb phrases, adverbial phrases, dates, events, technical expressions, etc.) are constructed out of simpler expressions.

Local grammars can be constructed in an incremental way and designed to “learn” new constructions. Contrary to other techniques (e.g. machine learning, statistical approaches) local grammars can easily be modified and extended.

It uses hundreds of thousand specific local grammars and can easily determine the new local grammars needed for new technical domains (e.g. medicine, finance, law, etc.) by analyzing large text corpora in these domains

Extraction Algorithms:  text analytic algorithms are based on finite-state technology both for the manipulation of very large dictionaries (construction and deployment) and the application of very large local grammars (together with dictionaries) to text.

The extraction algorithms can be used for semantic parsing both of entities and of propositions (i.e. sentences).

The extraction technology is the same for all languages (only the dictionaries and grammars need to be adapted).

The extraction algorithms are extremely fast – parse all of the English Wikipedia in a few minutes, compared to weeks with competing technologies.

Can you share with us some case studies on how your offering is deployed? Be specific with clients and industry.

We don’t share specific client case studies, but I can share more generalized use cases.

We’ve had clients in the legal industry use our technology for all kinds of applications. Most commonly, lawyers and researchers link externally to law data to references within internal documents.

One very successful client used the software to identify and extract over 200 data points from complex mortgage documents. This saved them countless hours of labor.

Another recent client used it to identify key data in the inspection reports for a manufacturing facility.

The uses and implications of AI-driven software for legal professionals are endless.

MORE: 10 Amazing Examples Of Natural Language Processing

Subscribe to our newsletter

Signup today for free and be the first to get notified on the latest news and insights on artificial intelligence

KC Cheung
KC Cheung has over 18 years experience in the technology industry including media, payments, and software and has a keen interest in artificial intelligence, machine learning, deep learning, neural networks and its applications in business. Over the years he has worked with some of the leading technology companies, building and growing dynamic teams in a fast moving international environment.
- Advertisment -

MOST POPULAR

AI Model Development isn’t the End; it’s the Beginning

AI model development isn’t the end; it’s the beginning. Like children, successful models need continuous nurturing and monitoring throughout their lifecycle. Parenting is exhilarating and, if...