It’s another milestone in the race to artificial superintelligence:
A study conducted by legal AI platform LawGeex in consultation with law professors from Stanford University, Duke University School of Law, and the University of Southern California, pitted twenty experienced lawyers against an AI trained to evaluate legal contracts.
Their 40 page report details how AI has overtaken top lawyers in accurately spotting risks in everyday business contracts.
The Participants:
The human participants were made up of law firm associates, sole-practitioners, in-house lawyers and general counsel.
The LawGeex AI was trained on NDA’s using machine and deep learning technologies.
The Task:
To review five Non-disclosure agreements (NDA), a very common kind commercial document.
The results:
The NDA’s contained 30 legal issues and the human lawyers achieved an average accuracy of 85% in identifying them.
Only the top lawyer managed to match the accuracy level of the AI at 94%.
The human lawyers took on average 92 min to review the documents, compared to the 26 seconds by the AI.
The Takeaway: AI can replace expertise, but…
While these are impressive results, we should also remember that humans are still required, though the role and the path to that role will change.
The study shows that Ai can catch-up and overtake experts in some areas, including those previously thought a “safe” preserve of humans.
At the same time, the review of NDA’s is a highly logical task, something computers are good at. Especially when trained with a huge set of similar, highly logical data.
Marvelling at these results, we must not forget that without such data sets, AI’s cannot (yet) match human intellectual and creative abilities because they lack the human ability to “infer” from one situation to another.
Ben Dickson, from TechTalks, explains this human ability by reference to video games. As a human, you know when you start playing a game that falling into a pit or running into a hedge is likely to result in points lost. You bring your real-world experience to bear.
Even smart AI’s cannot do this (yet) – they have to learn all the rules. Fair enough, they can learn them very fast, but a video game is a limited environment and AI still has to learn every new game from scratch. That means they need examples and, in most cases, some human support to steer them in the right direction of learning.
So this raises the challenge of expertise:
If AI still needs humans to learn, but AI’s perform a lot of the basic tasks that were the traditional route to gaining expertise (e.g. document and case law review for lawyers, accounts prep for accountants, surveys, models and simple design tasks for architects and engineers) – HOW will the humans gain the necessary expertise?
Especially, if they still operate in traditional, hierarchical organisations, helping their people to gain the learning experience and grow their expertise will be the main challenge for professional services as well as creative organisations.