While some Virginia law firms have integrated a host of artificial intelligence tools into their work, others are still weighing the risks. Photo illustration by Adobe Stock
While some Virginia law firms have integrated a host of artificial intelligence tools into their work, others are still weighing the risks. Photo illustration by Adobe Stock
Summary
Despite dire warnings about how artificial intelligence could upend white collar work as we know it, attorneys at Virginia law firms who have incorporated AI into their daily work are optimistic the technology is already changing the legal industry for the better.
An AI-enabled future is no longer theoretical. In the span of a few years, law schools have adapted their curricula to prepare the next generation for AI proficiency, while the standard practice of billable hours is being reassessed. These changes give lawyers a rare opportunity to reimagine what it means to practice law.
But clients also stand to benefit. An end to the billable hour could create deflationary pressure that will reduce the cost of legal services, make costs more predictable and democratize access to legal services for clients once priced out, says Justin Ritter, founder of Ritter Law in Charlottesville. There are upsides for attorneys, as well: The efficiencies of using AI could free them up to spend more time thinking strategically and engaging in the very enjoyable intellectual aspects of their careers, potentially addressing burnout, he adds.
“I am so excited about what’s coming and where we are right now,” Ritter says.
That sense of optimism is relatively newfound. Until last year, Ritter didn’t encourage people asking for career advice to pursue law school. But after he and his colleague, Christopher Sullivan, incorporated AI into their practice and began co-teaching a course at the University of Richmond School of Law focusing on integrating AI into law, Ritter sees the career path in a new light, thanks to AI’s transformative potential.
“Now, you can spend more time being strategic,” he says. “Ultimately, I’m trying to create a high-quality work product faster and cheaper, and these tools allow me to do that and in ways I never really thought were possible.”
Other lawyers, however, haven’t fully embraced AI and are still weighing the risks. That’s created an uneven path of adoption among Virginia law firms.
And while many factors dictate a law firm’s willingness to implement an AI strategy, a lack of tools isn’t one of them. Rather, the sheer number of AI tools can be overwhelming to navigate, as no clear winner has emerged to claim market share dominance in the legal industry, Ritter notes.
Some AI providers court only the largest law firms, while others offer hyper-specific solutions that may be too specialized. Factor in some of the resource or bureaucracy constraints exacerbated by a firm’s size, and lawyers face a Goldilocks dilemma of identifying which tools are “just right” for their needs.
Solo practitioners might greatly benefit from AI, but they may be hampered by a lack of time and money to experiment, notes Kellam T. Parks, managing member of Parks Zeigler in Virginia Beach. Mean-while, he adds, AI implementation plans at larger firms could get waylaid in committees and bureaucracy for months, if not years.
A firm like Parks Zeigler, with 13 attorneys, is in a “sweet spot” for AI adoption — nimble enough to adapt quickly and large enough to have the resources for experimentation, Parks says. Being an early adopter, he says, can offer firms like his competitive advantages against larger, deeper-pocketed law firms — at least in the short term. “When they figure it out,” however, he adds, “they’re going to lap me.”
Parks still has a considerable head start: He began experimenting with AI following the public release of ChatGPT in late 2022 and hasn’t stopped. Now well-versed in the current array of providers, Parks rattles through a carefully vetted list of at least 15 tools his firm regularly uses for administrative and legal tasks.
That number may seem high, but each tool serves an extremely specific purpose. Parks uses Dialpad and Fireflies.ai to summarize and transcribe client conversations; speech coaching tool Yoodli to prepare for trials; and Lexis+ AI and Callidus for assistance with research and drafting legal documents. Finally, the firm uses Billables AI to automate billing by tracking the amount of time spent working on each client.
That list doesn’t even include many other AI tools the firm uses for nonlegal tasks like preparing a slideshow or drafting marketing materials. Lest there be any doubt about Parks’ enthusiasm for AI, even the firm’s professional headshots are AI-generated.
The subscription costs of these various tools, while expensive, are easily justified. “AI costs money, but it also builds efficiencies,” Parks says. “We’re always exploring new tools.”
Some of the largest U.S. law firms — including Kirkland & Ellis and DLA Piper — have developed proprietary AI tools for their practice areas. That solution often isn’t practical for midsize firms, while the ad hoc approach favored by small firms also isn’t feasible.
Midsize firms risk being left behind as both their larger and smaller competitors forge ahead. And this makes them the target of AI providers eager to pounce on potential new customers.
Even as providers tout their AI tools’ capabilities, law firms must be discerning about whether applications address their needs, sufficiently protect client data and adhere to professional ethical duties, notes Beth Burgin Waller, principal and chair of the cybersecurity and data privacy practice at Woods Rogers in Richmond and Roanoke.
Waller characterizes Woods Rogers as being “on the cusp of big changes,” and the firm is currently evaluating several generative AI tools that can be customized to its needs. The firm’s attorneys and staff are already using five to 10 commercially available and customized tools for tasks ranging from summarizing materials to drafting documents. And a year from now, she expects AI to play an even bigger role in daily business.
But handling sensitive client and firm information is another concern with artificial intelligence. “We’re trying to balance innovation with responsibility,” Waller says. “We’re trying to be thoughtful about how we deploy tools that protect our secrets.”
Gentry Locke Attorneys is similarly vetting options but has yet to integrate any AI tools into its legal practice. The Roanoke-based firm hasn’t shied away from AI altogether — attorneys and support staff use tools for a variety of nonbillable tasks — but it’s proceeding with caution with respect to legal work, says K. Brett Marston, the firm’s managing partner.
How ethical rules pertain to the use of AI has been a heavy focus for the Virginia State Bar, for which Marston serves as this year’s president.
As AI tools become more advanced, the human element cannot be ignored — particularly as young associates navigate an AI-enabled future while still learning the letter of the law. “The No. 1 job of attorneys will be to ask, ‘Did you use AI to help prepare this, and is it accurate?’” Marston says.
To assess both the opportunities and the risks, Jessiah Hulle, a Richmond-based associate with Gentry Locke, has been tracking instances of AI misuse in litigation, including more than 100 cases in which attorneys nationwide have filed briefs that contained AI-hallucinated information. Hulle expanded his monitoring to include the misuse of AI by other parties, such as expert witnesses, and the potential effect on evidence. “That kind of stuff is good to keep an eye on,” he says.
The perceived risks of missing out on the AI bandwagon are more than outweighed by the potential reputational risks of misusing this technology. “We’re being a little more purposeful with our approach,” Hulle adds.
Just as law firms are undergoing a technological revolution, so are law schools. Prospective attorneys must prepare for AI-enabled jobs by becoming adept at using a broad array of tech tools and understanding the limitations and vulnerabilities of those systems, says Margaret Hu, a professor and director of the Digital Democracy Lab at William & Mary Law School.
“It’s absolutely imperative that law schools begin shifting the way they teach to accommodate the rapid transformation of the legal profession by AI and other emerging technology,” Hu says.
There will be a steep learning curve for junior law firm associates to become comfortable deconstructing whether AI outputs are accurate and contain any vulnerabilities or risks, Hu says. This will require critical thinking skills to determine when it’s necessary to override an AI system if the output is incorrect, she adds. “Will they have the skills to be able to do that responsibly and effectively? That’s the question.”
Ritter is optimistic that opportunities for law school students are far from some industry analysts’ doom-and-gloom predictions that AI will take away a significant number of jobs from law school grads.
For the final project in the course he’s been teaching with Sullivan for the past two years at the University of Richmond School of Law, students must develop an AI bot to address a very narrow legal topic, and he’s been impressed both by their innovations and AI proficiency. Such skills immediately make them “way more useful” to law firms, he adds. “It’s very inspiring to see.”
t