Artificial Intelligence: A Powerful Tool for Court Professionals

Over the past month, I have had the same reaction to several videos I have seen on the Internet: “Is this real or is this AI?” I know that I am not alone in this thought process based on the comment sections of these videos. This is a sharp contrast to my thoughts after viewing the AI-generated video of Will Smith eating spaghetti that was widely circulated across the Internet in 2023. Although the video was an impressive demonstration of the power of AI, it was obvious that the video was not real.

With the vast amount of information floating around about AI, the topic can be overwhelming. As a court professional who uses AI occasionally, I am writing this article for other court professionals, like me, who are non-experts in the field of AI. This article will also offer valuable insights for court professionals seeking to work effectively with AI tools while understanding their potential biases.

Some people may be wondering, what exactly is AI? Depending on who you ask, that definition will change. AI is a technology that enables computers and machines to simulate human learning, perform complex tasks, solve problems, and make decisions.1 Technically, computer chess games dating back to the 1950s use AI technology. However, the type of AI technology that everyone is excited about in 2025 is known as generative artificial intelligence (GenAI). With traditional AI programs, computers may analyze data or make predictions. With GenAI, the computer can create new data and make certain decisions on its own based on prompts from the user.2 GenAI started to gain popularity among the public following the launch of ChatGPT on November 30, 2022.3

Pictured above4

I vividly remember the first time I was introduced to GenAI and its potential. I was at lunch with a judge who asked me, “Have you heard of ChatGPT? It’s fascinating.” I responded, “No, Your Honor,” and assumed the conversation about ChatGPT would stop there. The judge said, “Let me show you this on my phone; you have to see it?” I then became intrigued. I asked myself, “Why is the judge so adamant about ChatGPT?” The judge asked me to come up with a task for ChatGPT to complete, like writing a report, and explained that ChatGPT could provide results in seconds. Using three sentences, I asked it to draft a basic legal document. I was shocked at the speed, length, and accuracy of the result. It wasn’t a perfect result, but it was impressive. The results appeared on the judge’s phone right before my eyes. It was, as the judge described, fascinating! 

Following my initial introduction to ChatGPT, I started using the application over the next few days to see what else it could do. I asked for it to create recipes, draft budgets, and to help me brainstorm on several topics. Like most people, I was captivated by it, but I did not fully understand it. There were times when I was impressed with the results; other times, it would make simple mistakes. There was no denying that this was a powerful tool. When I think of a powerful tool, the first thing that comes to mind is something physical, like a circular saw or a drill. With those types of tools, the benefits, risks, and dangers of using them are imminent.

When thinking of powerful technologies such as GenAI, the risks can be overlooked, since misuse generally does not cause physical harm or injury. If court professionals choose to use GenAI in a work setting, they should be mindful and intentional about using the technology responsibly. In other words, users need to verify the accuracy of the results generated by any GenAI application.

AI assistants such as ChatGPT, Grok, Claude, or Gemini are some of the most popular tools available to users interested in leveraging this technology. These assistants can answer questions, draft correspondence, analyze data, expand ideas, and respond to a myriad of other prompts. In addition to AI assistants, other types of AI tools include, but are not limited to, video generators, meeting assistants, voice generators, music generators, and presentation builders. The sky’s the limit in terms of where this technology will take us in the future.

For court professionals who have never used AI before, I urge you to try using an AI assistant on a personal, non-confidential task. You can have an AI assistant do something like draft a letter, assist you with a travel itinerary, or help you write a positive review about a restaurant. However, before using one of these GenAI tools to do anything related to your courts, please first check with your leadership. In most cases, it is not going to be appropriate to use widely available AI tools for court business because those tools still come with certain risks, such as a lack of protection for confidential data.

This does not mean that readily accessible GenAI tools won’t have an impact on courts. Court leaders should be prepared if civil filings increase due to self-represented litigants having access to GenAI tools. The National Center for State Courts recently released a report hypothesizing that ready access to GenAI tools is likely to increase filings in contract cases.5

If you are a Court leader who has the authority to allow your staff to use GenAI for work purposes, please download a copy of NACM’s AI guide because it is an excellent resource for court leaders who want to know about using AI in-depth. Then read the section titled “AI System Procurement and Vendor Management.” It implores court leaders to be intentional about AI governance and to use AI tools that meet certain criteria. It’s best to contact a vendor whose AI systems can handle the needs of your judiciary.

A properly vetted and reputable AI tool can provide many benefits to court professionals. According to NACM’s AI guide, AI has the potential to improve both “customer and employee experiences within the court system, ranging from virtual assistants for citizens to automated support and document processing for internal operations.”6 There are several court jobs where data has to be reviewed, tables need to be created, and reports need to be generated. AI can help speed up the time it takes to complete many of those tasks. In the future, I wonder if AI will be helpful for language access needs.

Even if your court has contracted to use the best AI system available, court leaders still need to ensure that court staff are trained and have a foundational knowledge of whatever task AI is assisting them with. The Society for Human Resources Management (SHRM) stresses the importance of AI + Human Intelligence (HI), which supports humans and AI systems working together to solve problems.7 As court professionals, we strive to produce a work product that is free from errors and biases. If data inputs contain biases, results from the AI system will still contain biases. Additionally, some AI systems were developed based on data points that may contain biased information. I asked a popular AI assistant, “Do you have biases?” The assistant responded:

“As a large language model, I don’t have personal experiences, beliefs, or emotions, so I don’t have biases in the way humans do. However, I am trained on a massive amount of text and code from the internet. This means that if the data I was trained on contains biases — which much of it inevitably does, as it reflects human language and societal perspectives — those biases could potentially be reflected in the information I generate. My developers are continuously working to identify and mitigate these potential biases to ensure I provide fair and objective information.”

If that answer doesn’t convince you that certain AI tools contain biases, ask an AI image generator to “create an image of a successful person.” Pay attention to the demographics that AI assigns to the individual. Continue asking the AI image generator to create images of different types of people without mentioning any demographics. Use prompts such as “create an image of a friendly person.” You’ll probably start to notice certain patterns with the types of demographics the AI image generator is assigning to people. Now, imagine using a biased AI system that wasn’t properly trained or vetted to analyze court data. That system has the potential to produce biased recommendations or results that could negatively impact certain populations. This would provide a disservice to the judiciary and the community that it serves.

The good news is that NACM’s AI guide provides actionable steps for mitigating bias when using GenAI tools. Although some of these AI tools pose a risk of providing biased or inaccurate information, it doesn’t mean we should not use the technology. Like any powerful tool, precautions must be taken, and when used correctly, tremendous value can be gained.


ABOUT THE AUTHOR

Creadell Webb is the chief diversity, equity, and inclusion (DEI) officer for the First Judicial District of Pennsylvania (FJD). Beyond the FJD, his leadership extends to the national stage. He sits on the Board of Directors for the National Association for Court Management (NACM) and serves as the vice chair of its DEI Committee, actively shaping the conversation on inclusion within court administration. He can be contacted at creadell@nacmnet.org.


  1. Chris Stryker and Erdem Kavlakoglu, “What is AI?,” IBM, August 9, 2024, accessed July 2, 2025, https://www.ibm.com/topics/artificial-intelligence.
  2. Bernard Marr, “The Difference Between Generative AI And Traditional AI: An Easy Explanation For Anyone,” Forbes, July 24, 2023, accessed July 2, 2025, https://www.forbes.com/sites/bernardmarr/2023/07/24/the-difference-between-generative-ai-and-traditional-ai-an-easy-explanation-for-anyone.
  3. Amber Jackson, “ChatGPT turns one: How AI chatbot has changed the tech world,” Technology Magazine, November 30, 2023, accessed July 2, 2025, https://technologymagazine.com/articles/chatgpt-turns-one-how-ai-chatbot-has-changed-the-tech-world.
  4. I asked an AI image creator to recreate the scene of the judge talking to me about AI. In all fairness, I did not provide the AI image creator with a picture of myself nor the judge, but it’s awesome that AI tools can produce customized graphics based on word prompts. I can imagine a world in the future where court professionals begin to use AI to create meaningful graphics for training materials.
  5. Court Statistics Project and TRI/NCSC AI Policy Consortium, National Center for State Courts, “Is GenAI revolutionizing court filings?,” June 24, 2025, accessed July 2, 2025, https://www.ncsc.org/resources-courts/genai-revolutionizing-court-filings.
  6. Ian Ashton, et al., “Courting AI: Understanding Artificial Intelligence in Courts,” (National Association for Court Management, 2024), https://nacmnet.org/resources/store/.
  7. Society for Human Resources Management (SHRM), The AI + HI Project, accessed July 2, 2025, https://www.shrm.org/topics-tools/flagships/ai-hi.