Ethics Exam Showdown: Chat GPT vs. Law Students – A New Era in Legal Education

Last Thursday, LegalOn Technologies published a study claiming AI chatbots outperformed most aspiring lawyers on the Multistate Professional Responsibility Exam (MPRE). According to the report, OpenAI’s Chat GPT-4 performed best by answering 74% of the exam questions correctly, outperforming the nationwide average of human test-takers answering 68% correctly. Earlier this year, other research concluded that GPT-4 also surpassed law students in passing the Uniform Bar Exam. What does this mean for the future of legal education? Let’s get into the details.

Standardized Tests in Legal Education

Law students in every state except Wisconsin must pass two exams to become a lawyer: the MPRE and the Bar exam. The National Conference of Bar Examiners (NCBE) develops both exams. The MPRE is typically taken during a law student’s second year of study and tests legal ethics and professional conduct. It is a 60-question, multiple-choice exam administered over two hours.

Law students complete the bar exam after they graduate from law school. The bar exam is the final hurdle before becoming a licensed attorney in the United States. Every jurisdiction administers a bar exam “to test a candidate’s ability to think like a lawyer and prove they have the ‘minimum competency’ to practice law in that state.” The Uniform Bar Exam (UBE) is a 2-day exam promulgated by the NCBE but is administered and scored by individual states. The majority of states have adopted the UBE, with exceptions including California, Florida, Virginia, Delaware, and Hawaii, to name a few.

Bar requirement changes are taking hold in some states. Oregon announced earlier this month that starting next year the state will no longer require law students to take the bar exam in order to become licensed attorneys. The state is initiating the Portfolio Bar Exam —an “alternative pathway to licensure that would allow aspiring lawyers to spend four to six months working under the supervision of an experienced attorney and to gain admission to the bar after submitting an acceptable portfolio of legal work.” California is also considering the Portfolio Bar exam. Last Thursday, the State Bar of California’s board of trustees voted to test-run the program. Now, it is waiting for the California Supreme Court to sign off on it.

What Does the MPRE Study Conclude?

The study tested four leading generative AI models, including OpenAI’s Chat GPT-4 and GPT-3.5, Anthropic’s Claude 2, and Google’s PaLM 2 Bison. According to the report, GPT-4 and Claude 2 achieved scores exceeding the approximate passing threshold (between 56-64%) for the MPRE in every state. The study used an MPRE-style exam question developed by an ethics and economics professor from the University of Houston Law Center.

The study used standard application programming interfaces (APIs) and basic prompting of “Answer the following multiple-choice question.” To simulate the MPRE, researchers randomly selected 60 questions from an available 500 in each subject area.

“Based on the sampling and testing methodology described above, the overall mean accuracy for each of the models was as follows: GPT-4 answered 74% correct, Claude 2 answered 67% correct, GPT-3.5 answered 49% correct, and PaLM 2 answered 42% correct.”

The study concludes with an implication that generative AI models can now apply black-letter ethical guidelines and help lawyers with legal ethics questions. It emphasizes that though GAI models perform well, it remains vital that legal professionals who use AI must understand its capabilities and limitations –ensuring that they are the final decision-makers. It also cautions that legal technology providers test their models extensively with “lawyer-led validations, coach LLMs to consistently produce profession-grade results, and augment generative AI with domain-specific content and training.”

How Does Generative AI Impact Law Students’ Test Scores?

Earlier this year, two University of Minnesota law professors conducted a study on integrating Generative AI (GAI) with legal writing assignments and taking exams. The study found that students could complete their assignments faster with AI, but the work product was not any better than the students who completed the assignments without technology.

Regarding law school exams, the study found “low-performing students scored higher on final exams when given access to GPT-4, while their high-performing classmates performed worse when using the technology.”

The results indicate that generative AI is becoming a vital tool for law students. The study did, however, urge law schools to ban the use of GAI in core first-year courses and on exams because the technology “disproportionately helps lower-performing students.”

What Does AI Mean for the Future of Legal Education?

GAI can help students get their work done faster, but according to research, the quality of work is not any better with AI. GAI is also incredibly helpful with legal research. Overall, GAI saves a person time, and time is something both law students and lawyers need more of. As this technology continues to develop rapidly, law schools and educators must adopt AI policies so that students understand when they are allowed to use AI and when they are not. Law schools must also implement AI training so that students become familiar with AI software –its potential and limits.

There has been a growing debate in the legal community surrounding the ethics of using AI in legal practice, which encompasses legal education. Some argue that AI threatens knowledge-based learning because if students rely on AI to do the work, are they truly “learning” or developing new skills? Others argue that AI is rapidly integrating into our society. Therefore, teaching students how to become familiar with this technology is necessary as it will most likely impact future job prospects.

Regardless of the AI debate, it is a fact that AI integration is at the forefront of technology development. Law students and faculty can benefit from AI and save time researching and writing, while legal institutions can develop AI policies and guidelines so that AI does not benefit some students to the detriment of others.

Ready to Integrate AI in Legal Education?

Are you a law professor or law librarian interested in teaching your students about the benefits of AI? Are you a student who wants to get a jump start on learning the ins and outs of legal research at the state court level? Check out Trellis! Trellis is an AI-driven data analytics platform for lawyers, law students, and legal professionals. Access state trial court data using our API and simplify your legal research workflow. Find us at trellis.law or contact us directly for a demo.

Sources:
https://www.legalontech.com/generative-ai-passes-the-legal-ethics-exam

https://www.reuters.com/legal/transactional/ai-chatbot-can-pass-national-lawyer-ethics-exam-study-finds-2023-11-16/

https://www.reuters.com/legal/transactional/ai-chatbot-can-pass-national-lawyer-ethics-exam-study-finds-2023-11-16/

https://www.barbri.com/about-the-bar-exam/

https://www.reuters.com/legal/government/bar-exam-alternative-proposed-california-passes-key-hurdle-2023-11-17/

https://www.reuters.com/legal/government/bar-exam-alternative-proposed-california-passes-key-hurdle-2023-11-17/

https://www.foxbusiness.com/politics/ai-chatbot-beats-most-aspiring-lawyers-national-legal-ethics-exam-study-finds

https://nysba.org/navigating-the-ethical-and-technical-challenges-of-chatgpt/