Six reasons why you shouldn’t use ChatGPT as a legal tool

Who doesn’t love to save time? I’ll be the first to admit that artificial intelligence (AI) has massive potential to make life easier. It should be leveraged as much as possible for mundane tasks. However, when it comes to anything related to the law, care must be taken.

Here’s why:

“Generative” AI

The integration of AI in law has showcased some disastrous consequences. This has raised major concerns about the reliability and accuracy of the information generated from AI. The key word here is “generated.” What does it mean for AI to be “generative?” It simply means that it generates high-quality text, images, and other content based on the data that it was trained on.

That is why AI is particularly good at fabricating uncannily convincing responses known as “hallucinations.” In other words, it has the propensity to pull unsubstantiated and fabricated “facts” out of thin air.

No fact-checking

Unlike humans, ChatGPT makes mistakes with confidence. Written right on the screen is the following text: “ChatGPT can make mistakes. Consider checking important information.” The issue with this lies in the fact that very few people do any form of fact-checking at all.

When it comes to legal terms, concepts, advice, or regulations, most non-lawyers would not know where to even go to start the fact-checking process. Should they read the statute or a case? Which statute applies? Which level of court decides the matter? Has the law been overruled? In cases of conflict of laws, what information is correct? These are the kinds of things that lawyers are rigorously trained on in law school.

Therefore, while ChatGPT may be a good starting point for ideas, it should not be relied upon to produce real substance or factually accurate and specific information. You might ask ChatGPT a question, and it will generate a response, but there is no way of verifying the accuracy of its answer.

No precision

ChatGPT is smart because it generates its responses by instantly sifting through massive datasets. However, this is also the source of its shortcomings when it comes to the law. The thing with the law is that it is extremely specific to individual situations, contexts, existing contracts, jurisdictions, etc. Plus, it is changing all the time.

ChatGPT itself says the following: “It’s important to note that while I can provide information and assistance, my responses may not always reflect the most current legal standards or interpretations, and they should not be considered a substitute for professional legal advice. When dealing with legal matters, it’s crucial to consult with a qualified attorney or legal professional who can provide advice tailored to your specific situation and jurisdiction. Legal information is subject to change, and the application of legal principles can vary based on the details of a particular case.”

No confidentiality

Your conversations with ChatGPT are not protected by attorney-client privilege. Only lawyers can give you that privilege. Always keep in mind that anything you type into ChatGPT may be saved and possibly used by the technology when generating responses for other users.

Therefore, you should never enter confidential information into ChatGPT because the model saves all user inputs.

No duties towards you

Lawyer owe fiduciary duties to their client. Specifically, we are bound by a duty of loyalty to our clients. We are prohibited from acting in conflicts of interests. These are just some of the duties that we are bound by and we take them very seriously. This is because we have a lot to lose – we could lose our licenses or our reputations.

Guess what? ChatGPT cannot lose any of those things. You can’t take ChatGPT to court for giving you negligent representation. There is no regulatory body for which you can lodge a complaint against ChatGPT for not acting in your best interests. Just know that if ChatGPT fails you, there is no recourse.

No insurance

Speaking of recourse, lawyers are obligated to have professional liability insurance in case things go sideways. For example, in BC, lawyers have to pay a mandatory fee which goes towards their insurance. This insurance gives them coverage for $1 million per error. In order words, if an error is made, there is some recourse available.However, if something goes wrong and you used ChatGPT instead of hiring a lawyer, then there is zero insurance to fall back on.

Here’s a short case study (true story):

A lawyer named Mr. Schwartz, along with his firm, filed a lawsuit against an airline in Manhattan federal court. Unfortunately, he was fined thousands of dollars for filing a brief filled with citations to fake case law generated by ChatGPT. Judge P. Kevin Castel stated that the submission was “replete with citations to non-existent cases” and that “six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations.”

There are paid AI tools out there that are designed specifically for lawyers. These are powerful and highly useful. However, when it comes to non-lawyers using basic AI software such as ChatGPT for legal questions or advice – we don’t recommend it.

Share:

More Posts

Social Media: Legal Dos and Don’ts

Navigating the world of social media can be tricky, especially when it comes to legal considerations. This article empowers online entrepreneurs and small businesses by outlining the key “do’s” and “don’ts.”

Request a free consultation

Elevate your business to new heights with Cunning Law as your legal partner. 

Contact us

© Cunning Law. 2024. All Rights Reserved.

This information is intended for informational purposes only and does not constitute legal advice or an opinion on any issue. We would be pleased to provide additional details or advice about specific situations if desired.