BE YOUR OWN LAWYER

Empowering You to Represent Yourself

AI - A Blessing or a Curse?

Think about it – It’s like magic!!

You’re in court.

You don’t have a lawyer.

You have to prepare a legal document – be it a pleading, a motion, or legal brief. And you have no clue how to do it.

What goes into a complaint? What does an answer have to include.

How do you respond to the motion you just received from the other side?

Just when you’re ready to give up it dawns on you….. All you have to do is try artificial intelligence.

This stuff is really awesome!

 If it can write a story it should be able to generate an answer to a complaint right? Surely Claude, or ChatGPT can conjure up a response to the motion you’re looking at.

Give it a shot!

And so you do.

The result?

Wow…. Just Wow.

It’s all there. The paragraphs are numbered. It’s formatted to look “official”.

It even provides a place for you to sign and tells you how to serve the other side in your lawsuit.

EXCEPT…..

Except there’s a better than fifty-fifty chance its all garbage!

There’s a better than fifty-fifty chance it will get thrown out.

There’s a good chance that it could cause you to lose your case or at the very least find yourself in trouble with the judge.

Here’s why:

AI Systems don’t understand legal issues.

There is no doubt that one day there will be AI platforms that can generate any document in a lawsuit.

But that day is not today!

Today, AI systems have a serious flaw called “hallucination”—they confidently generate plausible-sounding information that’s entirely fictitious. When asked for legal help, AI will often cite cases that don’t exist, create fake court decisions, and invent judicial opinions that sound legitimate but are completely manufactured.

This isn’t just an occasional glitch. AI systems routinely fabricate case names, docket numbers, and legal holdings. They’ll tell you that “Smith v. Jones (2019)” supports your position when no such case exists. They’ll quote from judicial opinions that were never written. And they’ll do it all with complete confidence, making their fabrications seem authoritative and real.

 Sometimes AI references actual cases but completely misrepresents what they say. It might cite a criminal case to support a landlord-tenant argument, or claim a case supports your position when it actually contradicts it. AI doesn’t understand law—it predicts word patterns. It can’t distinguish binding precedent from irrelevant dicta, recognize overturned decisions, or apply proper legal reasoning.

And these results can have devastating consequences.

Judges have little tolerance for nonsensical filings and will not hesitate to impose fines or other sanctions. If a pleading gets thrown out it can mean you lose your case. If for example you have filed an answer that is tossed, and the deadline has passed, you can be in default an automatically lose an entire case.

In our next posts we’ll show you some real life examples of how  self-represented litigants found that AI could not be trusted.