ET

EverythingThreads

712 lawyers have been caught with AI-hallucinated citations. Don't be next.

AI-generated legal text looks authoritative. But fabricated case law, invented statutes, and hallucinated citations are ending careers.

The scale of the problem

Our database tracks 712+ documented cases where lawyers submitted AI-generated content containing hallucinated citations to courts. These are not edge cases — they span multiple jurisdictions, practice areas, and experience levels. Stanford research found that even premium legal AI tools hallucinate between 17% and 33% of the time.

The consequences are real. In the MyPillow Inc. case, a lawyer was fined $3,000 for submitting AI-fabricated citations. Other cases have resulted in sanctions, withdrawn filings, and disciplinary proceedings. Courts are now requiring AI disclosure — and treating hallucinated citations as a competence issue.

How Legal AI Scope catches it

Legal AI Scope analyses AI-generated legal text against seven sector-specific risk flags: unauthorised legal advice, hallucinated citations, jurisdiction mismatches, outdated precedent, misapplied legal standards, omitted qualifications, and confidence-accuracy gaps. Each flag is grounded in verifiable primary sources.

Paste any AI-generated legal text and get an immediate risk breakdown before it reaches a client, a court, or a regulator. Free, independent, and not affiliated with any AI provider or legal technology vendor.