Law Firm Warns Attorneys: AI-Generated Case Law Can Get You Fired

Law Firm Warns Attorneys: AI-Generated Case Law Can Get You Fired

A prominent law firm, Morgan & Morgan, is warning its attorneys about the risks of relying on AI-generated case law after one of its lawyers cited fake cases in a lawsuit against Walmart. The lawyer, Rudwin Ayala, used ChatGPT to supplement his research, but the AI tool provided fictitious case citations, including one that was completely made up.

This incident highlights the potential consequences of relying on AI-generated information without proper verification. Ayala's mistake led to his removal from the case, and his supervisor, T. Michael Morgan, had to take over. Morgan & Morgan's chief transformation officer, Yath Ithayakumar, warned the firm's attorneys that citing fake AI-generated cases could lead to disciplinary actions, including termination.

This is not an isolated incident. There have been several cases where lawyers have improperly cited AI-generated cases, leading to sanctions and reputational damage. The use of AI in legal research can be beneficial, but it's essential for lawyers to verify the accuracy of the information provided by AI tools to avoid these kinds of mistakes.

About the author

TOOLHUNT

Effortlessly find the right tools for the job.

TOOLHUNT

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to TOOLHUNT.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.