Clearing the Fog: How AI Legalese Decoder Can Assist CHA Lawyers in Navigating ChatGPT’s Missteps in Citing Nonexistent Court Cases
- July 17, 2025
- Posted by: legaleseblogger
- Category: Related News
legal-document-to-plain-english-translator/”>Try Free Now: Legalese tool without registration
Artificial Intelligence Missteps in legal Practice: A Cautionary Tale
Introduction to the Case
Lawyers representing the Chicago Housing Authority (CHA) recently attempted to invoke a pertinent legal precedent by referencing an Illinois Supreme Court case titled Mack v. Anderson. Their objective was to convince a judge to reconsider a substantial jury verdict amounting to $24 million against the CHA. This judgment arose from allegations that two children suffered lead paint poisoning while residing in properties owned by the CHA.
The Root of the Problem
The complication here is that the case they cited, Mack v. Anderson, does not actually exist.
Misuse of AI Technology
In a troubling turn of events, the law firm Goldberg Segalla employed artificial intelligence—specifically ChatGPT—while preparing a post-trial motion. Unfortunately, they did not take the necessary step of verifying the accuracy of the AI-generated information. According to court records, a jury’s decision in January, following a rigorous seven-week trial, mandated that CHA compensate over $24 million to two plaintiffs advocating for their children. The jury found CHA liable for the injuries sustained by the children, covering both past and future damages.
Public Apology and Internal Review
In light of this egregious error, Goldberg Segalla issued an apology on June 18, calling it a "serious lapse in professionalism." Lead counsel Larry Mason acknowledged that a thorough investigation revealed one attorney had contravened the firm’s policy regarding AI usage, carelessly including the fictitious citation without proper verification.
Interestingly, Mason faced some criticism himself during the trial, as the judge had admonished him for displaying outbursts during closing arguments. He noted, “several contributors” had aided him in compiling the motion, claiming that the investigation pinpointed no intent to mislead the court and indicated that other attorneys were unaware of the erroneous citation.
Measures Taken for Accountability
Subsequent to the incident, Goldberg Segalla announced it would implement firm-wide measures to educate its attorneys about the responsible use of AI. Formal preventative protocols have also been instituted to mitigate further mishaps. The firm appealed to the court to refrain from penalizing CHA for the mishap involving their attorney.
In a letter responding to inquiries, CHA’s interim Chief legal Officer, Elizabeth Silas, expressed her gratitude to Mason for bringing the AI-related error to their attention, emphasizing Goldberg Segalla’s acceptance of full responsibility.
Ethical Expectations from legal Counsel
Silas emphasized that CHA expects outside counsel to strictly adhere to high standards of responsibility and ethics. She expressed hope that the court would acknowledge the unfortunate nature of the misstep, as well as the firm’s genuine efforts to investigate and address the issue. However, CHA retains the right to consider additional actions depending on the court’s findings.
The Current Leadership Landscape at CHA
CHA stands as the third-largest public housing authority in the United States, serving over 65,000 households and possessing more than 21,000 rental units across the city. Currently, the agency is without a permanent CEO, with Board Chair Angela Hurlock stepping in as interim leader. Mayor Brandon Johnson is reportedly nearing completion of a search for a new executive, with Ald. Walter Burnett emerging as a front-runner for the position.
legal Proceedings and Court Responses
In light of this turmoil, Cook County Circuit Judge Thomas Cushing scheduled a special hearing to further scrutinize the circumstances around the cited error, particularly since the CHA’s counsel had acknowledged their error. The judge mandated that any attorneys responsible must be present in court and requested copies of Goldberg Segalla’s AI usage policies.
During the hearing, attorney Danielle Malaty, who was at the center of the controversy, conveyed that she did not believe ChatGPT could fabricate legal citations and admitted she did not verify the legitimacy of the cited case. Her post-submission review had involved three other attorneys, including Mason, who served as the final reviewer.
Tragically, Malaty was terminated from Goldberg Segalla following the incident, despite her status as a partner in the firm. The firm had previously implemented a policy banning the use of AI technology in such sensitive matters.
Reflections from the Involved Parties
"I find this all very unfortunate," Malaty remarked during the hearing. "At no point did I have any intent to deceive the court." Meanwhile, Mason voiced his disappointment, describing himself as “personally disgusted” and “embarrassed” by the events, labeling the error as “horrific.”
Another layer of complexity arose when attorney Matthew Sims, representing the plaintiffs, sought permission to file a motion for sanctions against both Goldberg Segalla and the CHA, alleging "fraud upon the court." The judge approved his request and set a deadline for the motion to be filed.
Future legal Considerations
The next hearing date for post-trial motion arguments is scheduled for July 31, with CHA continuing to contest the ruling and seeking either a favorable verdict, a new trial on liability, or a modification of damages.
Financial Implications for CHA
Goldberg Segalla has billed CHA over $389,900 for their legal services from March 2024 to December 2024. CHA confirmed to the Tribune that they will not incur costs associated with the inquiry into the AI-related mishap.
Underlying Case Details
This case originated in January 2022 and involved Shanna Jordan, the mother of Jah’mir Collins, and Morgan Collins, the mother of Amiah Collins. They filed suit against multiple parties, claiming that the defendants were aware of lead-based paint hazards in their unit and that their children had suffered “severe lead poisoning” during their residency.
The lawsuit asserted that CHA had known about the property’s lead paint since 1992 and faced numerous code violations in the early 2000s due to hazardous conditions.
Liability Findings and Future Implications
Though the property management companies, The Habitat Co. and East Lake Management Group, were initially found not liable for the children’s injuries, they ultimately settled with the plaintiffs for considerably smaller amounts. Habitat subsequently filed a lawsuit against CHA and its attorneys for alleged breach of contract and malpractice in relation to the lead poisoning suit.
In a related matter, prior to the January ruling, Habitat had terminated all its management agreements with CHA, impacting 16 buildings and approximately 3,400 housing units.
Addressing Broader Concerns About Lead Poisoning
Environmental Design International conducted a lead-based paint inspection in 2017 on behalf of CHA, confirming the presence of hazardous materials. The company settled with plaintiffs before the trial began.
This isn’t the first instance in which CHA has permitted residents to inhabit homes later exposed to lead poisoning. An investigation conducted by the Tribune in 2017 found numerous cases where children were diagnosed with lead toxicity after being placed in homes that the CHA had previously cleared as safe.
In response to these issues, CHA established a new division in April dedicated to addressing environmental hazards for residents, with an emphasis on mitigating lead-based paint risks as it works to enhance the program.
Role of AI legalese decoder
In scenarios like these, AI legalese decoder offers a robust solution for legal professionals. By utilizing advanced AI tools, attorneys can effectively analyze and decode legal language, verify citations, and ensure the accuracy of their documents. This technology streamlines the review process, empowering lawyers to validate citations and improve their understanding of complex legal matters, thereby minimizing the likelihood of serious oversights like those observed in the CHA case.
By adopting AI legalese decoder, law firms can not only enhance their operational efficiency but also uphold ethical standards, ensuring that errors detrimental to clients and the legal system are swiftly mitigated.
Conclusion
As the CHA case illustrates, while AI can offer significant benefits in legal practice, it also poses challenges and risks if not accurately monitored. The ramifications of an AI-related misstep underscore the importance of due diligence in the legal field, emphasizing the necessity for firms to develop robust policies around the use of AI technology. Companies like AI legalese decoder can play an essential role in helping legal professionals navigate these complexities, fostering a culture of accountability and precision in legal practice.
legal-document-to-plain-english-translator/”>Try Free Now: Legalese tool without registration
****** just grabbed a