We had been witnessing the surprising journey of ChatGPT since its inception in November last year. Being a souped-up chatbot, this AI tool can respond to your most minor to most extensive queries on anything. The bot even drew the attention of students by helping them with essays and crack math problems. But the news of this Colombian judge has caused a stir in the AI world. 

According to report by The Guardian, Juan Manuel Padilla, a judge in the Caribbean city of Cartagena, has admitted that he used Open AI’s chatbot ChatGPT to write a legal ruling in a recent case on the medical care of a child. 

While presiding over a case in which Judge Padilla had to decide whether an autistic child from a limited-income family had to pay medical and transportation expenses or not, he sought the help of ChatGPT. He concluded writing his judgement in favour of the child after asking ChatGPT about the case and stated that his medical plan should cover the entire expenses as his parents could not afford them. 

The Padilla-ChatGPT conversation that prompted disputes 

Though the judgement did not create hassle, the inclusion of Padilla-ChatGPT conversations made the decision controversial. The legal documents prove Padilla, during the enquiry with ChatGPT, asked precisely, “Is an autistic minor exonerated from paying fees for their therapies?” while ChatGPT responded, “Yes, this is correct. According to the regulations in Colombia, minors diagnosed with autism are exempt from paying fees for their therapies.” Drawing a final decision for the judge, favouring the child. The judgement prompted a discussion on the authenticity and use of AI in law, which welcomed much criticism even from Padilla’s colleagues. But the judge defended the technology telling, that it can help Colombia’s legal system to be more effective. He added that he referred to previous rulings as examples to support his decision. 

Judge Padilla on ChatGPT

Padilla said that “ChatGPT and other such programs could be useful to facilitate the drafting of texts but not with the aim of replacing judges. He added, “by asking questions to the application, we do not stop being judges, thinking beings”. He mentioned the performance of ChatGPT as “organised, simple and structured manner” that can help “improve response times” in the judicial system.

Criticisms behind the judgement

Many professionals came up with their disagreements in this case. Prof Juan David Gutierrez from Rosario University said, “there is a need for urgent digital literacy training for judges”. A judge in Colombia’s Supreme Court, Octavio Tejeiro, said, “AI has instigated a moral panic in law as people feared robots would replace judges”. But he also shared his thought on the future acceptance of the AI tool by common people. He said that using ChatGPT for judgement is unethical and misleading, as they can be imperfect and propose wrong answers. “It must be seen as an instrument that serves the judge to improve his judgment. We cannot allow the tool to become more important than the person Tejeiro added.”

The legal implications

ChatGPT generates a vast amount of information according to informed responses, which produces biased decisions and often fabricates information. Though AI tools help people reduce their workload, the legal and ethical side remains vulnerable regarding surveillance, privacy, discrimination or bias, and philosophical challenges. These obstacles create a dilemma in the public which makes them distrust technology. Moreover, people are getting concerned about the latest digital technologies as they have become a new source of inaccuracy and data breaches. 

Though Colombia approved a law in 2022 that suggests that public lawyers should use technologies where possible to make their work more efficient, there is still a continuous debate regarding if AI fits within the legal categories or whether a new category with special implications should be developed or not. 

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE