AI use in Arizona brings up ethical questions in Indiana

  • Print
Listen to this story

Subscriber Benefit

As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe Now
0:00
0:00
Loading audio file, please wait.
  • 0.25
  • 0.50
  • 0.75
  • 1.00
  • 1.25
  • 1.50
  • 1.75
  • 2.00

Stacey Wales, sister of the late Christopher Pelkey, displays her brother’s image. She also helped create an artificial intelligence version of him that was allowed to speak to a judge via video. (Matt York/AP photo)

The family of a man killed in a road rage incident in Arizona let their loved one speak for himself during his killer’s sentencing hearing last month.

More specifically, they let an AI-generated version of him speak before the judge, marking a likely first for the nation, according to Fox 10 in Phoenix.

The video shows a digitally altered version of Christopher Pelkey, who was shot to death in 2021, and what he may have said in the courtroom, offering forgiveness to the man who took his life.

And while the judge presiding over the case seemed to appreciate the sentiment, some judicial officials in Indiana are on the fence about it.

“It’s a computer model created with information about the victim, but it’s not the victim, and that’s my concern,” said Andrew Bloch, a judge in Hamilton Circuit Court. “Whether manipulation is intentional or not, you’re ultimately telling the computer what to do, including how to act and to me, that’s not the victim.”

The technological experiment reflects how rapidly AI is progressing through the legal system nationwide, even for laypeople, and indicates where courts could be headed with the technology.

The video was created by Pelkey’s brother-in-law and his business partner using clips of his voice and a photo of him, according to NPR. His sister, Stacey Wales, couldn’t find a long, clear audio clip of his voice, or audio of his laugh without background noise, so AI technology was used to make adjustments.

The image of Pelkey had to be altered as well: his sunglasses were removed from the top of his hat, and his beard was trimmed to resolve technological issues.

The words Pelkey spoke, thanking the judge for working on the case and saying what a gift growing old is, were written by Wales and what she believed her brother would’ve said in front of the judge.

Ultimately, Wales wanted to paint a complete picture of Pelkey and humanize him, NPR reported.

“I knew what he stood for and it was just very clear to me what he would say,” she told NPR.

Courtroom dilemma

David Dreyer

David Dreyer, Marion Superior Court senior judge, said a video like this is more likely to be accepted by a judge in a sentencing hearing instead of a trial because of how it could impact a jury.

“Judges on their own, at a sentencing hearing or something, they will tend to allow parties to present whatever they want, and they can sort it out…juries wouldn’t have the same experience to sort it out. They could be very affected [by] this,” he said.

Both Dreyer and Judge Bloch said that while a judge may approve of a video that evokes the same reaction as a traditional victim impact statement, how that video impacts a defendant’s case is a different matter to consider.

In Indiana, Dreyer points back to the state’s rules of evidence, which weigh the significance of evidence against its prejudicial impact.

Andrew Bloch

“Even though something is admissible or relevant, if its prejudicial impact outweighs the weight or how much it’s going to show, a judge could keep it out,” Dreyer said.

In Arizona, the defendant was sentenced to 10 1/2 years for manslaughter, one year more than what the state asked for, according to ABC 15 Arizona. Attorneys for the defendant are now pushing back against his conviction, filing an appeal shortly after his sentencing.

An attorney for the defendant stated the following, according to Fox 10: “It’s just simply inauthentic to put the words in the mouth of the likeness. It’s much like Geppetto putting words in the words of [Pinocchio’s] mouth. Those words were a stark contrast from the reality that numerous witnesses testified to, those being Chris Pelky’s last words, of challenging my client to a fight violently getting out of his car in a crowded intersection waving his arms in the air.”

Bloch, who has a background in IT, is a major proponent of technology and its benefits for attorneys. He said he uses AI for administrative tasks. But he said he probably wouldn’t allow a video like Pelkey’s in his courtroom, especially before a sentencing decision is made.

He said that ultimately the video is not Pelkey, and the words coming from his mouth are not his, regardless of who’s interpreting them and their intent behind it.

He’s also interested in what prompts were fed to AI to build the script.

“You ultimately have a human author inputting information about this victim into the computer and telling AI…we don’t know what they said. Did they tell the AI to come across as sympathetic?” he said.

This artificial intelligence version of road rage victim Christopher Pelkey was allowed to speak at the sentencing of the man convicted in his death. (Screenshot from YouTube)

The future of AI in courts

As with any professional field, the use of artificial intelligence is being explored in different capacities throughout Indiana’s legal system.

Many professionals, like Bloch, are using AI to lighten their workload, implementing the technology to summarize documents and identify relevant case law.

Its use, of course, requires trepidation. Despite its benefits, experts urge AI users to always fact-check the technology’s work.

Frank Emmert

Another supporter of AI and its benefits, Frank Emmert, executive director of the Center for International and Comparative Law at the Indiana University McKinney School of Law, teaches AI regulation at McKinney.

While he encourages the technology’s use, he also stresses the importance of responsible use to students.

“We used to have footnotes with references to law review articles, books, and so on. Now, they will have to disclose that certain things are from AI and that’s okay, if they disclose it honestly, because that’s the modern world,” Emmert said.

Emmert believes that the use of AI will only increase, especially as a tech-savvy generation of law students graduate into the legal profession.

“We have to learn how to use this responsibly to make our work more efficient, to make our lives better, and to prevent and avoid abuse and misuse in any kind of way,” he said.

According to the latest Legal Trends Report by legal technology company Clio, 79% of legal professionals surveyed are currently using AI in their practice.

In addition, 81% of Gen-Z-aged clients are open to law firms using AI.

In March, Mark Dinsmore, a magistrate judge in the U.S. District Court in southern Indiana, called for disciplinary action against a Texas-based attorney practicing in Indiana after the attorney allegedly submitted briefs containing citations for non-existence cases identified through AI.

“It is one thing to use AI to assist with initial research, and even non-legal AI programs may provide a helpful 30,000-foot view. It is an entirely different thing, however, to rely on the output of a generative AI program without verifying the current treatment or validity—or, indeed, the very existence—of the case presented. Confirming a case is good law is a basic, routine matter and something to be expected from a practicing attorney,” Dinsmore wrote in a brief.

Bloch acknowledges the benefits of AI use but still weighs it against the ethical standards of the profession. When it comes down to it, he wants clients to trust judicial officers.

“I want people to have confidence in the decisions our judiciary puts out, because we all work very hard to make the right choices, and when we start putting in things that are intentional or unintentional manipulation, that calls into question everything we’re doing,” he said.•

Please enable JavaScript to view this content.

{{ articles_remaining }}
Free {{ article_text }} Remaining
{{ articles_remaining }}
Free {{ article_text }} Remaining Article limit resets on
{{ count_down }}