- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
The avatar was made by Pelkey’s sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey’s brother-in-law, recoiled when she told him about the idea. “He told me, ‘Stacey, you’re asking a lot.’”
Your emotions don’t always line up with “what you know” this is why evidence rules exist in court. Humans don’t work that way. This is why there can be mistrials if specific kinds of evidence is revealed to the jury that shouldn’t have been shown.
Digital reenactments shouldn’t be allowed, even with disclaimers to the court. It is fiction and has no place here.
Sure, but not for victim impact statements. Hearsay, speculation, etc. have always been fair game for victim impact statements, and victim statements aren’t even under oath. Plus the other side isn’t allowed to cross examine them. It’s not evidence, and it’s not “testimony” in a formal sense (because it’s not under oath or under penalty of perjury).