Last year, Kaylin Heyman appeared in a Pittsburgh courtroom to testify against a man she had never met who used artificial intelligence technology to use her face to create pornographic photos.
Kaylin, 16, is a child actor who appeared on the Disney show “Just Roll With It” from 2019 to 2021. The perpetrator, a 57-year-old man named James Smelko, targeted her because of her public profile. She was one of about 40 of his victims, all of whom child actor. In one of the images of Kaylin submitted as evidence at the trial, Smelko used her face from a photo she posted on Instagram when she was 12 years old and was working on set and superimposed it onto another person’s nude body. Combined.
“There have been times when I have cried uncontrollably because I don’t understand why some people are so evil,” she said in an interview with the Guardian. “I just can’t wrap my head around that.”
Kaylin lived in Ventura, California, and at the time of committing these crimes against Smelko, Smelko was based in Pennsylvania. She was shocked when she learned that her case would not go to trial because it was an interstate crime. Possession of depictions of child sexual abuse is a crime under U.S. federal law. However, under California law, it was not considered illegal.
Kaylin put her fear into action. This year, she became a staunch public advocate in support of California’s new bill, AB 1831. The bill expands the scope of current law on child sexual abuse materials (CSAM) to include images and videos that are digitally altered or generated by AI. In June, he testified in support of the bill at the state Capitol in Sacramento.
“I told them how I felt violated and that I was really appalled that this wasn’t already a crime in California,” Kaylin says. “California is a huge part of the acting industry, and there are a lot of kids who weren’t protected from this crime.”
California Governor Gavin Newsom signed the bill at the end of September. Child predators who create such materials can face prison time or fines of up to $100,000 in the state.
Although the new law focuses on AI in the hands of child predators, there are other factors in Kaylynn’s life that led her to Smelko and his They say they are putting themselves at risk of encountering people like them.
Kaylin got her first Instagram account when she was 10 years old. social network need Users must be at least 13 years old to sign up, except for accounts managed by parents. Smelko downloaded photos from the profiles and created sexualized images that combined her face with the nude bodies of other girls and women.
“Disney set up an Instagram account specifically to promote the show and themselves,” Mark says. “But the bigger problem is when these companies are hiring these kids, putting them there, and not providing support.”
This support should include training and counseling on how to deal with harassment and account blocking, he said. Kaylin similarly lays the blame at Disney’s feet.
“Disney’s public relations team signed me and all of the Disney kids up on the app. They sent us clips to post on Instagram every week as the episodes came out. ” says Kaylin. “It started with my work and them planting a seed. I want them to take some responsibility, but that hasn’t happened yet.”
In recent years, men have been harassing Kaylynn by sending them nude photos through her Instagram and TikTok accounts. She said she reported the spam messages to both social media companies, but no action was taken.
“She definitely had her fair share of creepy stalkers who kept teasing her,” says Charlene.
Mark also believes the Hollywood actors’ union, Sag Aftra, needs to be more proactive in educating its members about the dangers of predators using AI and social media to victimize public figures. are. Kaylin’s parents regularly check her account, and she still uses it and has access to it.
“We read a lot of comments and think, ‘What’s wrong with people?’ But we don’t know if we can escape from that. It’s hard to be in this industry without social media. That’s it,” says Charlene. “We want social media companies to censor and protect us responsibly.”
Instagram has announced several initiatives in recent years to strengthen protections for users under 16, including parental controls and measures to determine who can send messages. In September, the company announced it would make all accounts for users under 18 private by default, a move praised by child safety advocates. According to Meta’s guidelines, the same restrictions apply to verified accounts of minors.
“There are so many inappropriate images on Instagram and I don’t understand why they are being sent to children,” said Kaylin, who turns 17 this month. “Instagram should say, ‘No, that’s not allowed’ and delete it. But that’s not going to happen and I don’t understand it.”
Mehta said in a statement: Detailed and robust policies We oppose child nudity and exploitation, including in real life, images, and those created using GenAI. ”
“SAG-AFTRA has been educating, negotiating, and legislating about the dangers of deepfake technology since at least 2018,” said SAG-AFTRA General Counsel Jeffrey Bennett. Bennett noted that the guild publishes a magazine on deepfakes, participates in panels and publishes articles on the subject.
Disney has not commented.
CSAM’s online circulation is increasing. Predators have traditionally used photo-editing software, but recent advances in AI models provide an easily accessible opportunity to mass-produce more realistic images of child abuse. The National Center for Missing and Exploited Children (NCMEC), the US-based global reporting clearinghouse for CSAM, received 36.2 million child abuse reports online in 2023, an increase of 12% from the previous year . Most of them are from the meta.
While most of the reports received involved actual photos and videos of children being sexually abused, NCMEC also received 4,700 reports of images and videos of child sexual exploitation created by generative AI. I did. The organization has criticized AI companies for not proactively trying to prevent or detect the creation of CSAM.
Kaylin said discovering that her face had been used to create CSAM marked the end of her childhood innocence. She is now more concerned about her safety and the safety of other children and teens she knows.
“I always get nervous when I see a man or someone looking at me a little strangely, strangely,” she says. “I’m always thinking about the worst that can happen in a particular situation. I think it’s something that young women have to get used to. It’s a shame that I had to get a wake-up call at 4 p.m. I think it’s part of life,” she added.
She says testifying at Smelko’s trial a year ago meant she was regaining some control over the situation. In the courtroom, she focused on answering the prosecutor’s questions and, while facing the jury, glanced at the stranger on trial for sexual exploitation.
“When I caught a glimpse of him, he looked like he had lived a really sad life. He wasn’t a first-time offender, so he probably spent much of his life cooped up indoors,” she said. say. After testifying, Smerko was found guilty of two counts of possession of child pornography.
Kaylin is determined to continue acting and hopes to someday appear in a movie. But right now, she’s focused on finishing her senior year of high school and advocating against online child exploitation. This ordeal also gave her new ambitions. She hopes to attend law school to one day become a child rights attorney. Source: www.theguardian.com