The federal police union is calling for the establishment of a dedicated portal where victims of AI deepfakes can report incidents to the police. They expressed concern over the pressure on police to quickly prosecute the first person charged last year for distributing deepfake images of women.
Attorney General Mark Dreyfus introduced legislation in June to criminalize the sharing of sexually explicit images created using artificial intelligence without consent. The Australian Federal Police Association (Afpa) supports this bill, citing challenges in enforcing current laws.
Afpa highlighted a specific case where a man was arrested for distributing deepfake images to schools and sports associations in Brisbane. They emphasized the complexities of investigating deepfakes, as identifying perpetrators and victims can be challenging.
Afpa raised concerns about the limitations of pursuing civil action against deepfake creators, citing the high costs and challenges in identifying the individuals responsible for distributing the images.
They also noted the difficulty in determining the origins of deepfake images and emphasized the need for law enforcement to have better resources and legislation to address this issue.
The federal police union emphasized the need for better resources and legislation to address the challenges posed by deepfake technology, urging for an overhaul of reporting mechanisms and an educational campaign to raise awareness about this issue.
The committee is set to convene its first hearing on the proposed legislation in the coming week.
Source: www.theguardian.com