Recent crackdowns in the European nation have resulted in several arrests amid nationwide investigations. This comes on the heels of reports from mid-2020 indicating authorities were “overwhelmed” by the number of reported incidents. In the country’s efforts to fight child sexual abuse, investigators and child advocacy groups have requested the use of Deepfake generators to produce artificial “kinderpornografie.” This, reportedly, would include imagery created using a database containing actual images of child sexual abuse. The reasoning for the investigators’ request, according to a report from local news outlet Suddeutsche Zeitung, is so undercover agents can infiltrate child sexual abuse rings. The so-called “darkeweb” groups often solicit images from prospective new members as a form of initiation and vetting. It’s typically illegal for officers of the law to provide investigative targets with actual depictions of child sexual abuse. But, despite the ethical concerns surrounding the generation of novel, artificial depictions of child sexual abuse, experts in Germany believe the use of such materials will make it easier to identify and arrest predators who operate online. The new legislation also allows for the arrest and charging of adults who unwittingly attempt to groom a parent or undercover officer who they believe to be a minor. Quick take: There was mild opposition to the law at its inception from political leaders who feared the use of criminal content to capture criminals was unwarranted. Even some lawmakers who supported the effort saw the potential for negative impact. In 2019 Stephan Thomae, the deputy leader of the opposition Free Democrats’ parliamentary group, said “the goal should actually be to eliminate child pornography material from the internet, and not to enrich it with computer-generated material,” before ultimately supporting the initiative. Despite the potential for good, it’ll take awhile to sort out the ramifications of the decision. We’re not sure if the German investigators have developed a method by which they can tag and track the images as they’re passed around or if they’ve tested them for resilience against detection. H/t: Jack Clark, Import AI