According to Polish criminal law, the distribution and possession of pornographic content involving a insignificant is subject to imprisonment from 3 months to 5 years. The problem is that the rules only apply to materials with real, live children. Materials produced by artificial intelligence, on which a realistically undressed kid is visible without sexual activity, are not punished in Poland. Experts are informing that the law needs to be changed urgently.
Pursuant to Article 202 of the Criminal Code, the dissemination and possession of pornographic content involving a insignificant shall be subject to imprisonment from 3 months to 5 years. It's about material with real, live children, whether or not it's a sexual act.
On the another hand, the possession and dissemination of materials produced or processed by artificial intelligence, on which a realistically undressed kid is visible, or presented in fairy-tale style, but without sexual activity, is not punished in Poland - indicated Amanda Krać-Batira, who studied forensic anthropology in an interview with PAP.
Gaps in Polish law
The same is actual of materials with children dressed sexually, which do not show actual sexual activity or real surviving children. In Poland there are besides legal stories, which are a verbal description of sexual activities towards the kid and audio recordings in this respect - added Tomasz Sidor in an interview with PAP.
The problem is the definition contained in Article 202 of the Criminal Code, which penalises the production, possession and dissemination of materials "presenting the manufactured or processed image of a insignificant active in sexual activity". We must urgently change this law - the experts alert.
Work on amending legislation
The work on the fresh content of Article 202 kk is already under way, mainly advocated by non-governmental communities and by any public institutions related to the protection of children. The main postulate is replacing the concept of "pornographic content depicting the created or processed image of a insignificant active in sexual activity" with a word adopted in EU legislation, i.e. "materials showing the sexual exploitation of a child" - the expert pointed out.
The word "pornography" should not be utilized in regulations relating to the child, due to the fact that pornography is simply a legitimate amusement business aimed at adults - noted Krać-Batira. any EU countries, including Slovenia, Belgium and Germany, are more restrictive of CSAM legislation.
Problem with global pursuit
Such crimes cross national borders. Meanwhile, in 1 country, certain behaviours are penalized and in another, said the expert Tomasz Sidor. Consequently, there is simply a serious problem with the global prosecution of crimes committed by persons with impaired sexual preferences.
CSAM generation is possible by AI models accessible on the Internet, which is simply a major challenge. There are besides models that can be launched on your device and prepared to bypass filters imposed by manufacturers. There is no request for peculiar equipment - frequently a laptop with a dedicated graphics card is enough.
Access to illegal tools
There are besides AI models specially trained to "gather" presented in children's photographs or make CSAM materials. Cybercriminals sale access to specified tools on a subscription basis in the Darknet.
Working on CSAM materials, which 1 fishy can have on the device thousands, carries a immense intellectual burden on experts, who are frequently overloaded or burned professionally - emphasized Amanda Krać-Batira.
Problems with convictions
I always indicate materials that I think are suspicious, even if under Polish law their possession is not prohibited. However, in cases like this, convictions are rare, says the expert. She erstwhile reported that the fishy had hundreds of commercial photos of babies in sleepovers on the device, which he downloaded from the parties selling the clothes. The D.A. called her and said there was nothing illegal.
Some experts usage artificial intelligence to aid in the preliminary selection of CSAM, but problems arise here too. There are cases of "false positives", i.e. situations where AI means as suspicious materials at all not related to children, e.g. photos on which were dumplings with cream, greaves or badgers - she said.
Technological challenges
There are besides alleged "false negatives", i.e. situations where AI does not detect CSAM. If badly qualified material goes to court, it will be very easy to challenge. And the expert is criminally liable for giving a false opinion, even if he does so unintentionally,' noted Krać-Batyr.
In order for AI tools to be able to effectively assist in detecting CSAM, they should be specially trained for this intent and explained, meaning that information is available on why the algorithm made specified a decision alternatively than another decision. However, AI creators defender the secrets of algorithms. They are besides not willing to supply information or share tools for the detection of materials generated by AI - added Tomasz Sidor.
(PAP) Note: This article was edited with Artificial Intelligence.













