'The left is protecting predators': Police reveal new trends in 'child sexual abuse materials'

image
crying child by is licensed under Canva
Submitted by Randy Snyder

--

Other the past few years, there have been several disturbing trends in the realm of child sexual abuse. Beyond the significant increase in reported cases, which has gone from approximately 4 million annual reports in 2014 to over 32 million reports in 2022, there has been the call, largely from the political left, to encourage the very behaviors that normal thinking people find reprehensible.

While portions of the country are fighting against those behaviors that are harmful to children, some, like Kentucky Senator Karen Berg (D) are proposing that child sex offenders should be provided lifelike tax-funded child sex dolls.

Berg believes that these dolls, designed to simulate children, even infants, are not only entirely lifelike, including multiple realistic orifices, child like features, and in some cases audio recordings to simulate pleasure and/or pain, can provide an outlet for the sexual desires of pedophiles, thereby deterring them from abusing real children.

However, many who actually work in the fields of psychology and child sexual abuse, know that this is entirely false. Dr. Michael Bourke indicates that they are being used as a substitution for hands-on offending, but would not prevent abuse.

Dr. Bourke is quoted as saying, "These dolls make the sexual fantasies of pedophiles more real. And making their fantasies more real is precisely what we want to avoid. We do not want them to reach the point where they become habituated to their dolls.

The child sex dolls reinforce and increase the longing and desire to abuse children, and because the dolls are only a substitute for what the offender truly wants, they are likely to 'push' offenders down the road toward acting out with a real child." 

Additionally, Melanie Blow, COO of The Stop Abuse Campaign, said, "[Pedophilia] is not exactly a normal sexual attraction. Not only is it unhealthy, but [peodphiles] tend to have trouble controlling it. It is really bad to reinforce that attraction. Sex Offenders are not treated clinically by giving them a handful of child [sexual abuse material]; that's not how it's done because that's not the right way to do it." 

In layman’s terms, providing child sex dolls to those who desire sex with children is like giving an alcoholic a beer, because “at least it’s not whiskey." The reality of these dolls is that they are not therapeutic, serve no legitimate purpose for medical or education purposes, and are merely a tool for sexual offenders to practice their deviant and harmful sexual practices until, or in lieu of, actual children.

These dolls can also be customized to the specific perversion of the offender. Dolls can be made tailored to the preferred age, gender, ethnicity or other physical characteristics, even including putting the features of an actual child on them. Cases have been documented of offenders stealing images from Facebook, to personalize their doll to that of an actual child that they desire, and other cases have documented offenders putting the pictures of their actual, or intended, victims on the dolls.

One would think that headlines, like,Silicone 'baby' sold as sex toy online horrifies Twitter: 'So sick and demented' A sex toy in the shape of infant, formerly referred to as 'butt baby,' is currently sold online by an adult website, should deter people from even considering these dolls as anything but a perversion, yet some do not.

And unlike the senator’s argument that these will curb the desires of offenders, by providing them an outlet for their sexual perversion, studies find that, much like possession of CSAM (child sexual abuse materials, commonly, but inaccurately referred to as child pornography), there is a high prevalence of cooccurrence between possessors of these materials and actual offending.

One article detailed a study that contacted 85 persons who owned 
dolls, and 120 "Minor attracted persons" (I hate that term, let's just call them pedophiles!) and found that of the 120 pedos that did not own the dolls, 79% expressed interest in owning one. Additionally, "doll ownership was associated with lower levels of sexual preoccupation and self-reported arousal to hypothetical abuse scenarios, but HIGHER (emphasis added) levels of sexually objectifying behaviors and anticipated enjoyment of sexual encounters with children.”

This is analogous to other studies, such as the “Butner” Study, which said, “Our findings show that the Internet offenders in our sample (nearly two thirds of the sample) were significantly more likely than not to have sexually abused a child via a hands-on act. They also indicate that the offenders who abused children were likely to have offended against multiple victims, and that the incidence of 'crossover' by gender and age is high.

In fact, depending on the study, prevalence’s rates between 35% and 85% were found to have co-occurences between possession of CSAM and hands-on sexual abuses of children.


Another disturbing trend is the us of Artificial Intelligence (AI) software to generate pornographic images of people falsely. While Taylor Swift is one of the most well-known victims of this new trend, ordinary children are being victimized by this also.

Computer generations images and videos of child sexual abuse material, including depictions of infants and toddlers being raped, have been found in investigations across the globe. In Canada, a Quebec man was sentenced to more than three years in prison for AI-generated CSAM.

The International Watch Foundation confirmed that at least seven web addresses have displayed or distributed confirmed AI-generated CSAM materials, with some of the children depicted appearing as young as three years of age, and engaged in acts that would otherwise classify the images as Category A or B, the most severe classifications. 

A recent poll showed that 80% of the respondents on a pedophilic forum said they had either already used or intended to use AI tools to generate CSAM. The technology is rapidly catching on with the pedophilic community, as indicated by the chief legal officer at NCMEC noting how her team has been 
observing a sharp uptick of AI-generated images, as well as reports of people uploading images of child sexual abuse into the AI tools in hopes of generating more.

While AI generated content does not always involve harm to a real child, the knowledge that this material normalizes child abuse, and may encourage and embolden those who want to abuse children, is well documented.

NCA Director General Graeme Bigger is quoted as saying, “We assess that the viewing of these images – whether real or AI-generated – materially increases the risk of offenders moving on to sexually abusing children themselves."

This co-occurrence between online and hands-on offending has been known for some time. In 2008, Bourke and Hernandez conducted what is known as the Butner study, investigating the co-occurrence of these offense types. What they discovered was that 85% of the online offenders also had hands-on offenses, with a reported number of victims at 1,777, an average of 13.56 victims per offender.

Investigators also know that real CSAM and CGI/ AI -generated CSAM do not occur independently. It is common when conducting forensic analysis of offenders’ devices to find both real CSAM and AI or CGI generated material. This co-occurrence means that an offender could have both types, without any clear indication of causation.

Information obtained from fellow investigators about their experiences include comments such as: “I encountered a phone recently that contained a lot of rudimentary altered/AI CP which seems to have been created using free apps. These applications allow you to upload a clothed image of a subject and the software uses AI to simulate what that image would look like nude. Some of the applications that work this way are:
 
And, “A cybertip a few weeks ago from 4chan that was AI CSAM, the image was of such realism that I couldn’t tell the difference. I found through the filename that it was generated with 'epiCRealism PureEvolutionV3' AI.”

Regardless of whether a person is looking at images of actual minors being abused, or computer-generated depictions of minors being abused, it shows a predisposition toward child sexual abuse and the probable dangerousness of the individual.

Because of the relative newness of this phenomena, exact numbers for cases, nationally or in Arizona, are virtually impossible, especially since some of these images are indistinguishable from actual CSAM, but a single case in Canada located a suspect with “hundreds of thousands” or CSAM images and videos, and he produced at least seven videos of “deep-fake” abuse material.

The Washington Post reported, “Thousands of AI-generated child-sex images have been found on forums across the dark web, a layer of the internet visible only with special browsers, with some participants sharing detailed guides for how other pedophiles can make their own creations.”

The European Union is also struggling with this phenomena, with children bearing the brunt of the consequences.

The threat of increased offending behaviors is not the only fear from AI-generated content. The impact of exploitation on the victims has been well documented. The Canadian Centre for Child Protection conducted a study in 2017 to look at the impact on victims of exploitation.

This Survivor’s Survey, based on 150 returned surveys, documents the impacts of online exploitation on the victims. Long after the abuse itself ends, the impacts of the online availability of images and videos documenting the abuse persist. Thirty percent of the respondent’s report having been identified by a person who had viewed their CSAM online, some even reporting having been contacted by “fans” or even stalked by viewers.

“Moreover," the study said, "twenty-three respondents said they had been specifically targeted by persons who had recognized them from the child sexual abuse imagery. Most of those who had been targeted provided additional information, reporting having been re-victimized (e.g., assaulted, stalked or propositioned) (71%) or blackmailed (43%) by the persons who had identified them” and “87% of the respondents who shared information about how being identified from their imagery had impacted them said that they experienced further trauma."
           
This is especially relevant in the discussion of AI-generated material, because the use of a real child’s image to fabricate sexual images has been increasing.  This same situation has been documented in the use of real children’s faces for the creation of child-sex dolls, and these doctored AI images have already been found in various investigations.

This new means of creating CSAM content, that can emotionally injure victims, induce stalking and other crimes, and even sextort these children to the point of suicide. Reports of "deep-fake" or "synthetic sexually explicit material" (SSEM) has been in the news, as Pornhub has been found to host significant amounts of this type of material. 

Additionally, the FBI issued an alert in June 2023 about the growing threat of explicit content (SSEM) being created for sextortion schemes and harassment, and on September 5, 2023, 54 state Attorneys General submitted a letter to Congress calling for a Congressional study into the impact of AI on CSAM and solutions to the problem.

Another concern about AI-generated CSAM materials is the difficulty in conducting investigations into this type of material. Current CSAM investigations include attempting to identify, locate and rescue minors depicted in the CSAM materials. Computer generated content makes identifying real children more difficult, and will lead to additional manpower being expended trying to determine which children need to be rescued, and which are computer generated – especially as it relates to the ‘deep fake’ images.

As more efforts are needed to distinguish whether a child is real or fake, there is less available for actual rescues of live children being subjected to abuse.

Finally, a recent trend identified in CSAM investigations is the modification of real, identified CSAM images, through AI and other computer-generated applications, into simulated or cartoonish looking images.

This technology takes records of a child being abused, and further dehumanizes the abuse by making it appear to be animated or otherwise computer-generated. By this method, a person could be in possession (or distribution) of child abuse materials, but would not be able to be charged because he only possesses the “animated” version, which would not be covered under current Arizona legislative language.

Any image depicting what is, or is intended to be, a child, holds no legitimate artistic or educational value, and would be obscene, not only under the Miller standard (413 U.S. 15), but unprotected by the First Amendment, as ruled in Ferber (458 U.S. 747).

While these trends are disturbing, it shouldn’t be surprising when the legislatures are being populated with people who support sex offenders, pass legislation to reduce penalties for offenses, and in some cases are even offenders themselves.

The state of Colorado no longer calls those who sexually offend against children "Sex Offenders," since it is a “negative label," instead using the term “adults who commit sexual offense.” Likewise, the state of Washington is replacing the term Sex Offender, to be loss “offensive” to offenders, completely ignoring the feelings of the victims, and even went so far as to try and add a registered offender to the Sex Offender Policy Board, an advisory group also made up of victims, to recommend appropriate legislation and policy.   

But are we really surprised by some of these attempts to normalize perverted and abusive behavior, when the legislatures are infected with the very offenders’ police are fighting to arrest?

The Virginia State legislature had Joe Morrissey seated until just a few weeks ago. Morressey was married to his victim, whom he fathered three children with. She was a 16-year-old intern at his law firm when they began their relationship, and despite being charged, and convicted, of contributing to the delinquency of a minor, narrowly missing more serios felony charges of indecent liberties with a minor, possession and distribution of child pornography and electronic solicitation of a minor.

He successfully ran for his seat, while in jail for this offense, despite being dis-barred (after having his legal license suspended twice). His victim has also since filed for divorce, citing his infidelity as a cause for leaving her husband who is 40 years her senior.

An AP article in 2019 documented more than 90 state lawmakers accused of sexual misconduct, and another Virginia candidate, Nathan Larson, who fortunately was unsuccessful in his bid for the congress, actually ran on a pro-incest, pedophilia, and rape platform. Larson was later arrested for travelling across the country to abduct a 12-year-old he had been grooming, exploiting, and abusing online, and died in jail.

All of these problems are being further compounded by the invasion of millions of unknown and unvetted illegal aliens across U.S. borders, some of whom are previously convicted sex offenders, and other who are committing violent and sexual offenses against women and children on a regular basis.

We have enough of these offenders in our own population, without importing more from other countries, and allowing them unfettered access to our citizens.

So, what is the solution? How can we better protect children, prevent legislators from enacting harmful policy and hold accountable those who would go after the most vulnerable members of society?

Advocating for legislation that makes it harder for children to be exploited online, like the REPORT Act, Kids Online Safety Act, EARN IT Act, STOP CSAM Act, and many others. Additionally, introducing legislation in all 50 states, AND the Federal government to outlaw AI CSAM, Deep Fakes, Child Sex Dolls and other harmful items, and people, is critical to ensure the safety of children.

 
For corrections or revisions, click here.
The opinions reflected in this article are not necessarily the opinions of LET
Sign in to comment

Comments

Powered by LET CMS™ Comments

Get latest news delivered daily!

We will send you breaking news right to your inbox

© 2024 Law Enforcement Today, Privacy Policy