Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

Shawn Vestal: The corrupt underside of the age of the selfie thrives online

When the former editor of this newspaper, Steven A. Smith, was arrested and accused of purchasing pornographic images and videos of children, a disturbing fact stood out: The young victims appear to have taken and sold the material themselves, using Instagram as a marketplace.

It was far from a unique case. The growth of what experts and law enforcement officials call “self-generated child sexual abuse material” has come to dominate the world of online child pornography – a dark, complicated extension of the age of the nude selfie and sexting, the rise of exploitative grooming online, and lax moderation by social media platforms and others in the digital sphere.

With so-called SG-CSAM, experts say, it is often impossible to tell whether children have been coerced or groomed, or whether others are involved off-camera. But even in cases where they have not, the children are victims of criminal exploitation, creating a permanent record of their abuse and liable to suffer long-term consequences.

“It’s crucial to recognize that the term ‘self-generated’ doesn’t assign blame to the child,” said Cassie Coccaro, the communications lead for Thorn, a nonprofit that fights the sexual exploitation of children. “Increasingly, adults are coercing and grooming children online, asking them for self-generated material.”

The Internet Watch Foundation, a U.K.-based organization that investigates online child abuse, reported that in 2021, nearly 80% of all websites with sexually explicit images of children included self-generated images. In the year prior to that report, that proportion had increased by 77%.

The most pronounced rise involved preteens and children as young as 7. The foundation also identified several newsgroups where images were shared, with a similar preponderance of self-generated material.

“In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves,” the foundation said in a report. “The images are created of children often in their bedrooms or another room in a home setting and no abuser is physically present but may be present virtually via the internet.”

Robert Hammer, special agent in charge of Homeland Security Investigations for the Pacific Northwest, said more adult predators are exploiting the ability to hide their identities to make contact with children online, persuade them to share images, and blackmail them with the threat of exposure to continue sending images – a practice that investigators call “sextortion.”

Homeland Security Investigations and the Washington State Patrol operate a task force out of Spokane to investigate these crimes in Eastern Washington. Hammer would not answer questions about the ongoing investigation into Smith, but spoke about the issue and the task force’s work generally.

“Every single day, that team is going through literally hundreds of leads dealing with child exploitation,” he said.

Social media platforms have become a major conduit for such material, as was true in the Smith case. Last month, a team of Stanford University researchers released the results of an investigation demonstrating how easily buyers and sellers connect on Instagram and Twitter, and then migrate to even-less-regulated forums – such as Telegram – or email to execute transactions.

“Our investigation finds that large networks of accounts, purportedly operated by minors, are openly advertising SG-CSAM for sale on social media,” the authors wrote.

“Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers. … The platform’s recommendation algorithms effectively advertise SG-CSAM: These algorithms analyze user behaviors and content consumption to suggest related content and accounts to follow.”

The problem is not confined to a few platforms, however.

“What we know is that any content-hosting platform – any platform with an upload button – can and does host CSAM,” Coccaro said.

‘A trove of poison’

Since its advent, the internet has provided dark channels of connection for people seeking child pornography, and helped create communities of predators who share material.

Well over a decade ago, then-Attorney General Eric Holder warned of the “historic rise” in the number of images being shared online and in the level of violence associated with child exploitation and sexual abuse.

“Tragically, the only place we’ve seen a decrease is in the age of victims,” he said at a 2011 conference on combating child exploitation.

Since then, there has been an explosion of self-made images in that realm, which seems to have accelerated during the pandemic, when both children and adults were shut up at home more often and finding connection in the digital realm.

In its 2022 report, the Internet Watch Foundation identified more than 255,000 individual web pages containing child sexual abuse imagery, with the large majority including self-generated images. That total represented a slight increase from the previous year – but in the previous year the organization charted a 64% rise.

Each web page could contain “one, tens, hundreds or even thousands” of individual images or videos, the foundation said.

Most often, it is children ages 11-13 who appear in self-generated images, but there was a sharp increase in images of children aged 7-10, the foundation said.

Hammer said every new evolution of technology presents a new challenge for investigators, as predators adjust to hide their activities.

“The internet is a trove of treasure, and it’s also a trove of poison,” he said. “When you put these devices in a child’s hands, you have to be cognizant of the dangers that can come with it.”

The increase of self-generated images in criminal forms has tracked with the rise of the more common form of SG-CSAM – the nude selfie, the explicit pic, the sext. In an era of constant digital interaction, a large and growing number of young people share such images of themselves with others whom they believe they can trust.

Experts say they may do so for a variety of reasons. They might be experimenting with their sexuality or identity; they might be acting within what they see as a trustworthy romantic relationship; they may be groomed or encouraged by others.

Many children now grow up with phones in hand, starting well before their teenage years, and they often meet and share interests with others online – which can be healthy and positive in many instances, while also creating an environment where the risks of exploitation are great.

Between 33% and 39% of teenagers say they think “it’s normal for people my age to share nudes with one another,” according to surveys taken in 2019, 2020, and 2021 by Thorn.

Twenty-one percent of children ages 9-12 agreed it was normal in the 2020 survey.

“The pathways leading to the production of this imagery are varied, ranging from consensual sexting among peers to coercive grooming by a stranger online, and it may be impossible for investigators to know the circumstances under which SG-CSAM was produced from looking at the picture alone,” Thorn said in one report.

“Regardless of the pathway, the resulting images are still CSAM. Their distribution threatens the well-being of the child in the image, and they can be used by offenders to groom future victims.”

Hammer echoed that point, saying that no form of such sharing by a child can be viewed as consensual or willing.

“A child is a victim when their image goes out across the internet,” he said.

‘There is no delete button’

Thorn’s reports outline the prevalence of, and attitudes about, such self-generated images among young people, based on surveys, focus groups and other research methods.

One in five girls aged 13-17 said they had shared their own nudes with someone online, while 1 in 10 boys had done so. The most used digital platforms among those aged 9-17 were YouTube, TikTok and Instagram, and large proportions of teens said they had secondary accounts to prevent traceability, sometimes referred to as “Finsta” accounts.

Of those who had shared images of themselves, 43% had done so with someone they had not met in person, and two-thirds of those contacts were initiated by the other party, not the child. A third of teens said that they consider an online-only friend one of their closest confidants, and the proportion is higher for LGBTQ kids.

If sexting and sending nudes is common among young people, so is the experience of having such images reshared without consent – one way in which images might leak from the private realm to the public. Nearly 4 in 10 children aged 9-17 said they had a peer whose nude photos were shared or leaked without permission, and 1 in 5 children said they’d viewed such material themselves.

Experts say that it’s vital for parents to be involved with their children’s online activities, having open, honest conversations at an early age and placing limits on their access to apps and social media.

Coccaro said parents should begin discussing online safety with their children as early as possible – “in the single digits” – in open, judgment-free ways that ensure that kids feel safe discussing their digital behaviors. It’s important for young people to understand that there can be a dark permanence behind what might seem like a brief, fleeting online interaction.

“Once an image is online, that image can be shared and copied and duplicated across the web,” Coccaro said. “They’re revictimized again and again.”

Hammer echoed that point.

“Once that image leaves the device, it’s gone,” he said. “There is no delete button once it goes out on the internet.”

‘A very large amount’

In March, federal investigators in Kentucky obtained a search warrant for information related to two Instagram accounts operated by a young teenage girl.

A month later, they received the material from Meta, which owns Instagram, and discovered the girl had been exchanging direct messages about selling explicit photos and videos with an account called “hermiesays.”

This account was Smith’s and he used it to arrange the purchase, via CashApp, of approximately 30 videos of three girls, ages 10, 11 and 14, between April 2022 and January , investigators say. The hermiesays account remains active; it has made no posts, but follows several thousand other accounts, most of which are pornographic.

Examination of Smith’s phone and computer data revealed “a very large amount” of sexually explicit material involving children. He stands charged with 10 counts of first-degree possession of depictions of minors engaged in sexual conduct, but the affidavit of probable cause in the case says officers found “much more” than that – and that more charges are likely.

Some of the material involved victims as young as 5 being assaulted by men, court documents say.

On July 20, officers served a search warrant at Smith’s South Hill home. When they arrived, he was in the midst of downloading material on his laptop, and engaged in several chats apparently with “various young females.”

Smith remained in the Spokane County Jail on $25,000 bond as of Friday. He declined a request for an interview for this story.

Court records describe the minor victims as taking the photos and videos of each other themselves, and using social media accounts to sell them.

While the case remains ongoing and investigators aren’t answering specific questions, the records so far do not indicate whether other adults may be involved in the production or sale of the material, or how the interactions with Smith began.

Smith’s arrest came as shocking news to many who knew him from his time as the executive editor of The Spokesman-Review between 2002 and 2008.

Many have noted the seeming irony of his prominent involvement in the newspaper’s investigation into former Mayor Jim West, in which he approved an undercover sting employing a forensics expert posing as a high school student to engage with West online.

That story became controversial over the newspaper’s methods, and was the subject of a critical documentary on “Frontline,” which presented the story more as a reckless outing of a closeted gay man in a city of little tolerance than a report unveiling abuses, and raised questions about some of the newspaper’s reporting.

Asked about his reactions to the effect the stories had on West’s career, Smith said, “I have more sympathy for the young men that I believe he sexually abused. I have enormous sympathy for young gay men in our community who I believe were stalked and victimized by a sexual predator, whether or not they were of age.”

Smith left the newspaper in 2008 and taught journalism at the University of Idaho until his retirement in 2020.

‘A time bomb’

The interactions between “hermiesays” and three girls in Kentucky follows a sadly common pattern, according to an investigation by the Stanford Internet Observatory, which focuses on identifying and preventing abuse and negative experiences online.

Acting on a tip from a reporter at the Wall Street Journal, three researchers with the Observatory examined networks on Instagram and Twitter in which minors apparently created and shared illegal material. What they found were large networks of accounts that seem to be operated by children and are “openly advertising” SG-CSAM for sale.

The investigation concluded Instagram was the main platform for these networks, and that its algorithms for suggesting content to viewers would often connect accounts that used child-pornography-related hashtags to similar accounts.

Co-author David Theil, the chief technologist for the Stanford Internet Observatory, said in a recent podcast interview that the people operating the Instagram accounts understand how to market to their audience, following the pattern used by adult content providers.

“On Instagram, it’s a fairly sophisticated operation,” Thiel said. “These kids understand how content enforcement works. They understand how to juggle accounts correctly. They know how to advertise using stories, do networking via stories, chat out to other accounts, remarketing accounts when they reappear.”

Twitter, which renamed X this month, had similar breakdowns in moderation and security, was slow to remove illegal content when notified, and has made it more difficult for outside observers to gain access to system information, the authors have said. Its content moderation problems have been exacerbated by widespread cuts in staffing.

On both platforms, someone looking for child pornography is able to find it without any trouble, the authors said.

“It’s quite easy to come across if you’re looking for it,” Thiel said.

Representatives of both platforms have promised action in the wake of the report, with Meta saying it was forming a task force to investigate the issue. And the report noted that such material “did not seem to proliferate” on TikTok, due in part to stricter and more rapid content regulation and algorithmic differences on the platform.

But the authors note that their ability to investigate other platforms was limited, and that “a wide cross-section of this industry is leveraged by this (SG-CSAM) ecosystem,” including file-sharing services, merchants and payment providers, and calls for a more robust, widespread effort to limit production and distribution of the material.

When the news of the Stanford study was first reported in the Wall Street Journal, Meta responded by claiming the prevalence of child pornography on Instagram was rare – saying just 1 in 10,000 posts viewed involved child sexual abuse material.

Co-author Alex Stamos – the former chief security officer for Meta – criticized this response, noting that the prevalence of material might be a good metric for content that is not illegal but which a platform is trying to limit – such as anti-vaxxer disinformation – but that it’s beside the point with criminal activity.

“This is a criminal conspiracy between buyers and sellers on this platform and so whether or not some random person sees it is not relevant to whether harm is happening,” he said in the podcast. “So it is a completely inappropriate metric to use and just really a bizarre one. It’s also strangely high. One in 10,000 is not a good number.”

Hammer said predators often use fake profiles on social media to make connections with young people. They may – while pretending to be a friend or peer – persuade a child to share an image, and then “spring the trap” by extorting them into sending more.

“Parents need to educate themselves,” he said.

That involves putting limits on their children’s digital activities, having frank conversations about the dangers online, and monitoring their use of the internet.

Otherwise, “turning your child loose with a device – it’s going to be a time bomb,” he said.