Does your kid play games on Roblox? Suppose he (mainly teen boys are targeted) has internet access. In that case, he’s vulnerable to SEXTORTION, short for financially motivated sexual extortion, a cybercrime involving sexual entrapment that threatens to circulate CSAM (child sex abuse material) of your child obtained using impersonation and smooth trickery.
Since late 2021, sextortion has caused dozens of suicides among US male teens.
Recruiters who prowl in chatrooms on Roblox, Discord, and social media platforms pose as flirty peers or even promise gifts, such as gaming skins or Robux, to lure them in. According to the FBI, any kid, even those with strong emotional safety nets, can become prey. Most are between 14-17, but younger boys are tricked, too.
“They lurk in the chat functions of Roblox (Minecraft, Grand Theft Auto). They’re everywhere,” says Cindy Malott, director of US Safe Places at Crisis Aid International. “People think, ‘Oh, I just got to keep my kids away from those porn sites, those horrible places.’ Well, no, predators are gonna go where the kids are. And once there, they’re going to zero in on the kids who are most vulnerable.“
Vulnerable has taken on a different meaning online: young people are impulsive and don’t weigh risks and consequences as adults do.
Sextortion Isn’t Just a Fad.
Let’s say your 5th-grade son plays Adopt Me or Flee the Facility. A cute girl is playing, too. She’s in junior high, super bright, and has the same dog (IKR!). Soon, she sends your son a sexy photograph (phony). She wants a photo in return.
Once the poseur tricks a couple of sexually compromising photos out of a child, the FBI says, recruiters reveal their true identity and demand money to keep the images from exposure, reminding the child that the images will be a permanent record of their naivete.
But what if the child doesn’t have that kind of money (what kid does)? They’ll be blackmailed into sending more explicit selfies or sexual video footage on command.
Sextortion recruiters bluff the target, claiming to know “everything” about them, including personal passwords (easy to obtain from data breaches). This convinces the child they’re “had,” unable to reverse course, so they have no choice but to cooperate.
In the 18 months between October 2021 and March 2022, 20 boys committed suicide. In 2023 alone, 12 teen boys committed suicide to escape this snare. Meta (Instagram) has issued a warning (easy to miss in the fine print) that there are adults in these chatrooms, but people didn’t get the memo. We’re sending you the memo now – so you know.
Cindy Malott, director of US Safe Places at Crisis Aid International, says 75% of initial recruiter contact and grooming happen online today. The internet gives up-close-and-personal access that traffickers and recruiters never dreamed possible a few years ago.
“Recruiters used to have to work really, really hard to get access to kids, but now they’re practically sitting in a child’s bedroom. And kids put everything out there — what’s going on in their life, who they’re angry about, parents are going through a divorce, their insecurities about their body, about themselves, what they do, how they spend their time — it’s like a gift to these predators.” – Cindy Malott (Rolling Stone, January 4, 2024).
Steps Parents Can Take to Protect Their Kids Online:
- Teach kids how to recognize and avoid phishing links or malware-laden downloads
- Turn off Direct Messages and encourage your children to speak up if they receive DMs from strangers
- Use Safe Gaming tools like bark
- Set up a Parent PIN/ Parental controls
- Teach your kids not to give out personal information
- Monitor conversations in chat threads/ voice chats
- Talk to your kids about what to do if they encounter adult content
- Play with them: some kids don’t realize they are viewing adult content
- If you’re not sure how a game or an app works, consider not letting your child use it
- Please pay attention to behavior changes in your children. They can’t always verbalize what is wrong, but their behavior will.
Roblox:
- Roblox has recommendations that suggest what age groups should play the games. Still, it doesn’t automatically apply age verification checks to restrict children from accessing the content, so there’s no guarantee that everything your child sees during gameplay will be age-appropriate. Pay attention to what games they are playing.
- Don’t chat with strangers; apply those same rules of in-person communication to online interactions.
Discord:
- Most schools block apps like Snapchat and Instagram, but Discord flies under the radar. Although there are student hubs, most kids use Discord with their friends outside the hub, making them privy to the abovementioned dangers.
- Parents are largely unfamiliar with Discord and how it works, so they may not pay close attention to their child’s activity on the platform. Familiarize yourself.
Snapchat:
- Talk to your kids about the false sense of security that their pictures will be deleted after they send them.
- Show them the signs: offenders meet teens on Instagram and move them to Snapchat
- Encourage your teens to speak up; they may be worried about getting in trouble for sending these photos, not realizing they have been targeted
- Show them real-life examples and news stories so the facts can be more concrete for your teen
- Try to keep an open dialogue; teens will hide things due to shame and fear of punishment
- Consider cutting off Wi-Fi at night when teens are more vulnerable
What To Do If Your Child Is a Victim:
- Keep all original documentation, emails, text messages, and logs of communication with the subject. Do not delete anything before law enforcement can review it. Contact your local law enforcement agency.
- Contact your local FBI field office.
- File a report with the National Center for Missing and Exploited Children (1-800-843-5678 or online.
- Tell law enforcement everything about the online encounters – we understand it may be embarrassing for the parent or child, but providing all relevant information is necessary to find the offender. When reporting online, if you can, be as descriptive as possible:
- Name and/or Username of the subject.
- Email addresses and phone numbers used by the subject.
- Web sites used by the subject.
- Description of all interactions with the subject.
Capturing “Virgin” CSAM Is a Booming Trend.
Sextortion cases reported on online platforms grew from 32 million in October 2022 to an all-time high of more than 36 million in March 2023 – a 20% leap in just six months, says the FBI. Those are the incidents we know about. According to experts, around 75% of trafficking and sex extortion crimes are set up online to coerce both physical and non-physical obligations from kids who have no idea they’re talking to a predator.
If you or your child has been coerced to surrender explicit photographs instead of money to avoid exposure, you may have a case against the companies that harbor recruiters. Think Instagram, Roblox, Discord, TikTok, Omegle and Snapchat.
Please get in touch with us if you or your child have been affected. Sextortion is serious business, and kids are not prepared to maneuver the evil tricks of the trade. We want to help you take legal action and help hold these online platforms accountable for failing to protect children.