Discord has up to date its coverage meant to guard youngsters and teenagers on its platform after studies got here out that predators have been utilizing the app to create and unfold little one sexual abuse supplies (CSAM), in addition to to groom younger teenagers. The platform now explicitly prohibits AI-generated photorealistic CSAM. As The Washington Publish just lately reported, the rise in generative AI has additionally led to the explosion of lifelike pictures with sexual depictions of youngsters. The publication had seen conversations about the usage of Midjourney — a text-to-image generative AI on Discord — to create inappropriate pictures of youngsters.
Along with banning AI-generated CSAM, Discord now additionally explicitly prohibits every other form of textual content or media content material that sexualizes youngsters. The platform has banned teen courting servers, as effectively, and has vowed to take motion in opposition to customers participating on this habits. A earlier NBC Information investigation discovered Discord servers marketed as teen courting servers with members that solicited nude pictures from minors.
Grownup customers had beforehand been prosecuted for grooming youngsters on Discord, and there are even crime rings extorting underage customers to ship sexual pictures of themselves. Banning teen courting servers utterly might assist mitigate the problem. Discord has additionally included a line in its coverage, which states that older teenagers discovered to be grooming youthful teenagers might be “reviewed and actioned underneath [its] Inappropriate Sexual Conduct with Youngsters and Grooming Coverage.”
Other than updating its guidelines, Discord just lately launched a Household Heart instrument that folks can use to keep watch over their youngsters’ exercise on the chat service. Whereas mother and father will not be capable to see the precise contents of their youngsters’ message, the opt-in instrument permits them to see who their youngsters are pals with and who they discuss to on the platform. Discord is hoping that these new measures and instruments may also help preserve its underage customers protected together with its outdated measures, which embody proactively scanning pictures uploaded to its platform utilizing PhotoDNA.
Discord’s Household Heart is a brand new opt-in instrument that makes it simple for teenagers to maintain their mother and father and guardians knowledgeable about their Discord exercise whereas respecting their very own autonomy. pic.twitter.com/UFY0ybo0LR
— Discord (@discord) July 11, 2023