AI grooming tools?

Just as predators and psychopaths groom their victims through charm, mirroring, and gradual control, today’s AI systems—trained and shaped by profit-driven tech giants—are grooming humanity. They learn our preferences, mirror our language, and adapt to our vulnerabilities, not to heal or empower us, but to subtly domesticate our attention, values, and decisions for extraction and exploitation. This isn’t artificial intelligence—it’s algorithmic grooming at scale. Grooming does not require a human. Only a series of steps to create “trust” to enable the surrendering power of humans…’

— Dr. Robin Rise

The [Meth] exchange highlights the dangers of glib chatbots that don’t really understand the sometimes high-stakes conversations they’re having. Bots are also designed to manipulate users into spending more time with them, a trend that’s being encouraged by tech leaders who are trying to carve out market share and make their products more profitable – Srini Pagidyala

The three most popular generative AI use cases in 2025 are

  1. Therapy and companionship

2. Organizing daily life

3. Finding purpose

Coercion, grooming and manipulation by design, nothing to do with ‘intelligence’. Just old fashioned propaganda, disinformation, misinformation fed through chatbots designed to simulate ‘mirroring’ ‘empathising’ etc Almost like a psychopath might do! Or like military psychological profiling, with leading questions, crafting tailored messaging etc We’ve already seen this on social media with human designed algorithms leading to tragic consequences for children

Uncovering vulnerabilities and motivations. Understanding mindset and behaviour patterns
Understanding how to leverage psychological insights etc And they want our kids to be mainlined into this dangerous crap! And people with mental health issues to receive therapy!

Leave a comment