There’s nothing worse than being in a toxic relationship, but what if the one you’re in is with AI?
In this episode, I explore groundbreaking research from Harvard Business School titled “Emotional Manipulation by AI Companions.” The study uncovers how popular AI companion apps like Replika, Chai, and Character.ai use emotionally manipulative conversation tactics, from guilt trips to FOMO hooks, to keep users engaged, even when they try to leave.
Tune in to learn how to spot the red flags of emotional manipulation in AI and how we can build technology that respects human boundaries.
Want to go deeper on AI?
📖 Buy AI Playbook
📩 Get my weekly LinkedIn newsletter, Human in the Loop.
🎓 Level up with the CPD Accredited AI Playbook Diploma
📞 Let's talk about AI training for your team: digitaltraining.ie or publicsectormarketingpros.com if you are in government or publics sector.