The Dark Side of Short Video Algorithms: Are We Being Manipulated Into Emotional Distress?
A few days ago, I had an emotional conflict with my loved one. Like many of us do in moments of distress, I instinctively opened my phone and started scrolling through reels to distract myself. But what happened next was unsettling.
As I kept scrolling, I noticed a pattern—every reel was related to love, relationships, and heartbreak. Then, out of 20 reels, more than 3 were related to suicide. I scrolled through over 100 reels, and every single one was centered around intense emotions—loneliness, sadness, and suffering. It felt as if the algorithm had sensed my emotional vulnerability and was feeding it back to me, amplifying my distress instead of relieving it.
This experience left me with a disturbing realization: Are short video algorithms designed to push us deeper into our emotions, not for our well-being, but to increase our screen time and engagement?
The Algorithm’s Role in Emotional Manipulation
Social media algorithms are built on a simple principle—maximize user engagement. Every like, comment, and second spent on a video trains the algorithm about our emotional state. These AI-driven models don’t understand ethics, mental health, or human well-being; they only understand patterns.
If you show even a slight interest in a particular type of content—whether it’s relationship advice, heartbreak, or depression—the algorithm assumes that’s what keeps you engaged and floods your feed with similar content. This creates an emotional echo chamber, where you get trapped in a loop of similar emotions, making it harder to move on or find balance.
The Consequences?
Why Are These Platforms Doing This?
The answer is simple—profit.
Short-form video platforms are driven by capitalism. The more time users spend on the app, the more ads they see, the more revenue these platforms generate. And what better way to keep users engaged than by trapping them in a loop of highly emotional content that exploits their vulnerabilities?
It's not about helping you heal; it's about keeping you hooked.
The algorithm isn’t designed to care for your mental well-being. It’s designed to extract maximum engagement at any cost—even if that cost is your mental health.
领英推荐
The Bigger Picture: How Short Videos Are Rewiring Our Minds
The impact of short videos isn’t just emotional; it’s cognitive as well.
This is not just a tech issue; it’s a psychological and societal issue.
How Can We Break Free?
We can’t expect these platforms to change their behavior voluntarily, but we can take back control:
Final Thoughts
Short video platforms are not inherently bad, but their algorithms prioritize profit over well-being. If we are not careful, they can push us down a dark rabbit hole, reinforcing anxiety, depression, and emotional instability—all in the name of engagement.
Next time you catch yourself endlessly scrolling, ask yourself: Am I in control, or is the algorithm controlling me?
It’s time we start using technology consciously—before it starts using us.