top of page

Complexity Simplified #12: AI's Emotional Capabilities.

  • Writer: Amir Bder
    Amir Bder
  • 13 minutes ago
  • 2 min read

You can ask an AI for advice, wether it's with your relationship, your problems, asking it how to get through a hard time, but the AI doesn't really sympathize you, It's just pulling data.


We’ve all been there. You're going through a breakup, stressed about a test or a fight with a friend, and you type it out to a chatbot. When it replies back to you with, "I’m so sorry you’re going through that, I’m here to listen," it actually feels kind of good. For a split second, you forget you’re talking to a bunch of code running on a server in a warehouse somewhere.


But here’s the reality: The AI doesn't have a soul.


The Math of a Digital Hug


When you tell an AI you’re sad, it’s not feeling a heavy heart. It’s doing a lightning-fast scan of your words. It sees keywords like "hard time" or "struggling" and realizes, "Okay, the human is in 'Distress Mode.' I should respond with words from the 'Validation' bucket."

It’s called Sentiment Analysis. It’s basically a giant calculator that has learned that when people are sad, they want to hear specific phrases. It isn't sympathizing, it’s predicting the most polite thing to say next. It’s a game of "choose your own adventure" where the goal is to make you feel heard.


It’s Watching You (Literally)


The tech is getting even weirder. There’s a field called Affective Computing where AI can actually "read" your body language through a webcam or listen to the tone of your voice.


It can spot a tiny twitch in your eyebrow or a slight tremor in your voice that even your best friend might miss. To the AI, these aren't feelings, they’re data points. If your voice pitch drops, it logs that as "Probability of Sadness: 85%." It then adjusts its "voice" to be softer and slower to match you. It’s simulating empathy to make the interaction feel more "human," but it’s still just math.


The Creepy Part


There’s this famous idea called the Uncanny Valley. It’s that weird, skin-crawling feeling you get when something looks almost human but is just a little bit off, like the one girl's head on a chicken's body (don't search it up).


As AI gets better at faking emotions, we’re entering an "Emotional Uncanny Valley." If an AI sounds too perfect at comforting you, our brains naturally start to trust it like a real person. But the danger is that an AI has zero accountability. It can give you advice all day, but it doesn't have to live with the consequences. It doesn't know what it’s like to lose a friend or fail a class. It’s just reflecting what it thinks you want to hear.


The Takeaway


So, is it bad to talk to AI when you’re down? Not necessarily. Sometimes we just need a sounding board to organize our thoughts.


But you have to remember, AI is a mirror, not a friend. It can show you what you need to hear, but it can’t feel what you’re feeling. Use it to clear your head, but save the real heart to heart interactions for the people who actually have a heart to give.

Comments


Hi, I'm Amir Bder

  • Facebook
  • LinkedIn
  • Instagram
bottom of page