Don't bring ChatGPT to a fight
Part of AI Era Collection
AI isn't magic—it's a tool. What matters is how you use it.
My wife and I got into two fights recently.
Halfway through the first one, she suddenly sent me a wall of text. It was all from ChatGPT—a perfectly structured analysis of why she was right and I was wrong, complete with her very rational prompt included.
I was pissed. Her ChatGPT seemed totally unfair, so I immediately opened my own ChatGPT, desperate to tell it every detail to prove how wronged I was, and get my own “objective analysis” that would clearly show why I was right and she was wrong—naturally, with my very rational prompt included. Then I fired it right back at her!
Predictably, things escalated. Eventually we just stopped talking.
Now we each have our own ChatGPT1. Every time we fight, it’s tempting to bring it in as backup.
But I’ve noticed some problems:
- 
Copy-pasting feels insincere It’s like saying “I can’t be bothered to tell you myself.” In close relationships, what matters is that I’m willing to use my own words, risk getting it wrong, and tell you what I’m thinking. Once it becomes pasting AI responses, trust disappears. 
- 
ChatGPT only hears one side Unless you deliberately ask for neutrality, ChatGPT will always take your side. It amplifies your grievances and justifies your reactions. It looks rational, but it’s incredibly biased. 
- 
It turns into a court case “See? Even ChatGPT with its perfect logic says you’re wrong”—it’s like bringing AI in as a judge. But relationships shouldn’t be about who’s right or wrong. 
The second time we fought, she sent me another ChatGPT analysis.
I almost fired one back, but stopped myself.
I told her: “Stop sending ChatGPT stuff, or this will never end. It’s biased.”
Then I went to ChatGPT myself, but this time I didn’t look for validation. I changed my prompt: “Objectively analyze this. Don’t focus on who’s right or wrong, just tell me what I should do.” After two hours of discussion, I replied to my wife in my own words.
This is a problem that only exists in 2024.
ChatGPT is genuinely helpful for processing emotions. Using it as a therapist or advisor during conflicts is fine2, but using it as a weapon only makes things worse.
So, don’t bring ChatGPT to a fight.
Footnotes
- 
These chat histories have become ultra-private now—like diaries you can’t share. But thinking about how they’ll eventually be shared with advertisers makes the future look pretty dark. ↩ 
- 
ChatGPT probably understands what my wife is upset about better than I do. That sentence is pretty terrifying when you think about it. ↩