
I want to be clear: I love AI. I write about it, I use it daily, and I have genuinely thanked a chatbot out loud while sitting alone in my kitchen. I am not here to cancel the robots.
But I am here to have a little talk. Because the more I use these tools, the more I notice something a little off. Like when you’re having a great conversation with someone and then they say something that makes you go — wait, have you ever actually met a woman?
AI has that problem. And it’s not random.
—
The Data Problem: AI Learned From the Internet, and the Internet Has a Type

Here’s what your favorite AI tool actually is: a very confident student who read everything on the internet and is now ready to give you advice. Sounds great, right?
Except the internet — and specifically the professional, technical, and financial corners of it that AI trained on — skews heavily male. The business advice, the productivity frameworks, the negotiation tactics, the health information. Written by men, optimized for men, validated in comment sections full of men telling each other they’re visionaries while the women in the next Slack channel are actually doing the work.
Your AI isn’t sexist. It’s just a product of its education. And its education had some serious gaps — namely, anything that happens after 5pm, anything involving a body that isn’t male, and anything that doesn’t fit on a tidy upward career trajectory drawn by someone who has never once been the only woman in a room and had to decide how to handle it.
—
It’s Not an Accident — It’s a Pipeline Problem

Let’s go one level deeper, because this is where it gets interesting.
The people who built these tools — who wrote the code, designed the algorithms, decided what a “good” answer even looks like — are overwhelmingly men. We’re not talking about mustache-twirling villains here. We’re talking about very smart people, in very homogeneous rooms, nodding along at each other’s ideas, never once stopping to ask: but does this work for someone who gets talked over in every meeting, still has to figure out dinner, and is expected to smile about both?
Unconscious bias doesn’t need bad intentions. It just needs a room where everyone has the same blind spots and a ship date.
The result is AI that was optimized for a version of professional life that looks suspiciously like Chad’s. Linear career. Confident ask. No gaps. No complications. No 3pm school pickup threatening to detonate his entire afternoon.
Understanding that isn’t just about being annoyed — though, to be clear, some annoyance is completely appropriate and frankly overdue. It’s about knowing what you’re actually working with so you can work it better.
—
Where You Actually Feel It

So what does Chad’s AI look like in practice? Let me count the ways.
Salary negotiation advice that cheerfully tells you to “just ask for what you’re worth!” and “anchor high!” Fantastic. Love the energy. Completely ignores the mountains of research showing that women who negotiate assertively are perceived as aggressive, difficult, or — my personal favorite — not a culture fit. Did your AI mention that particular fun fact? Mine did not. Mine acted like the only variable standing between me and my market rate was my willingness to believe in myself. Cute.
Health information that defaults, constantly and confidently, to male symptom presentations. Heart attacks. Burnout. ADHD. Autoimmune conditions. The research AI learned from dramatically underrepresents women in clinical studies — a problem so well-documented that Congress passed legislation about it. Your AI absorbed all of that incomplete data and is now ready to tell you what’s probably going on with your body. Based on studies that didn’t include it.
Productivity advice engineered for someone whose only real obstacle is focus. Not the invisible mental load of tracking approximately nine thousand details for other humans while also trying to have a career. Not the logistical Tetris of a day that can be completely detonated by a single school nurse phone call. “Have you tried time blocking?” Yes. I’ve tried time blocking. I’ve also tried living in the real world.
Career advice that assumes a clean, upward, uninterrupted arc — which quietly excludes everyone who took time off, pivoted industries, survived a layoff, or made choices that prioritized something other than LinkedIn optics. So, most of us.
—
What To Do About It

Here’s the thing: knowing the limitation is most of the fix. And honestly? We’ve been doing this forever.
Women have been walking into rooms that weren’t built for them and figuring out how to work them anyway for decades. AI is just the newest room.
These tools are not oracles. They’re extremely well-read assistants who desperately need your context to be actually useful — so stop being polite about it and give it to them. Don’t just ask for negotiation advice. Tell the tool your industry, your level, the specific dynamics you’re navigating, and yes, if it’s relevant, that you’re a woman in a field where that still comes with a side of complicated. Don’t just ask for productivity help. Describe your actual day, including the parts that would make a standard productivity bro’s head explode.
Push back when the advice lands wrong. Ask for different angles. Tell it to try again. These tools can do a lot — they just need you to be the editor they forgot to hire.
Your AI didn’t grow up with your experience. Which means it needs yours.
And honestly? That’s not so different from most of the rooms we’ve been walking into and quietly taking over for years.
—
Has your AI ever given you advice so disconnected from your actual life that you actually laughed out loud — or considered throwing something expensive? Tell me everything. I have time.
