
Let me ask you something. Have you ever opened Instagram or Facebook, scrolled for twenty minutes, and walked away feeling vaguely terrible about yourself without being able to explain why? Or watched your teenage daughter disappear into her phone for hours, emerging quieter and more anxious than when she went in?
That feeling isn’t a coincidence. And it isn’t your fault.
What’s happening to you — and to your kids — was designed. Deliberately. By people who knew exactly what they were building and built it anyway.
Last week, two separate juries confirmed it in a court of law.
First, Let’s Talk About What an Algorithm Actually Is

Most people interact with social media the way they interact with weather — it just happens, and you adjust. Your feed shows you things. Some of it feels good. Some of it makes you feel bad. You’re not entirely sure why you keep scrolling, but you do.
That experience is not random. It is the result of an algorithm — a set of rules built by engineers that decides, with extraordinary precision, what you see, in what order, and for how long. It decides which posts surface and which disappear. It decides what makes you feel connected and what makes you feel left out. It decides, in thousands of small invisible ways, how you feel about yourself by the time you put your phone down.
And here’s the part nobody puts in the terms and conditions: that algorithm was not designed to make you feel good. It was designed to keep you on the platform as long as possible, because your attention is what gets sold to advertisers. Your engagement — your likes, your scrolling, your emotional reactions — is the product. You are not the customer. You are what’s being sold.
The Internet Has a Type, and You’re Not It

Now layer this on top: the algorithms running your feeds didn’t appear out of thin air. They were trained on data — specifically, on the vast ocean of content, images, and interactions that make up the internet. And the internet has a very specific idea of what a woman looks like and how old she should be.
Researchers at UC Berkeley and Stanford analyzed 1.4 million images across major platforms and found that women are consistently depicted as younger than men across virtually every occupation and social role online. The most common age for women shown? Their 20s. For men? Their 40s and 50s.
The same internet that calls a man in his fifties “distinguished” and “experienced” has essentially decided that women over 40 are past their expiration date. And every algorithm trained on that data absorbed that assumption and encoded it into the system.
This isn’t just insulting. It has real consequences for what you see, how you’re seen, and what gets quietly filtered out of your world without you ever knowing it happened.
It’s Not Random. It’s Unconscious Bias.

Before we go platform by platform, let’s talk about who built these systems.
The engineers who designed these algorithms, the executives who approved them, the product teams who decided what “good” looks like — they are overwhelmingly young, overwhelmingly male, and overwhelmingly focused on one metric: engagement. Not your wellbeing. Not your daughter’s mental health. Engagement. Time on platform. Return visits. The numbers that translate directly into advertising revenue.
Nobody in those rooms was asking whether the machine they were building would serve a 50-year-old woman with decades of real experience and something genuinely valuable to say. That question never made it onto the whiteboard. Because that woman was never really the point.
Unconscious bias doesn’t need villains. It just needs a room full of people with the same blind spots, the same incentives, and a ship date.
Platform by Platform: Here’s What They’re Actually Doing

Facebook:
Facebook knows its user base has aged — and rather than serve that audience better, it has spent years quietly suffocating organic reach while pushing users toward paid promotion. What this means in practice: the connections you’ve built, the community you’ve gathered, the trust you’ve earned — Facebook is now standing between you and those people, charging a toll to reach them.
Meanwhile, its algorithm is simultaneously serving you a feed engineered to provoke emotional reactions, because outrage and anxiety keep you scrolling longer than contentment does. You came to connect. The platform found that keeping you angry was more profitable.
Instagram:
Instagram has one goal right now: compete with TikTok. That means Reels — fast, loud, trend-driven short video — dominate everything. The algorithm buries almost everything else. But the deeper problem isn’t the format. It’s what Instagram does to how women see themselves. The platform is saturated with beauty filters, body-altering tools, and algorithmically amplified images of physical perfection. Research cited in the Meta trial showed that young women who used Instagram developed body dysmorphia — a clinical condition — from the relentless comparison the platform’s design encourages. And Instagram knew. Internal documents presented at trial showed the company was aware its platform was damaging girls’ body image and mental health. They kept the filters anyway.
TikTok:
TikTok is the newest and in some ways the most honest of the platforms about what it’s doing — it simply optimizes for watch time, full stop. The more of your attention it captures, the more it’s winning. What makes TikTok particularly powerful — and particularly worth understanding — is that its algorithm is extraordinarily good at personalization. It learns your pressure points, your insecurities, your interests with uncanny speed and feeds you a stream of content precisely calibrated to keep you watching. For younger users especially, that can mean being pulled into rabbit holes of content around body image, anxiety, or worse — before anyone realizes what’s happened. TikTok’s algorithm doesn’t care what’s good for you. It cares what keeps you there.
LinkedIn:
LinkedIn presents itself as the responsible adult of social media — the professional network, the serious one. So it’s particularly worth noting that women have been quietly reporting for months that their reach on the platform has dropped dramatically, while their male counterparts’ content continues to perform. A group of women ran an experiment, switching their listed gender to male, and reported significant jumps in reach almost immediately. LinkedIn says its algorithm doesn’t use gender as a signal. What it appears to use instead is writing style — and the platform’s AI seems to reward styles stereotypically associated with men: terse, direct, declarative. The warm, nuanced, conversational voice that many women naturally use gets treated as a proxy for lower value content. So it’s not explicitly sexist. It just learned that “professional” means “sounds like a man” — and acts accordingly.
Last Week, Two Juries Said: This Was Never an Accident

Here’s where we stop talking about bias and start talking about what the courts just decided.
Last week, in the span of two days, Meta faced two landmark verdicts.
In New Mexico, a jury ordered Meta to pay $375 million for failing to protect children from sexual predators on Instagram and Facebook — and for misleading the public about the safety of its platforms. In Los Angeles, a California jury found Meta and Google negligent in the case of a young woman named Kaley, who began using Instagram at nine years old and YouTube at six. By the time she was a teenager, she had developed severe depression, anxiety, and body dysmorphia. The jury found that the platforms’ design — the infinite scroll, the algorithmic recommendations, the autoplay, the notification systems — was a substantial factor in her harm. They awarded $6 million in damages. They found the companies had acted with malice, oppression, and fraud.
Let that word sit there for a moment. Malice.
This case didn’t hinge on what content Kaley saw. It hinged on how the platforms were built — the specific features engineered to make it impossible to stop, to keep a nine-year-old girl scrolling until she couldn’t remember who she was before she started. Internal documents introduced at trial showed that executives at Meta and Google knew their platforms were causing harm to young users. They knew.
And they made a calculated decision that the revenue was worth it.
This is already being compared to the tobacco litigation of the 1990s — the moment a massively profitable industry was forced to answer for what it had always known. Social media companies knew. They built it anyway. And now the courts are starting to keep score.
Why This Matters for You

You don’t have to be trying to build an audience to be affected by this. You just have to be a woman who uses these platforms — which most of us do, every day, often without thinking much about it.
Every time you scroll, something is being decided about you without your knowledge or consent. What you see is being curated to provoke a reaction.
What you don’t see has been filtered out by a system designed by people who were not thinking about you. Your daughter’s feed is being shaped by an algorithm that has already decided what she should look like and feel bad about.
None of this happens in the background. It happens in the foreground of your life, disguised as a way to stay connected.
Knowing it doesn’t immediately fix it. But it changes the relationship. You stop asking what you did wrong when your content disappears. You stop blaming yourself when scrolling leaves you feeling hollow. You stop taking it personally when the platform makes you feel invisible.
It Was Designed to Work this Way. The Juries Said So.
Two juries just confirmed what a lot of us have felt for years: these platforms were never built for us. They were built to extract from us — our time, our attention, our daughters’ self-worth — while giving as little back as possible.
Have you or someone you love felt the effects of this? I want to hear your story. Drop it in the comments — because the more we talk about it, the harder it gets for them to pretend it isn’t happening. 👇
