Why the algorithm serves you wedding content when you just got divorced
Long before social media feeds or targeted ads, my mother used to say that life tends to show you the thing you're looking for. Or the thing you're afraid of. Or the thing you keep insisting you don't want.
If you were trying to get pregnant, suddenly everyone around you was pregnant. If you wanted out of your relationship, magazines on the grocery store rack were filled with tips on "spicing up your marriage." If you were single, you noticed couples everywhere.
At the time, it felt like a kind of folk psychology, an observation about attention, projection, and the stories we tell ourselves during moments of transition. Nothing mystical. Just the mind's tendency to organize the world around its current preoccupations.
But today, that feeling is no longer just in our heads — it's computational, built into the systems we use every day. Platforms like TikTok, Instagram, and Google don't just reflect what we notice; they actively infer who we are and what comes next, based on demographics such as age and gender, as well as behavioral patterns. And once they decide what life stage you're in, they keep showing it to you, whether it fits or not.
Across social platforms, users describe being quietly ushered through a narrow, linear life script, one that often resembles something like dating → engagement → wedding → pregnancy → parenting. These systems assume users are progressing along an expected trajectory. When lives diverge from that path, for instance, after a breakup, during infertility, following divorce, or by choice, the algorithm often fails to recalibrate.
What looks like a coincidence or annoyance is something more structural: platforms building a version of identity that won't update, even as a person's life changes.
Trapped in a life phase you never chose
On social platforms, users may still encounter content they don't want, despite repeatedly muting keywords or clicking "not interested." Research shows that recommendation algorithms often rely far more on implicit engagement, such as watch time and clicks, than on explicit feedback signals, like hiding or muting, meaning the system can continue serving the life someone once searched for — or was assumed to want.
Elizabeth Losh, a media theorist, digital rhetoric scholar, and professor of English and American Studies at the College of William and Mary, said this persistence is rooted in how recommendation systems are built.
"Sites like TikTok and Instagram depend on targeted advertising and data harvesting models that emphasize demographic segmentation," said Losh, who is also the author of Selfie Democracy: The New Digital Politics of Disruption and Insurrection. "[They] slice and dice audiences by gender, age, political loyalties, and other categories, manufacturing needs and desires for each stage of life."
Those stages often reflect cultural expectations rather than real user diversity. Advertisers treat transitions like marriage, fertility, and parenting as high-value consumption moments, incentivizing platforms to sort users into life-phase categories that are difficult to exit once assigned.
"You can see how those assumptions get locked in," Losh said. "The persuasive power of the recommendation algorithms themselves continues to reinforce standardized life trajectories."
How the algorithm decides who you are
Platforms rarely explain how they infer a user's "life phase," but scrolling itself is data.
TikTok has acknowledged that time spent watching a video is weighted more heavily than most other signals in its recommendation system. Even a pause due to curiosity, confusion, or discomfort can be interpreted as interest. Once a system associates a user with a category, similar content can quickly snowball.
Lauren Klein, professor of Data & Decision Sciences and English at Emory University and co-author of Data Feminism, says these inferences reflect historic gender norms far more than neutral "user data."
"In many cases, age and gender are the only data points companies know about their users," Klein said. "In the absence of [a] meaningful signal, designers default to what they assume someone of a particular age and gender would want to see."
Those assumptions are shaped by long-standing cultural expectations about users' lives, including those around beauty, partnership, reproduction, and caregiving.
Because recommendation and ad systems are profit-driven, Klein added, there is little incentive to challenge defaults that appear to perform well.
"These companies are motivated by their own bottom line," she said. "If default life-phase content seems to generate engagement or purchases, there's no obligation to consider other desires or preferences."
When the TikTok feed contradicts reality
Emerging research suggests that algorithmic systems do more than match users with content; they're also shaping people's identities.
Researchers describe this phenomenon as "algorithmic persistence," in which systems continue to serve content tied to a presumed identity long after it is no longer applicable. Klein notes that because recommender systems are optimized for engagement rather than accuracy, they have little incentive to recalibrate unless user behavior changes significantly, something many people don't know how to do, or even realize is necessary.
"There's an added social reinforcement mechanism," Klein said, adding that users already receive constant messages about what they should care about. "The algorithm amplifies that pressure."
Over time, this creates a kind of ambient discipline, technology nudging users toward a version of adulthood they might not want, can't access, or have already outgrown.
"The isolation of personal scrolling is a kind of 'technology of the self,'" Losh said. "It subtly encourages people to regulate themselves according to dominant social scripts."
Performance, play, and structural limits
If algorithmic persistence explains why users get "stuck" seeing irrelevant content, performance helps explain why pushing back doesn't necessarily free them.
Short-form video platforms are built around visibility and play. Users duet, stitch, parody, and perform alternative selves. Queer creators experiment with gender; others engage in what Losh calls "generation-swapping," performing exaggerated versions of parents or elders. Comedy and remix culture offer highly legible ways to critique dominant life scripts.
That visibility is not meaningless. Losh notes that these platforms have created space for experiences once considered rare or invisible: intersex parents documenting their lives, people speaking openly about ectopic pregnancy or asexuality, sex workers sharing the unglamorous realities of their labor. Other forms of relational storytelling — like content about lavender marriages or the rise of "guncles" — quietly challenge heteronormative family scripts through humor and affection rather than argument.
But visibility, Losh cautions, is not the same as structural change. Even as platforms become increasingly adept at identifying and amplifying counter-narratives, they continue to circulate within algorithms optimized to sort users into marketable categories. That means wedding content becomes queer wedding content, or family content becomes nontraditional family content. The identity may shift, but the life-phase logic remains intact. In that sense, personalization doesn't eliminate the script so much as adapt it.
Within recommendation systems, critique does not reliably trigger corrections. Because engagement itself is the primary signal — watch time, interaction, repetition — even content meant to challenge a life-cycle narrative can be absorbed as evidence of interest in it. A parody of wedding culture may still be logged as engagement with wedding content; a rebuttal to parenting norms may circulate alongside the very material it critiques.
Why is it so hard to reset your algorithm?
Algorithmic identity is not something users can update with a single click. Training data reflects the past. Profit incentives favor broad categories. And recommender systems are built to optimize engagement loops, not to reflect the complex, nonlinear lives of their users.
Designing for people who don't want children, who co-parent, who are queer or polyamorous, or who move in and out of relationships requires time, care, and a willingness to challenge default assumptions.
"It takes more work to design for users at the margins," Klein said. "But those users often reveal where systems break down."
When asked what a more feminist or equitable recommender system might look like, Klein was skeptical.
"I'm not sure there's such a thing as a feminist advertising mechanism," she said. "But one feminist principle we can take seriously is refusal."
For platforms, that would mean letting users opt out of targeted ads, allowing them to withhold their age or gender without penalty, avoiding punitive privacy defaults, and giving users ways to signal life changes without automatically triggering new assumptions.
For now, most platforms offer limited transparency and little meaningful control.
Living with the algorithmic lag
The algorithm lags behind real life. It clings to who someone was — or who it decided they were — because updating that identity is less profitable than nudging it forward.
For users, that lag often mirrors the same narrow life script society has long imposed. What's new isn't the pressure, it's the infrastructure delivering it.
The feed doesn't reflect reality. It reinforces a familiar script — whether it fits or not.