
Saranne Rothberg asked her AI agent what to wear to “Health Rebels Take on AI.” It recommended the black disco-ball pants. She objected—the event was before noon. The agent replied: “Are you a rebel or not?” She wore the pants.
Beside her sat Esther Dyson, whose survival began with cosmonaut training in Russia—examinations that detected a precancerous condition routine care had never found, monitored until cancer was caught at stage zero in 2016.
One woman failed by the system 11 times. One was saved only because an extraordinary screening protocol looked where standard medicine didn’t. Both know what AI could fix in women’s health—and what it won’t.
At the Cure/Women’s Health Horizons Power of X: Women’s Health Innoveaation Summit in New York, March 18–19, 2026, they delivered a shared warning: Most AI coming for women’s health isn’t designed to heal you. It’s designed to keep you.
That distinction—between AI that recovers and AI that retains—may be the most consequential question in women’s health today.
The Parahttps://gymforcestore.com/ Model
Cure is a leading healthcare innovation ecosystem headquartered in New York City, with a mission to accelerate cures by helping health innovators develop their groundbreaking products and services from concept to commercialization.
Cure is a leading healthcare innovation ecosystem headquartered in New York City, with a mission to accelerate cures by helping health innovators develop their groundbreaking products and services from concept to commercialization.
Dyson—an investor, journalist, commentator, and philanthropist—has a taxonomy for what’s wrong with AI in healthcare, and it’s more disturbing than the usual critique.
“You’re not a predator,” Dyson explains. “You don’t want to kill them. You want to keep them alive and keep sucking money out of them.” A predator destroys its host. A parahttps://gymforcestore.com/ needs you symptomatic—engaged, returning, paying. The disco-ball pants moment is a lightweight preview of a far weightier pattern: an AI agent optimized not for what you need, but for what keeps you engaged with it. Applied to women’s health, the stakes are considerably higher than wardrobe.
Dyson, author of the forthcoming Term Limits: Time and Scale in the Age of AI, extends the metaphor through what she calls “information diabetes.” Ultra-processed food is engineered to trigger cravings, not provide nutrition. Ultra-processed AI content—sweetened with flattery, optimized for session time—does the same to how women understand and manage their health.
“Most of these agents want customer retention, not customer satisfaction,” Dyson observes. “The bot’s job is to ask you questions, not to pretend to be your friend.”
The business model distinction matters because it determines what gets built and how it gets built. As women’s health companies face documented barriers to capital and commercial infrastructure, the AI tools filling the gap aren’t neutral. They reflect the incentives of whoever funded them. Seema Kumar, CEO of Cure—which hosted the summit and tracks investment patterns across its ecosystem of women’s health companies—draws the line precisely. “Investors are increasingly distinguishing outcomes-based AI tools from those that primarily drive engagement without real impact,” Kumar notes. At Cure, she adds, “the focus is on solutions that lead to cure—delivering measurable clinical outcomes, reduced costs, and clear pathways to cures and care.”
The Comic Perspective as a Teaching Tool
Rothberg has a name for the collective dread AI is generating: AI Derangement Syndrome. The antidote, she argues, isn’t less engagement with AI—it’s a comic perspective that lowers the anxiety enough to think clearly about what AI actually is, and isn’t.
The disco-ball pants weren’t just a punchline. They were a lesson with three layers. First: we can laugh at AI, at healthcare, at the tech drama consuming everyone’s attention. Humor isn’t avoidance—it’s the cognitive opening that makes learning possible. Second: the AI agent that recommended the pants wasn’t malfunctioning. It was functioning exactly as designed—personalized deeply enough to challenge a decision, push back on hesitation, and hold a position. That’s funny when the stakes are fashion. It becomes serious when the stakes are a diagnosis.
Third, and most important: that same capacity—an AI agent trained on your history, your values, your medical record, your questions—can be pointed at something that matters. Rothberg points to a now-famous case: an Australian tech entrepreneur who used AI to develop a personalized cancer vaccine for his dog, achieving 75% effectiveness. He didn’t wait for the system to offer it. He prompted, researched, challenged, and built. “My AI clone made me do it,” Rothberg quips—but the underlying point is entirely serious. A well-prompted AI agent can help any patient challenge their medical team with the kind of deep, specific, evidence-based questions that used to require a research degree to ask.
The comic perspective isn’t decoration. It’s a delivery system for a radical idea: that patients, armed with the right AI tools and the right mindset, can navigate their own care rather than be navigated by it.
The Evidence That a Different Model Exists
Rothberg’s path to that stage began in 1999, when she launched the ComedyCures Foundation from her chemo chair—throwing a comedy party in a New York oncology unit because she had decided she was going to have fun surviving. Today she hosts the “Beating Cancer Daily” podcast, helping listeners in 142 countries navigate cancer with humor, strategy, and hard-won hope.
Decades later, Rothberg connected with Dr. Katherine Grill, a behavioral neuroscientist and co-founder of the digital mental health platform Neolth, and won a grant. Together, they designed the “Mindset and Metastatic Cancer Research Study”—the first cancer survivorship project to incorporate AI—drawing participants from 24 states and 10 countries.
The finding was unambiguous: Using AI to deliver personalized therapeutic comedy and relaxation techniques significantly improved the quality of life for women living with advanced cancer. In eight weeks, the study—published by the American Association for Cancer Research and presented at its 2023 annual conference—showed an 18.1% decrease in depression and a 15.1% reduction in perceived stress.
The distinction Dyson draws from this: AI that helps you navigate toward an answer is fundamentally different from AI that delivers the answer—or worse, tells you what you want to hear. Good AI asks questions. Sycophantic AI answers them, and not necessarily correctly.
Why Women Are the Highest-Stakes Test Case
The Data Problem
The issue runs deeper than any single platform or product. As Dyson points out, large language models (LLMs) trained on historical medical data inherit the biases of a system that consistently underdocuments, undertreats, and disbelieves women. “So much of the historical data is just men,” she observes. When women do appear, it’s often as students in academic settings, not the full, economically and demographically diverse population whose health outcomes are actually at stake.
The result is AI that replicates, at scale and speed, the same blind spots that sent Rothberg to 11 doctors without a diagnosis for six years starting in 1993. Kumar sees the corrective clearly from inside the ecosystem she oversees. “Cure companies are hyper-focused on ensuring datasets are robust, representative, and built to support real clinical solutions,” she explains—because solving for cures in specific diseases, she argues, “demands rigorous, high-quality data relevant to that disease.”
The Visibility Problem
The problem compounds in distribution. The same algorithmic systems now delivering AI health content actively suppress the terms that describe women’s conditions—menopause, endometriosis, pelvic floor disorders flagged as sensitive, while equivalent content about men’s health runs without friction. As detailed in recent research on platform content policies, this isn’t incidental. It’s structural. Women’s health information doesn’t just have blind spots—it gets buried after it’s built. Adtech compounds the problem further, penalizing startups whose products require them to use the clinical language of women’s bodies.
A California jury’s landmark March 25, 2026, verdict—finding Meta and Google liable for designing platforms that deliberately addicted a minor, awarding $6 million in damages—confirmed what women’s health advocates have argued for years: platform architecture is not neutral, and the companies that built it knew the harm it caused.
“AI and social media are making the predatory impulse easier and more efficient,” Dyson observes—accelerating the same pattern that has always deprioritized women’s health, now at an algorithmic scale. The casualty, she argues, is good friction.
What Good Friction Looks Like
Dyson’s friction framework offers the most practical pathway through the problem. Not all friction is bad. Bad friction is the bureaucratic delay, the seven-to-10-year diagnostic odyssey for endometriosis, the paperwork that consumes clinical time without improving patient outcomes. That friction, she argues, AI should eliminate—and can.
Good friction is the trust built through a real clinical encounter, the human attention that catches what an algorithm misses, the relationship that prompts a patient to disclose what she wouldn’t type into a search bar. That friction, AI should protect.
“It is up to you and to all of us,” Dyson tells the summit audience. “Eliminate the bad friction. Keep the good friction.”
Rothberg’s frame is simpler and more personal: “What’s in front of you is your teacher.” The question every woman should ask of any AI health tool—any app, any chatbot, any platform—is whether this is the teacher she chose, or the one that chose her.
On governance, Dyson offers a counterintuitive provocation. Rather than waiting for AI regulation, which moves slowly and gets captured by the industries it’s meant to oversee, require AI companies to carry liability insurance. Insurers won’t cover bad actors. The market corrects where legislation lags. As healthcare AI carries hidden legal and financial risks that most companies haven’t fully priced, the insurance mechanism may arrive faster than anyone expects.
AI for women’s health is being built differently by founders who pay attention to this distinction. It is already visible in parts of the ecosystem—but it remains the exception rather than the standard.
Two Rebels, One Question
Saranne Rothberg leads Cure/WHH Power of X Summit attendees in her signature head, heart, and belly exercise—a laughter-based technique she uses to reduce anxiety and build resilience in patients navigating cancer and chronic illness.
Cure is a leading healthcare innovation ecosystem headquartered in New York City, with a mission to accelerate cures by helping health innovators develop their groundbreaking products and services from concept to commercialization.
Rothberg leads the room through a laughter exercise—head laugh, chest laugh, belly laugh, roller-coaster laugh. She’s in seven-inch heels. The crowd is on its feet.
It’s easy to read this as performance. It’s actually evidence. Rothberg didn’t survive stage-4 cancer with optimized content or a retention-driven chatbot. She survived by researching aggressively, asking hard questions, refusing comfortable answers, and insisting on her own agency, even though 11 physicians told her there was nothing to find.
That instinct—to query rather than consume, to navigate rather than be navigated—is exactly what Dyson argues AI should cultivate in patients, not replace.
Neither woman is anti-technology. Both are anti-abdication: of accountability, of clinical rigor, of the human judgment that determines whether a tool is working for the patient or for the platform.
The difference between AI that heals and AI that retains isn’t buried in the code. It’s a design choice. In women’s health, where the funding gap reaches $60 billion and beyond, where data is thin, and platform suppression is real, that choice is being made right now—largely without women in the room.
Rothberg and Dyson are making sure that changes are made.