The Healthcare AI Paradox: Maximum Hope, Minimum Guardrails
By Elaine Ogden, EVP, Data and Analytics
I’ve had a wonderful few days at SXSW full of learning and thought-provoking conversations. So much, in fact, that I’m sharing a series of reflections; on uncertainty, trust, and the broader implications of what we’re all building toward.
A recurring theme across panels was the existential question: “Is this humanity’s last dance?” While (almost) everyone answered “no,” the fact that nearly everyone felt compelled to grapple with this potential reality speaks to the depth of uncertainty we’re living in. Across conversations, whether about art or technology, people kept returning to the same question: what actually makes us human? Judgment, creativity, compassion, even our physical form.
With that in mind, I was especially glad to spend a day at The Future of Health 2026 and to join the panel “Public Health in the Age of Algorithms.” Because if there is one place where even AI skeptics lean in, it’s healthcare. The potential is undeniable; curing disease, expanding access, simplifying deeply frustrating systems.
But healthcare is also where the stakes are highest. It is, appropriately, one of the most regulated environments we have. This is not a space to “move fast and break things.”
And yet, that’s effectively what’s happening.
The appetite for AI-enabled health guidance is rapidly outpacing the systems designed to safeguard it.
People are increasingly turning to LLMs as a first point of consultation, whether for minor symptoms or more serious concerns, often sharing deeply personal information and, in many cases, trusting the response as authoritative.
While the healthcare community has long discussed “Dr. Google,” physicians have remained the most trusted voices in healthcare. But the immediacy and intimacy of these conversations with LLMs signal a genuine trust shift.
We now have tools being used in quasi-clinical ways, without being governed like clinical tools. There is little clarity on how sensitive health data is stored, used, or potentially monetized. Dr. ChatGPT is not HIPAA compliant; these interactions are not governed by HIPAA in the way clinical settings are. And with advertising models on the horizon, the question of how responses could be shaped, or influenced, becomes even more urgent.
At the same time, regulatory and institutional frameworks are moving slowly, while adoption accelerates in what is, for most users, a black box.
At Precision, we can’t solve that systemic challenge alone, but we can make the black box more legible.
Through our GEO mapping, we’re helping clients understand what people are actually asking, what answers they’re receiving, and how those answers are being framed. If LLMs are becoming the new front door to health information, then the question becomes: what are people hearing before they ever engage with a provider, and how do we ensure that information is accurate, responsible, and human-centered? Are answers grounded in credible, evidence-based sources, or are they overly generalized, outdated, or misaligned with clinical best practices?
We’re also helping organizations understand where and how they can responsibly show up in this ecosystem. Unlike traditional search, where visibility is tied to rankings and keywords, LLM environments require a different kind of participation; one rooted in clarity, consistency, and credibility across a fragmented and rapidly evolving information landscape. Providers and health systems need to be thinking about how their expertise is represented, how their content is interpreted, and what signals LLMs are using to determine authority in the first place.
And importantly, this visibility creates the opportunity for influence. If we can identify the sources that are consistently shaping LLM outputs, we can begin to understand how narratives are formed and where to engage to shape them. And when we surface inaccuracies, even if they originate in places we can’t directly control or correct (like Reddit or other user-generated forums), we can work upstream and across the broader information ecosystem; strengthening the quality, clarity, and accessibility of the sources LLMs are more likely to cite. In doing so, we can help ensure that more accurate, contextualized information enters the model’s response set, effectively correcting the record over time.
Our analysis doesn’t resolve the broader tension. But it does create a shared starting point; a way for stakeholders across healthcare, technology, and policy to engage with something that is otherwise opaque. Because before we can govern, improve, or build trust in these systems, we first have to understand how they are already operating in the wild.
There is much more to unpack here. But spending this week in conversation with so many thoughtful leaders reinforced one thing for me; we need to be engaging with these questions far more directly.
This isn’t a future problem. It’s already shaping how people understand their health. If you’re thinking about this too, I’d love to continue the conversation.

Elaine Ogden | EVP, Data & Analytics
Elaine Ogden is a data and communications strategist helping organizations understand how people think, feel, and act—and turning those insights into measurable impact. At Precision, Elaine is an executive vice president leading the Data and Analytics practice. She works with companies, campaigns, and nonprofits to decode audience behavior, develop data-driven strategies, and build frameworks to measure success. Her clients span industries—from healthcare and tech to consumer brands and advocacy organizations. Elaine previously served as Deputy Assistant Secretary of State for Research and Analytics in the Bureau of Global Public Affairs, where she managed a global team of more than 75 and oversaw an $11 million budget. She led the State Department’s opinion research, media and social media analysis for communications, and spearheaded the launch of an AI-powered insights platform projected to save more than 180,000 staff hours in its first year. Elaine previously worked as the Director of Analytics at W2O Group (now Real Chemistry), helping Fortune 500 companies and mission-driven organizations craft communications rooted in data. She has also led insights and analysis work within gaming, production, and startup companies.