The Strategy of Health

How AI and Data Equity Are Reshaping Patient Access: A Candid Conversation with Meghan Gaffney, CEO of Veda Data (Now H1)

By: The American Journal of Healthcare Strategy Team | May 05, 2025

Introduction: The New Frontier of Patient Access—Why Data and AI Matter Now

In an era where healthcare transformation is measured by more than just cost and compliance, the real battleground is access—who gets it, how fast, and how well. At the intersection of artificial intelligence, automation, and equity, new solutions are rewriting the rules for healthcare organizations, providers, and, ultimately, patients. As health plans race to personalize care and cut operational waste, the role of AI in navigating the labyrinth of provider data has become mission-critical. But with buzzwords everywhere, few leaders are bridging the gap between technical promise and real-world results.

Enter Meghan Gaffney, CEO of Veda Data (now H1), who brings a rare perspective: a policy insider turned tech entrepreneur, with a pragmatic, story-driven approach to solving one of the industry’s knottiest problems. On a recent episode of the American Journal of Healthcare Strategy podcast, Gaffney shared her journey, hard-won lessons, and what it really takes to make data work for everyone. This post unpacks the conversation—so whether you’re a health system executive, payer leader, or policy-minded innovator, you’ll leave with concrete insights (and more than a few quotable moments).

From Washington Policy to Tech Startup: Why Meghan Gaffney Jumped Into AI

Question: What inspired a seasoned policy leader to pivot into healthcare AI—and what does it reveal about today’s access crisis?

In Meghan Gaffney’s own words: “Like a lot of entrepreneurs, it was a personal experience.” After spending nearly 15 years in Washington, D.C.—“I was in the policy space in Washington DC for just under 15 years and was there during the time when the ACA was crafted”—Gaffney knew the system’s blind spots from the inside out. But it wasn’t until she tried, and failed, to navigate her own daughter’s care through well-insured, well-connected channels that the gap became real. “I could not figure out how to get an appointment with the right specialist for her and get her treated quickly for anybody that was covered in my insurance plan…If I couldn’t do it literally sitting in the halls of Congress…then nobody could.”

That frustration became the genesis of Veda Data. The goal: build scalable AI tools to help people find the right provider, equitably, and fix the “invisible” infrastructure problems that policy debates often miss. Her journey is a case study in how industry outsiders, with grit and perspective, are now shaping the future of digital health.

Key Takeaways:

  • Personal pain points—especially for “insiders”—are powerful innovation triggers.

  • Fixing access means going beyond buzzwords to solve real workflow and data infrastructure challenges.

Building AI Solutions Before “AI in Healthcare” Was a Trend

Question: How do you create cutting-edge AI for healthcare when you’re not in Silicon Valley—and years ahead of the hype cycle?

Gaffney’s path wasn’t typical. Her technical co-founder was “an astronomer…building AI tools in radio astronomy”—not a classic health IT hire. In fact, the team’s initial foray was at a hackathon in 2016, working with real provider data and forced to innovate out of necessity: “We couldn’t make 10,000 phone calls to try to validate the data…so we had to build something scalable and data-driven.”

What set Veda Data apart was its commitment to:

  • Adapt academic machine learning to healthcare data: Borrowing methods from radio astronomy, where “there’s so much data…they couldn’t process it manually…so they had to start writing machine learning scripts to clean up all the errors.”

  • Focus on human impact, not just technical novelty: Gaffney notes, “We build a system that can find the errors automatically clean them up and then take disparate data sets and link them together so we can get the right answer.”

Their Midwestern roots and scrappy, science-driven culture became an advantage, not a liability: “Our company was founded in Madison, Wisconsin—not in Silicon Valley…What it made us do was build things in a smart way that were scalable…it made the technology more affordable for our customers.”

Key Takeaways:

  • Breakthroughs happen when you cross-pollinate disciplines (astronomy + healthcare).

  • Starting “outside the bubble” can force more practical, cost-effective innovation.

Why Accurate Provider Data Changes Everything—And How AI Makes It Possible

Question: What’s the real-world impact when health plans get provider data right—and what role does AI play?

For payers like Humana, the answer is simple: getting provider directories right directly translates to better access, member satisfaction, and savings. As Gaffney shared, “For Medicare Advantage, one of the things that matters most to seniors is when they need an appointment they have to find care and so the accuracy of their directories…is a huge benefit for their seniors getting access to care and access to the right care.”

How AI-powered data accuracy drives value:

  1. Enriched specialty and sub-specialty data:
    “If you need a neurologist, you really need to know does this person treat Parkinson’s or are they treating nerve pain—two very different outcomes.”

  2. Operational cost savings:
    “They don’t have to use their team to do things like make outbound phone calls to figure out the provider’s fax number—they can use that team to be responsive to what their members are looking for.”

  3. Enhanced member experience:
    Seamless, accurate directories mean fewer dead ends for patients—especially vulnerable populations—reducing frustration and accelerating care.

Key Takeaways:

  • Data quality isn’t a technical detail; it’s a frontline determinant of patient access and satisfaction.

  • AI can “scale” what would otherwise require armies of manual phone calls and data cleanup.

The Data Mess in Healthcare: Turning Chaos Into Reliable Insights

Question: With provider and facility data coming from everywhere, how do you actually clean it up—and why does measurement culture matter?

Gaffney doesn’t sugarcoat the challenge: “It’s 100% a mess.” The team’s approach is deeply scientific, drawing from the hard sciences’ emphasis on measurement and iteration. “We now have over 10 PhDs from the hard sciences and they’re used to measuring things over and over again…It’s okay for them to get it wrong culturally.”

What sets Veda (H1) apart:

  • Relentless internal measurement:
    “We try to get the data point right, we measure it internally, and if it doesn’t work that’s okay—we just keep going back to the drawing board.”

  • Specialist expertise for niche problems:
    Gaffney’s example: “We have one former astrophysicist…his entire job is to make sure that we can get the latitude and longitude on hospital buildings and outpatient surgical centers correct…it matters when a patient’s plugging in the data to Google Maps.”

  • Focus on details others ignore:
    Investing in “minor” issues (like mapping complex hospital addresses) pays dividends in access, navigation, and ultimately patient trust.

Key Takeaways:

  • Measurement-focused, scientific cultures adapt faster and achieve higher data quality.

  • Getting the details right can unlock “last mile” improvements in patient experience.

Equity, Hiring, and Building Ethical AI: Practical Lessons for Healthcare Leaders

Question: How do you design AI, teams, and business processes that actually advance health equity—instead of reinforcing old biases?

Gaffney is clear-eyed about the risks: “AI is a tool like any other tool…the machines are incredibly powerful but they do what they’re told to do.” If your goal is only profit, your algorithms will reflect that. But if you build with equity and bias mitigation in mind, you can change outcomes.

Strategies for ethical, equitable AI:

  • Reflect the communities you serve:
    “We actually built our data science team to reflect the communities of people that we serve…having all parts of our community reflected in the teams can be a really powerful way to prevent unintended or harmful outcomes.”

  • Recognize application and hiring bias:
    “If you look at a job description…women will only apply…if they meet 100% of the qualifications. Men will apply if they meet 50%…So in order to get more diverse teams you really need to do some outbound recruitment.”

  • Create structural incentives:
    “We give a referral bonus…if someone refers in someone and they get hired and they stay for a period of time they get a bonus for doing that.”

Key Takeaways:

  • AI is not neutral—diverse teams and intentional objectives produce better, fairer tools.

  • Outreach and thoughtful recruitment are essential for real diversity; passive pipelines won’t cut it.

Why Retention and Fair Compensation Are Strategic Imperatives

Question: How can startups justify the cost of hiring top talent—and what’s the ROI for customers?

Gaffney doesn’t flinch from the numbers: “Paying people fairly is expensive. Paying to constantly recruit and hire is also expensive.” Veda Data invests heavily in people, covering “100% of our employees’ healthcare premiums for them and their family for a no deductible plan.” The payoff is employee loyalty, deeper expertise, and better results for customers—especially as healthcare problems require deep domain knowledge.

“If you had somebody that’s been thinking just about hospital addresses for two years, they can do so much more…Our customers appreciate the continuity frankly…they value that as well and are willing to invest a little bit in our team in addition to the technology.”

Key Takeaways:

  • Investing in people drives lower turnover, deeper knowledge, and better outcomes for both the company and its clients.

  • High retention and fair compensation are especially valuable in data-driven, relationship-dependent industries like healthcare.

Responsible AI: Balancing Privacy, Bias, and the Business Case

Question: How can health tech companies prevent algorithmic bias and privacy risks as they scale up AI?

Gaffney is direct: “If you have an engineering and innovation team that is building with profit in mind and they’re not thinking about bias and they don’t have objectives set out for things like making sure the outcomes are equitable, you will get a profit-only answer.” The antidote is proactive governance:

  • Intentional design and safety procedures:
    “If you ask these tools to build things with equity and bias in mind and you put the right safety procedures and checks in place you will get an outcome that reflects that intention.”

  • Team diversity as a safety check:
    “They understand the impacts that those biases can have because they’ve seen them in their own lives.”

  • Continual measurement and openness to correction:
    “It’s okay for them to get it wrong culturally…in science, you can publish a paper that says ‘Hey, we thought this was a cool signal but it was actually an airplane’ and you don’t get fired for that.”

Key Takeaways:

  • Build diverse, intentional teams and set explicit, equity-focused objectives for every AI deployment.

  • Accept that iteration and correction are essential for responsible, high-quality AI.

The Pragmatic, Data-Driven Culture That Drives Results

Question: What cultural values underpin Veda Data’s (now H1’s) success—and how can executive teams apply them?

Gaffney credits both Midwestern pragmatism and her time in bipartisan policy work for her results-driven, scientific mindset: “There is some Midwestern pragmatism of just trying to get things done and looking and seeing what works and being committed to doing the thing that works not necessarily the thing that you wanted to do.”

Just as importantly, she notes the role of data in uniting teams across divides: “If they could both look at information and agree like okay this study makes sense…now we can talk about we might have different approaches to solve the problem but we can all agree on what the problem is and the data is.”

A defining hiring principle at Veda Data: “The one characteristic when we hire people that we ask our people and culture team to look for is folks that are willing to change their mind with new information.” This trait supports agility and innovation—both vital in a fast-moving, data-rich landscape.

Key Takeaways:

  • Reward open-mindedness and adaptability, not just experience.

  • Make data the neutral ground for debate and decision-making.

Where Is Healthcare Data and AI Headed Next?

Question: What does the future hold for AI in patient access and healthcare personalization?

Gaffney is optimistic: “I think what we’re going to see is the dream of personalization of your healthcare working uniquely for you…more and more a reality every day.” For this to happen, patients must see tangible, positive impacts—making their lives easier, safer, and healthier.

Her advice to other founders: “When we do right by patients at the end of the day, we’re building the industry acceptance for all of us.” The technology must work for people, not just as an abstract tool, but as an enabler of better outcomes and equity.

Key Takeaways:

  • Personalized care is no longer a pipe dream; it’s becoming an operational reality, driven by robust, responsible AI.

  • The key to future adoption is delivering real, visible value to patients, not just to payers or providers.

Conclusion: Actionable Insight for Leaders—Data Quality, Equity, and Pragmatism Win

Healthcare’s next leap forward will be won not by the flashiest algorithm or the deepest pockets, but by leaders who make data work for people—relentlessly, transparently, and equitably. Meghan Gaffney and the Veda Data (now H1) team offer a model: combine policy savvy, scientific rigor, and real-world empathy. Whether you’re building AI, managing teams, or trying to fix your own provider directory, take a page from their book: hire open-minded people, measure everything, invest in equity, and never lose sight of the patient’s experience.

For your team:

  • Audit your provider data.

  • Diversify your hiring.

  • Make open-mindedness a core value.

  • Tie every tech investment back to patient impact.

It’s not just the future of AI in healthcare—it’s the future of access itself.