Longitudinal: Deven McGraw

Date
June 20, 2024

Meet the speakers

Datavant
Shannon West
Chief Product Officer
Ciitizen
Deven McGraw (Ciitizen)
Chief Privacy and Regulatory Officer

About the podcast

Shannon West talks with Deven McGraw, Chief Privacy and Regulatory Officer at Ciitizen about:

  • The Ciitizen platform
  • Data sharing post-Dobbs
  • The future of connected health data
  • AI and health data privacy

Full Transcript

<Theme music> 

I'm Shannon West. I believe we all want a better future for healthcare. I've spent much of my career on core issues, serving as the Executive Director for Healthcare at the United States Digital Service, CTO at the Innovation Center at the Centers for Medicare & Medicaid Services, and today, I'm the Chief Product Officer at Datavant. I want to support patients in accessing their medical data any way they can, whether that’s through apps, fax, email, or people walking in the door. On this show we invite guests to talk about exactly that. 

(01:56) Part 1: Meet Deven McGraw

Shannon: Here we are. Thank you, Deven, for being willing to jump on today and chat more about you, about your history and helping patients access their medical records and what's happening in the industry as a whole.

So maybe to just jump in, would you share a little bit about who you are, what you're doing today, what you've done in the past at a high level? And then I'll jump in with some specific questions as we go.

Deven: Okay, great. Thanks, Shannon. I'm excited to be here. I love to talk about all this stuff, as you know.

So, I am the Chief Regulatory and Privacy officer at Ciitizen, which is a company that helps patients gather all their medical records from all the places where they've been seen so that they then have that data and can use it for self-care, to coordinate care, to find second and third, fourth, however many opinions, or to contribute to research, or to do none of that and just stare at their health data all day long. Whatever is their choice.

That's what we do.

And prior to joining Ciitizen, I was at the Department of Health and Human Services where I was the Deputy Director for Health Information Privacy, in the Office for Civil Rights [at HHS], so all things HIPAA, and the acting Chief Privacy Officer at the Office of the National Coordinator for Health IT.

Shannon: Awesome. And I think my favorite fact about when we met is, I was just starting at the Center for Medicare and Medicaid, you had just left OCR, and everyone kept saying, Deven McGraw is the only one who can help us figure out how to answer questions on patient privacy. And so even though you were no longer at HHS, I was still emailing you on your personal Gmail, asking for advice on how to go to OCR to ask the questions that I needed answers to. So, I think you might be the OCR-Navigator-Privacy-Guru for many people.

Deven: I try. They're very busy and very short staffed at OCR. So even though I don't officially speak for them anymore at all…you're not anywhere for 2 to 3 years without taking some knowledge with you.

So I try to be helpful.

Shannon: It is fascinating how under-resourced I think OCR is, and has been, as we continue to pile mandates to patient privacy, enforcement for the various regulations, things like that.

Okay, so the first time I met you, you were at OCR. What did you do before the Office of Civil Rights?

I actually don't know this, so I'm super curious.

Deven: I had been at the Center for Democracy and Technology, directing its Health Privacy Project. So a nonprofit, nonpartisan, civil liberties organization that promotes digital rights on the internet. And they had some funding to focus specifically on protections for health data as we increasingly digitized it, whether that was digitizing that information in the traditional healthcare system or the amount of digital data that was being collected from people's online activities, both knowingly and unknowingly, by consumers and patients.

So it was a great job. I was there for six years, I think. It's nice to have funding to think about these issues full time, and to think about them through this neutral lens, right? I wasn't working for a company. I wasn't working for the government. I got to be really objective.

And that was super fun. And I did a stint in a law firm in between that and HHS. I was a partner at Manatt, Phelps & Phillips, helping companies comply with HIPAA, which is a job I really liked. I like telling people what they need to do, to do the right thing with patient data. But, the job I took at the office for Civil Rights opened up after I had just been at Manatt for a year.

And it really was the top career job in that office. So all things enforcement, all things policy. I had this laundry list of things I wanted to do to make HIPAA work better, particularly for patients in terms of their rights to get data. And it was a really amazing opportunity at just the right time to go in and try to make a difference.

Shannon: I love that. And you are so right. There are so few moments in our careers where we get to be really unbiased in the way that we approach a problem, because we are always acting on behalf of an organization. And it is always interesting in this space to think about things through that lens.

Deven:  It's really nice.

So when you started at Ciitizen, one of the things that I remember early on, and actually I have this memory of us sitting in a cafe in San Francisco, I think, talking about this, but, were the scorecards that you guys put out for a report. Were they scorecards or report cards?

Deven: We called it the Patient Records Scorecard. But it was really kind of more maybe a cross between a scorecard and a report card, because we gave grades. We gave numerical grades, but we also provided feedback to any organization that was on the scorecard that wanted it, about why they didn't get a good score or, if and why they got a good score, and created this entire framework for evaluating when we put in a request for our user.

So I should probably back up a minute.

One of the things I got to do when I was at OCR was to write pretty extensive sub-regulatory guidance: FAQs on the right of patients to get their health data, because it's a right that has existed for more than 20 years, but patients really struggled to exercise it.

They were charged too much money. They were told they had to come in person. They were told they didn't have the right to get their data. And so there was a need to sort of elevate the profile of that issue and to make sure that entities covered by the rule, who would be required to make data available to patients upon request, understood what their obligations were.

So we had just gone through this period of writing this guidance, and I get to Ciitizen, I’m one of the first employees, we're sending out our first requests for some of our beta users of the platform. And I just figure, oh, I put this guidance out, they know what the requirements are. Sent out requests. Waited 30 days, which is the time frame under HIPAA under which - an entity can take up to 30 days to send your records, and no records came.

What the heck just happened?

So then we spent a bunch of time making phone calls, trying to figure out what went wrong. And it became pretty clear to us that notwithstanding all the work we did to issue this guidance, people weren't either aware that it existed or they were ignoring it, or it was enough attention hadn't been paid to making sure that these units were in compliance.

And so we decided that if we wanted to improve it, you know, there's this whole quality measurement in this enterprise that exists in healthcare, where if you measure something and publicly report on it, it improves. It's like, we need to rip a page of the quality measurement playbook and start publicly reporting by name on how well institutions are doing in getting patients their data.

So we decided that we would publish without patient names on it, but just for experience. And we did a survey actually, because we only had record requests into like 75 organizations. It's not a whole lot of data, but we did a survey of 3,000 hospitals and asked them, how do patients get their data? And then published in a preprint server.

We never took the step to actually get it submitted to a peer reviewed journal. But it was published in a preprint server with the intent on peer review publication. This is the experience. I don't know whether it made a difference or not, but people became aware that this effort was underway.

I got, of course, several phone calls from some health information management departments. What is this? What are you doing?

We continue to regularly publish the scores of organizations that we requested data from. It is tied to our database that we use to record whether data requests have been responded to. So it automatically populates. It has more than 5000 organizations…

Shannon: Wow.

Deven: Offices and hospitals on it now. And it's a score of 1 to 5.

You know, you get a “5” by doing extraordinarily well - like above and beyond what HIPAA would require. And then a “4” is you’re compliant. And then a ”3” is well, you know, you came close. I have to refresh my memory on the scores. But it's all very transparent. We remain open to entities calling us and saying, why did we get this score?

There are still a lot of ones we even created. A “0” score means we didn't get any records. A “1” score is, we got records,  but you were out of compliance with one or two provisions, usually around fees or timing.

Shannon: It's super interesting and interesting to hear you react like, you don't know if it made a difference because I think it did.

It was really the first time that I think that a health information management department was being measured in a way that was public, that people could see where it was – I don't think your intention was for it to be like a wall of shame, but  it could be that. And I do believe sunlight is the best remedy for us to look at any of these problems and think about how we should solve them as an industry, because before that moment in time, all evidence for what was happening was really anecdotal, right?

It was very much the Tweets that you see from people complaining about not getting their medical records. It was, you know, the comments on LinkedIn. And it was typically people in the industry complaining. It wasn't even in many cases the average patient. It was people who knew how to navigate the system, who knew how to advocate for themselves.

Deven: Yeah. 

Shannon: So maybe to flip this to this, that was actually the first time that I really heard of or knew what was happening at Ciitizen. 

(12:55) Part 2: Ciitizen

Can you give me the overview of what is Ciitizen, what do you guys do? How do you support patients? How do you support the future of care? We'd love to just be able to share that with folks.

Deven: Yeah. So again, Ciitizen is a platform - it's an online platform. It's free to patients that we serve and we don't serve every patient. And I'll explain why in a minute. We use your right as a patient to get all your medical records. And that right is kind of anchored in two federal laws. One is HIPAA, which we've just talked about.

And then the other one is the Information Blocking rules, which frankly, did not exist when Ciitizen first started. It was HIPAA, HIPAA, HIPAA, we'll just rely on the HIPAA rules. We now talk about whether an entity is potentially not in compliance with the Information Blocking rules, but they're not really being enforced against medical providers yet.

So that doesn't have quite as much pull, I think. It's pointing out to someone, you're out of compliance with HIPAA, but we use HIPAA Right of Access. We ask patients where they've been seen, and then we go get their records. We ID proof them, and at least in  a sort of version 1.0 of Ciitizen, we actually would send a copy of that driver's license and a signature (people would sign that they wanted their medical records requested).

We serve patients who are very sick with particular diseases, for lack of a better way to frame it, because it's not just a shoebox of your documents. What happens when you get medical records and you're sick and it's multiple providers is it's thousands and thousands of pages of records and some of it is duplicated data and some of it is conflicting data.

So we will also prepare summaries for patients of, here you are in your care journey. And it takes a bit of time to abstract that data. And it does mean it has to be sort of disease-specific. So we can't serve all patients with that kind of a model because we don't just give you, oh, here's all your documents. Good luck. We try to give our patients, our users, something that works for them and helps them.

And then one of the things that we also allow them to do is to share their data with researchers, that de-identified abstract of their medical records. You can say no and still have a Ciitizen profile. We'll go get your data. It's yours for free. If you share it for research purposes and it's licensed to a commercial researcher, like a pharmaceutical company, then you get a portion of that licensing fee, and that is also the business model of the company because again, it's free for patients, but it is your choice. You don't have to participate in research. Otherwise it's not really a choice if I tell you you can't have a Ciitizen profile for free unless you participate in research.

So we have been increasingly focusing on patients with rare disorders. And within the broad universe of rare disorders, rare neurological disorders for now. But obviously we're going to expand that. Patients who are very sick have a lot more motivation to collect and use their data than those of us who are healthy.

I don't know about the rest of you fortunate to not be sick. I don't look at my health data. I collect it because I've had enough experience with changing providers to know that if you don't have copies of that data, it becomes very difficult to trace your medical history. Ideally, when we have complete interoperability in the healthcare system, maybe it won't be so incumbent on patients to sort of bring their data to every encounter, but that still is an issue.

I actually just, had to forward my report from my prior colonoscopy. This is my third. Guess what age I am? But it's like, oh you've had prior ones, do you have the reports? I'm like, yes, I do. And it still mystifies me why I'm the one who is supposed to do all of this.

But you can see for people who are very sick, it really makes a difference for them. And parents of sick kids in particular. And a lot of our rare neurological disease patients are our kids.

What are the dynamics like when it's actually a parent requesting a medical record for a kid?

It's harder, right? There's some additional nuance to it. I would love to maybe just hear a little bit about that piece because, I've only heard rumors of it. I have experienced this a little bit with my child. But, you know, I'm extremely lucky that she's very healthy. But would just be curious, what are the nuances there?

What do you have to change about your process? Or how do you also think health information management teams should be changing in order to support it? I would be curious about all of that.

Deven: It's really tough. It's not tough for a child 12 years of age or under for the most part, because there are few instances where a child that young is able to consent to get medical care on his or her own right.

As a child, your parent, or parents, or legal guardians are the ones who direct your medical care, who consent to you getting any kind of treatment at all. And the privacy laws kind of mirror the consent to treatment laws.

As a minor, once you start to age in to needing care where you don't necessarily have to tell your parents – sexual health, reproductive health, mental health, substance abuse treatment, HIV treatment  – it really varies by state as to which types of treatment you can consent to on your own, as a minor, without a parent. And then when you are able to consent to that treatment, it's really up to you whether that information is shared with a parent.

That creates a lot of friction when you're talking about trying to build a platform that is online only, where people don't present in person. How are you going to demonstrate to a health information management department that you've got - it's really hard to identity proof, a minor, even a minor who's aging, a teenager leaning towards adulthood.

Even if you could do that, how do you get consent from the minor? You can't actually even set up their own account for them until they're at least over the age of 13, because of federal laws protecting children online. All of which makes sense. But it just creates some friction once kids start to age into adulthood.

And then for the populations of kids that we serve, those kids are of varying levels of whether they are competent to consent to their own care.

Some of these kids with rare neurological diseases from birth, have developmental and intellectual disabilities that mean even once they hit the age of 18, they are not necessarily making their own medical decisions, are still very reliant on their parents. But kids, and we're learning this, we're learning this as a company, even kids within this who have the same disorder, there's varying levels of functionality. Some kids will be able to make some decisions by themselves. Some kids will not.

So it gets a lot trickier once you get to age 12 and beyond. When they're really young it's not hard at all. We also collect a birth certificate if the health information management department wants proof of parental, some parental or guardianship relationship. That doesn't always happen, but sometimes it is requested. So you have to have that documentation.

And then again, as you get into the type of care where kids can consent on their own. And those records are mixed, right? There's care that that parent consented to, but there might be care that the minor wanted, consented to, and didn't allow the parent to hear about.

And so it gets super tricky. And it's a challenge to navigate. It's an important issue to resolve. Even in portals, if you think about connecting apps into FHIR APIs, a lot of times, once a parent has the minors portal until the minor reaches a certain age, and then I've talked to parents who are like, I can't access any of my kids data anymore. They just cut me off.

Shannon: I've heard that from people as well. Maybe to pull on - I've got two different directions I want to take this - and I’ll pull one thread first.

(22:40) Part 3: Data Sharing post-Dobbs

But relative to how you talked about sensitive data for minors, I think we're also seeing in a post-Dobbs world really different applications of how we should think about reproductive data in state health laws. How are you thinking and navigating these issues of privacy and Right of Access in the face of those changes? Is Ciitizen feeling an impact? Are you thinking about it generally?

Deven: We're not feeling an impact as of yet, in part because we source data only with the patient's consent. And once the data is in a Ciitizen account, it doesn't get shared unless that patient has consented.

So we're not subject to the HIPAA rules around, you're allowed to share this with law enforcement under certain circumstances. And the fact that the Office for Civil Rights has proposed, and I understand we might see this week it was cleared from OMB. I saw some, chatter about this on social media that we could this week see the finalization of some rules that are intended to create stronger protections around sharing for purposes where the care that was delivered was lawful, but the information is being sought in order to enforce a civil or criminal set of penalties, or an investigation.

And, for us, we would not release data to anyone seeking it without the consent of the patient. Obviously, if there's a subpoena from a court, we need to be prepared to figure out how we're going to defend against that. We haven't seen it yet. It doesn't mean we won't. I worry about it, but haven't seen it as of yet.

Shannon: In the privacy circles that you're in, what are some of the hot takes as people are thinking about this? I know on our side in health information management, one thing we're really thinking about is the variations across states can make compliance really hard.

And I also think the implication of that compliance down to the provider level, like providers are trying to figure out how they comply with it. And I think it was New Jersey there was some kind of general language around reproductive health which could mean anything. It could mean if in your appointment with a physician, they asked about the date of your last menstrual cycle, does that count as reproductive health and does it need to be withheld? Questions about men and their implication in it as well. 

I would just be curious, what more broadly are you hearing, thinking, seeing? What's your take on that?

Deven: It's incredibly complex. I think the state laws definitely create an additional layer even on top of what OCR proposed.

So let's just assume that what OCR proposed goes into effect. I frankly like the way they handle this issue, which was not to say, oh, we're just going to require patient consent to share any of this type of data. Because that creates obstacles to care delivery as opposed to, let's hit the problem straight on.

The problem isn't with legitimate care delivery. The problem is when people start to use this data in ways that could be harmful to a person or their medical provider when they received a completely lawful service.

And yet the information about the delivery of this that services then is weaponized, essentially, against them. Let's get at that. Let's prevent that use as opposed to, we're going to gum up the machinery around any flows of data.

The states have done differently, saying that information can't leave the state without the patient's consent. I don't know how those are going to get implemented.

I think we are going to see that care is going to be impacted. On the one hand, we don't have great interoperability to begin with. So the fact that information is not flowing as well across state lines as we wanted to, but we do have some interoperability and information about reproductive health care or sexual health care for that matter is relevant to all sorts of other types of care.

You can't isolate it and say to this lower half of my body, nobody can find out what's going on about that, but you can treat the rest of me effectively. Sometimes it's relevant, sometimes it's not, but you don't have a whole history about a patient. That's obviously not ideal. And I think we're going to see, in the same way that we did with substance abuse treatment, siloed health care.

But it'll be siloed health care for women of a broad age range, like let's say, 12 to 50, 55.

Shannon: Yeah. I mean, even I think talking about post-menopausal could be considered reproductive, right? I mean, yes, this is a super interesting part of both, protecting individual patients and their privacy, and also thinking about from a system level perspective: how do we do that in a way that doesn't stick up the areas that I think HIPAA has allowed for data sharing, that's really important for the health system overall. And then I think, two, from our lens, just thinking about it through the perspective, how do you implement this and how do you do it in a way that makes sense, that individual providers - where we can internalize various policies? We're definitely keeping an eye on it. I think it will be critical for us to just see how things unfold.

Maybe pulling on my second thread here, I want to talk a little bit about the national networks, and we'll leave kind of current issues to the side. I'm really curious how you think about patient involvement in the national networks, and maybe we will expand that to health information exchanges in general.

There's so much exchange of information that happens where patients don't know that it’s happening behind the scenes. I think a big portion of that is administrative. When we sign the HIPAA waiver in every single care setting that we go into, we are agreeing to that data to be exchanged. We're usually agreeing at the same time to participate in the health information exchange. 

(30:01) Part 4: Imagining the Future of Data Sharing

As you think about the future of more data sharing happening through these mechanisms, what are the either issues that you're thinking about or opportunities that you're thinking about for pulling patients into the middle, to have patients really understand what's happening with their data, or engaged in the in the process of helping to share or whatever that model might look like?

Deven: I think the networks present unprecedented opportunities for patient engagement. If you think about today, we've kind of orchestrated patient engagement through this, knock on the health information department door, and knock on each door, separately, during business hours to get the information. That obviously is not terribly efficient. Then you have  the Open standard APIs and certified electronic medical records: works if your providers had those records with those APIs, works if those APIs are fully turned on, works if the institution has made those endpoints public, publicized them, sandbox tested them, they're ready to go. And even then, you still have to connect at each and every place - helps if you have an app. Otherwise, you're stuck downloading from each of your portals, which is not…

Shannon: It’s tedious.

Deven: It's tedious.

So, you know, what do they call that? Hyper-portal-itis? Hyper-portal-itis. And then, of course, the tokens are supposed to persist, but they often don't. So let's say you do all the work to get yourself connected. Then you get kicked off. You have to re-engage, and it's just the data that's available through the electronic medical record, which is more data than it used to be, but there's still data that's subject to the right of access, that's left out. Images is a big category that I think about. DNA and genomic testing is another, pathology reports is another.

So what the networks give you the opportunity to do is, I'm going to send one query in to this information superhighway, and all the endpoints that are connected, it's all going to come in through one query when I need it, at that time. I don't have to maintain any persistent connections. Ideally, I ID proof myself one time and I get and I get it all. And because I have that connectivity into the national network, that also presents opportunities for, oh, we need to get this patient's consent for something, we can shoot a request to them at that point.

That's not necessarily being done now. We're still trying to work out the kinks around the record access. So I think the opportunities are huge.

There's a large pocket of resistance that exists out there, some of it because of the opportunities for patients to be abused by companies claiming to represent them or who are representing them. But the patient may not necessarily be fully aware of what they've signed up for. Is this really the patient or is it a bot?

And then matching issues as well. Our breach notification rules make it a reportable breach if you send information to the wrong patient's records. And so we're seeing the criteria for when a match occurs being set very, very high. Has to be a unique match only on data that's been verified by an objective source.

And that creates a high bar.  People have to have a government issued photo ID, which is most people in this country, but you can imagine that there are pockets of people getting health care out there who don't necessarily have a reliable government issued photo ID, or do have some concerns about putting identity in through a system when they don't have guarantees that that won't be used to harm them.

So lots of kinks to work out. But the potential is clearly there. I query for my records. I get them from everywhere I've been seen, and I don't have to maintain system connections. And ideally it's all the information that I have a right to and not just what's resident in some of these electronic medical records.

Shannon: How do you think as an industry we should think about mitigating the risks there? Some of them that you highlighted are really real. We launched a tool a number of years ago to help patients request their medical records from our individual health systems, and it really is about patient access. It's a simple form that gets filled out. We're working on the ability to turn that on so it's digitally turned around really quickly for them.

We also find that not-patients are using it to try to request records for patients. And it's not like it's happening 90% of the time, but it's not not happening. And so, you know, I think about that often in this context, especially as we do think about the blurred lines or who really gets to make the call on when does a record get released or who approves it.

Do you have an ideal solve here? What do we do as an industry to make it better so patients could potentially use these other mechanisms?

Deven: I think  we don't do enough vetting of who's allowed to do these queries and for what purposes are they allowed to query. We've sort of become dependent on the connectors, who provide a valuable service of making sure that you can get connected into one of these networks, so you don't have to establish your own connections. But they make money if they have a number of customers that they're connecting in. And so we're asking somebody who gets paid only if they facilitate your connection to be vetting whether you deserve to make a connection, and be able to query for this type of data or not? I'm not sure that that's a viable solution because of the inherent conflict of interest.

But historically, even before we had national networks, we had regional health information exchanges. And even though they were also paid only if they had a number of members paying fees, most of them were nonprofits. They risked breaching the trust of the other participants if they started letting folks who didn't really meet the criteria for querying for a certain use case, like they weren't really training providers, or they weren't really health plans, or they actually weren't really representing patients, but instead representing a business interest that might or might not have gotten authorization from the patient to query those records.

So there was at least a vetting mechanism in place.

Today, we're kind of dependent on whether you've got the proper tokens. So we're trying to fix this with tech. And we're trying to suggest that, well, if everybody signs the same flow down provisions that that's going to work, but we don't necessarily have the right mechanisms of accountability for when people abuse the system. And the health data economy is so big and has so much money attached to it that people are looking for pathways to get data. People are looking for pathways to get data that have the least amount of resistance, have the smoothest possible onramps. And it shouldn't surprise us that people are trying to squeeze through whatever doorways we've created. And yet there are rules around who can go through those pathways. And we don't necessarily have the right mechanisms to enforce them.

And I don't know who does it. I don't know if it's the federal government. I don't know if that's a role for a nonprofit like the Sequoia Project, like the Karen Alliance that can't be subject to industry capture, that does things fairly, that has some public accountability, and that people trust.

We at Ciitizen would be more than happy to open up our books, open up our doors, here's who we are, here's what we do. We believe we represent the interests of our patients. And if any patient wants to close their account, doesn't like what we're doing, they are free to do so. We don’t continue to query under their names for their data. We give them back their data.

You know, there's just certain sets of requirements and standards that we can create - a set of expectations. But the enforceability of that, and remedies, and consequences for bad behavior.

Shannon: And to the point we were talking about earlier with OCR, not not at all linked to this, but I do think in general, we are suffering from an inability to enforce some of the regulation that's new that we have now.

How do we think about this in a broader context? And I think there's a lot of room for, either a continuation of this administration or a new administration to also be thinking about that in terms of health data, in terms of certified health IT, in terms of how we think about provider participation, etc.

(39:49) Part 5: AI and Data Privacy

Maybe to switch gears to another topic that is a hot topic right now around AI and privacy.

And speaking of the data economy, I am consistently floored at how much money AI companies are raising, especially in the healthcare space. It is both exciting as a patient to think about a future of precision medicine that is really driven by all of the exciting advancements in technology. It also is a little terrifying to think about how much potential infringement on privacy could be happening.

I would love to hear your overall take on AI and privacy and health care and what we should as an industry be doing or looking out for here.

Deven: I'm pretty excited about the potential for AI and similarly, of course have concerns if we don't put some guardrails around this, we we will increase the disparities in health care rather than reducing them.

We will abuse people's trust because we've collected a whole bunch of data about them, and we're not transparent about it. And, and, you know, we have sectors of our economy making gobs of money and people are still struggling to get the care that they should be getting. So, you know, AI doesn't solve all the problems of health care.

But again, I came out of the quality measurement movement when I worked at the National Partnership for Women and Families. And you see just how often people don't get the care that they're supposed to. So the idea that a machine is going to make a decision versus a human does not scare me.

In fact, because I already know that humans make lousy decisions. So being assisted by a machine…But if it's done right, if it's tested, if we know the data that's gone into it. Do we know that it's been tested on populations of people that it's going to be used on? I worry a lot less about that as a white woman, but maybe I should, because I'm a woman and it's probably all male data. Like there are many ways that this can go wrong.

From a privacy standpoint, we've had a de-identified data economy that has existed in this country for years. AI just basically turns the heat up on all of that because it's de-identified data that's largely going to be used to feed these algorithms, and we don't regulate it.

Once the data falls out of protection from HIPAA, once it's been de-identified for HIPAA standards, assuming that HIPAA applied to it in the first place, it doesn't get regulated at all. And so now I think because of the ways that AI can be used to harm people and the fact that data is going to be collected from all sorts of sources, that maybe we're not mined before for just regular old medical care.

HIPAA doesn't have very many collection limitations. So you, you know, you have people getting clinical data, claims, data and data from all sorts of other sources, merging it together and trying to figure it out how to create an AI algorithm that is more predictive, that makes care more efficient, more effective - we don't really have exactly the right infrastructure for that.

On the other hand, we're not operating on a blank slate. HIPAA already has guardrails for when you can use data for certain purposes. But again, de-identified data falls completely out and we already have a health data economy that is largely unregulated, that people are getting increasingly concerned about, how did my data get into this data mining soup? How did that happen?

It's something that we should pay attention to. On the other hand, like, far better for it to be de-identified data because privacy is enhanced when it's data about me, but it's not tied to me. Like, okay, it's newly 60 year old woman with, you know, history of X, Y, and Z, but they don't know it's Deven McGraw.

Shannon: Yeah. Like the algorithm isn't going to accidentally spit out your name. Yeah, it is such an interesting juxtaposition to just think about the future of healthcare that we want that is really assisted by computers, is really assisted by technology, solving problems from, you know, disparate issues in health care delivery.

Today, when we think about rural, I think about every single time that a critical access hospital might be closing for, for a variety of reasons. But just thinking about people who are 3 to 10 hours away from the closest large hospital who might need, or could be assisted by a care that's technically driven.

And there's just there's so much there for us to really think about. But I do continue to believe patient privacy has to be at the center of it, because it can also go very wrong. I don't live in a world where I totally believe in dystopian, wild future, but sometimes I get a little paranoid.

Deven: As we all should. If we didn't get paranoid, we would just run off off the rails really quickly because the money's going to drive where this goes, as always. Right? If we don't think about what are the consequences that we have to control for we won't get any of the benefits, or we'll get fewer of the benefits, or we'll have a bunch of systems that nobody trusts.

Shannon: Which is also terrifying. Like, you can think about marginalized populations not utilizing health care in that moment? 

(45:43) Part 6: Shannon’s Lightning Round

All right. This has been wonderful. I'm gonna close this out with a quick lightning round of three questions.

Deven: Okay. I'll try to give you quick answers! I'm sorry I’m so long-winded.

Shannon: No, I love it.

I do. You have a wealth of information that I wish we could share with everyone far and wide. 

So first question: How would you fix American health care in two sentences or less?

Deven: More transparency about health insurance operations.

Shannon: Ooh, love that one. Yes, absolutely. Speaking of algorithms.

Deven: More transparency and a bit more regulation of health insurance because there's a lot of money being made there and a lot of people suffering.

What are you most optimistic about in the health care or health tech space?

Deven: I'm pretty optimistic about patient-directed research and development. I just love the patient registry movement. And I love that we're a part of powering that. It's exciting to watch communities of patients take things into their own hands and now they have the ability to get the data to get that done.

Shannon: Yeah, that is really exciting.

Three books you think everyone in health care should read?

Deven: Oh, Lord, I'm terrible about reading books. I really am. I have to read too much for work.

Shannon: That's so fair. Is there one healthcare book that you feel like really changed your opinion? I feel like I reference An American Sickness all the time.

Although everyone at US Digital Services at one point in time read that. But would be curious if you've got like one or 1 or 2…

Deven: Definitely. The Institute of Medicine / National Academy of Medicine Publications on To Err is Human and stuff on learning the healthcare system. I do wish people were a lot more educated on the fact that our system doesn't work nearly as well as it should, and it costs too much. Because I think in general, the voters in our country seem to think we need to preserve what we have when the reality is that what we have doesn't work well for very many people.

Shannon: Yeah, yeah, that's super fair. Super fair. Awesome. Thank you so much, Deven. This was so much fun.

Deven: Thanks for asking.

Shannon: I know we get to chat moderately often, so I'm excited to get to share some of your brilliance with folks who will tune in and to listen to this conversation as well. Thank you.

Deven: Appreciate it. I'll talk with you at any time. It's really fun. Thank you. 

Shannon: Of course.

Achieve your boldest ambitions

Explore how Datavant can be your health data logistics partner.

Contact us