ADVERTISEMENT
  • Amplify Ad_LivingWithRiskUrgentCare_728x90_NA_DISP

What Critical Data Could Your EHR Be Hiding?

No Comments

Privacy advocates are pushing electronic record vendors to create a new class of “sequestered” health information, which could hide critical data from emergency physicians when they need it most.

Privacy advocates are pushing electronic record vendors to create a new class of protected health information. This sequestration of data could hide critical information from EPs and even erode the practice of emergency medicine.

 

Your longtime colleague just got a needlestick on a high-risk case. She’s going to want post-exposure prophylaxis. She gets registered and you see her, right away. After a minute or two commiserating, you log onto your ED information system, which is part of your hospital’s enterprise electronic health record system.
Just as you’re about to put in orders, you see a table of past visits, including . . . a hospitalization on the psychiatric floor, two years ago.

ADVERTISEMENT
Amplify LivingWithRiskUrgentCare_300x250_NA_DISP

You wonder: why didn’t she mention this when I just asked about other medical problems? Are there other meds, or conditions I should know about before moving forward? Then: Did I really need to know about her psychiatric history? And: Will I be able to look at her, the same way again?

********

The doctor-patient relationship is one of the most special – and scrutinized – forms of human communication. It is regularly subject to the interest of administrators, insurers, vendors, and lawyers. In recent years, the growing adoption of electronic health records has raised the stakes. On the one hand, EHR lets providers access and share more patient data faster. On the other hand, compared to paper, electronic access to patient records can be easily tracked and reviewed.

ADVERTISEMENT

With news of data breaches and HIPAA violations, of hospital providers being fired for accessing data on celebrities not under their care, you might think that the sole focus of health IT and compliance officers would be on improving security. They could be focusing on better and more frequent audits of doctors’ activity when logged into EHR systems, anomaly detection systems that can identify when a user starts accessing data outside his or her regular practice, and promoting a greater awareness of risks, and harsher penalties for unauthorized access.

Instead, some EHR vendors are offering to hospitals the option to hide – or sequester – certain types of data within the electronic medical record. Some administrators and privacy advocates promote the practice, saying that some patient information should be held to a higher standard of confidentiality. Don’t some patients, like celebrities and hospital employees, deserve additional protection?

Not anymore. The sequestering of data was understandable in an era where stigmas were strong and anti-discrimination laws weak. But in that era – an era of paper charts – physicians were used to acting on fragmented, inconsistently available patient records. No ED doctor expected to be making decisions on all the available facts. In the electronic era, that expectation is changing.

ADVERTISEMENT

********

A 60-year-old man is brought in at midnight by EMS, from a group home. Their report is that he’s got a history of psychosis, became agitated tonight, lashed out at a caregiver, and got in a scuffle with another resident. Now he’s not speaking, his eyes are closed, and his muscles are rigid.  

Is this catatonia? NMS? Head trauma? EMS doesn’t have much else to go on, and your hospital’s EHR has a med list but no recent entries. You spend 20 minutes trying to get in touch with someone who can fax records from the group home. When that fails, you order blood work and a head CT, and wonder just how well he’s protecting his airway. The list of patients waiting to be seen is growing, so you reluctantly consult psychiatry, even though you know they won’t touch the patient until the “medical workup” is complete and you clear the patient.  

To your surprise, psychiatry comes over right away and asks if they can transfer the patient to their care. When you ask why they’re so eager, they explain that the patient was just discharged from the inpatient psychiatric floor a few days ago. It turns out that because they’re psychiatrists, they have access to that part of the EHR that details this patient’s inpatient hospitalization, details which are completely hidden from you in the emergency department record.

ADVERTISEMENT

********

It makes sense that accessing the data of patients not under your care is a punishable offense. This was true in the era of paper records, and since electronic records expand the reach of doctors’ access, it made sense that the punishment severity should increase. In the paper chart era, however, once you had your hands on the patient’s clipboard or folder, you had access to every page of that chart. It was easy to consider all health information as created equal, and that each bit of data should be subject to the same protection. With EHRs, though, additional granularity is possible, and different data need not be afforded equal protection.

In addition to rendering some visits and data “invisible” to certain providers, another common option for sequestering data is for the EHR to pop up a “Break the Glass” window for special patients, or in certain parts of the chart, to make sure you really need to view the sensitive information (as well as flag your usage, so compliance officers can review its appropriateness).

Those in favor of sequestering health data would argue: we in the ED are specialists, with defined roles, and we don’t really need to see a list of psychiatric hospitalizations to manage a needlestick. Beyond the ED, they could ask: Why should the nurse practitioner, tasked with logging a hospital custodian’s vital signs at a blood pressure drive, need to see his HIV status? Why does the orthopedist, treating the hospital security officer who broke her ankle, need to know about the abortion she had last year?

Indeed, when I’ve argued, in these pages and elsewhere, that electronic health records and health information exchange can fundamentally change the practice of emergency medicine by making a patient’s history immediately accessible, I’ve been met with the reply that a lot of what we do in the ED doesn’t really require a complete knowledge of the patient’s history.

So, even if we did have access, why not just make some of that sensitive information invisible?

Sequestration of certain elements of EHR data has been said to be a reasonable introduction into the modern doctor/patient encounter, especially given the findings that patients with concerns about privacy are less likely to seek care.

The problem is that it’s only in retrospect that it is easy to see which bits of information weren’t helpful. We know better, how sometimes revealing a bit of history can be invaluable, how some information can literally be lifesaving. At the very least, having access to the EHR’s complete history makes us more efficient, preventing unnecessary phone calls and reducing duplicate testing.

So who is to say when patient information is unnecessary, and when it’s crucial? Really, it can only be the emergency physician. We’re the ones who are ultimately responsible for our patients, and for running our ED efficiently.

Just as we’ve seen a righteous indignation from emergency physicians on imposed “quality” measures that lead to unintended consequences that harm patients, so too should we oppose sequestration of patient data in the ED. The unintended consequences of sequestration range from complicating the reporting of EHR meaningful use metrics, to perpetuating the very stigmas that patients are trying to protect themselves from. When we enter into a doctor-patient relationship, no matter if it’s a life-or-death situation or a minor injury, we enter into a contract – sometimes called the Hippocratic bargain. The patient offers i
ntimate details and examination; the doctor offers confidentiality and a promise to use that information to give the best possible care.

When administrators and vendors start blocking access to parts of the chart, or letting patients pick and choose which tests or visits can be hidden, the doctor-patient relationship erodes. We retreat further into a box, as specialized providers, and patients come to expect less.

Emergency medicine was built on the notion, unique among specialties, that circumstances will force doctors to make important decisions about patients with limited information. This notion underlies all our training and patient interactions. Almost every patient is someone with whom we’ve had no prior therapeutic relationship, and we see a steady stream of hospital employees with injuries or acute complaints.
Electronic data could be a godsend for ED providers, and our patients – bringing to light information that can improve our diagnoses and therapies. Additionally, electronic records give us the potential to save time, energy and costs, letting us work more efficiently. But we’re watching the promise of electronic records be whittled away, before it’s fully realized.

Privacy advocates mean well, but will never have the responsibilities we do. Penalties are already in place for unauthorized access or disclosure of a patient’s health information. Adding another layer of security – or invisibility – surrounding certain conditions may give comfort to some, but undermine the doctor-patient relationship and ultimately jeopardizes quality care.

 

Models for displaying/concealing sensitive data

1. When viewing a chart with sensitive health information, an alert pop-up appears to inform the doctor that some information will not be displayed, because of patient confidentiality selections. If the circumstances warrant, clinicians may have the ability to “break the glass” and override the block to access this data, if the patient cannot consent. The case would then be flagged for review by a privacy officer.

2. Some EHRs let patient data be restricted at the chart, visit, or even narrower levels. A “confidential tab” can be created in some EHRs, so while any provider can access the patient chart, and see that confidential information exists, only those granted explicit access can view the labs or notes in the confidential section.

3. Other EHRs simply hide sensitive information altogether from unauthorized users – the sensitive visits or test results are simply not listed, with no indication that a visit or test even occurred. There’s no emergency override option, no glass to break.

4. At least one EHR offers administrators the ability to restrict patient data by location or patient type. For instance, in the ED, all patient information could be made available, but in the Orthopedics clinic, when seeing patients that are hospital employees, the psych history is restricted (again, a “break-the-glass” feature can be superimposed on this restricted data).

5. Maybe someday, EHRs will sport advanced algorithms that can determine what data is relevant to the current visit, and what can be safely buried. One could imagine that, based on the complaint, vitals, and even based on words in the notes you’re writing, different elements of a patient’s history can grow or shrink in prominence, in real-time. In this way, the EHR can determine that the ED physician can remain unaware of a patient’s prior psychiatric hospitalization, when treating an ankle injury.

Terms

>The HIPAA privacy and security rule requirements, as well as accrediting standards, complement each other. Individuals have the right to expect their health information will be kept confidential.
>
Confidentiality, for the purpose of this article, is the practice of permitting only certain authorized individuals to access information with the understanding that they will disclose it only to other authorized individuals as permitted by law. For example, substance abuse information may not be released without specific consent.
>
Privacy, for the purpose of this article, is an individual’s right to control his or her protected health information.
>
Security is the protections or safeguards (administrative, technical, or physical) put in place to secure protected health information.

References

Nicholas Genes is assistant professor at Mount Sinai’s Department of Emergency Medicine, where his focus is on ED information system optimization, and using EDIS data to aid research efforts. Kevin Baumlin is Vice Chair and ED director at Mount Sinai, as well as chair of SAEM’s academic informatics committee.

Leave A Reply