Earlier in this series, we discussed How to Prepare for a HIPAA Audit, offered tips for Updating Your Plan & Training Your Staff, reviewed the pros and cons of using Electronic Devices in Your Practice, and talked about How to Prevent a Security Breach.
Q: What happens if you don't self-report breaches that affect less than 500 patients?
A: If you don't report breaches, that's essentially, I think you guys have seen these penalty slides that we've been putting up, this is basically willful neglect. This starts at $10,000 per fine. Even if you remedy that and report those after the fact that you've been investigated, those start at $10,000 per violation.
If you lost a spreadsheet of 100 people, those would be starting at $10,000 per violation. If you are found to be in breach because you willfully neglected your obligations to report and you do not mitigate or do anything to fix that breach, those fines start at $50,000 per violation. Actually, that is $50,000 per violation with a cap of $1.5 million per violation in a calendar year. With that, you also need to keep in mind that it may not have been reported. If they come in and you do not have Policies and Procedures, Business Associate Agreements, any other things that are required under the law, those are considered separate violations under the law, and that's why we see fines over $1.5 million.
Q: What are the trends for HHS enforcement?
A: We've been seeing an uptick in enforcement in general. Since January 2016, we've seen a bunch of reports that have come down. Last week or 2 weeks ago, we were talking about, you know, there was a company, Raleigh Orthopedic, that just had a major fine that was levied against them. We're seeing some pretty stiff fines for not having Business Associate Agreements, not doing what they need to do to make sure that they have compliance plans in place, breaches, those sorts of things.
What we're getting ready for, and what you guys are going to need to start making sure you're checking your e-mail for, and this includes checking your junk e-mail, because HHS is expecting that everybody is checking their e-mails, including junk, is that HHS is going to start doing what we call desk audits. It's the same as when the IRS sends you a form and says, "Send us your tax returns from this year and any receipts and justify your expenses." They're going to do the same thing for HHS. They're starting to ask, "Send us your plan. Send us information on business associates that you have, training records, those sorts of things." You're going to start seeing those questionnaires going out and those assessments happening.
If, at that time, you don't ... You usually have 10 days to respond to those requests for plans. If you do not have a plan, that's probably too late to start, but you never know. They are also looking for a history of compliance. So we need to make ... That's been released, too. They're looking to say, "Okay, this general practitioner didn't just do it January 2015. They've been compliant for the last 3 years, that they've been working on a compliance plan and working and updating that as needed and as they've seen things change."
Q: What should I do if an employee is aware of a breach but fails to report it?
A: This should be included in your Policies and Procedures, and we've actually seen some issues with this, so it's very important that you have a sanction policy in place and that you actively enforce it. The policy should be clearly laid out in your Policies and Procedures. For example, your sanction policy should state, if there is an accidental breach, so let's somebody faxes something to the wrong person, that's considered a minor violation. That employee should be required to complete training again or maybe there's some sort of penalty that goes with that. Usually it's just re-training at that point.
If there is a history of noncompliance, like the same person has faxed things five times to the wrong people and is consistently not paying attention, step two may be that you suspend that employee without pay for a week or two. If this person just continues to be careless and is not heeding your warnings, then I think the third step is to terminate their employment. That could happen at any point along the way, but you really want to have a clear sanction policy and understand how to enforce it fairly throughout your practice.
Q: What are the most frequent types of breaches?
A: The most frequent breaches are unfortunately loss and theft. Last year we published an infographic with the Top 10, but to sum it up, breaches mainly occur due to employee carelessness. This means not having proper passwords and enforcing passwords on devices, sending information to the wrong person, speaking too loudly in the hallways so that other people are overhearing what's happening. We're seeing loss and theft of devices left and right. If you don't have password protection or encryption on your devices, I strongly recommend as soon as you get off this webinar, that that's the first thing you go and do. Go talk to your IT person. Make sure that every device that you have, whether it stores PHI or not, that you start encrypting and you start password protecting those devices. This includes mobile devices, laptops, tablets. Anything that you can protect, you want to make sure it's password protected and encrypted.
We're also starting to see a large increase in malware, and I think that's getting a lot of press right now because of the hospital in LA that got hit. There have been a lot of other instances that we've seen of people being hit with this. Those are instances where you need to make sure you're notifying the FBI. You don't want to keep this information to yourself. These are people that can help you. They'll help investigate what's going on and can try to help you get your information back. Again, your best course of action in those situations is make sure you have good backups, that you're constantly backing up information, and that you have weeks, maybe a month of backups that you can then turn back to for critical information.
Q: What is considered a HIPAA breach when it comes to email? Does having the patient sign a waiver make it okay to email them? And is it okay to use just a last name in an email without violating HIPAA?
A: First off, any time you send information unencrypted, there's a potential for a breach. You have to think of all the different places that email goes and where it could be intercepted at any point along the way. That's why we strongly recommend that you use an encryption program that requires authentication, which means the patient has to log in.
Is a waiver okay? Yes, it's okay to email waivers that don't have any PHI or any sort of identifiers on them, just a form, but I would strongly recommend that you tell them not to email it back to you. Again, you need to give them a way to send it back to you in an encrypted format. It's also important to know that you cannot ask a client or patient to waive their rights when it comes to HIPAA, when it comes to releasing that information. You can't say, "The only way I'll email you, if you basically sign off and say, 'It's okay for you to send me things unencrypted.'" They have to willingly ask for that. You can't force that upon your patients.
If a patient does ask you to send things to them unencrypted, you can have them sign a waiver on that. You are able to do that. I recommend that you don't go that route because it becomes really hard to keep track of who you're supposed to send encrypted information to versus unencrypted information. In those cases, I would just say to the person, "Our policy is to send things as encrypted emails and I'm sorry, but you're going to have to log in in order to read this information. This is for your safety and for our safety as a provider."
Can you use just the last name in email without violating HIPAA? No, you cannot. Any time you send an email that has any sort of PHI, the email address itself becomes an identifier. Email addresses, IP addresses, last names, those are three identifiers that you would have in that email. That would be in the tags. Anybody who intercepted that email, that would be considered a breach of that information, so that is not a good way to send information.
Again, use a encrypted portal. We've got a couple on our website, at TotalHIPAA.com that you guys can look at. If you use our name, you'll receive a discount.
Q: Is sharing PHI with an employer regarding a reportable Workers Compensation claim considered a breach?
A: No. This is an interesting question because we run into this with our employer groups and with our insurance agents a lot. When it comes to a Workers Comp. claim, you want to be cognizant of the minimum necessary provision which is outlined in the Omnibus ruling. The ruling states that it's very important that you only disclose the information that's necessary for the employer. Does the employer really need to know every bit of what's happening with that employee or is it just an instance where an employer needs to know if this person fit to return to work? If that's all you need to release, then yes, releasing PHI would be considered a breach. You mainly just need to say "Yes, Bob is able to return to work," and that's the end of it.
Olive Lynch: Welcome again. Thanks for joining us today. My name is Olive Lynch, and I'm the creative marketing manager here at NueMD. We're really happy to have Jason Karn and Dan Brown back with us for today's presentation.
Jason Karn: Thank you very much for the introduction, Olive. My name is Jason Karn. I'm the chief compliance officer over at Total HIPAA Compliance, and I'm going to pass off to Dan here who's going to introduce himself. Dan?
Dan Brown: Good afternoon everyone, or good morning depending on your location. My name is Dan Brown. I'm an attorney with the law firm of Taylor English Duma in Atlanta, Georgia. You'll note that we are, at least I'm an attorney, and I'm here to talk to you today not as your lawyer. The materials we're going to be going through today are going to be presented as educational only. We may give a few examples, but we're not in the position to actually advise you on any particular issue that you might have, so just keep that in mind. You need to also note that these laws change frequently, and you may want to be sure to check back with your counsel or with the rules to make sure that you're still in compliance.
Defining a Breach
Dan Brown: Let's talk a bit about what today is all about. Today is, we're going to talk about the responsibilities that you have to the public and to your patients regarding a breach of secure protected health information. Remember, if you're a covered entity, which is a health plan or a health care provider or health care clearing house, or if you're a Business Associate providing services for a covered entity that uses PHI, who have certain obligations under HIPAA. The obligation we're talking about today is, what happens when you find that all of a sudden there's been a breach of the security of this data. As we'll see, it's a breach not only of the electronic data, but also of the paper data you might have laying around. Let's go ahead and look at the definition of what a breach is. If you take a look at this slide, I hope it gives you a headache, because it certainly gives me a headache.
Some lawyer somewhere got paid a lot of money to put all these words on this piece of paper. We've going to have to cut through and see exactly what they're talking about. If you're covered by HIPAA, Business Associate or doctor's office, and you find that there's been an unauthorized acquisition, excess use or disclosure of PHI, what does that mean? Unauthorized acquisition, access, use, or disclosure? That means that, there are kinds of people who are authorized to take a look at protected health information. It may be your nurses or when you pick up the phone and schedule and x-ray, all of that is authorized. What happens if someone acquires or accesses or use? What does access mean? Access means basically you've been hacked. If you're using electronic data at medical records, and someone unauthorized has access to the date, that's a breach and you stop to think about it, that's weird.
How do I even know if someone's accessed it. Not only that, how do I know if they've accessed it, if they've actually taken it and if they've used it. Unless it shows up on some form that says, "Here's some Social Security numbers I got from the doctor's office." We have to stop and look at these words and say, "What's going on here?" We know that there's been some unauthorized hack to you electronic medical records system, and whether or not ... Does that compromise the security or privacy of the information? Absolutely, because then under HIPAA you have an obligation to make sure you have secured the privacy of that data, then that is a breach. There are some exceptions.
There are three exceptions in the statute and the regulations. Here's one that sits out right on the statute itself. "Except where an unauthorized person to whom such information disclosed would reasonably been able to retain such information." What does that mean? That means that a person would arguably ... If I accessed it, and I'm unauthorized, and I got my hands on it, but it turns out it's encrypted, or it's gibberish, does that mean that there's been a breach that I have to act on. Technically the answer is no. If you had some kind of access of protected health information, but it's in a manner that would not reasonably been able to ... I would not be reasonably been able to retain it, then there's been no breach, which is really, really nice.
It's interesting to think about that in this context. The regulation gives us three exceptions to the definition of breach. Typically all of them go along these lines. There's been some minor discrepancy, I have a good faith belief that the unauthorized person who got the disclosure would not know what to do with the information or we're going to presume that a Business Associate let some information out. We're gonna say, "Well, you know it's not a breach if the data got out of our control. If we can prove that nobody was hurt by it." The rules give us four types of things to click off to make sure whether or not somebody's going to be hurt. Every time we think there's been a breach, we have to make an assessment that the extent and type of the information involved, how the information was disclosed, whether it was actually viewed. Let's say it got out of the box, but nobody actually saw it.
A laptop was stolen, can we prove that the laptop was never ever, ever, ever opened? If we can prove that, then maybe we can go back and prove we don't have a breach. What all of this does, these rules that came out in 2013, what they do for us, means that if there was a breach, if we had noticed that someone's hacked us, if we have notice that a laptop's been stolen or missing, then we have to presume that there was a breach. We have to presume that someone opened it up, looked at it, and used it. The only way we're going to escape our obligations to report the breach, and take other action, is if we can positively, objectively prove that the laptop blew up before anybody saw it. If we can't do that, then we're trapped into these rules of going through the breach notification.
Let's take a look at the next slide, then we have understanding of what a breach is, we had an obligation to make it secure. Somehow the genie got out of the box. We can't prove that the genie blew up before he or she said anything. We've got a breach on our hands. We have to act. Typically only the patient can decide when to release their protected health information.
Exceptions for Releasing PHI
Dan Brown: Well the patient can tell us, for example, that it's not a breach. It's not even unauthorized use or disclosure for us to have treatment, for us as doctors or providers to disclose information for treatment, payment, or health care operations. It's not going to be a breach, and we don't have any obligations for breach if we disclose for law enforcement. HIPAA says we, the covered entity, can make all these types of disclosures when law enforcement says, you know, "We have a criminal situation. Tell us where this guy is." If there's been child or spousal abuse, we may have an obligation to report. Same with diseases, we may have an obligation to report to public health officials certain types of communicable diseases, AIDS, maybe Zika nowadays. Then there are required FDA disclosures. Those are not breaches. Those are normal health care operations. We basically are giving this data to authorized persons, so we're not going to have a breach situation like we've discussed in the first slide, in all those words.
Examples of Breaches
Dan Brown: So what are some examples of breaches? Believe it or not, these are all breaches and regulation goes into detail. Let's assume that you faxed Mrs. Jones' records to the wrong person. You got the wrong fax number. That is, going back to the definition, it's an unauthorized disclosure of PHI. That's a breach, and if we send it to one individual that we know it's a wrong number and we find out it's a wrong number, then we're going to have certain obligations to make regarding this breach.
Remember, if we can prove that the fax was never opened, that nobody ever looked at it, then we don't have any obligations to make a breach notification, but the examples are pretty clear in the regs that, yep, you mail or fax an appointment notice to the wrong person. Unless we can prove that they never saw it or the mail was never opened, then we have a breach.
What about an unauthorized conversation about patients? This is the kind of talk that you have in the elevator, saying, "Mrs. Smith was in," or maybe a celebrity that everybody knows who Mrs. Smith might be, had some certain condition. That is a breach. You notice that's not electronic activity. That's an actual discussion. That could be a breach.
What if we release protected health information without a Business Associate Agreement? Here we're a physician. We practice. We're using Joe's Billing Company, and for whatever reason, we've decided not to use a Business Associate Agreement when one was obviously required. That's a breach. As a consequence, we would have certain breach obligation. We have obligation to the patient, obligations to the public that we'll talk about.
What about if we use data more than is necessary? This is a situation where we are absolutely clear to call up the x-ray department and say, "We're going to send you ... Mr. Jones needs an x-ray, and oh, by the way, Mr. Jones has some type of communicable disease," that has nothing to do with radiology issues that are going on. That's a breach. We have an obligation under HIPAA to disclose no more than that which is minimum necessary for the treatment or activity disclosed.
Malware attacks, it's kind of ransom-ware, right? All of a sudden we have ransom-ware. All of a sudden we have malware attacks. We've locked down the computer. What's happened? We've had an unauthorized access. An unauthorized person has made access to our data. Maybe we have a breach and we have to think about what are our obligations now we know we have a breach?
The clearly easy one is the lost devices. A laptop is stolen. You lose your cell phone, and we don't have the ability to wipe the phone immediately or the data is not encrypted. Remember, there's an exception. It's not a breach if it's impossible for the person who got the lost laptop to use the data, to see the data, to manipulate the data. If it's encrypted on the device, then that's not a breach. It kind of gets you a "get out of jail free" card if you make sure that all your data is encrypted and it has all the proper passwords.
Discovering a Breach
Dan Brown: Discovering a breach. If we have a breach, what happens when you have one? Now the clock has started. You've found out that a conversation was made, somebody accessed your electronic medical record system, your laptop's stolen, and it's not encrypted. You've got obligations under HIPAA. First thing you need to do is: You need to have a policy and procedure in place. In other words, you should expect that you're going to get some unauthorized access, or lose a laptop, somewhere along the line. As part of your HIPAA policy and procedures, you should have one specific for what to do in case of a breach. Jason's going to tell us a little bit about what your obligations are now that you have found a breach. Now you're going to put your policy and procedures that you've already lined-up in order. Jason, go ahead.
Obligations and Tools for Breaches
Jason Karn: Thank you so much, Dan, I appreciate that. Yeah, so when you have a breach, or a suspected breach, as the slide says, HHS expects you to actually have a procedure in place to evaluate the suspected breaches before. This means that you have your policy, you have your way that you're going to be investigating these, and these are the kinds of tools that you need to have. You have a breach analysis log or checklist, and this is different than your risk assessment that you do when you're creating your entire plan. This is actually your analysis is the ... is what you would do to evaluate what happened in that breach, Where the information went? Who accessed it? What was done wrong?
In that incident breach log, if it's under 500 records, then you're going to need to keep a log of those, and you'll be submitting those at the end of the year. We'll be talking more about that to HHS. If there are over 500 records, then you need to have a ... you need to have the link to report to HHS. That's very important at the time that you discover these you have the clock is ticking, you have 60 days from the date of discovery to report that breach to HHS, and that's also when you should have discovered what was happening or did discover what was happening, so that can get really tricky, so it's important that you report those breaches as soon as possible.
You need to have a letter outlining what you're going to do, so you need to make sure that you have an action plan of how you're going to approach your patients, how you're going to notify them what's happened and what's going forward? You need also a strategy to repair your reputation. I know that this feels like what we're planning for at the end, but when you talk to most hackers, or you talk to most white-hat hackers, what they say is that it's not if you’re going to get breached; it's when you get breached. It's best to have these plans in place, know what you're going to do, and know with your compliance team what those actions are that you're going to take.
Steps for Responding to a Breach
Jason Karn: So, what happens when you have a breach? Step number one is you need to contact the authorities. We need to figure out was this a malware attack? If it was, then we probably need to contact the FBI. They have a special task force that works for electronic breaches like this. If it was maybe a client stealing information, then we need to contact authorities for the possible theft. Loss of devices would make you want to make that contact. It's really important that we makes sure that we've notified the proper people.
Step two is reviewing your risk analysis. We want to look and say, "Okay, what really happened here? How did it happen? How are we going to try to mitigate this? How do we get that information back possibly? How do we destroy that information so that it's not usable? Was it encrypted?" These are all things that you would go through in that risk analysis. You want to analyze what actually happened there. That's really important to go through that whole process.
What Type of PHI Was Breached?
Jason Karn: What types of PHI were breached? You really want to identify what you're looking at. Was it physical, was it electronic? Was it oral? You want to be able to say, "Okay. How did that get out? What process failed us?" This is really important because once you've submitted this information to HHS, they're going to want to look through that themselves and say, "Okay. What did you actually do? How did you mitigate this breach? What are the steps did you take to try to either get that information back or to stop it from disseminating more? That is very important.
Who Accessed or Received the PHI?
Jason Karn: Who accessed or received the PHI? Who did you send it to or who got into your system? This might be a reason that you get a third party involved. This could be ... If you've had a malware attack, you may need to get an IT expert in who can go back and backtrack, look through your logs, figure out what happened, where this information came from, who was at fault for maybe downloading the wrong attachment, those sorts of things.
If it comes to paper, did you fax the wrong number? One thing we tell people all the time is if you have a lot of fax numbers that you use, or even a few fax numbers that you use regularly, make sure those are programmed. Either have them programmed in your actual fax machine or use an electronic fax, where you can scan it in and then send that information, so you make sure that you're sending information to the right phone numbers. It's very easy to hit the wrong number, and next thing you know you're faxing information to the wrong person. It's really important that you understand what kind of PHI has been accessed and who's received it.
Was the PHI Actually Acquired or Viewed?
Jason Karn: Was the PHI actually acquired reviewed? This gets a little tricky here, because with the fax, if that fax goes through and it's transmitted, you have no way of getting that back. That's one of the reasons that faxes, though they are allowed, we try to steer people away from it, unless a carrier or a provider insists that you fax them.
What we recommend for most people is using an encrypted email program. The reason that we like those is many of the programs, they'll sign the BAA with you, and they have this ability to either expire the email or to delete that email, so if that information is going out then basically you can retract that information.
This is why the authentication on emails and on any electronic information you're sending, if you have to send through files through Box, or if you're using a portal, which is a really nice way to communicate with your patients, that they have to authenticate themselves as who they are. That means they set up a password and possibly a username, so that you know that they're the only ones, unless they've given that information to somebody in their family or to another user, that has access to that information.
It's really important, because once that information is out of your hands it's really hard to determine was it actually acquired.
Breach Notification Requirements
Dan Brown: This is really where the rubber meets the road. You've got a breach, you've got to notify somebody. Who do you notify? When do you have to notify them? What's involved? Who has to pay for the notification?
You have to stop and think about: are you a covered entity? Are you a Business Associate? If you're a covered entity, you have certain notification requirements that are a little bit different from if you're a Business Associate.
Let's talk about our obligations to make notification. There's a big bright line that's drawn in the rule, and that is 500 folks. We've got 500 records. If less than 500 records have been breached, unauthorized access, then we have one level of notification. If, on the other hand, more than 500 records were breached at any one particular time in one event, then we have a separate set of obligations.
Let's talk first about what happens if you come across a breach and it's less than 500 records are wrapped up in this one particular breach event. What do you have to do? If you have less than 500 individuals, if you're a covered entity, you have an obligation to notify by US mail that their data was breached. This is going, the notification, directly to the individual themselves. You've got to notify the individual, or, if the individual is deceased, the next of kin of the individual.
How do you make notification? You make notification by First Class mail, or, if the patient gives you permission, by email, and that's at the last known address. If you don't have sufficient information, or if you have out of date contact information, then there is a conspicuous posting permitted on the homepage of the website of the covered entity, or you can do notice elsewhere in the media if you want.
You've got an obligation to actually notify the individuals. Stop and think about that for a moment. You need to notify them by their most recent address and you have to do it by First Class mail. Think about the cost of doing it by First Class mail. You've got a lot of people involved.
What happens if we have a breach, and by the way, when you have less than 500 people, you do not have an obligation to notify the media, and you don't have an obligation to notify the Secretary of Health and Human Services immediately, but you have to make a log of all the breach events that you've had, and who was involved, and what you did to fix it, etc. You need to get that information to the Secretary at least once a year, if you have events that are less than 500 individuals.
If you have more than 500 records were compromised, then you have a different set of obligations. Again, you need to go ahead and make notification to the patients, the individuals that were affected by mail or email, if they permit, but you also must make notification to the local media if there have been more than 500 records or persons involved in a single jurisdiction.
That's kind of a strange idea. What does it mean to be in a single jurisdiction? What happens if you're in northern Virginia and you've got some folks in Maryland and DC? The rule breaks it out and says if you don't have more than 500 individuals in a single jurisdiction, you don't have an obligation to go the local media.
If you have 502 in one jurisdiction and 480 in another jurisdiction, it's OK to go just to the jurisdiction, the newspaper, it says use prominent media in that jurisdiction, whether that's newspaper, television, radios, is unclear, but we use one of those prominent media outlets. You can use the prominent media outlet for the location where there were 500 folks or 508 folks, but not necessarily the one that were 480.
Regardless of where these people live, you do have an immediate obligation to notify the Secretary of Health and Human Services of the breach, how it happened, what happened, what steps you took to find out and what steps you're going to be taking to mitigate, so you have this threshold where if you've got less than 500 you have certain obligations regarding to the Secretary and the media, if you have more than 500 records then you have a different set of obligations.
I can read, let's think for a minute about timing. What's your timing here? The obligation to notify the patient has to be prompt, not later than 60 days from the discovery of the breach. When does the clock start? What is the discovery of breach? The discovery of breach is the date you actually knew about it or you should have known about it, depending on all the facts and circumstances coming to your attention.
What about if you're a Business Associate? If you're a Business Associate you have an obligation to go ahead and tell the covered entity. Whether or not you then have to bear the cost of notifying patients or others basically is a contractual obligation that you as a Business Associate have negotiated with your covered entity.
Remember I said that you have an obligation to report to the Secretary of Health and Human Services. If it's more than 500 you should do that immediately, which means the exact same time you tell the media and others, and if you have less than 500 then you must notify the Secretary of Health and Human Services before the end of 60 days of the end of the year for all the less than 500 events that you've had.
You can go to the HIPAA website, the Secretary's website, and actually see all the reports of breaches that have been made. It's fascinating. Through 2009, when these first started, there's been 1,547 different breach notification reports that have been made to the Secretary. They make for fun reading. Here's one example:
In January of this year, a company called Elite Imaging in Florida, a healthcare provider, reported a theft of protected health information. It was not the theft of a laptop, rather it was a log-in book at the front desk. Somebody stole the log-in book at the front desk, and it was returned anonymously with a letter. The log-in book contained the patients' full names, the name of the procedure for each patient.
I guess, my name is Mr. Smith and I'm here for an appendectomy. OK, that's unauthorized access and disclosure of someone's protected health information. The breach affected about 1500 people, and the covered entity, the healthcare provider, provided notice to the Department of Health and Human Services, they said what happened, who was affected and the media that was involved, and they also contacted the local press, the newspaper and the radio folks.
They didn't get beat up too bad because they did their retraining of their personnel and they started using shredding services for their log-in books, but I thought it was fascinating when you stop and think. A breach typically involves some electronic gum drive or some hacker into your database or a lost laptop, no, somebody came in and stole the log-in book and that was a breach they had to report.
That's what your obligations are. Principally, they are to report. I'm going to let Jason continue on and tell you a little bit about what's inside these notices and what's going to be involved.
Patient Notification Criteria
Jason Karn: What do you have to have in patient notification? The first thing is, what happened? You want to have a brief description of what happened, the date of the breach, and the date that the breach was discovered. This will help the patient. We always say, you want to release the information that you would want to know if your information was compromised, so that you can try to mitigate yourself as much as you want or as you can. You want to keep the description fairly brief, especially if you've got law enforcement involved. They may not want you to release too much information because an investigation may be going on at that time. Really, you want to give a brief description. Maybe what information has gone out when it happened.
Then we want to talk about what PHI was released. Was it X-rays? Was it information like what Dan was talking about with the book going out with procedures in it? What steps patients should take to protect themselves. This might be something that you take on yourself, that you decide a lot of companies have started doing this. We've seen this with some of the larger breaches that they started offering credit monitoring. That's a decision that you can make. It's a nice way to show your patients that you care and that you are really looking out for their information. That might be something that you want to incorporate in part of that notification plan.
You want to talk a little bit about what you're actually doing to mitigate this. How you're investigating it. Again, be careful. Don't release too much information if law enforcement has asked you not to. You definitely have to have contact information for yourself as the covered entity. This may be that you decide to hire an outside service to field phone calls for you rather than calling your office and clogging that up.
You should have a way for them to contact you. That could be via e-mail. It could be through a form for them to submit to you. Again, if you do it that way, make sure that that information is encrypted at all times because you never know what kind of information people put in that e-mail, so you want to give them as strong a portal as possible to protect that information.
Notification exceptions. Since we talked about law enforcement officials, they may ask you not to release a lot of information, especially when it comes to some of these electronic hacks we've been seeing. FBI has been playing their cards pretty close to their chests with what's going on with the ransom-ware right now. The best we've seen is that they're saying they strongly advise that practices, companies, do not pay the fines for ransom-ware, but they haven't really released any, who they're investigating, what's going on. I think they're really trying to keep that under wraps. You need to be careful and make sure that yes, you are honoring what the law enforcement is doing, so that you don't impede an investigation of what's happened.
What if Your Business Associate Has a Breach?
Jason Karn: What do you do if the Business Associate that you have has the breach? It's not you, it's your Business Associate. This could be that your file sharing system was hacked, your EHR something happened if you have a Cloud-based EHR. What happens at that point?
Well you, of course, need to have a Business Associate agreement outlining the time frame. Now we recommend in our Business Associate agreements that everybody ... it's fifteen days from notice of, or discovery of, a breach. This gives you as a covered entity time to marshal your forces, figure out what's going on, make sure that everybody is on the right page. You don't want to go out and start reporting breaches right away. You don't want to go off thinking I've got to get this information out right away. It's important that you contact your legal counsel, that you have a set plan, you investigate what's happening, contact your legal authorities if need be, just make sure that you have time and you want to make sure you take some of that time. You want to make sure that you've taken that opportunity to understand what's going on and so that you're not giving away too much information and also giving away false information because you may discover the breach wasn't as bad as originally you thought it was and you want to make sure that you release proper information, what people can expect, what to look for.
It's very important, and this is another important step here and we talk about this a lot is, the covered entity should really take the lead and begin notifying patients. You are the front. Though you've given your information to a Business Associate and they may have a breach and we've seen a lot of those recently. They may be responsible for the breach but it really becomes your responsibility to notify your patients. They are your patients, after all. It's really important that you know what the plan is and that your Business Associate doesn't go behind your back and report or go behind your back and start talking to your clients. You are the face and you need to make sure you're having that interaction. This will help you control and also have a better idea of what's going on and know what's happening and be able to interact with your patients directly.
At the point, if you have a Business Associate that's had a breach, part of your risk analysis on that might be that you decide you may need to terminate this relationship with the Business Associate and find somebody different to work with or it may be that there was just a procedure that was wrong or something happened in that case. You really want to look and make sure you understand what happened and don't go off and start reporting things without doing a proper investigation internally.
We appreciate your interest and know that maintaining compliance with HIPAA can be a big task. If you're still a bit behind schedule, our partners at Total HIPAA Compliance and Taylor English are available to provide expert HIPAA compliance training and consultation.