Published by Al Saikali

If you have noticed an increasing number of high profile problems for healthcare organizations with respect to privacy and security issues these last few weeks you’re not alone.  The issues have ranged from employee misuse of protected health information, web-based breaches, photocopier breaches, and theft of stolen computers that compromised millions of records containing unsecured protected health information (PHI).  These issues remind us that healthcare companies face significant risks in collecting, using, storing, and disposing of protected health information.

Pharmacy Hit With $1.4 Million Jury Verdict For Unlawful Disclosure of PHI

An Indiana jury recently awarded more than $1.4 million to an individual whose protected health information was allegedly disclosed unlawfully by a pharmacy.  The pharmacist, who was married to the plaintiff’s ex-boyfriend, allegedly looked up the plaintiff’s prescription history and shared it with the pharmacist’s husband and plaintiff’s ex-boyfriend.  The lawsuit alleged theories of negligent training and negligent supervision.  The pharmacy intends to appeal the judgment.

Health Insurer Fined $1.7 Million For Web-Based Database Breach

Meanwhile, the Department of Health and Human Services (HHS) recently fined a health insurer $1.7 million for engaging in conduct inconsistent with HIPAA’s privacy and security rules following a breach of protected health information belonging to more than 612,000 of its customers. The breach arose from an unsecured web-based database that allowed improper access to protected health information of its customers.

HHS’s investigation determined that the insurer:

(1) did not implement policies and procedures for authorizing access to electronic protected health information (ePHI) maintained in its web-based application database;

(2) did not perform an adequate technical evaluation in response to a software upgrade, an operational change affecting the security of ePHI maintained in its web-based application database that would establish the extent to which the configuration of the software providing authentication safeguards for its web-based application met the requirements of the Security Rule;

(3) did not adequately implement technology to verify that a person or entity seeking access to ePHI maintained in its web-based application database is the one claimed; and,

(4) impermissibly disclosed the ePHI, including the names, dates of birth, addresses, Social Security Numbers, telephone numbers and health information, of approximately 612,000 individuals whose ePHI was maintained in the web-based application database.

Health Plan Fined $1.2 Million For Photocopier Breach

In another example of privacy and security issues causing legal problems for a healthcare organization, HHS settled with a health plan for $1.2 million in a photocopier breach case.  The health plan was informed by CBS Evening News that CBS had purchased a photocopier previously leased by the health plan.  (Of all the companies to get the photocopier after the health plan, it had to be CBS News).  The copier’s hard drive contained protected health information belonging to approximately 345,000 individuals.  HHS fined the health plan for impermissibly disclosing the PHI of those individuals when it returned the photocopiers to the leasing agents without erasing the data contained on the copier hard drives.  HHS was also concerned that the health plan failed to include the existence of PHI on the photocopier hard drives as part of its analysis of risks and vulnerabilities required by HIPAA’s Security Rule, and it failed to implement policies and procedures when returning the photocopiers to its leasing agents.

blogged about photocopier data security issues last year, after the Federal Trade Commission issued a guide for businesses on the topic of photocopier data security.  Another resource I recommend to my clients on the topic of media sanitization is a document prepared by the National Institute of Standards and Technology, issued last fall.

Medical Group Breach May Affect Up To Four Million Patients

Lastly, a medical group recently suffered what is believed to be the second-largest loss of unsecured protected health information reported to HHS since mandatory reporting began in September 2009.  The cause?  Four unencrypted desktop computers were stolen from the company’s administrative office.  The computers contained protected health information of  more than 4 million patients.  As a result, the medical group is mapping all of its computer and software systems to identify where patient information is stored and ensuring it is secured.  The call center set up to handle inquiries following the notification of the patients is receiving approximately 2,000 calls each day.

The Takeaways 

So what are five lessons companies should take away from these developments?

  • Having policies that govern the proper use and disclosure of PHI is a first step, but it is important that companies audit whether their employees are complying with these policies and discipline  employees who don’t comply so that a message is sent to everyone in the company that non-compliance will not be tolerated.
  • As technology is upgraded or changed, it is important that companies continue to evaluate any potential new security risks associated with these changes.  An assumption should not be made that simply because the software is an “upgrade” the security risks remain the same.
  • There are hidden risks, such as photocopier hard drives.  Stay apprised of these potential risks, identify and assess them in your risk assessment (required by HIPAA), then implement administrative and technical safeguards to minimize these risks.  With respect to photocopiers, maybe this means ensuring that the hard drives are wiped clean or written over before they are returned to the leasing agent.
  • Encrypt sensitive information at rest and in motion where feasible, and to the extent it isn’t feasible, build in other technical safeguards to protect the information.
  • Train, train, train – having a fully informed legal department and management doesn’t do much good if employees don’t understand these risks and aren’t trained to avoid them. Do your employees know how seemingly simple and uneventful conduct like photocopying a medical record, leaving a laptop unaccompanied, clicking on a link in an email, or doing a favor to a friend who needs PHI about a loved one, can lead to very significant unintended consequences for your company (and, as a result, them)?  Train them in a way that brings these risks to life, update the training and require it annually, and audit that your employees are undertaking the training.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

 

How does your company dispose of personally identifiable information (medical records, financial information, applications containing sensitive information, etc.) and other sensitive information when the information is no longer needed?  Do you throw it in the trash can next to your desk?  Where does it go after that? Is it securely shredded, or thrown into an unsecured dumpster with the trash of other offices and companies?  What about sensitive electronic information?

These questions might not seem important, but the way in which your company disposes of sensitive information can have significant consequences on your business, as two companies learned recently when they discarded personally identifiable information in unsecured dumpsters and were fined over $100,000 by the Federal Trade Commission (FTC).

What Happened?

 The FTC filed charges against three companies that own, manage, and operate payday loan and check cashing stores, alleging that they failed to safeguard personally identifiable information by discarding “documents containing sensitive personal identifying information – including Social Security numbers, employment information, loan applications, bank account information, and credit reports – in unsecured dumpsters near [the defendants’] locations.”

What Were The Causes Of Action?

The FTC’s complaint claims that the defendants violated:

(1) the FTC’s Disposal Rule, which requires companies that maintain or possess certain consumer information for a business purpose “properly dispose of such information by taking reasonable measures to protect against unauthorized access to or use of the information”;

(2) the Gramm-Leach-Bliley Safeguards Rule and Privacy Rule, which require that financial institutions (companies significantly engaged in providing financial products or services) develop and use safeguards to protect consumer information, and deliver privacy notices to consumers explaining their policies and practices; and,

(3) the FTC Act, which prohibits misrepresentations about the reasonable measures companies implement to protect sensitive consumer information.

What Was The Result?

Two of the three defendants settled with the FTC after agreeing to pay a $101,500 fine and agreeing to establish what will likely be an expensive and comprehensive information security program, obtaining regular independent, third-party audits every other year for 20 years, and adopting a number of recordkeeping and compliance monitoring requirements.

What Are The Takeaways?  

First, you need to assess how your company disposes of sensitive information.  Next, you must identify the policies and procedures your company has adopted to ensure that sensitive information is disposed of securely. Can those policies and procedures be improved?  Do your employees comply with existing policies and what “checks” are in place to maximize compliance and minimize risk?  When was the last time you trained and reminded employees about the proper way to securely dispose of sensitive information?  Do you know how your vendors and business associates, with whom you share sensitive information, are disposing of that information?  If you are not sure whether the safeguards your company has adopted meet the legal requirements for secure disposal, it might be wise to retain counsel.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Regulators increasingly want to know what companies are telling consumers about how the companies are using information about their consumers.  Companies that do not properly explain how they collect, store, and use their customers’ information are facing increased scrutiny.  Nowhere is this increased scrutiny move evident than in the $22.5 million civil penalty that the FTC levied against Google, or the FTC’s complaint and decision against Facebook.

Now, the Office of the Attorney General for the State of California has weighed in by cracking down on companies that do not include privacy policies in their mobile apps.  In a recent press release, California Attorney General Kamala Harris announced that her office has begun formally notifying up to 100 mobile application developers and companies that they are not in compliance with California privacy law.  According to Bloomberg, some of these companies receiving letters include United-Continental, Delta Air Lines, and Open Table.

The law that the Attorney General is referring to is the California Online Privacy Protection Act, which requires commercial operators of online services who collect personally identifiable information from California residents to conspicuously post a privacy policy.  Companies that violate this law face fines of up to $2,500 each time the non-compliant app is downloaded.

Amazon, Apple, Facebook, Google, Hewlett-Packard, Microsoft, and Research in Motion, as platforms for mobile applications, all agreed to privacy principles earlier this year that allow consumers to review an app’s privacy policy before they download the app rather than after.  The companies also agreed to offer consumers a consistent location for an app’s privacy policy on the application-download screen in the platform store.

So what is the takeaway?  If you collect information about individuals, make sure you have a clear privacy policy.  Make sure the policy is placed in a location that makes it easy to find.  If you offer a mobile app, try to work with your mobile app platform to provide the privacy policy to consumers before they download the app.  It’s also a good idea to update your privacy policy periodically to ensure it remains current with your company’s information collection practices.

When was the last time your company took a fresh look at its privacy policy?

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Today, the Federal Trade Commission levied a $22.5 million penalty against Google, the largest civil penalty by the FTC against a single defendant.  Here is a copy of the Stipulated Order entered into between the FTC and Google.  The penalty stems from an FTC Complaint alleging that Google violated “privacy promises” it agrees to as part of a 2011 consent order it entered into with the FTC.

In 2011, the FTC sued Google after Google initially assured Gmail users it would not use their information for any purpose other than to provide email service.  The FTC claimed that Google did not honor that promise, so an order was entered requiring Google to adopt comprehensive privacy protections for consumers and civil penalties if Google did not abide by the agreement.

Today’s settlement stems from an FTC allegation that Google subsequently misled consumers about the use of tracking cookies in Apple’s Safari Internet browser.  “Cookies” are small files stored on a computer that hold data specific to a particular user and website, so that when the user visits a certain website, that site delivers a page tailored to the user.  By placing a cookie on a person’s computer, an ad network can collect information about the person’s browsing habits and then use that information to display advertisements targeted to the person’s interests.  In this case, Google used the “DoubleClick Advertising Cookie” to collect information about users’ browsing activity.

Some people prefer to disable cookies from monitoring websites they visit.  Increasingly, companies are giving consumers ways to control such monitoring.  Apple’s Safari program generally blocks cookies in almost all situations.  One situation in which cookies are not blocked is when the user submits information in an online form on a website.  (For example, a Safari user who submitted a mailing address via a form embedded in a page when buying something online).  In such a situation, Safari accepts the cookie and allows additional cookies from that same site.

What Happened Here?

In this case, the FTC alleged that Google violated the 2011 consent order by representing to consumers that it would not place tracking cookies or serve targeted ads based on those cookies, but then it delivered tracking cookies and targeted ads to some users.  Specifically, users would allow one cookie from Google’s advertising cookie service, which opened the door for all cookies from that advertising cookie service to be accepted.

Google informed users that if they wanted to opt out of its system where Google’s advertising cookies were automatically accepted, the users need not take any action due to Safari’s default cookie-blocking settings.  According to the FTC, however, Google sidestepped Safari’s default cookie-blocking settings by taking advantage of Safari’s narrow exception for forms.  Google “tricked” the user’s browser into believing that the user was submitting information through a form, allowing Google to place a temporary cookie in the user’s computer.  Once the temporary cookie was installed, the user’s computer would then accept all cookies that Google had originally said would be blocked, which the FTC alleged was a violation of the consumer privacy protections imposed by the 2011 consent order.

What Are The Takeaways For The Business Community?

There are a few takeaways from today’s settlement announcement.  First, if your company enters into an agreement with the FTC regarding future conduct, you should be careful to ensure you remain in compliance.  The FTC takes the violation of a consent order very seriously.  Second, be up front, open, and honest with consumers who use your product about the measures you are taking to protect their privacy and the procedures they should follow to change their privacy settings.  Finally, if you make promises to consumers about how their information will be accessed, maintained, or used, be sure to keep those promises.

As FTC Chairman, Jon Leibowitz, stated today, “No matter how big or small, all companies must abide by FTC orders against them and keep their privacy promises to consumers, or they will end up paying many times what it would have cost to comply in the first place.”  I would add that the negative publicity that could follow from FTC action such as this can be as harmful to a company as the monetary penalty itself.

 

DISCLAIMER: The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients. Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients. All of the data and information provided on this site is for informational purposes only. It is not legal advice nor should it be relied on as legal advice.

Does your organization use a photocopier?  If so, what types of documents do you copy, fax, and email with it?  Do those documents contain proprietary information or personal information of your consumers/employees?  If so, then you should review a guide issued by the Federal Trade Commission, called “Copier Data Security:  A Guide for Businesses.”  The guide reminds us all that hard drives in modern digital copiers store data about the documents the copiers copy, print, scan, fax, or email, and steps must be taken to secure that data.

In a typical large organization, copy machines are often leased, returned, and then leased again or sold.  As a result, there is a good chance that information stored on those copy machines’ hard drives could be accessed by an unauthorized third-party.  The FTC’s guide recommends that organizations build in data security for each stage of the copier’s life-cycle:  planning the acquisition of a device, buying/leasing the device, using the device, and returning or disposing of the device.

Before acquiring a copier, organizations should ensure that their information security policies govern copiers.  Employees who have responsibility for securing computers/servers for your organization should be responsible for securing data stored on the copiers.

When buying/leasing a copier, an organization should consider options to secure data on the device.  For example, some copiers can encrypt (scramble) the data stored on copier hard drives so it cannot be retrieved even if the hard drive is removed.  Other copiers can overwrite existing data on the hard drive with random characters.  Also, check that your lease or purchase contract states that your organization will retain ownership of all hard drives at the end-of-life, or that the company providing the copier will overwrite the hard drive.

When using the copier, the FTC guidance recommends overwriting the entire hard drive at least once per month.  Place a sticker on the machine that reminds the organization that at the time of disposal the hard drive must be physically destroyed.  Additionally, make sure that if the copier must be connected to a network, it is integrated securely.

When your organization has finished using the copier, the FTC recommends checking with the manufacturer, dealer, or servicing company for options on securing the hard drive.  Some companies will remove the hard drive and return it to you or will overwrite it for you.

What’s the takeaway?  The hard drives in many modern copy machines can store personal and proprietary information contained in the documents they copy, fax, and email.  Organizations should take steps when purchasing, maintaining, and disposing of their copiers to ensure that the data stored on the copiers is secure.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Flying “under the radar” this week as a result of the high profile LinkedIn data breach, was news that the Federal Trade Commission charged two businesses with illegally exposing the sensitive personal information of consumers by allowing peer-to-peer (P2P) file-sharing software to be installed on their corporate computer systems.  P2P software is commonly used to play games, make online telephone calls, and share software, music, videos and documents.  If not configured correctly, however, files not intended for sharing may be accessible to anyone on the P2P network.  Once shared, a file usually cannot be permanently removed from the P2P network.

In February 2010, the FTC issued a warning about the improper release of sensitive consumer data on P2P file-sharing networks.  FTC Chairman, Jon Leibowitz, recommended that “[c]ompanies should take a hard look at their systems to ensure that there are no unauthorized P2P file-sharing programs and that authorized programs are properly configured and secure.  Just as important, companies that distribute P2P programs . . . should ensure that their software design does not contribute to inadvertent file sharing.”

Fast-forward two years, the FTC recently filed two complaints against businesses that allowed P2P technology to be used on their network, exposing their consumers’ personally identifiable information.  The FTC has entered into settlements in both cases, but the lessons learned about a company’s obligation to monitor the type of software installed by its employees are invaluable.

In the first complaint, the FTC alleged that an auto dealer compromised its consumers’ personal information by allowing P2P software to be installed on its network, which led to sensitive financial information being uploaded to the P2P network.  The FTC claims that the auto dealer failed to implement “reasonable security measures” such as:

  • assessing risks to the consumer information it collected and stored
  • adopting policies to prevent or limit unauthorized disclosure of information
  • preventing, detecting, and investigating unauthorized access to personal information on its networks
  • adequately training employees
  • responding to unauthorized access to personal information
  • Because the dealer is also a financial institution, it was governed by the Gramm-Leach-Bliley Safeguards Rule, and accordingly failed to provide annual privacy notices and provide a mechanism by which consumers could opt out of information sharing with third parties.

In the second complaint, the FTC charged a debt collection company with failure to implement reasonable security measures after the company’s COO installed a P2P application on her desktop computer that allowed private information of 3,800 of the company’s clients’ customers to leak into the P2P network.  The FTC’s complaint details some of the debt collection company’s alleged failures:

  • failure to adopt an information security plan that was appropriate for its network and the personal information processed and stored on them
  • failure to implement an incident response plan
  • failure to assess risks to the consumer information collected and stored online
  • failure to adequately train employees about security to prevent unauthorized disclosure of personal information
  • failure to assess and enforce compliance with its existing security policies and procedures, such as scanning networks to identify unauthorized P2P file sharing applications and other unauthorized applications operating on the networks or blocking installation of such programs
  • failure to prevent, detect, and investigate unauthorized access to personal information on its networks, such as by logging network activity and inspecting outgoing transmissions to the Internet to identify unauthorized disclosures of personal information.

The settlement agreements entered into in both cases require the companies to, among other things, establish and maintain a comprehensive information security program and undergo data security audits by independent auditors every other year for 20 years.

So what are the take-aways?  First, this is a major warning to companies that the FTC may hold them responsible for software that a company employee installs on her desktop.  Second, P2P software can be a significant threat to private information and it is important that companies take steps to monitor and perhaps entirely prevent the use of such software.  Third, the terms of settlement show why it is important for companies to be proactive and undergo audits of their information security network, adopt information security policies and procedures, train their employees, and continue staying vigilant against threats to private information stored on their networks before a data breach occurs.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.