Published by Al Saikali

A client recently asked me to identify the next wave of data privacy litigation.  I said that with so much attention on lawsuits arising from data breaches, particularly in light of some recent successes for the plaintiffs in those lawsuits, the way in which companies collect information and disclose what they are collecting is flying under the radar.  This “failure to match” what is actually being collected with what companies are saying they’re collecting and doing with that information could lead to the next wave of data privacy class action litigation.

Here’s an example.  A privacy policy in a mobile app might state that the app collects the user’s name, mailing address, and purchasing behavior.  In fact, and often unbeknownst to the person who drafted the privacy policy, the app is also collecting information like the user’s geolocation and mobile device identification number, but that collection is not disclosed to the user in the privacy policy.  The collection of the additional information isn’t what gets the company into trouble.  It’s the failure to fully and accurately disclose the collection practice and how that information is used and disclosed to others that creates the legal risk.

What is the source of this problem?  In an effort to minimize costs, small companies often slap together a privacy policy by cutting-and-pasting from a form provided by a website designer or found on the Internet.  Little care is given to the accuracy and depth of the policy because there is little awareness of the potential risk.  Larger companies face a different problem: the left hand sometimes doesn’t know what the right hand is doing.  Legal, privacy, and compliance departments often do not ask the right questions of IT, web/app developers, and marketing, and the latter may not do a sufficiently good job of volunteering more than what is asked of them.  This problem is can be further exacerbated where the app/website development and maintenance is outsourced.  This failure to communicate can, unintentionally, result in a “failure to match” a company’s words with its actions when it comes to information collection.

We have already seen state and federal regulators become active in this area.  The Federal Trade Commission has brought a significant number of enforcement actions against organizations seeking to make sure that companies live up to the promises they make to consumers about how they collect and use their information.  Similarly, the Office of the California Attorney General recently brought a lawsuit against Delta Air Lines alleging a violation of California’s Online Privacy Protection Act for failure to provide a reasonably accessible privacy policy in its mobile app. Additionally, the California Attorney General’s Office has issued a guidance on how mobile apps can better protect consumer privacy, which includes the conspicuous placement and fulsome disclosure of information collection, sharing, and disclosure practices.  As the use of mobile apps and collection of electronic information about consumers increase, we can expect to see a ramping up of these enforcement actions.

What sort of civil class action liability could companies face for “failure to match”?  Based on what we’ve seen in privacy and security litigation thus far, if the failure to match a policy with a practice is intentional or reckless, companies could face exposure under theories of fraud or deceptive trade practice statutes that provide a private right of action (e.g., state “Little FTC Acts”).  Even if the failure to disclose is unintentional, the company could still face a lawsuit alleging negligent misrepresentation, breach of contract, and statutory violations that include violations of Gramm Leach Bliley, HIPAA’s privacy rule, or California’s Online Privacy Protection Act. Without weighing in on the merits of these lawsuits, I would venture to guess that the class actions that will have the greatest chances of success will be those where the plaintiffs can show some financial harm (e.g., they paid for the apps in which the deficient privacy policy was contained) or there is a statute that provides set monetary relief as damages (e.g., $1,000 per violation/download).

What can companies do to minimize this risk?  To minimize the risks, companies should begin by evaluating whether their privacy policies match their collection, use, and sharing practices.  This process starts with the formation of a task force under the direction of counsel that is comprised of representatives from legal, compliance, IT, and marketing and that is dedicated to identifying: (1) all company statements about what information is collected (on company websites, in mobile apps, in written documents, etc.); (2) what information is actually being collected by the company’s website, mobile app, and other information collection processes; and (3) how the information is being used and shared.  The second part requires a really deep dive, perhaps even an independent forensic analysis, to ensure that the company’s statements about what information is being collected are correct.  It’s important that the “tech guys” (the individuals responsible for developing the app/website) understand the significance of full disclosure.  Companies should also ask, “do we really need everything we’re collecting?”  If not, why are you taking on the additional risk?  Also remember that this is not a static process.  Companies should regularly evaluate their privacy policies and monitor the information they collect.  A system must be in place to quickly identify when these collection, use, and sharing practices change, so the policies can be updated promptly where necessary.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

If you have noticed an increasing number of high profile problems for healthcare organizations with respect to privacy and security issues these last few weeks you’re not alone.  The issues have ranged from employee misuse of protected health information, web-based breaches, photocopier breaches, and theft of stolen computers that compromised millions of records containing unsecured protected health information (PHI).  These issues remind us that healthcare companies face significant risks in collecting, using, storing, and disposing of protected health information.

Pharmacy Hit With $1.4 Million Jury Verdict For Unlawful Disclosure of PHI

An Indiana jury recently awarded more than $1.4 million to an individual whose protected health information was allegedly disclosed unlawfully by a pharmacy.  The pharmacist, who was married to the plaintiff’s ex-boyfriend, allegedly looked up the plaintiff’s prescription history and shared it with the pharmacist’s husband and plaintiff’s ex-boyfriend.  The lawsuit alleged theories of negligent training and negligent supervision.  The pharmacy intends to appeal the judgment.

Health Insurer Fined $1.7 Million For Web-Based Database Breach

Meanwhile, the Department of Health and Human Services (HHS) recently fined a health insurer $1.7 million for engaging in conduct inconsistent with HIPAA’s privacy and security rules following a breach of protected health information belonging to more than 612,000 of its customers. The breach arose from an unsecured web-based database that allowed improper access to protected health information of its customers.

HHS’s investigation determined that the insurer:

(1) did not implement policies and procedures for authorizing access to electronic protected health information (ePHI) maintained in its web-based application database;

(2) did not perform an adequate technical evaluation in response to a software upgrade, an operational change affecting the security of ePHI maintained in its web-based application database that would establish the extent to which the configuration of the software providing authentication safeguards for its web-based application met the requirements of the Security Rule;

(3) did not adequately implement technology to verify that a person or entity seeking access to ePHI maintained in its web-based application database is the one claimed; and,

(4) impermissibly disclosed the ePHI, including the names, dates of birth, addresses, Social Security Numbers, telephone numbers and health information, of approximately 612,000 individuals whose ePHI was maintained in the web-based application database.

Health Plan Fined $1.2 Million For Photocopier Breach

In another example of privacy and security issues causing legal problems for a healthcare organization, HHS settled with a health plan for $1.2 million in a photocopier breach case.  The health plan was informed by CBS Evening News that CBS had purchased a photocopier previously leased by the health plan.  (Of all the companies to get the photocopier after the health plan, it had to be CBS News).  The copier’s hard drive contained protected health information belonging to approximately 345,000 individuals.  HHS fined the health plan for impermissibly disclosing the PHI of those individuals when it returned the photocopiers to the leasing agents without erasing the data contained on the copier hard drives.  HHS was also concerned that the health plan failed to include the existence of PHI on the photocopier hard drives as part of its analysis of risks and vulnerabilities required by HIPAA’s Security Rule, and it failed to implement policies and procedures when returning the photocopiers to its leasing agents.

blogged about photocopier data security issues last year, after the Federal Trade Commission issued a guide for businesses on the topic of photocopier data security.  Another resource I recommend to my clients on the topic of media sanitization is a document prepared by the National Institute of Standards and Technology, issued last fall.

Medical Group Breach May Affect Up To Four Million Patients

Lastly, a medical group recently suffered what is believed to be the second-largest loss of unsecured protected health information reported to HHS since mandatory reporting began in September 2009.  The cause?  Four unencrypted desktop computers were stolen from the company’s administrative office.  The computers contained protected health information of  more than 4 million patients.  As a result, the medical group is mapping all of its computer and software systems to identify where patient information is stored and ensuring it is secured.  The call center set up to handle inquiries following the notification of the patients is receiving approximately 2,000 calls each day.

The Takeaways 

So what are five lessons companies should take away from these developments?

  • Having policies that govern the proper use and disclosure of PHI is a first step, but it is important that companies audit whether their employees are complying with these policies and discipline  employees who don’t comply so that a message is sent to everyone in the company that non-compliance will not be tolerated.
  • As technology is upgraded or changed, it is important that companies continue to evaluate any potential new security risks associated with these changes.  An assumption should not be made that simply because the software is an “upgrade” the security risks remain the same.
  • There are hidden risks, such as photocopier hard drives.  Stay apprised of these potential risks, identify and assess them in your risk assessment (required by HIPAA), then implement administrative and technical safeguards to minimize these risks.  With respect to photocopiers, maybe this means ensuring that the hard drives are wiped clean or written over before they are returned to the leasing agent.
  • Encrypt sensitive information at rest and in motion where feasible, and to the extent it isn’t feasible, build in other technical safeguards to protect the information.
  • Train, train, train – having a fully informed legal department and management doesn’t do much good if employees don’t understand these risks and aren’t trained to avoid them. Do your employees know how seemingly simple and uneventful conduct like photocopying a medical record, leaving a laptop unaccompanied, clicking on a link in an email, or doing a favor to a friend who needs PHI about a loved one, can lead to very significant unintended consequences for your company (and, as a result, them)?  Train them in a way that brings these risks to life, update the training and require it annually, and audit that your employees are undertaking the training.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

 

What are law firms doing to protect their clients’ sensitive information?  What are clients doing to determine whether their outside counsel are using reasonable security measures to protect their sensitive information (confidential communication, customer data, financial information, protected health information, intellectual property, etc.)?

According to the data forensic firm Mandiant, at least 80 major law firms were hacked in 2011 by attackers who were seeking secret deal information.  The threats to law firms are real and are publicly documented.  In 2011, during the conflict in Libya, law firms that represented oil and gas companies received PDF files purporting to provide information about the effect of the war on the price of oil.  These documents contained malware that infected the networks of the firms that received them.  Similarly, law firms can be a target of political “hacktivism”, as was the case of a law firm that was attacked by Anonymous after representing a soldier in a controversial case, resulting in the public release of 2.6 gigabytes of email belonging to the firm.  And, of course, law firms are just as susceptible to the same risks as other companies when it comes to employee negligence (e.g., lost mobile devices containing sensitive information), inside jobs (misusing access to sensitive information for personal gain), and theft of data.

With these threats in mind, it is useful for lawyers to remember that they have a number of ethical responsibilities to secure their clients’ information, in addition to important business interests.

The Ethical Obligations

Duty to be competent – lawyers cannot stick their heads in the sand when it comes to technology.  They have an ethical obligation to understand the technology they use to secure client information, or they must retain/consult with someone who can make them competent.  As the Arizona Bar stated in Opinion 09-04 (Dec. 2009), “[i]t is important that lawyers recognize their own competence limitations regarding computer security measures and take the necessary time and energy to become competent or alternatively consult available experts in the field.”

Duty to secure – lawyers have an obligation under Model Rule of Professional Conduct 1.6(c) to “make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.”  Because the model rule was only recently adopted by the ABA, there is no easy definition of “reasonable efforts”, but Comment 18 to Rule 1.6(c) requires consideration of several factors:  (1) the sensitivity of the information; (2) the likelihood of disclosure if additional safeguards are not employed; (3) the cost of employing additional safeguards; (4) the difficulty of implementing the safeguards; and (5) the extent to which the safeguards adversely affect the lawyer’s ability to represent clients.  The Arizona Bar’s 09-04 opinion again provides some helpful details:  “In satisfying the duty to take reasonable security precautions, lawyers should consider firewalls, password protection schemes, encryption, anti-virus measures, etc.”  The Arizona Bar rightfully recognized, however, that the duty “does not require a guarantee that the system will be invulnerable to unauthorized access.”  Also, what are considered “reasonable efforts today” may change, as an opinion of the New Jersey Advisory Committee on Professional Ethics pointed out when it expressed reluctance “to render a specific interpretation of RPC 1.6 or impose a requirement that is tied to a specific understanding of technology that may very well be obsolete tomorrow.”

Duty to update – the duty to secure client information is not static; it evolves and changes as technology changes. Arizona Bar Opinion 09-04 is again helpful:  “technology advances may make certain protective measures obsolete over time . . . [Therefore,] [a]s technology advances occur, lawyers should periodically review security measures to ensure that they still reasonably protect the security and confidentiality of the clients’ documents and information.”

Duty to transmit securely – lawyers have an obligation to securely transmit information.  For example, the ABA requires that “[a] lawyer sending or receiving substantive communications with a client via e-mail or other electronic means ordinarily must warn the client about the risk of sending or receiving electronic communications using a computer or other device, or e-mail account, where there is a significant risk that a third party may gain access.”  One example is where a lawyer represents the employee of a company and the employee uses her employer’s email account to communicate with her attorney – in that instance, the attorney should advise his client that there is a risk the employer could access the employee’s email communications.

Duty to outsource securely – Model Rule of Professional Conduct 5.2 states that “a lawyer retaining an outside service provider is required to make reasonable efforts to ensure that the service provider will not make unauthorized disclosure of client information.”  ABA Formal Opinion 95-398 interprets this rule as requiring that a lawyer ensure that the service provider has in place reasonable procedures to protect the confidentiality of information to which it gains access.  The ABA recommends that lawyers obtain from the service provider a written statement of the service provider’s assurance of confidentiality.  In an upcoming blog post I will write about a Florida Bar Proposed Advisory Opinion that provides guidance on how lawyers should be engaging cloud computing service providers, which is an emerging trend in the practice of law.

Duty to dispose securely – lawyers also have an obligation to dispose of client information securely.  This is not as much an ethical duty as a legal obligation to do so.  Many states have data disposal laws that govern how companies (law firms are no exception) should dispose of sensitive information like financial information, medical information, or other personally identifiable information.  Examples of secure disposal include shredding of sensitive information and ensuring that leased electronic equipment containing sensitive information on hard drives are disposed of securely.  In one instance, the Federal Trade Commission fined three financial services companies that were accused of discarding sensitive financial information of their customers in dumpsters near their facilities without first shredding that information.  An example of an unnoticed machine that usually stores sensitive information is the copy machine, many of which have hard drives that store electronic copies of information copied by the machine.  Fortunately, the FTC has provided a useful guide to minimize some of these risks.

The Legal Obligations

The ethical obligations discussed above are separate from any legal obligations that govern certain types of information under HIPPA/HITECH, Gramm-Leach-Bliley, the Payment Card Industry’s Data Security Standards, state document disposal laws, state data breach notification laws, and international data protection laws.  Depending on the type of information the law firms collect, those laws may impose additional proactive requirements to secure data, train employees, and prepare written policies.

The Business Interests

Finally, even if the ethical and legal obligations to secure sensitive information do not provide sufficient incentives for law firms to evaluate their security measures with respect to client information, there are business interests that should compel law firms to do so.  Companies are recognizing the risks presented by sharing sensitive information with service providers like law firms and are, at a minimum, inquiring about the security safeguards the providers have adopted and, in some cases, are requiring a certain level of security and auditing that level of security.  One such example is Bank of America.  According to a recent report, following pressure from regulators, Bank of America now requires its outside counsel to adopt certain security requirements and it is auditing the firms’ compliance with those requirements.

Specifically, Bank of America requires its outside counsel to have a written information security plan, and to follow that plan.  Firms must also encrypt sensitive information that Bank of America shares with the firms.  Bank of America also wants their law firms to safeguard information on their employees’ mobile devices.  Most importantly, law firms must train their employees about their security policies and procedures.  Finally, Bank of America is auditing their law firms to ensure they are complying with these requirements.

So with these threats, ethical responsibilities, and business interests in mind, it is important that law firms, like all other companies that handle sensitive information, evaluate their administrative, technical, and physical safeguard to minimize the risks associated with their storage, use, and disposal of their clients’ sensitive information.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

In August of last year, I wrote about HB 300, a Texas law that, beginning September 1, 2012, created employee training and other requirements for any company doing business in Texas that collects, uses, stores, transmits, or comes into possession of protected health information (PHI).  The law’s training provisions required covered entities to train their employees every two years regarding federal and state law related to the protection of PHI, and obtain written acknowledgement of the training.  (The training was required for new employees within 60 days of their hiring).  Companies were required to train their employees in a manner specific to the way in which the individual employee(s) handle PHI.

Recently, however, the Texas legislature passed two bills that amend the requirements of HB 300 in a few significant ways.  Under SB 1609, the role-specific training requirement has changed.  Now, companies may simply train employees about PHI “as necessary and appropriate for the employees to carry out the employees’ duties for the covered entity.”

SB 1609 also changed the frequency of the training from once every two years to whether the company is “affected by a material change in state or federal law concerning protected health information” and in such cases the training must take place “within a reasonable period, but not later than the first anniversary of the date the material change in law takes effect.”  This change could mean more or fewer training sessions of employees depending on the nature of the covered entity’s business, the size of the covered entity, and the location of the covered entity.

SB 1610, which relates to breach notification requirements, is more puzzling.  Until now, Texas law required companies doing business in Texas that suffered data breaches affecting information of individuals residing in other states that did not have data breach notification laws (e.g., Alabama and Kentucky), to notify the individuals in those states of the breach.  SB 1610 removes that requirement and now provides that:  “If the individual whose sensitive personal information was or is reasonably believed to have been acquired by an unauthorized person is a resident of a state that requires a [breached entity] to provide notice of a breach of system security, the notice of the breach of system security required under Subsection (b) [which sets forth Texas’s data breach notification requirements] may be provided under that state’s law or under required under Subsection (b).”

The natural interpretation of this provision is that a Texas company that suffers a breach of customer information where, for example, some of the customers reside in California, Massachusetts, or Connecticut, is not required to comply with those states’ data breach notification laws if the company complies with the standards set forth in Texas’s data breach notification law.  It will be interesting to see whether Texas receives any push back from other state Attorneys General who enforce their states’ data breach notification laws and may not be pleased with a Texas law that instructs companies doing business in Texas that the requirements for breach notification set forth by other states can be ignored if the Texas company meets Texas’s data breach notification requirements.  Nevertheless, the practical effect of this law is not clear because most companies will want to avoid the risk associated with ignoring another state’s data breach notification law.

In short, the legislative changes are a good reminder that companies doing business in Texas that collect, use, store, transmit, or otherwise handle PHI must determine whether they are complying with HB 300 and the more recent legislative acts that were signed into law June 14, 2013 and became effective immediately.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Legislation was introduced in the U.S. Senate late last week that, if passed, would create proactive and reactive requirements for companies that maintain personal information about U.S. citizens and residents.  The legislation, titled the “Data Security and Breach Notification Act of 2013” (s. 1193) creates two overarching obligations:  to secure personal information and to notify affected individuals if the information is breached.  The bill requires companies to take reasonable measures to protect and secure data in electronic form containing personal information.  If that information is breached, companies are required to notify affected individuals “as expeditiously as practicable and without unreasonable delay” if the company reasonably believes the breach caused or will cause identity theft or other actual financial harm.

A violation of the obligations to secure or notify are considered unfair or deceptive trade practices that may be investigated and pursued by the FTC.  Companies that violate the law could be fined up to $1,000,000 for violations arising out of the same related act or omission ($500,000 maximum for failing to secure the personal information and $500,000 maximum for failing to notify about the breach of the personal information).

The legislation defines personal information as social security numbers, driver’s license numbers, passports numbers, government identification, and financial account numbers or credit/debit card numbers with their required PIN number.  The bill includes a safe harbor for personal information that is encrypted, redacted, or otherwise secured in a way that renders it unusable.

Here are some other important provisions of the legislation:

  • There is no guidance as to what “reasonable measures” means under the obligation to secure personal information, which is problematic (although not very different from state data breach notification laws) because it provides no certainty as to when a company may face liability for failing to adopt certain security safeguards.
  • With respect to the duty to notify, the bill explicitly allows for a reasonable period of time after a breach for the breached entity to determine the scope of the breach and to identify individuals affected by the breach.
  • The legislation would preempt state data breach notification laws, but compliance with other federal laws that require breach notification (e.g., HIPAA/HITECH) is deemed to be compliance with this law.
  • The bill requires that breached entities notify the Secret Service or the FBI if a breach affects more than 10,000 individuals.
  • The bill also allows for a delay of notification if such notification would threaten national or homeland security, or if law enforcement determines that notification would interfere with a civil or criminal investigation.
  • There is no private cause of action for violating the legislation.  The bill is silent as to whether private causes of action based on common law or other statutory claims (e.g., negligence, state unfair trade practices claims, etc.) may be pursued, to the extent such causes of action are recognized.

The remains, however, a big question as to whether this legislation will ultimately become law.  Given the political climate in D.C. and the lack of success of similar federal legislation in the past, the outlook is bleak.  The ambiguity of the required proactive security measures and the lack of clarity as to whether private causes of action may be pursued for non-statutory violations also raise political problems for the legislation on both sides of the aisle.   Nevertheless, there is growing climate of concern regarding privacy and security issues that may result in this legislation being included within a larger package of legislation on cybersecurity and data privacy.  It will be important to keep an eye on the status of this bill moving forward.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Until recently, individuals whose information was compromised as a result of a company suffering a data breach faced an uphill battle when suing the company in a class action lawsuit.  Far more often than not, Courts dismissed the lawsuits or entered summary judgment in favor of defendants on grounds that the plaintiffs could not establish a cognizable injury, preemption by breach notification statutes, or lack of evidence that the data breach (as opposed to some other act of identity theft) caused the plaintiff’s damages.  I’m still convinced that the pro-defendant environment remains the norm.  Nevertheless, four recent cases are being used to support the argument that the tide may be turning in favor of plaintiffs.

Burrows v. Purchasing Power, 12-cv-22800-UU (S.D. Fla.)

The most recent example is a proposed settlement in a class action lawsuit against Winn-Dixie and one of its service providers arising from a breach of personally identifiable information of Winn-Dixie grocery store employees.  The employees’ personally identifiable information was allegedly compromised when an employee of a company that provided an employee benefit program to Winn-Dixie employees misused his access to the PII and filed fraudulent tax returns with it.

Approximately 43,500 employees filed a class action lawsuit in the Southern District of Florida against Winn-Dixie and its employee benefits service provider.  The lawsuit includes counts of negligence, violation of Florida’s Deceptive and Unfair Trade Practice statute, and invasion of privacy.  Plaintiffs alleged that Defendants failed to adequately protect and secure the plaintiffs’ personally identifiable information, and that the defendants failed to provide the plaintiffs with prompt and sufficient notice of the breach.

The defendants’ attempts to defeat the plaintiffs lawsuit on the pleadings failed.  Winn-Dixie was subsequently voluntarily dismissed from the lawsuit and the case proceeded against the service provider, which ultimately entered into a proposed settlement with the plaintiffs, agreeing to pay approximately $430,000 ($225,000 towards a settlement fund, $200,000 in attorney’s fees and costs, and a $3,500 incentive aware to the named plaintiff).  The settlement states that it was entered into “for the purpose of avoiding the burden, expense, risk, and uncertainty of continuing to litigate the Action, . . . and without any admission of any liability or wrongdoing whatsoever.”

The settlement requires the service provider to maintain rigorous security safeguards to minimize the risk of a similar incident in the future.  The settlement fund will be divided into four groups:  (1) a tax refund fraud fund (class members who show they were victims of tax refund fraud can be compensated for a portion of lost interest); (2) a tax preparer loss fund (class members can be compensated for fees paid to tax preparers for notifying the IRS of a tax fraud claim or assisting in resolving issues arising from the tax refund fraud, not to exceed $100); (3) a credit card fraud fund (class members who show they were victims of identity theft other than tax refund fraud that resulted in fraudulent credit card charges that the credit card company did not waive, up to $500); and, (4) a credit monitoring fraud (class members who receive compensation in any of the previous three groups may receive credit monitoring services for one year).  To “prove” they were victims of fraud, plaintiffs must prepare a statement under penalty of perjury regarding the facts and circumstances of their stolen identity.

The settlement was preliminarily approved by the court on April 12, 2013, and a fairness hearing is scheduled for October 4, 2013.  The amount of money being paid to plaintiffs and their lawyers in this case should give corporate counsel monitoring these lawsuits pause for concern.  The District Court’s order allowing the case to proceed beyond the pleadings phase will likely be used as an instruction manual for plaintiffs in future data breach cases.

Resnick v. AvMed, Inc., 1:10-cv-24513-JLK (S.D. Fla.)

I previously blogged about the Eleventh U.S. Circuit Court of Appeal’s opinion that allowed a data breach class action to proceed where the plaintiffs claimed they were victims of identify theft arising from the theft of a laptop computer containing their personal information.  I encourage corporate counsel to read that post to learn more about the factors the Eleventh Circuit looked to in allowing that case to proceed beyond the pleadings phase. That lawsuit remains pending in the U.S. Southern District of Florida.

Harris v. comScore, Inc., No. 11-C-5807 (N.D. Ill. Apr. 2, 2013)

Another recent legal development considered by many to be favorable to plaintiffs was a decision by the U.S. District Court for the District of Chicago court certifying a class of possibly more than one million people who claim that the online data research company comScore, Inc. collected personal information from the individuals’ computers and sells it to media outlets without consent.  Although the lawsuit did not arise from a data breach, some of the arguments regarding lack of injury and whether class certification is appropriate are the same.  The plaintiffs allege violations of several federal statutes including the Electronic Communications Privacy Act and the Stored Communications Act. The court rejected comScore’s arguments challenging class certification, including its argument that the issue of whether each plaintiff suffered damages from comScore’s actions precludes certification.  The lawsuit remains pending.

Tyler v. Michaels Stores Inc., SJC-11145, 2013 WL 854097 (Mass. Mar. 11, 2013)

The Massachusetts Supreme Judicial Court broadened the definition of the term “personal information” to include ZIP codes.  The court held that because retailers can use ZIP codes to find other personal information, retailers where prohibited by Massachusetts law (the Song-Beverly Credit Card Act) from collecting ZIP codes.  The court also ruled that the plaintiffs did not have to prove identity theft to recover under the statute.  They could instead rely on the fact that they received unwanted marketing materials and that their data was sold to a third party.  The fact that plaintiffs can proceed with their lawsuit without having to show that their information was actually compromised will undoubtedly be used by plaintiffs in data breach litigation to argue that the threshold for injury in such cases is lower that in other cases.

What’s the Takeaway?

What should corporate counsel take from these cases? It is still too early to tell if these cases are outliers or if they mark a new trend in favor of plaintiffs in privacy and data breach cases that will embolden the plaintiffs’ bar.  The most important takeaway for corporate counsel at this stage is that they must, at a minimum, monitor the litigation risks associated with data breaches and other privacy violations so they can advise their companies about these risks, which can in turn consider these risks when building security and privacy into various products and services.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

How does your company dispose of personally identifiable information (medical records, financial information, applications containing sensitive information, etc.) and other sensitive information when the information is no longer needed?  Do you throw it in the trash can next to your desk?  Where does it go after that? Is it securely shredded, or thrown into an unsecured dumpster with the trash of other offices and companies?  What about sensitive electronic information?

These questions might not seem important, but the way in which your company disposes of sensitive information can have significant consequences on your business, as two companies learned recently when they discarded personally identifiable information in unsecured dumpsters and were fined over $100,000 by the Federal Trade Commission (FTC).

What Happened?

 The FTC filed charges against three companies that own, manage, and operate payday loan and check cashing stores, alleging that they failed to safeguard personally identifiable information by discarding “documents containing sensitive personal identifying information – including Social Security numbers, employment information, loan applications, bank account information, and credit reports – in unsecured dumpsters near [the defendants’] locations.”

What Were The Causes Of Action?

The FTC’s complaint claims that the defendants violated:

(1) the FTC’s Disposal Rule, which requires companies that maintain or possess certain consumer information for a business purpose “properly dispose of such information by taking reasonable measures to protect against unauthorized access to or use of the information”;

(2) the Gramm-Leach-Bliley Safeguards Rule and Privacy Rule, which require that financial institutions (companies significantly engaged in providing financial products or services) develop and use safeguards to protect consumer information, and deliver privacy notices to consumers explaining their policies and practices; and,

(3) the FTC Act, which prohibits misrepresentations about the reasonable measures companies implement to protect sensitive consumer information.

What Was The Result?

Two of the three defendants settled with the FTC after agreeing to pay a $101,500 fine and agreeing to establish what will likely be an expensive and comprehensive information security program, obtaining regular independent, third-party audits every other year for 20 years, and adopting a number of recordkeeping and compliance monitoring requirements.

What Are The Takeaways?  

First, you need to assess how your company disposes of sensitive information.  Next, you must identify the policies and procedures your company has adopted to ensure that sensitive information is disposed of securely. Can those policies and procedures be improved?  Do your employees comply with existing policies and what “checks” are in place to maximize compliance and minimize risk?  When was the last time you trained and reminded employees about the proper way to securely dispose of sensitive information?  Do you know how your vendors and business associates, with whom you share sensitive information, are disposing of that information?  If you are not sure whether the safeguards your company has adopted meet the legal requirements for secure disposal, it might be wise to retain counsel.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

It can be easy in the data privacy and security sphere to focus significantly on best practices, changing statutes, new administrative investigations, and evolving industry standards.  It is important, however, not to lose the forest for the trees by ignoring larger issues like “what criteria should we use to determine whether information is in fact ‘private’ information?”  The issue was recently addressed by Brad Smith, General Counsel of Microsoft, in a recent InsideCounsel article .

When many of us think of what it means for information to be “private”, we assume the information must be kept secret.  Instinctively, it would seem to make sense that publicly known information cannot also be “private” information.  But can information be private if the owner of the information purposefully provides it to certain individuals and not others?  That issue was recently addressed by the U.S. Supreme Court and discussed in Smith’s article.

Smith’s article argues that legal change may be coming to the definition of privacy, and he cites by way of example Justice Sotomayor’s concurring opinion in the recent U.S. Supreme Court decision in U.S. v. Jones.  In Jones, the court held that the government was required to obtain a warrant where it installed a tracking device on a suspect’s vehicle, as this conduct was a search under the Fourth Amendment.

In her concurring opinion, Justice Sotomayor began with the general principle that “a Fourth Amendment search occurs when the government violates a subjective expectation of privacy that society recognizes as reasonable.”  Does this expectation of privacy extend to information shared with some individuals and not others?  Justice Sotomayor posited that:

it may be necessary to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties.  This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks.  People disclose the phone numbers that they dial or text to their cellular providers; the URLs that they visit and the e-mail addresses with which they correspond to their Internet service providers; and the books, groceries, and medication they purchase to online retailers. . . . I for one doubt that people would accept without complaint the warrantless disclosure to the Government of a list of every Web site they had visited in the last week, or month, or year.  But whatever the societal expectations, they can attain constitutionally protected status only if our Fourth Amendment jurisprudence ceases to treat secrecy as a prerequisite for privacy.  I would not assume that all information voluntarily disclosed to some member of the public for a limited purpose is, for that reason alone, disentitled to Fourth Amendment protection.

Justice Sotomayor also quoted Justice Marshall’s dissent in the 1979 case of Smith v. Maryland – “Privacy is not a discrete commodity, possessed absolutely or not at all.  Those who disclose certain facts to a bank or phone company for a limited business purpose need not assume that this information will be released to other persons for other purposes.”

Ultimately, the Jones Court did not decide whether a reasonable expectation of privacy exists in information voluntarily disclosed to third parties, but as Mr. Smith observes, “the Fourth Amendment will likely evolve and influence the future of privacy rules and practices with implications for inside counsel across the country.”

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Regulators increasingly want to know what companies are telling consumers about how the companies are using information about their consumers.  Companies that do not properly explain how they collect, store, and use their customers’ information are facing increased scrutiny.  Nowhere is this increased scrutiny move evident than in the $22.5 million civil penalty that the FTC levied against Google, or the FTC’s complaint and decision against Facebook.

Now, the Office of the Attorney General for the State of California has weighed in by cracking down on companies that do not include privacy policies in their mobile apps.  In a recent press release, California Attorney General Kamala Harris announced that her office has begun formally notifying up to 100 mobile application developers and companies that they are not in compliance with California privacy law.  According to Bloomberg, some of these companies receiving letters include United-Continental, Delta Air Lines, and Open Table.

The law that the Attorney General is referring to is the California Online Privacy Protection Act, which requires commercial operators of online services who collect personally identifiable information from California residents to conspicuously post a privacy policy.  Companies that violate this law face fines of up to $2,500 each time the non-compliant app is downloaded.

Amazon, Apple, Facebook, Google, Hewlett-Packard, Microsoft, and Research in Motion, as platforms for mobile applications, all agreed to privacy principles earlier this year that allow consumers to review an app’s privacy policy before they download the app rather than after.  The companies also agreed to offer consumers a consistent location for an app’s privacy policy on the application-download screen in the platform store.

So what is the takeaway?  If you collect information about individuals, make sure you have a clear privacy policy.  Make sure the policy is placed in a location that makes it easy to find.  If you offer a mobile app, try to work with your mobile app platform to provide the privacy policy to consumers before they download the app.  It’s also a good idea to update your privacy policy periodically to ensure it remains current with your company’s information collection practices.

When was the last time your company took a fresh look at its privacy policy?

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

On September 1, 2012, a new law will go into effect in Texas that imposes new requirements on organizations that maintain protected heath information (PHI).  The new legislation, HB 300, imposes even tighter standards than required by the federal Health Insurance Portability and Accountability Act (HIPAA) and the Health Information Technology for Economic and Clinical Health Act (HITECH).

Who Does HB 300 Apply To?

Like HIPAA and HITECH, HB 300 applies to “covered entities.”  But the definition of a covered entity under HB 300 is broader than the definition of a covered entity under HIPAA (expanded by HITECH).  A “covered entity” under HB 300 is any individual, business or organization that:

  • Engages in the practice of assembling, collecting, analyzing, using, evaluating, storing, or transmitting PHI;
  • Comes into possession of PHI;
  • Obtains or stores PHI; or
  • Is an employee, agent, or contractor of a person described in the above three categories, if they create, receive, obtain, maintain, use, or transmit PHI.

In short, HB 300 would theoretically apply to entities such as law firms that maintain medical records in prosecuting/defending lawsuits, schools that maintain or use PHI, and information management entities that transfer and sell PHI.

What Does HB 300 Require?

HB 300 imposes a number of requirements on covered entities, including but not limited to:

Employee Training – Covered entities must train their employees regarding federal and state law related to the protection of PHI.  The training must be specifically tailored for the employee’s responsibilities and the ways in which the covered entity uses PHI.  New employees must be trained within 60 days of their hire dates, training should take place at least once every two years, and upon the completion of a training program, the employee must sign a statement verifying the employee’s attendance at the training program.  The covered entities must maintain these signed employee statements.  In contrast, HIPAA requires training only within a reasonable period of time after an employee is hired or whenever there are material changes to privacy policies.

Patient Record Requests – HB 300 requires covered entities to provide patients with electronic copies of their electronic health records within 15 business days of the patient’s written request.  This requirement differs from HIPAA, which allows covered entities 30 days to respond to such requests.

Disclosure of PHI – HB 300 prohibits the sale of PHI.  Additionally, a covered entity may only disclose PHI to another covered entity for the purpose of treatment, payment, health care operation, performing an insurance or health maintenance organization function, or as otherwise authorized or required by state or federal law.  If disclosure is made, then the covered entity must give notice to patients about the disclosure.

Consumer Information Website – The Texas Attorney General must maintain a website explaining consumer privacy rights regarding PHI under Texas and federal law, a list of the state agencies that regulate covered entities, detailed information about each agency’s complaint enforcement process, and contact information for each agency for reporting a violation of HB 300.

Audits of Covered Entities – Texas’s Health and Human Services Commission may request that the U.S. Secretary of Health and Human Services conduct an audit of a covered entity to determine compliance with HIPAA and the commission must periodically monitor and review the results of those audits.

What Are The Consequences For Violating The Law?

HB 300 imposes significant civil penalties, ranging from $5,000 to $1.5 million, on covered entities that fail to comply with its requirements.  The Texas Attorney General is responsible for pursuing these penalties.  In determining the amount of a penalty imposed, the court will consider the seriousness of the violation, the entity’s compliance history, the risk of harm to the patient, the amount necessary to deter future violations, and efforts made to correct the violation.

To the extent the violation arises from a failure to comply with the disclosure requirements of HB 300, factors that may limit a covered entity’s liability include whether the disclosed information was encrypted, whether the recipient did not use or release the PHI, and whether the covered entity had developed, implemented, and maintained security policies, including training of its employees responsible for the security of PHI.

What’s The Point?

The point is that your business needs to evaluate whether HB 300 applies to you.  Are you a covered entity under this new, broader definition?  Do you have training policies and procedures in place that meet the requirements of HB 300?  Are you ready to respond quickly to requests for PHI?  Even if the law doesn’t apply to you, best practices in your industry might make it wise to become compliant, as concerns about the privacy and security of PHI continue to grow.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.