Published by Al Saikali

The following post was prepared by guest contributor, my friend, my brother-in-arms, and newly-minted Partner in Shook’s Privacy and Data Security Practice, Colman McCarthy 

For what seemed like an eternity (okay, just a couple years), the California Consumer Privacy Act was the only game in town when it came to state-level, comprehensive privacy legislation. Sure, we saw many other states introduce similar bills, and Washington got close a couple times to passing the Washington Privacy Act. Those all died on the vine, however. In fact, California was the only state after itself to see passage of anything really big, with the California Privacy Rights Act (which amends the CCPA, and is not a separate, new law) gaining passage in the November 2020 election.

All to which Virginia has recently stepped forward and said, “Hold my…authenticated consumer request.” The Virginia Consumer Data Protection Act (or VACDPA, as I prefer to call it) is a comprehensive privacy bill that was just signed into law by Governor Northam on March 2, and shows influences from both the CCPA and Europe’s General Data Protection Regulation.

What do you need to know about VACDPA, beyond the fact that it’s fun to say out loud? Probably the most important fact is that it won’t go into effect until January 1, 2023. That gives entities a long runway to understand their obligations under the law and get into compliance.

So, what about all the other stuff? Well…

The Basics

To Whom Does the VACDPA Apply?

For-profit entities doing business in Virginia are covered by the new law if they control or process personal data of 100,000 Virginia residents in a calendar year. That threshold drops to 25,000 if the business derives over 50% of its gross revenue from the sale of personal data. Unlike the CCPA, there is no revenue trigger. So the law can capture small companies that process a lot of data, but pass over medium or large companies without substantial business in Virginia.

Who is left outside the scope of the VACDPA? Public agencies, non-profits, and institutions of “higher education,” for one. But also—and this is potentially quite impactful—any “financial institution or data subject to” Gramm-Leach-Bliley, or “covered entity or business associate governed by” HIPAA. Those seem to provide entity-wide exemptions, rather than limiting the scope of the exemptions to the data actually covered by those laws.

The Controller/Processor Divide

After the CCPA’s exciting creation of the defined terms “business” and “service provider” to distinguish between entities (and to confuse everyone who doesn’t have the time to memorize those definitions), the VACDPA has returned to the GDPR’s use of the terms “controller” and “processor,” and basically adopts the same definitions.

Unsurprisingly, the VACDPA’s obligations (disclosures, complying with requests, data minimization, etc., etc.) largely fall on the controller. The processor’s obligations generally are to adhere to its contract with the controller and assist the controller with its obligations. But, as with the GDPR, an entity may occupy both roles, which is a fact-based determination based on the context surrounding a particular instance of processing.

What Information Is Covered?

Like the GDPR, Virginia’s new law uses the term “personal data,” rather than the CCPA’s “personal information.” Why point this out? Perhaps out of annoyed frustration at having to use separate terms when trying to keep a written analysis of the law under the length of a novella. For expediency’s sake, I’ll use the composite term “personal data/information”—or PD/I, for those who share a love for acronyms. (Mostly because the portmanteau “personal datmation” would cause more confusion than most legal terms, and sounds like a breed of dog in any event.)

Notwithstanding the above (probably pointless) digression into legal-term utilization, “personal data” under the VACDPA is defined to include “any information that is linked or reasonably linkable to an identified or identifiable natural person.” The exact language of that definition differs from what’s used in the GDPR and CCPA, and one could write an entire article analyzing the potential implications of those differences. To prevent any sudden attacks of somnambulism, however, I’ll simply note for now that—like the GDPR and CCPA—this is an extremely expansive concept of what information can qualify as PD/I.

PD/I does not include de-identified or publicly available information. And the VACDPA includes a host of other exemptions for PD/I covered by other laws, such as HIPAA, the Fair Credit Reporting Act, Driver’s Privacy Protection Act, Family Educational Rights and Privacy Act, and others.

Also excluded under Virginia’s law is PD/I of employees, job applicants, agents, or independent contractors in the context of their roles as such. That exclusion is accomplished in two ways, both as a specific exclusion to the law’s scope, and in the definition of “consumer,” which explicitly carves out individuals “acting in a commercial or employment context.”

What Other Information Is Covered?

But wait, you say. Isn’t there also a definition for “sensitive data?” Yes, there is. PD/I revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status is all considered sensitive data. Joining that information is precise geolocation data (which has its own definition), PD/I collected from a known child (i.e., under 13 years old), and the processing of genetic or biometric data (also with its own definition) for the purpose of identifying a “natural” person (sorry, “legal” persons). This distinction between regular PD/I and sensitive data pops up in connection with one of the rights granted by the VACDPA and with the requirement to perform data-protection assessments.

What Are the Basic Consumer Rights?

The law specifically enumerates five (six? eight? nine?) particular consumer rights:

  1. The right to confirm whether a controller is processing PD/I and to access it. (Does that count as two rights?)
  2. The right to correct inaccuracies.
  3. The right to delete.
  4. The right to obtain a copy of the PD/I in portable format.
  5. The right to opt out of processing for purposes of targeted advertising, sale of PD/I, or profiling that produces legal or other significant effects concerning the consumer. (That’s technically three rights, right?)

Exercise of those rights follows the CCPA’s structure. Controllers must provide one or more methods for submitting requests, and must respond within 45 days (extended by 45 days, if necessary) to authenticated requests. Denial of a request requires an explanation, consumers can request information for free twice annually, and controllers can request additional information if needed to authenticate the request.

Any Other Consumer Rights?

Tucked away in the VACDPA’s provisions away from the specifically enumerated rights are two other important rights. First, the law provides consumers a right to consent to the processing of sensitive data (though it’s framed as proscription on the controller’s ability to do so). The VACDPA defines “consent” much like the GDPR, and requires a clear affirmative act showing freely given, specific, informed, and unambiguous agreement to the processing of the PD/I.

The law also provides a right for consumers to appeal a controller’s refusal to take action on a request, and the controller must respond with a written explanation within 60 days. If the appeal is denied, the controller must provide the individual with an online mechanism that allows the consumer to submit a complaint to the Attorney General.

Other Notable Stuff

Narrow Definition of “Sale of Personal Data”

One quite noteworthy difference between the VACDPA and its CCPA cousin is how the laws approach the concept of selling PD/I. The CCPA, as has been repeated ad nauseam, expands the colloquial sense of “sale” to include any transfer of PD/I “for monetary or other valuable consideration.” The VACDPA cuts down substantially on that scope to the exchange of PD/I “for monetary consideration” by the controller to a third party. And like the CCPA, there are various exceptions to the definition, such as disclosing PD/I to a processor. Perhaps the most impactful of those exceptions is for transfers to affiliates of the controller.

Data Minimization and Reasonable Security

Data minimization is not a new concept in the world of privacy and data security. And the VACDPA continues a trend we are seeing of laws explicitly enshrining data-minimization principles. Under the law, controllers must limit collection of PD/I to what’s relevant, adequate, and reasonably necessary for the purposes its collected, and must not process PD/I beyond those or compatible purposes without consumer consent.

Reasonable security is also not a new concept to privacy law. Many states have laws that require reasonable security for entities that maintain personally identifiable information (a term that is narrower than “personal data” or “personal information” under comprehensive privacy laws). And some laws such as New York’s Shield Act and Massachusetts 201 CMR 17.00 try to provide more detailed guidance. Generally, however, what counts as reasonable security under these laws is frustratingly not clear, and the VACDPA is no exception. It simply requires “reasonable administrative, technical, and physical data security” practices “appropriate to the volume and nature of the personal data at issue.” How does one apply that in practice? Well, I happen to know that The Sedona Conference (via Working Group 11) just published last month its Commentary on a Reasonable Security Test (available here), a project headed up by Shook’s own Bill Sampson. It’s a great resource for anyone trying to figure out that question.

Data-Protection Assessments

Much like the GDPR, certain types of activities and certain types of PD/I require entities to perform impact assessments prior to processing. For the VACDPA, that includes processing for targeted advertising, sale of PD/I, or certain instances of profiling. Processing sensitive data also requires an assessment, as well as the catchall obligation to perform an assessment for processing activities that “present a heightened risk of harm to consumers.” The law requires a cost-benefit analysis for the assessment and consideration of certain factors, such as the use of de-identified data and reasonable expectations of consumers.

Two important things to remember about these data-protection assessments. First, the obligation is not retroactive, so assessments are only required for processing activities that occur on or after January 1, 2023. Second, the Virginia Attorney General is allowed to request a controller’s data-protection assessments, which seems like a pathway to public disclosure of potentially privileged information. But the VACDPA anticipates that issue by shielding assessments requested by the AG from Virginia’s FOIA law, and providing that disclosure to the AG is not a waiver of attorney-client privilege or work-product doctrine.

What About Enforcement?

Good question. The Virginia Attorney General has exclusive authority to enforce the VACDPA. Penalties can reach as high as $7,500 per violation, but entities have a 30-day period to cure a violation after receiving written notice from the AG.

There is no private right of action (not even for data breaches, like the CCPA), and the law specifically states that it does not provide the basis for one under the VACDPA “or under any other law.” That likely prevents actions under Virginia’s unfair trade practices law seeking to leverage violations of the CDPA.

Anything Else?

Yes, of course. Even after 1,800+ words, this lengthy piece doesn’t cover every nook and cranny of the VACDPA. There are:

  • Other exceptions to VACDPA, such as complying with other laws.
  • Particular requirements if you want to use de-identified data.
  • Particular requirements for processor contracts.
  • Lots of detailed definitions that will likely lead to many scholarly musings on their scope.
  • And other things besides.

Hopefully this (interminable) snapshot will help you start digesting the many aspects of Virginia’s new, comprehensive privacy law. But, just like any large meal, there’s only so much that can be comfortably consumed at one sitting.

And, of course, digestion is aided by a long nap. Sweet dreams. You’ve earned it.


DISCLAIMER:  The opinions expressed here represent those of Colman McCarthy (and Al, too) and not those of Shook, Hardy & Bacon, LLP, or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone and do not reflect the opinions of Colman, Al, Shook, Hardy & Bacon, or Shook’s clients.  All of the data and information provided on this site are for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

The Florida Legislature is considering a comprehensive privacy law (HB 969) that would fundamentally change the landscape of how/whether companies do business in Florida.  The bill is largely a “cut-and-paste” of the California Consumer Privacy Act (CCPA), but in some ways, it goes further than the CCPA and would make Florida’s law the most aggressive privacy law in the United States.  As I have previously described, the bill would create significant privacy rights for Florida residents, including the right to know what personal information companies are collecting about them, the source of that information, how the information is being shared, a right to request a copy of that information, and a right to delete/correct that information.  But the law goes too far – placing a crushing financial burden on most small and medium-sized businesses and creating a private right of action that dwarfs California’s version. This post analyzes the five most significant problems with HB 969 and proposes solutions.

HB 969 Would Crush Small and Medium-Sized Businesses

What’s the Problem?

The Florida Governor has promoted HB 969 as a law that will hold “big tech” companies accountable for their collection and use of Florida residents’ personal information.  While the bill would apply to big tech companies, for the most part, the law will affect small and medium-sized businesses that are not in the tech industry and may collect a small amount of personal information about Florida residents.

Section 501.173(1)(c)1. of the bill, which establishes the law’s scope, says that the law would apply to most for-profit companies that meet one of the following three requirements:

  • Has global annual gross revenue in excess of $25 million;
  • Annually buys, receives, sells, or shares for commercial purposes the personal information of 50,000 or more consumers, households, or devices; or,
  • Derives 50 percent or more of its global annual revenues from selling or sharing personal information about consumers.

As you can see, the law would apply to many companies beyond big tech. If a company collects personal information of just one Florida resident but generates annual revenue of more than $25 million, the law applies to that business. It isn’t difficult for an organization to hit the $25 million threshold. According to the U.S. Small Business Administration, many small businesses hit this requirement depending on their industry.

HB 969 would impose significant financial burdens on companies. To comply, a company would likely need to hire:

  • A lawyer to help understand the plethora of requirements in the 37-page bill;
  • A vendor to perform a data inventory that allows the business to understand what personal information they collect, where they get that information, how they use it, and with whom they share it;
  • A vendor to develop a process for responding to Florida residents’ requests to access, delete, or change their personal information;
  • A service/subscription that will track changes in how personal information is being collected and shared so that responses to data requests are accurate and provided in a timely manner;
  • A company to build the required “Do Not Sell” button on the homepage and all of the back-end support triggered by clicking on that button;
  • A company to train employees on how to comply with the law; and,
  • A cybersecurity firm to perform a threat assessment and to build the reasonable security procedures and processes required by the law.

The cost of the above services can range between $50,000 to $500,000 depending on the business and the number of vendors needed.

If those costs aren’t enough, businesses will also face a significant risk of class-action lawsuits if they suffer a data breach.  These lawsuits typically seek millions of dollars in statutory damages and attorney’s fees.

In short, the price tag for Florida businesses to comply with HB 969 is staggering and could result in bankrupting smaller enterprises.

What’s the Solution?

How do we balance the need to provide consumer privacy rights while protecting a business-friendly environment in Florida?  The best way is fairly simple:  follow the model created by the Virginia Consumer Data Protection Act (which is about to become law), which eliminates the gross revenue trigger.  The VCDPA will apply to “persons that conduct business in the Commonwealth or produce products or services that are targeted to residents of the Commonwealth and that (i) during a calendar year, control or process personal data of at least 100,000 consumers or (ii) control or process personal data of at least 25,000 consumers and derive over 50 percent of gross revenue from the sale of personal data.”

Such an approach makes more sense when it comes to privacy legislation because it uses criteria based on the amount of personal information a company collects, rather than the amount of revenue a company generates.

Eliminate the Bonanza for Lawyers

What’s the Problem?

Do you remember watching the opening credits of DuckTales as a kid and marveling at Scrooge McDuck diving into his pool of gold coins? (Physicists have actually weighed in on whether this is possible – seriously). Well, if section 12 of HB 969 becomes law, just replace Scrooge McDuck with every plaintiffs’ lawyer in Florida.

Section 12 allows any Florida resident whose personal information is impacted by a data breach to sue for $100 to $750 per consumer per incident. So, for example, a company that suffers a data breach impacting the personal information of 5,000 individuals could face a lawsuit seeking more than $5 million in damages, the plaintiff’s attorney’s fees, and class action administration costs (not to mention the company’s own legal fees).

Until now, the biggest obstacle plaintiffs have faced in data breach litigation has been proving actual harm (e.g., monetary losses). If HB 969 becomes law, plaintiffs’ lawyers will argue they no longer need to demonstrate harm because HB 969 creates it for them through the $100 to $750 in statutory damages. We can therefore expect to see a significant increase in these lawsuits, just as California has seen since its version went into effect.

HB 969’s private right of action is actually worse for businesses than California’s version because it fails to limit the definition of personal information to the more traditional definition of sensitive information (e.g., Social Security Numbers, Driver’s License Numbers, credit card numbers, medical information). The CCPA uses this more limited definition of personal information for the purpose of establishing a private right of action. That’s right – Florida’s proposed law is less business-friendly than California’s.  HB 969 defines personal information as “information that identifies, relates to, or describes a particular consumer or household, or is reasonably capable of being directly or indirectly associated or linked with, a particular consumer or household.”  So, for example, inadvertent disclosure of the fact that “John Smith’s favorite color is blue” or “the Smith household likes to watch old episodes of Breaking Bad” would allow the Smiths to sue the company that suffered the breach.

This leads to the next problem with the private right of action – it is based on an overly broad and inconsistent definition of a data breach. The definition of a data breach for the purpose of triggering the private right of action is broader than the definition of a data breach under Florida’s data breach notification law. Under HB 969, a breach means any “unauthorized access and exfiltration, theft, or disclosure” of personal information. In contrast, Florida’s breach notification law limits the definition of a breach to “unauthorized access of data in electronic form containing personal information” (it also limits personal information to more sensitive information as referenced above). Florida’s data breach notification law does not create a private right of action. In other words, a company could be sued based on a data breach for which they never would have been required to give notice, which is absurd.

Perhaps the biggest problem with the proposed private right of action is the disincentive it would create to disclose “gray area” data breaches. Currently, when a company suffers a breach it performs a forensic investigation to determine whether personal information was impacted. The investigation is often inconclusive on this question where there is a lack of log files or the attacker deleted the forensic artifacts that would show his tracks. In that instance, most companies notify individuals of the incident and a potential risk because they believe it’s the right thing to do even if there is no clear forensic conclusion that would have required the notice.  Now take that same scenario, but the company faces the risk of a multimillion-dollar class-action lawsuit under the proposed private right of action by disclosing the attach. It doesn’t take a rocket scientist to conclude that companies in that situation will likely not disclose the breach.  “Doing the right thing” would be rewarded with a demand from a plaintiffs’ lawyer seeking damages and fees.

Privacy advocates have argued that the private right of action is limited because it will only succeed where the plaintiff can also show that the breach is “a result of a business’ violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information.”  This argument is misleading because it ignores how lawsuits unfold in our judicial system. Plaintiffs’ lawyers will argue that the questions of whether security procedures and practices were reasonable and whether the breach resulted from a business’s violation of the duty to implement those procedures and practices are factual questions. Why does that matter? Because if the plaintiffs’ lawyers were correct, it is highly unlikely the case will be resolved early through a motion to dismiss or summary judgment (at least not on the ground that the “reasonable security practices” limitation applies). In short, the additional requirement of showing that the breach was a result of a violation of the business’s duty to implement reasonable security procedures and practices will have no impact on the lawsuit as the issue may not get resolved until trial (trials rarely happen in class action lawsuits) and the company will need to spend hundreds of thousands of dollars defending the lawsuit in the meantime.

What’s The Solution?

The best solution is to remove the private right of action in its entirety. Doing so would eliminate all the above concerns, disincentives, and contradictive outcomes.  To the extent we are concerned that companies will not be held accountable for data breaches, the Florida Attorney General is permitted under Florida’s data breach notification law to seek up to $500,000 in civil penalties and additional monetary/injunctive relief. There is no need for a private right of action that will incentivize companies to hide, rather than disclose, data breaches.

If the private right of action cannot be removed, there are some ways to limit the harm it causes, but none of these ways adequately remediate the problem.

For example, “personal information” and “data breach” should mean the same thing in HB 969 as they do in Florida’s data breach notification law, at least for the purpose of the private right of action. This will help avoid some of the inconsistencies described above.  If the Legislature keeps HB 969’s current definition of a data breach then, at minimum, it should be clarified so that it does not apply to a good faith disclosure of information by the business.

Another improvement would be to give companies an opportunity to cure the underlying vulnerability that led to the breach before allowing any private lawsuit to proceed. Curing the vulnerability may mean, for example, broadening implementation of multifactor authentication, patching specific applications or systems, addressing open ports, deleting the existence of the malware that gave rise to the incident, or deleting/minimizing the collection of certain data. The limitation here is that plaintiffs’ lawyers will argue that whether the company adequately cured the underlying vulnerability should be a fact question for the jury and it will turn into a battle of the experts.

A third change if the private right of action remains is requiring the plaintiff to demonstrate an intentional or willful violation of the duty to implement and maintain reasonable security procedures and practices.  Also, the question of whether the duty to implement and maintain reasonable security procedures and practices is met should be a question of law for the court (allowing early consideration of the issue), rather than a factual question that requires costly litigation.

Another option would be, as California considered at one point, requiring the plaintiff to first provide notice to the Florida Attorney General of the individual’s intent to file the lawsuit and obtain approval from the Florida Attorney General’s office before the private right of action is permitted to proceed.

Lastly, the private right of action could be amended to limit the damages provision. An individual could be limited to recovery of injunctive/declaratory relief; the statutory damages could be limited to $100 per consumer per incident; or, the individual could be required to show actual harm as a condition for obtaining statutory damages.

Explain When Regulatory Enforcement Can Occur

What’s the Problem?

HB 969 would be enforced by the Florida Attorney General. She can seek a civil penalty of “up to $2,500 for each unintentional violation or $7,500 for each intentional violation.” But the bill does not define how to quantify a “violation.”

What’s the Solution?

A violation could be defined in different ways.  Some options include:

  • Option 1 – there can be only one violation no matter how many provisions of HB 969 are violated or how many individuals are impacted. Penalties would therefore be capped at $2,500 or $7,500, depending on whether the violation is unintentional or intentional.
  • Option 2 – violations are measured by the number of requirements in the law that are not met. So, for example, if a company did not comply with the privacy notice requirement and did not comply with the requirement to provide a response to a verified consumer request, there would be two violations of the law for calculating civil penalties.
  • Option 3 – violations are calculated by counting the number of consumers impacted by the violation. This would create certain challenges. For example, how would you calculate violation of the privacy policy requirement when you won’t know how many people read it (or would have read it)? And for data subject requests, do you count the number of individuals whose requests were not responded to (which may be as few as one) or the number of people who could have made the request (which may be millions)?
  • Option 4 – a combination of 2 and 3 – a violation would be calculated by multiplying the number of impacted individuals by each provision violated, respectfully. This approach would result in potentially enormous penalties against companies for violating the law.

The bigger issue is not which of the four options is best (personally, I’d go with options 2 or 3), but the lack of clarity around the definition.

Fix the “Service Provider” v. “Third Party” Drafting Errors

What is the Problem?

HB 969 is based on the CCPA. That law imposes different obligations depending on whether the business is sharing personal information with a “service provider” or a “third party.” A service provider is a company with which the business shares personal information for a purpose that is compatible with the context in which the personal information was initially collected.  For example, sharing information with a company for auditing, detecting security incidents, performing services on behalf of the business, research, or quality control would all meet the business purpose requirement.  A third party is a company with whom the business shares personal information for a reason that is not a business purpose. These companies are not necessarily looking to help you; they’re looking to benefit from the data you’re sharing with them. An example might be a marketing or “big data” firm that purchases data from you primarily for their own independent benefit.

For a company to be considered a service provider, the contract between the company and the business must contain provisions described in lines 707 to 725 of HB 969, which prohibit further types of selling, using, and sharing of personal information. That makes sense because the whole idea of the service provider is that the business that hired the service provider must control/limit how any personal information is used/shared.  What does not make sense is the language in lines 726 to 744, which imposes the same share/use/sale limitations on third parties. Indeed, the reason why entities are considered third parties is because they will not agree to these limitations. Likely, this language was included due to a drafting error.

Additionally, there appears to be a typographical/drafting error in which the provision set forth in lines 730 to 735 repeats itself in lines 736 to 741.

What is the Solution?

There is an easy fix to both problems: delete lines 726 to 744.

Not Enough Time to Comply

What is the Problem?

HB 969 would go into effect on January 1st of next year.  By comparison, the CCPA took approximately two years to become effective. A long ramp-up time is necessary because companies need to understand how the numerous requirements will apply to their organizations. Companies will need to conduct risk assessments and perform a data inventory before preparing their privacy notices. The companies will need to build processes and policies to govern how they will respond to data requests from consumers. The companies will need to assess and improve their security practices and procedures, and they will need to train employees on the law’s new and often complicated requirements. With each of these examples, the company may need to engage a professional, experienced third party. The work will take most organizations much longer than six to nine months to complete.

What is the Solution?

The best solution would be to make the law effective on January 1, 2023. In addition to giving companies more time to prepare to comply with the law, this longer ramp-up time will allow Florida to evaluate and better address the weaknesses that are becoming increasingly apparent from the CCPA and other similar comprehensive privacy laws.


DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP, or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site are for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Yesterday, the Governor of Florida threw his support behind a newly introduced consumer data privacy bill (HB 969) which is very similar to the California Consumer Privacy Act of 2018. The Governor’s support is a significant development given that he and both chambers of the Florida Legislature are Republican and, to date, there has not been any aligned support for a privacy law since the Florida Information Protection Act (FIPA), Florida’s data breach notification law.  Nevertheless, as with the CCPA, the bill proposes a boondoggle for the plaintiffs’ bar in the form of a private right of action for data breaches and statutory damages, which could present a significant obstacle to passage in the bill’s current form, particularly for a fairly business-friendly Florida Legislature.

To Whom Would HB 969 Apply? 

HB 969’s definition of a business to which the proposed law would apply is similar to the CCPA’s definition.  It would apply to any for-profit business that has a global annual gross revenue in excess of $25 million, does business in Florida, collects personal information about consumers, and determines how the personal information will be processed.  (The drafters cleared up the confusion created by the CCPA as to whether the revenue had to be generated from within the state – it does not.)  Companies earning less than $25 million in global annual gross revenue would also be required to comply with the law if they buy/receive the personal information of 50,000 or more consumer, households, or devices, or if they derive 50% or more of their revenue from selling/sharing personal information. The law would also apply to entities that control or are controlled by a business that shares common branding with the business.

In short, contrary to the Florida Governor’s implications that the law would apply only to big tech companies, it would in fact apply to many small and medium-sized businesses that collect personal information about Florida residents.

What Rights and Obligations Would HB 969 Create?

HB 969’s obligations are very similar to the CCPA’s.  First, the bill would require that businesses create an online privacy policy that would need to be updated annually. The policy would need to include:

  • any Florida-specific consumer privacy rights;
  • a list of the categories or personal information the business collects or has collected about consumers;
  • a list that identifies which categories of personal information the business sells/shares or has sold/shared;
  • a list that identifies which categories of personal information the business discloses/shares for a business purpose; and,
  • the right to opt out of the sale/sharing to third parties and the ability to request deletion/correction of certain personal information;

The bill would also require businesses to provide a “just-in-time” notice at or before the point of collection that would inform consumers of the categories of personal information to be collected and the purposes for which the categories of personal information will be used. A business would not be allowed to collect additional categories of personal information or use collected personal information for any additional purpose other than those provided in the online privacy notice or the just-in-time notice without first providing the consumer with notice.

Next, HB 969 would require businesses to create and follow a retention schedule that prohibits the use and retention of personal information after the initial purpose for collecting the information has been satisfied, after duration of a contract with that consumer, or one year after the consumer’s last interaction, whichever occurs first. (This retention limitation is not the same for biometric information collected for ticketing purposes.)

HB 969 would also create a right to request from a business, up to twice per year, a copy of their personal data the business collected. The business would have to provide the information free of charge and in a readily usable format that could be easily transferable.  Upon receiving a verified request, the business would need to disclose the specific pieces of personal information the business collected about the consumer, the categories and sources from which it collected the consumer’s personal information, the business or commercial purpose for collecting or selling the consumer’s personal information, and the categories of third parties with which the business shares the consumer’s personal information.

The proposed law would also provide consumers with a right to have their personal information deleted.  The deletion would be required not just by the business, but also by any service provider with whom the business shared the personal information.  There are several exceptions to this right that almost swallow the rule. For example, a business can reject a request to delete if the personal information is needed to complete a transaction, provide a good or service that the business reasonably anticipates the consumer may want, comply with a legal obligation, or use the information internally in a way that is “compatible with the context in which the consumer provided the information.”

Similarly, the bill would provide consumers with a right to request that a business correct inaccurate personal information.  Again, the exceptions limit the rule significantly. Also, while this right is fairly easy to administer in the context of a straightforward request like changing information that is objectively verifiable, what happens in an instance where the personal information is more subjective, or the business believes the consumer is using this right to create an unfair advantage? Who makes the determination of which “version” of personal information is accurate?  Would “both sides of the story” need to be maintained, as is the case with HIPAA’s right to amend?

HB 969 would also create a right to request what personal data has been sold or shared. Specifically, a consumer would have the right to know the categories of personal information that have been sold/shared, the categories of third parties to which the personal information was sold/shared, and the categories of personal information about the consumer that the business disclosed for a business purpose.

The bill would create a right to opt-out of the sale/sharing of personal information to third parties. Any business that sells/shares personal information to third parties must provide notice to consumers that the information may be sold/shared, and that the consumer has a right to opt-out of the sale/sharing of their personal information. Relatedly, third parties are not allowed to sell/share personal information sold/shared with them by a business unless the consumer is provided with explicit notice of the intent to sell/share and has been provided an opportunity to opt-out.

Additionally, the law would require a business to provide a conspicuous link on its homepage entitled “Do Not Sell or Share My Personal Information” that enables a consumer to opt-out of the sale or sharing of the consumer’s personal information. (A business does not need to put this link on its homepage if it has/creates a separate page dedicated to Florida consumers.) A business cannot require a consumer to create an account as a condition for directing the business not to sell the consumer’s personal information, and a business must wait at least 12 months before asking the consumer to authorize the sale of their information.

Additionally, the bill would create a right to opt-in for the sale/sharing of personal information of children.  Specifically, where a business intends to sell/share personal information and it knows that a consumer is under 16 years of age, the business must obtain the child’s consent (if the child is between 13 and 15) or the parent’s/guardian’s consent if the child is 12 or younger.

Regarding the two rights described above, and as with the CCPA, there are some contours and carve-outs to the definition of a “sale” of personal information. For example, a business does not “sell” personal information if a consumer directs the business to intentionally disclose the personal information to a third party, assuming the third party does not then sell the personal information.  Additionally, a business can share personal information with a service provider for a business purpose if the business provides notice of this activity in its terms and conditions, and the service provider does not further collect, sell, share, or use the personal information except for the business purpose.

Another exception to the definition of “sale/sharing” is the transfer of personal information to a third party as part of a merger, acquisition, bankruptcy, or other transaction. Under this exception, if the new business wants to use the personal information in a way that is materially inconsistent with the previous business’s privacy practices, the new business must provide notice in a “prominent and robust” way.

As with the CCPA, HB 969 would prohibit discrimination against consumers who exercise their rights under the law.  Examples of such discrimination include denying goods or services to the consumer, charging different prices or rates for goods or services (including discounts or other benefits), or providing a different level/quality of goods or services.  The law does, however, allow for financial incentives, like rewards programs and payments for the collection, sale, or deletion of personal information. In that case, the business must first obtain consent that describes the material terms of the financial incentive program, and the consent may be revoked at any time.

Finally, but perhaps most significantly, HB 969 would create a private right of action for data breaches. More information about that is provided below.

Beginning January 1, 2022, a business cannot use an agreement with a consumer to waive or limit any of the above-described rights.

How Must a Business Respond to a Verified Consumer Request?

To meet the bill’s requirements of responding to a consumer’s verified request, the business must make two or more methods available for submitting requests, one of which must be a toll-free number and (if the business maintains a website) a link to the homepage of the website.

The business cannot require the consumer to create an account to make a request. Additionally, the business must deliver the responsive information in a readily usable format, free of charge, and within 45 days after receiving the request (this period can be extended under certain circumstances). 

Businesses must ensure that employees responsible for handling consumer inquiries are trained to handle inquiries about the business’s privacy practices and that they know how to direct consumers to exercise their rights under this proposed law.

How Does HB 969 Impact a Business’s Relationship with Third Parties and Service Providers?

HB 969 would impose restrictions on how service providers and other third parties with whom a business may share the consumer’s personal information can use/share that information.  For example, any contract between a business and a service provider or third party must prohibit the service provider or third party from (a) selling/sharing the personal information; (b) using the personal information in a way that is outside the business purposes specified in the contract; (c) disclosing the information to any third party outside the relationship between the business and the service provider or third party; and, (d) combining the personal information it receives from the business with other information it receives about the consumer.

The contract must certify that the entity receiving the personal information will comply with these restrictions.  These same restrictions apply between the service provider or third party and any subcontractor. If a third party, service provider, or subcontractor violates any of these restrictions, they may be held liable for those violations.  In contrast, the business that discloses personal information to a third party or service provider is not liable if, at the time of disclosing the personal information, the business did not know or have reason to believe that the service provider or third party intended to commit such a violation.

What Are The Exceptions To HB 969?

As with the CCPA, there are significant exceptions to the bill’s requirements.  So, for example, a business is not required to comply with HB 969 if doing so would restrict the business’s ability to comply with a different U.S. law, comply with a regulatory inquiry or subpoena, cooperate with law enforcement, or exercise legal rights.

Additionally, HB 969 does not apply to the collection of deidentified or aggregate consumer information. This assumes the business implements safeguards and processes that prohibit reidentification or prevent the inadvertent release of deidentified information, and the business does not attempt to reidentify the information.

HB 969 also carves out significant categories of information and businesses to which the bill would not apply, including employee personal information (where collected for employment purposes); health information, covered entities, and business associates under HIPAA (assuming they’re actually in compliance with HIPAA); information collected as part of a clinical trial; information collected as part of research in the public interest; and, information collected/used pursuant to GLBA, the FCRA, the DPPA, or FERPA.

HB 969’s Private Right of Action

Getting back to the private right of action. Like the CCPA, HB 969 would create a private right of action for data breaches and allow for statutory damages. First, it would broaden the definition of a data breach from “unauthorized access” of personal information (which is how Florida’s breach notification law currently defines a breach) to “unauthorized access and exfiltration, theft, or disclosure” of personal information. In other words, the definition of a data breach would remain the same for the purpose of determining whether notice to affected individuals is required, yet people could sue for data breaches about which companies would not have been required to provide notice. Pretty bizarre.

Consumer privacy organizations will argue that the private right of action would be limited to data breaches that are a result of a business’s violation of the duty to implement and maintain reasonable security procedures, so not all breaches would give rise to a class-action lawsuit. But that limitation is meaningless. Plaintiffs’ lawyers will argue, after filing the lawsuit, that this limitation is a question of fact for a juror at trial, rather than for a judge to decide via a motion to dismiss or motion for summary judgment. So a company will need to incur substantially the same expense fighting the lawsuit as if the limitation never existed.

Individuals will be entitled to seek damages between $100 to $750 per consumer per incident or actual damages, whichever is greater. The law also allows plaintiffs to seek injunctive or declaratory relief, which will undoubtedly be an avenue through which plaintiffs with weak claims will try to justify filing their lawsuits, getting class representative payouts, and obtaining attorney’s fees.

We’ve seen other privacy laws that create private rights of action and the results are never good. The Illinois Biometric Information Privacy Act (BIPA), for example, has resulted in hundreds of class-action lawsuits and millions of dollars in settlements against companies doing business in Illinois. Similarly, there have been hundreds of lawsuits filed under the Telephone Consumer Protection Act (TCPA). These are results that a business-friendly legislature will not want in Florida.

Another Change to The Definition of “Personal Information” Under FIPA

While on the topic of Florida’s breach notification law, HB 969 would broaden the definition of personal information under FIPA by adding biometric information to the list of identifiers that, if accessed without authorization, may be considered a data breach. FIPA would use the broader, CCPA-like definition of biometric information that would include not just physiological characteristics (like retinal scans, fingerprints, and voice prints) but also behavioral patterns like gait or keystroke patterns, and information like sleep, health, and exercise data.

It is likely that such a proposed change to FIPA will have little impact on a company’s data breach notification obligations, as breaches of biometric information are incredibly rare given the low value of what a threat actor actually obtains if he acquires “biometric information.”

The more meaningful impact of adding this amendment to FIPA would be opening the door to lawsuits by plaintiffs’ lawyers who would combine the broader definition of a breach (for the purpose of the private right of action) with Florida’s new definition of personal information (that adds biometric information) to try to create a Florida version of BIPA. For example, companies that “disclose” biometric information to vendors without first obtaining the data subject’s consent might inadvertently subject themselves to the new private right of action. To be clear, this private right of action is not intended by the law and should lose on the ground that such sharing is permitted under the good-faith exception to FIPA.  Nevertheless, that won’t stop plaintiffs’ lawyers from trying.

How will HB 969 be enforced?

If HB 969 were to become law, the new rights/obligations would be enforced by the Florida Office of the Attorney General, which can bring an action against any business, service provider, or other person and seek a penalty of up to $2,500 for each unintentional violation, or $7,500 for each intentional violation. The bill does not define how a violation is calculated. So, for example, it is not clear whether a violation means one-violation-per-act, one-violation-per-consumer, or one-violation-per-provision-of-the-law-alleged-to-be-violated.  In any event, fines may be tripled if the violation involves a consumer under 17 years of age. Additionally, the bill would allow the Florida Attorney General to adopt rules to implement the law.

If there is any “good” news with respect to enforcement, it is that before being deemed in violation of the law a business would first be provided 30 days to cure any alleged violation. Additionally, as with the CCPA, HB 969’s private right of action is limited to data breaches and does not apply to violations of the other privacy rights created by the proposed law.

When Would The Law Go Into Effect?

January 1, 2022.

What Is The Likelihood HB 969 Will Become Law?

HB 969 faces some uphill challenges created by the proposed private right of action, the low threshold that allows the bill to capture many small businesses in its net, and the potentially significant enforcement fines a company may face. Additionally, the high risk of class-action lawsuits may be enough to doom the bill, especially given Florida’s overt attempts to attract new business to the state.  Nevertheless, the Governor appears to be on board with the law, and his party controls both chambers of the Florida Legislature (the other party will certainly have few objections to the law).

I remain optimistic that Florida can be a state that creates privacy consumer rights without providing a boondoggle for the plaintiffs’ bar. I believe that can be achieved here with some changes to HB 969, but only time will tell whether that will actually be the case.


DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

Yesterday, in a 26-page opinion, the 11th U.S. Circuit Court of Appeals has weighed in on two important questions in the world of privacy and data breach litigation.  First, does a plaintiff have standing where he was exposed to a substantial risk of future identity theft, even though there was no misuse of his information. The court’s answer is no. Second, what efforts to mitigate this risk does a plaintiff need to undertake to meet the standing requirement.  Here, the court held that the plaintiff essentially manufactured his own injuries (wasted time, lost use of his preferred card, and lost credit card benefits) by voluntarily canceling his credit card, which is not enough to confer standing.


The case, Tsao v. Captiva MVP Restaurant Partners, arose from a data breach involving a restaurant chain’s point-of-sale system, which allowed access to the plaintiff’s credit card information. Upon receiving notice of the breach, plaintiff immediately canceled both credit cards used at the restaurant chain, though neither card had experienced fraudulent charges. Next, he filed a class-action lawsuit in the Middle District of Florida claiming that class members suffered a theft of his personal information, unauthorized charges on his payment cards, a loss of credit card reward points or cash back, and a temporary inability accrue points/benefits on his preferred credit cards. He also alleged that the time he had to take to mitigate the impact of the breach was an injury.  His legal theories included breach of implied contract, negligence and per se negligence based on an alleged violation of the “unfair” prong of Section 5 of the FTC Act, unjust enrichment, and a violation of Florida’s unfair and deceptive trade practices law.  The complaint also sought declaratory relief in the form of implementation of a variety of security measures.  The District Court dismissed the complaint for lack of standing.

The Appeal

On appeal, plaintiff argued that: (1) he could suffer future injury from misuse of the credit card information; and (2) the lost time, lost rewards points, and loss of access to his preferred credit cards should be sufficient to confer standing. The 11th Circuit disagreed with both arguments.

The court began its analysis by observing that lost time and a lost “fraction of a vote” can be considered concrete injuries, but such injuries must also be “certainly impending” to confer Article III standing, which was not the case here. The court cited to the principle often relied upon by defendants in privacy litigation – you cannot manufacture standing by inflicting harm on yourself.  For an injury to be “certainly impending” there must be a “substantial risk” that it will occur.

This is the 11th Circuit’s second significant post-Clapper decision on the standing issue. In an earlier decision, Muransky v. Godiva Chocolatier, the court held that merely printing too many digits on credit card receipts (creating an elevated risk of identity theft) did not confer standing on the plaintiffs, even if plaintiffs had spent time destroying or safeguarding receipts to mitigate the elevated risk.

Diving Into The Circuit Split

This opinion observed that circuit courts around the country are divided on the issue of whether a substantial risk of identity theft, fraud, or other harm in the future because of a breach is sufficient to confer standing. The court cited Sixth, Seventh, Ninth, and DC circuit court opinions holding that it does, but the court also cited Second, Third, Fourth, and Eighth circuit court opinions holding that it doesn’t. The court cited First Circuit opinions demonstrating that it had gone both ways on the issue. The court observed, however, that almost all of the cases that conferred standing included some allegations of actual misuse of actual access to personal data, and the case law generally has treated unauthorized access of credit card information as less likely to confer standing than other types of information.

Readers interested in learning more about the nature of this split should read this opinion, as the court dives deeply into it. The court does an effective job of shining a light on the split and effectively making the case for the U.S. Supreme Court’s review of the issue.  Readers will also appreciate the court’s analysis of the GAO Report on page 22 that has become commonplace in plaintiffs’ lawyers’ privacy and data security class action complaints these days.

The Court’s Holding

Ultimately, the court held that an increased risk of identity theft, at least as described by the plaintiff in this case, is not enough to confer standing.

The court also held that the conclusory allegation of “unauthorized charges” experienced by the class is not sufficient to confer standing. The plaintiff needed to show “specific evidence of some misuse of class members’ data.”

Finally, the court held that the plaintiff’s immediate cancellation of his credit cards effectively eliminated the risk of credit card fraud in the future. While the court conceded that there was still some risk of identity theft where an unauthorized actor could use the plaintiff’s name, that risk was speculative, not substantial. The court relied on another often-used line by defendants in data breach litigation — “evidence of a mere data breach does not, standing alone, satisfy the requirements of Article III standing.”

Regarding the plaintiff’s actual/present injuries (lost rewards, identity theft protection costs, and restricted access to his cards), the court held that they did not confer standing because they “are inextricably tied to [the plaintiff’s] perception of the actual risk of identity theft” and the injuries were a result of plaintiff’s own voluntary decision to cancel his cards.

The Concurring Opinion

Judge Jordan (one of Florida’s most respected jurists) wrote a concurring opinion given the court’s reliance on Muransky, a case in which Judge Jordan had dissented. He expressed concern that the analysis of whether a substantial risk occurred should not take place at the motion-to-dismiss stage, though he conceded that Muransky sanctioned such an analytical approach.

This procedural question, too, could be an issue upon which the U.S. Supreme Court weighs in if it were to address the broader divide between circuit courts on the standing issue. Indeed, the last line of Judge Jordan’s opinion states, “[h]opefully the Supreme Court will soon grant certiorari in a case presenting the question of Article III standing in a data breach case.”


DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.


The Florida Senate and House of Representatives are considering two bills (SB 1670 and HB 963) that, if adopted, will amend Florida law to create the state’s first comprehensive privacy law (though they do not go nearly as far as the CCPA). The proposed amendments would: (1) prohibit the use of personal data in public records maintained by state agencies for unsolicited marketing purposes, and (2) require companies doing business online to provide notice of their personal data collection/use activities and allow consumers to opt out of the sale of that data to third parties.  This article takes a deeper look at the proposed amendments, provides some context for them, and discusses the likelihood that they will become law. (Spoiler alert: the proposed amendments are significant and well-intended, but currently contain some flaws that, if addressed, create a good chance of the amendments becoming law).


Proposed Amendment #1 – Florida’s Public Records Request Laws (the “Marketing Amendment”)

For better or worse, companies are increasingly taking advantage of public records request laws to engage in marketing activities and unsolicited sales requests.  Under Florida’s (and most states’) public records request laws, for example, a company can request public records (e.g., mailing lists) from state and local agencies to obtain telephone numbers, email addresses, and physical addresses that the company can then use for marketing purposes.  I was interviewed about this issue several months ago and the story garnered the interest of state lawmakers.  In theory, the current law can be misused by malicious actors who request this information to take advantage of Florida’s elderly population and engage in fraudulent activity.

The proposed legislation would amend current law by adding the following language to section 119.01 of the Florida Statutes:  “(4) Any public records requested from state agencies that include the personal data, including the name, address, and birthdate, or any portion thereof, of a resident of this state may not be used to market or solicit the sale of products or services to the person or to contact the person for the purpose of marketing or soliciting sales without the consent of the person. Such marketing, soliciting, and contact is prohibited unless the person has affirmatively consented by electronic or paper notification to share the data with a third party before the data is used for such purpose.”

While undoubtedly well-intended, the amendment suffers from a few flaws.  The first is a lack of clarity as to what is considered “personal data.”  The proposed amendment doesn’t define the term.  It merely provides a few examples.  This lack of guidance will make it difficult for a company that acquires data in public records to know whether it can use the data for marketing purposes or not.  Technically under the proposed definition, the personal data doesn’t even have to be identifiable (i.e., allow you to know the person to whom it relates), so a phone number or email address alone, without knowing who it belongs to, may be enough to be considered personal data governed by the law.  This ambiguity can be addressed by including a more specific definition or, at minimum, making clear that personal data means information that is identifiable to a specific individual.

A second potential problem is the timing of the consent requirement.  The proposed amendment requires consent only before marketing/soliciting begins; it doesn’t require consent before a release of the public records.  As a result, there’s no way, in the bill as written, for the state agency to act as a “check” to help enforce the goals this law seeks to create.  This may be on purpose — there are limitations on how a state agency can respond to a public records request and “what do you intend to do with this information?” isn’t a permissible response to a request.

One way to address this issue is for state agencies to condition the release of public records on the requesting parties’ agreement not to use the information for marketing purposes without first obtaining the individual’s consent to do so, such as by requiring the requester to check a box so stating.  (Florida public-records law, Fla. Stat. 119.07, allows “reasonable conditions” on the disclosure of public records – though the allowable scope of conditions is unclear).   This approach could have real teeth because if the company later violates that representation, it could give rise to a misrepresentation or breach of contract claim that a consumer might be able to bring as a third-party beneficiary of the agreement.  In other words, it could create a private right of action that the law as proposed does not currently have.

The lack of a private right of action leads to a third concern:  enforcement challenges.  Currently, the requirement would be enforced entirely by the Florida Attorney General, but it’s unclear how the Florida AG will learn that personal data obtained from public records was actually used for marketing/soliciting purposes.  It seems like a very difficult violation to uncover, short of a whistleblower bringing it to light.


Proposed Amendment #2 – Online Consumer Privacy Rights (the “Notice and Opt-Out” Amendment)

The second, more substantial, change that SB 1670 and HB 963 would make is amending section 501.062 of the Florida Statutes to provide Florida residents with more notice and control over how companies doing business online use their personal data.  In short, “operators” who collect or maintain “covered information” about Florida residents must provide notice about their data collection/use practices and give the consumers an ability to opt out of the current and future sale of that information.  Let’s unpack this proposed amendment:

Who Does The Amendment Apply To? 

The change applies to any “operator”, which is defined as a person (or entity?) that:

(1) owns or operates a website or online service for commercial purposes;

(2) collects and maintains covered information from consumers who reside in this state and use or visit the website or online service;

(3) purposefully directs activities toward this state or purposefully executes a transaction or engages in any activity with this state or a resident thereof.

A couple of problems immediately jump out.  There is no “and” or “or” between #2 and #3, so it is unclear whether an “operator” is a person/entity that meets just one of these three requirements, or if it instead has to meet all three requirements.  Let’s assume it’s the latter, because the definition doesn’t make much sense if one or two of the three elements are missing.

Additionally, the law explicitly states that it does not apply to operators located in Florida.  If the law only applies to companies having no physical presence in Florida, but who collect information from Florida residents online, it could implicate personal jurisdiction issues – particularly given Florida’s strict long-arm personal jurisdiction requirements.  Such issues are not uncommon in the privacy-law context, but they have yet to be litigated.  Again, the lack of an appropriate conjunctive or disjunctive operator makes it unclear whether all “operators” in Florida are exempted or just those that meet other criteria such as less than 20,000 unique website visitors per year (a basically meaningless threshold).

There are some exceptions to the definition of an operator.  For example, a company that operates, hosts, or manages the online service on behalf of an operator or processes information on behalf of the operator, is not governed by this law.  There are also carve-outs for HIPAA- and GLBA-governed entities, as well as for some motor-vehicle manufacturers that retrieve covered information from a technology or service related to the vehicle.

What Type of “Personal Data” Does This Change Apply To?

The proposed amendment applies to “covered information,” which is defined as a first and last name, an address that includes a street and name of city, an email address, a phone number, a social security number, an identifier that allows a consumer to be contacted either physically or online (e.g., a username or screen name), and “any other information concerning a consumer that is collected from the consumer through the website or online service of the operator and maintained by the operator in combination with an identifier in a form that makes the information personally identifiable.”

An initial concern with this definition is potential overbreadth.  Unlike the Florida Information Protection Act (Florida’s data-breach notification law) which requires a name and another element of information, this law does not require both for the definition of covered information to be triggered.  An argument could be made, therefore, that collecting a physical or email address, a phone number, a social security number, or a username, WITHOUT the consumer’s name would still be considered covered information under this definition, which is highly unusual for a United States privacy law.

On the other hand, there are certain elements of what is traditionally covered information that might not be covered information under this definition.  For example, financial information, driver’s license numbers, passport information, or other elements of “personal data” under the Florida Information Protection Act are not considered covered information under this proposed law.  It’s possible that the last category of “covered information” (information concerning a consumer collected through the online service in combination with an identifier that makes the consumer identifiable) would cover those data elements, though those account numbers alone, without a link to a specific individual, would notbe considered covered information.

What Are An “Operator’s” Obligations?

The proposed amendment would impose opt-out and notice obligations on operators.  First, a consumer can request to opt out of the operator’s current or future sale of their covered information to a third party.  (NOTE:  unlike the overly-broad definition of a “consumer” under the CCPA that includes any resident of California, the proposed amendment applies a more conventional meaning of a consumer as an individual who seeks or acquires goods or services).

The consumer’s opt-out request must be verified, meaning that the operator can reasonably confirm the authenticity of the request, which makes sense for security purposes.  To that end, the operator has to establish a designated request address through which a consumer can submit a verified request.  The operator has to respond to the request within 60 days (with a 30-day extension available).

In addition to the right to opt out, the operator must provide notice (the method is not prescribed) that:

  • Identifies the categories of covered information the operator collects about consumers;
  • Identifies the categories of third parties with whom the operator may share such covered information;
  • Provides a description of the process for a consumer to review and request changes to his or her covered information;
  • Describes the process by which the operator notifies consumers of material changes to the notice;
  • Discloses whether a third party may collect covered information about a consumer’s online activities over time and across different websites or online services when the consumer uses the operator’s website or online service; and,
  • States the effective date of the notice.

These notice requirements are nothing new to state privacy laws, as they closely mirror those that CalOPPA imposed a number of years ago, but they’re new under Florida law.

How Will The Proposed Notice and Opt-Out Amendment Be Enforced?

The amendment states that it does not create a private right of action against an operator. Instead, it will be enforced by the Florida Attorney General, who must adopt rules to do so.

An operator must first be given 30 days to try to cure the alleged violation, though no right to cure will apply where a notice makes a knowing and material misrepresentation or omission to the detriment of a consumer.

The proposed legislation would allow for injunctive relief or civil penalties not to exceed $5,000 for “each violation.”  It’s not clear how the term violation will be applied: is that per incident, per consumer, per day, or per transaction?  Notably, privacy laws in California, Illinois, and other jurisdictions suffer from this same lack of clarity.  The Sedona Conference’s Working Group on Privacy and Data Security Liability is working on a commentary that will hopefully provide guidance on this issue.


What Is The Likelihood The Proposed Amendments Will Become Law?

The fact that SB 1670 and HB 963 are decidedly less comprehensive than the CCPA was likely a strategic decision: a CCPA-like law would have little chance of becoming law in Florida, given the current composition of the Legislature and Governor’s seat.  Nevertheless, the idea of giving Florida residents more control and notice over their online personal data, and limiting the barrage of unsolicited calls, texts, emails, and mail, will appeal to most Floridians.

Some businesses may push back against the legislation, and the degree to which there is pushback will likely be the controlling factor over the amendments’ fate.  But the fact that these bills were introduced by Republican legislators and they lack a private right of action, means they have a much better shot of passing than the proposed biometric privacy law.

In short, there’s a good chance we could see some or all of the proposed legislation become law.  At the very least, it’s a development that should stay on your radar, particularly to the extent your business (i) uses public records as a source of information for marketing purposes and/or (ii) could be considered an “operator.”


DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

An identical version of the Illinois Biometric Information Privacy Act (BIPA) has been introduced in the Florida Senate.  The bill includes the same private right of action.  The Illinois BIPA has become an enormous revenue earner for the plaintiff’s bar, who have filed gotcha lawsuits against companies seeking millions of dollars on the ground that the companies did not comply with all of the technical requirements of the law.  I suspect that is a similar driving force behind the Florida version.

Like the Illinois version, the Florida version does not take into consideration how biometric technology actually works.  It is based on a misunderstanding that devices using biometrics for authentication are storing libraries of fingerprints, iris scans, etc., that can be hacked, stolen, and misused.  Hopefully, if the legislation proceeds to receive serious consideration, the Florida Legislature will hold hearings to learn more about how biometric technology works, the massive liability a bill like this could create for companies doing business in Florida, and the possibility that companies will stop using what is the most secure form of authentication out of a fear of liability.

Given the politically conservative makeup of the Florida state legislature and Governor’s office, I do not anticipate that this bill will become law.  Nevertheless, it is definitely under-the-radar legislation that has not received attention in the media and we should monitor it moving forward.

The Illinois Supreme Court’s decision last week in Rosenbach v. Six Flags may have closed the first of what will be several chapters in class action litigation arising from the Illinois Biometric Information Privacy Act (BIPA).  The court addressed the very narrow issue of what it means for a person to be “aggrieved” under BIPA.  Ultimately, the court held that a violation of the notice, consent, disclosure, or other requirements of BIPA alone, without proof of actual harm, is sufficient for a person to be considered “aggrieved” by a violation of the law.

There are several important issues, however, that were not before the court and remain to be litigated.  One of those issues is implied notice and consent. Defendants will argue that the plaintiffs who checked in/out at work using fingerscan timekeeping systems (which is the fact pattern of almost all of the almost 200 class action lawsuits filed in state court) knew that the fingerscans were being collected and used by their employers for timekeeping purposes, and they voluntarily provided that information.

Federal courts have dismissed such lawsuits, reasoning that plaintiffs effectively received notice and gave consent.  In Howe v. Speedway LLC, for example, the court in a fingerscan timekeeping case held that the plaintiff’s “fingerprints were collected in circumstances under which any reasonable person should have known that his biometric data was being collected.”  Similarly, in Santana v.Take-Two Interactive Software, Inc.the U.S. Court of Appeals for the Second Circuit held that plaintiffs essentially received the notice and consent contemplated by BIPA because “the plaintiffs, at the very least, understood that Take-Two had to collect data based upon their faces in order to create the personalized basketball avatars, and that a derivative of the data would be stored in the resulting digital faces of those avatars so long as those avatars existed.”  In dismissing for lack of standing, the McGinnis court reasoned that the plaintiff “knew his fingerprints were being collected because he scanned them in every time he clocked in or out of work.”

Another significant defense is constitutional standing.  Federal courts have recently dismissed BIPA lawsuits on the ground that they do not meet Article III standing requirements.  Defendants in state court will argue that Illinois constitutional standing (which Illinois state courts have held should be similar to federal law) requires a level of harm that, at a minimum, should be what Article III of the U.S. Constitution requires. To hold otherwise would lead to a different result for a party based entirely on whether the lawsuit is filed in federal or state court.

Defendants will argue that most of the claims are barred by the one-year statute of limitations that applies to claims involving the right of privacy.  Assuming that the one-year statute of limitations is applied, the classes of affected individuals will shrink considerably.

Defendants will also contend that the information collected/stored by the timekeeping devices is not considered biometric information under BIPA.  There is no library of fingerprints stored by these timekeeping devices.  Instead, the devices measure minutiae points and convert those measurements into mathematical representations using a proprietary formula that cannot be used to create a fingerprint.  More security is layered on top of that — the mathematical representation is encrypted.  For these reasons, no plaintiff in any of these biometric cases has been able to point to a single data breach involving biometric information.  The technology is essentially tokenization(similar to Apple Pay), where if a hacker were to access the actual device, he’d find nothing there to steal because the valuable thing (the credit card number or, in this case, fingerprint) is not stored on the device but is instead replaced by a numerical representation.

Plaintiffs will also have to prove that the defendants didn’t just violate BIPA, but did so negligently or intentionally.  This is not an easy standard to meet, especially if the trier of fact determines that these are “gotcha” lawsuits, meant to catch companies off-guard about a little known and rarely used state law.

Assuming the plaintiffs jump all these hurdles, they must still demonstrate that these cases are appropriate for class certification. The cases involve different facts regarding whether individual plaintiffs received notice, whether they gave consent, whether they used the fingerscan method of authentication or another method like PIN number or RFID card, whether they enrolled in Illinois, and whether their claim involves a violation of BIPA beyond collection or storage. Given these differences between plaintiffs, it will be difficult for them to meet the commonality and fairness requirements for class certification.

To be sure, some Defendants will face their own challenges.  A line of cases has held that where companies used their time-clock provider’s cloud service to store or back up timekeeping information from the clock, they may be in violation of BIPA’s prohibition against disclosure of biometric identifiers to a third party.  But at least one court has disagreed with that logic, stating that not all disclosures to a third party automatically present a concrete injury, and whether the third party has strong protocols and practices in place to protect data is relevant to the inquiry.

Defendants need only win one of these (or several other) defenses.  Plaintiffs must win them all.  In the meantime, plaintiffs must hope that the Illinois legislature does not notice that hundreds of BIPA lawsuits are flooding the Illinois state court system creating potentially crippling liability for companies that tried to adopt more secure methods of authentication, which could lead to an amendment that would make the law more consistent with its original intent. 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

On Friday afternoon an Illinois intermediate appellate court decided that the bar for a plaintiff bringing a class action lawsuit under the Illinois Biometric Information Privacy Act (BIPA) is low, creating a conflict with its sister intermediate appellate court. The Illinois Supreme Court is expected to resolve the conflict early next year. How the court resolves the conflict will significantly impact companies doing business in Illinois.


BIPA requires companies to provide notice and obtain consent from Illinois residents before collecting their biometric information. It also limits what companies can do with biometric information and requires the adoption of certain security safeguards. Any person “aggrieved by a violation” of the law may sue for actual damages or statutory damages ranging from $1,000 to $5,000 per violation. You can learn more about BIPA from my earlier blog post.

Beginning in the fall of 2017, Illinois businesses of all sizes were hit with “gotcha” class action lawsuits brought by former employees looking for reasons to sue their former employers. Those companies used timekeeping systems that required employees to scan their fingers to punch in and out of work. Ironically, the timekeeping systems improved security by reducing fraud and strengthening authentication. Nevertheless, many companies were not aware of BIPA or the possibility that it might apply to their timekeeping systems. The plaintiff’s bar was quick to pounce. Over 150 class actions were filed by former employees claiming that they did not receive BIPA’s requisite notice and consent (despite the fact the employees voluntarily placed their fingers on these devices every day). The lawsuits in aggregate seek tens of millions of dollars from companies doing business in Illinois.

Requisite Harm for a Private Cause of Action

A key question in the BIPA litigation is what it means to be “aggrieved by a violation.” Is it enough that an employee doesn’t receive notice and consent, or must they show that they suffered some actual harm (e.g., financial loss or identity theft) as a result of the violation, as would be necessary in a typical data breach lawsuit?

In December of 2017, the Illinois Appellate Court (Second District) in Rosenbach v. Six Flags Entertainment Corp. held that a person aggrieved must allege some actual injury, adverse effect, or harm. The outcome makes sense because BIPA does not say that the data subject can sue “for a violation.” It requires two things: a violation of BIPA and that someone be aggrieved.

Nevertheless, last week the Illinois Appellate Court (First District) weighed in on the issue and reached an opposite conclusion, holding that a mere violation of BIPA, without additional harm, is all that is necessary to meet the “aggrieved by” standard for a private cause of action. The case, Sekura v. Krishna Schaumburg Tan, Inc., was brought against a tanning salon that used finger scans to admit members into its salons. The court rejected its sister court’s ruling in Rosenbach and held that aggrieved means only the deprivation of a legal right. The court further held that disclosure of biometric information to a third party (e.g., storing the information in the cloud) was sufficient to meet the “aggrieved by” standard, as was an allegation of mental injury. In short, the bar for meeting the “aggrieved by” standard, according to the First District’s conclusion, should be incredibly low.

What’s Next and When?

Presumably, the Sekura decision will be appealed quickly and joined with the Rosenbach case already pending at the Illinois Supreme Court. It is unclear what impact Sekura will have on the timing of a ruling from the Supreme Court on the issue, as briefing in the Rosenbach case was finished in September and the parties were simply awaiting the scheduling of an oral argument. It’s possible the court will wait for briefing to be perfected in the Sekura case before scheduling oral argument, or an expedited briefing process may take place because the issues in the two cases are so similar.

Substantively, one of the most significant consequences of the Sekura decision is that it could give the Illinois Supreme Court something to cite if it were inclined to reverse Rosenbach. I would argue that the reasoning in Rosenbach actually appears stronger in contrast to the Sekura decision. For example, the Sekura analogy of disclosing encrypted biometric information to a third party as equivalent to a disclosure of whether someone has AIDS under the AIDS Confidentiality Act is misplaced. Similarly, the Sekura reasoning makes the words “aggrieved by” meaningless as a mere violation of the statute also is all that is necessary to bring a private cause of action under the decision.

A Final Observation

Most concerning to me about the BIPA litigation generally is that it appears to be based on an unfounded fear and misunderstanding of the underlying technology companies use to collect, store, and share the subject information. Businesses are not collecting, storing, or sharing images of fingerprints, which might be accessed without permission and/or potentially misused. The finger scanning machines in question measure minutiae points and turn them into mathematical representations, which cannot be reverse engineered into a fingerprint. As a belt on these suspenders, the information is encrypted.

Two facts in the biometric privacy context are particularly telling and dispositive. First, no plaintiff or amici in any briefing in the more than 150 BIPA class actions has identified an example where biometric information was compromised. Why? Because the manner in which the finger scan information is collected is much like tokenization (a technology companies use to replace credit card numbers with valueless characters) – if a bad guy breaks in, all he can steal is a random set of characters that have no value.

Another important fact: all state data breach notification laws exempt encrypted information from the definition of personal information and the obligation to notify if it is the subject of a data breach. Why? Because there is no risk that a hacker can access the information and misuse it. Here, the subject information is encrypted so there is no risk of harm to the individuals bringing these lawsuits. The lawsuits are instead based on an unfounded fear of what could happen.

I wonder what impact a more fulsome explanation of the technology would have on the outcome of these cases. In the meantime, companies continue to spend significant sums of money defending these lawsuits and they face the risk of millions of dollars in potential liability.

In three months, the EU’s General Data Protection Regulation (GDPR), one of the strictest privacy laws in the world, will go into effect.  It will apply to companies that collect or process personal data of EU residents, regardless of whether the company is physically located in the EU.  Companies that violate the law will be penalized up to 4% of their annual worldwide revenue for the preceding financial year or 20,000,000 EUR, whichever is greater.  Is your organization ready?

Shook’s Privacy and Data Security Team regularly counsels multinational companies to comply with international privacy laws like the GDPR.  In an effort to help in-house lawyers understand whether the GDPR applies to their organizations and how to minimize its risks, we have prepared a webinar that provides tips on developing a GDPR compliance program.  The webinar is on-demand and complimentary.  Check it out here, and feel free to leave comments.


Does your company collect biometric information?  Are you not entirely sure what “biometric information” means?  Would you like to understand the differences between the different state biometric privacy laws?  Do you want to know why more than 50 companies were hit with class action lawsuits within a period of three months as a result of their biometric privacy practices?

If the answer to any of these questions is “yes” then check out this complimentary, on-demand webinar on Biometric Privacy prepared by Shook’s Privacy and Data Security team.  Then feel free to get in touch with any of the members of our Biometric Privacy Task Force (contact information at the end of the webinar).  Feel free to leave comments below.