Published by Al Saikali

HB 969, a comprehensive privacy law that would immediately become the most onerous in the United States, sailed through the Florida House of Representatives’ Regulatory Reform Subcommittee yesterday. This post provides a recap of that hearing and my takeaway on the bill’s momentum. Continue Reading Florida Privacy Legislation Moves Forward

The following post was prepared by guest contributor, my friend, my brother-in-arms, and newly-minted Partner in Shook’s Privacy and Data Security Practice, Colman McCarthy 

For what seemed like an eternity (okay, just a couple years), the California Consumer Privacy Act was the only game in town when it came to state-level, comprehensive privacy legislation. Sure, we saw many other states introduce similar bills, and Washington got close a couple times to passing the Washington Privacy Act. Those all died on the vine, however. In fact, California was the only state after itself to see passage of anything really big, with the California Privacy Rights Act (which amends the CCPA, and is not a separate, new law) gaining passage in the November 2020 election.

All to which Virginia has recently stepped forward and said, “Hold my…authenticated consumer request.” The Virginia Consumer Data Protection Act (or VACDPA, as I prefer to call it) is a comprehensive privacy bill that was just signed into law by Governor Northam on March 2, and shows influences from both the CCPA and Europe’s General Data Protection Regulation.

What do you need to know about VACDPA, beyond the fact that it’s fun to say out loud? Probably the most important fact is that it won’t go into effect until January 1, 2023. That gives entities a long runway to understand their obligations under the law and get into compliance.

So, what about all the other stuff? Well… Continue Reading Virginia’s Foray Into Comprehensive Privacy Law

The Florida Legislature is considering a comprehensive privacy law (HB 969) that would fundamentally change the landscape of how/whether companies do business in Florida.  The bill is largely a “cut-and-paste” of the California Consumer Privacy Act (CCPA), but in some ways, it goes further than the CCPA and would make Florida’s law the most aggressive privacy law in the United States.  As I have previously described, the bill would create significant privacy rights for Florida residents, including the right to know what personal information companies are collecting about them, the source of that information, how the information is being shared, a right to request a copy of that information, and a right to delete/correct that information.  But the law goes too far – placing a crushing financial burden on most small and medium-sized businesses and creating a private right of action that dwarfs California’s version. This post analyzes the five most significant problems with HB 969 and proposes solutions. Continue Reading Five Ways To Improve Florida’s Proposed Privacy Law

Yesterday, the Governor of Florida threw his support behind a newly introduced consumer data privacy bill (HB 969) which is very similar to the California Consumer Privacy Act of 2018. The Governor’s support is a significant development given that he and both chambers of the Florida Legislature are Republican and, to date, there has not been any aligned support for a privacy law since the Florida Information Protection Act (FIPA), Florida’s data breach notification law.  Nevertheless, as with the CCPA, the bill proposes a boondoggle for the plaintiffs’ bar in the form of a private right of action for data breaches and statutory damages, which could present a significant obstacle to passage in the bill’s current form, particularly for a fairly business-friendly Florida Legislature. Continue Reading Florida Throws Its Hat Into the Privacy Ring, And It’s Looking A Lot Like California

Yesterday, in a 26-page opinion, the 11th U.S. Circuit Court of Appeals has weighed in on two important questions in the world of privacy and data breach litigation.  First, does a plaintiff have standing where he was exposed to a substantial risk of future identity theft, even though there was no misuse of his information. The court’s answer is no. Second, what efforts to mitigate this risk does a plaintiff need to undertake to meet the standing requirement.  Here, the court held that the plaintiff essentially manufactured his own injuries (wasted time, lost use of his preferred card, and lost credit card benefits) by voluntarily canceling his credit card, which is not enough to confer standing. Continue Reading The Eleventh U.S. Circuit Weighs in on Data Breach Standing Issues

The Florida Senate and House of Representatives are considering two bills (SB 1670 and HB 963) that, if adopted, will amend Florida law to create the state’s first comprehensive privacy law (though they do not go nearly as far as the CCPA). The proposed amendments would: (1) prohibit the use of personal data in public records maintained by state agencies for unsolicited marketing purposes, and (2) require companies doing business online to provide notice of their personal data collection/use activities and allow consumers to opt out of the sale of that data to third parties.  This article takes a deeper look at the proposed amendments, provides some context for them, and discusses the likelihood that they will become law. (Spoiler alert: the proposed amendments are significant and well-intended, but currently contain some flaws that, if addressed, create a good chance of the amendments becoming law).

 

Proposed Amendment #1 – Florida’s Public Records Request Laws (the “Marketing Amendment”)

For better or worse, companies are increasingly taking advantage of public records request laws to engage in marketing activities and unsolicited sales requests.  Under Florida’s (and most states’) public records request laws, for example, a company can request public records (e.g., mailing lists) from state and local agencies to obtain telephone numbers, email addresses, and physical addresses that the company can then use for marketing purposes.  I was interviewed about this issue several months ago and the story garnered the interest of state lawmakers.  In theory, the current law can be misused by malicious actors who request this information to take advantage of Florida’s elderly population and engage in fraudulent activity.

The proposed legislation would amend current law by adding the following language to section 119.01 of the Florida Statutes:  “(4) Any public records requested from state agencies that include the personal data, including the name, address, and birthdate, or any portion thereof, of a resident of this state may not be used to market or solicit the sale of products or services to the person or to contact the person for the purpose of marketing or soliciting sales without the consent of the person. Such marketing, soliciting, and contact is prohibited unless the person has affirmatively consented by electronic or paper notification to share the data with a third party before the data is used for such purpose.”

While undoubtedly well-intended, the amendment suffers from a few flaws.  The first is a lack of clarity as to what is considered “personal data.”  The proposed amendment doesn’t define the term.  It merely provides a few examples.  This lack of guidance will make it difficult for a company that acquires data in public records to know whether it can use the data for marketing purposes or not.  Technically under the proposed definition, the personal data doesn’t even have to be identifiable (i.e., allow you to know the person to whom it relates), so a phone number or email address alone, without knowing who it belongs to, may be enough to be considered personal data governed by the law.  This ambiguity can be addressed by including a more specific definition or, at minimum, making clear that personal data means information that is identifiable to a specific individual.

A second potential problem is the timing of the consent requirement.  The proposed amendment requires consent only before marketing/soliciting begins; it doesn’t require consent before a release of the public records.  As a result, there’s no way, in the bill as written, for the state agency to act as a “check” to help enforce the goals this law seeks to create.  This may be on purpose — there are limitations on how a state agency can respond to a public records request and “what do you intend to do with this information?” isn’t a permissible response to a request.

One way to address this issue is for state agencies to condition the release of public records on the requesting parties’ agreement not to use the information for marketing purposes without first obtaining the individual’s consent to do so, such as by requiring the requester to check a box so stating.  (Florida public-records law, Fla. Stat. 119.07, allows “reasonable conditions” on the disclosure of public records – though the allowable scope of conditions is unclear).   This approach could have real teeth because if the company later violates that representation, it could give rise to a misrepresentation or breach of contract claim that a consumer might be able to bring as a third-party beneficiary of the agreement.  In other words, it could create a private right of action that the law as proposed does not currently have.

The lack of a private right of action leads to a third concern:  enforcement challenges.  Currently, the requirement would be enforced entirely by the Florida Attorney General, but it’s unclear how the Florida AG will learn that personal data obtained from public records was actually used for marketing/soliciting purposes.  It seems like a very difficult violation to uncover, short of a whistleblower bringing it to light.

 

Proposed Amendment #2 – Online Consumer Privacy Rights (the “Notice and Opt-Out” Amendment)

The second, more substantial, change that SB 1670 and HB 963 would make is amending section 501.062 of the Florida Statutes to provide Florida residents with more notice and control over how companies doing business online use their personal data.  In short, “operators” who collect or maintain “covered information” about Florida residents must provide notice about their data collection/use practices and give the consumers an ability to opt out of the current and future sale of that information.  Let’s unpack this proposed amendment:

Who Does The Amendment Apply To? 

The change applies to any “operator”, which is defined as a person (or entity?) that:

(1) owns or operates a website or online service for commercial purposes;

(2) collects and maintains covered information from consumers who reside in this state and use or visit the website or online service;

(3) purposefully directs activities toward this state or purposefully executes a transaction or engages in any activity with this state or a resident thereof.

A couple of problems immediately jump out.  There is no “and” or “or” between #2 and #3, so it is unclear whether an “operator” is a person/entity that meets just one of these three requirements, or if it instead has to meet all three requirements.  Let’s assume it’s the latter, because the definition doesn’t make much sense if one or two of the three elements are missing.

Additionally, the law explicitly states that it does not apply to operators located in Florida.  If the law only applies to companies having no physical presence in Florida, but who collect information from Florida residents online, it could implicate personal jurisdiction issues – particularly given Florida’s strict long-arm personal jurisdiction requirements.  Such issues are not uncommon in the privacy-law context, but they have yet to be litigated.  Again, the lack of an appropriate conjunctive or disjunctive operator makes it unclear whether all “operators” in Florida are exempted or just those that meet other criteria such as less than 20,000 unique website visitors per year (a basically meaningless threshold).

There are some exceptions to the definition of an operator.  For example, a company that operates, hosts, or manages the online service on behalf of an operator or processes information on behalf of the operator, is not governed by this law.  There are also carve-outs for HIPAA- and GLBA-governed entities, as well as for some motor-vehicle manufacturers that retrieve covered information from a technology or service related to the vehicle.

What Type of “Personal Data” Does This Change Apply To?

The proposed amendment applies to “covered information,” which is defined as a first and last name, an address that includes a street and name of city, an email address, a phone number, a social security number, an identifier that allows a consumer to be contacted either physically or online (e.g., a username or screen name), and “any other information concerning a consumer that is collected from the consumer through the website or online service of the operator and maintained by the operator in combination with an identifier in a form that makes the information personally identifiable.”

An initial concern with this definition is potential overbreadth.  Unlike the Florida Information Protection Act (Florida’s data-breach notification law) which requires a name and another element of information, this law does not require both for the definition of covered information to be triggered.  An argument could be made, therefore, that collecting a physical or email address, a phone number, a social security number, or a username, WITHOUT the consumer’s name would still be considered covered information under this definition, which is highly unusual for a United States privacy law.

On the other hand, there are certain elements of what is traditionally covered information that might not be covered information under this definition.  For example, financial information, driver’s license numbers, passport information, or other elements of “personal data” under the Florida Information Protection Act are not considered covered information under this proposed law.  It’s possible that the last category of “covered information” (information concerning a consumer collected through the online service in combination with an identifier that makes the consumer identifiable) would cover those data elements, though those account numbers alone, without a link to a specific individual, would notbe considered covered information.

What Are An “Operator’s” Obligations?

The proposed amendment would impose opt-out and notice obligations on operators.  First, a consumer can request to opt out of the operator’s current or future sale of their covered information to a third party.  (NOTE:  unlike the overly-broad definition of a “consumer” under the CCPA that includes any resident of California, the proposed amendment applies a more conventional meaning of a consumer as an individual who seeks or acquires goods or services).

The consumer’s opt-out request must be verified, meaning that the operator can reasonably confirm the authenticity of the request, which makes sense for security purposes.  To that end, the operator has to establish a designated request address through which a consumer can submit a verified request.  The operator has to respond to the request within 60 days (with a 30-day extension available).

In addition to the right to opt out, the operator must provide notice (the method is not prescribed) that:

  • Identifies the categories of covered information the operator collects about consumers;
  • Identifies the categories of third parties with whom the operator may share such covered information;
  • Provides a description of the process for a consumer to review and request changes to his or her covered information;
  • Describes the process by which the operator notifies consumers of material changes to the notice;
  • Discloses whether a third party may collect covered information about a consumer’s online activities over time and across different websites or online services when the consumer uses the operator’s website or online service; and,
  • States the effective date of the notice.

These notice requirements are nothing new to state privacy laws, as they closely mirror those that CalOPPA imposed a number of years ago, but they’re new under Florida law.

How Will The Proposed Notice and Opt-Out Amendment Be Enforced?

The amendment states that it does not create a private right of action against an operator. Instead, it will be enforced by the Florida Attorney General, who must adopt rules to do so.

An operator must first be given 30 days to try to cure the alleged violation, though no right to cure will apply where a notice makes a knowing and material misrepresentation or omission to the detriment of a consumer.

The proposed legislation would allow for injunctive relief or civil penalties not to exceed $5,000 for “each violation.”  It’s not clear how the term violation will be applied: is that per incident, per consumer, per day, or per transaction?  Notably, privacy laws in California, Illinois, and other jurisdictions suffer from this same lack of clarity.  The Sedona Conference’s Working Group on Privacy and Data Security Liability is working on a commentary that will hopefully provide guidance on this issue.

 

What Is The Likelihood The Proposed Amendments Will Become Law?

The fact that SB 1670 and HB 963 are decidedly less comprehensive than the CCPA was likely a strategic decision: a CCPA-like law would have little chance of becoming law in Florida, given the current composition of the Legislature and Governor’s seat.  Nevertheless, the idea of giving Florida residents more control and notice over their online personal data, and limiting the barrage of unsolicited calls, texts, emails, and mail, will appeal to most Floridians.

Some businesses may push back against the legislation, and the degree to which there is pushback will likely be the controlling factor over the amendments’ fate.  But the fact that these bills were introduced by Republican legislators and they lack a private right of action, means they have a much better shot of passing than the proposed biometric privacy law.

In short, there’s a good chance we could see some or all of the proposed legislation become law.  At the very least, it’s a development that should stay on your radar, particularly to the extent your business (i) uses public records as a source of information for marketing purposes and/or (ii) could be considered an “operator.”

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

An identical version of the Illinois Biometric Information Privacy Act (BIPA) has been introduced in the Florida Senate.  The bill includes the same private right of action.  The Illinois BIPA has become an enormous revenue earner for the plaintiff’s bar, who have filed gotcha lawsuits against companies seeking millions of dollars on the ground that the companies did not comply with all of the technical requirements of the law.  I suspect that is a similar driving force behind the Florida version.

Like the Illinois version, the Florida version does not take into consideration how biometric technology actually works.  It is based on a misunderstanding that devices using biometrics for authentication are storing libraries of fingerprints, iris scans, etc., that can be hacked, stolen, and misused.  Hopefully, if the legislation proceeds to receive serious consideration, the Florida Legislature will hold hearings to learn more about how biometric technology works, the massive liability a bill like this could create for companies doing business in Florida, and the possibility that companies will stop using what is the most secure form of authentication out of a fear of liability.

Given the politically conservative makeup of the Florida state legislature and Governor’s office, I do not anticipate that this bill will become law.  Nevertheless, it is definitely under-the-radar legislation that has not received attention in the media and we should monitor it moving forward.

The Illinois Supreme Court’s decision last week in Rosenbach v. Six Flags may have closed the first of what will be several chapters in class action litigation arising from the Illinois Biometric Information Privacy Act (BIPA).  The court addressed the very narrow issue of what it means for a person to be “aggrieved” under BIPA.  Ultimately, the court held that a violation of the notice, consent, disclosure, or other requirements of BIPA alone, without proof of actual harm, is sufficient for a person to be considered “aggrieved” by a violation of the law.

There are several important issues, however, that were not before the court and remain to be litigated.  One of those issues is implied notice and consent. Defendants will argue that the plaintiffs who checked in/out at work using fingerscan timekeeping systems (which is the fact pattern of almost all of the almost 200 class action lawsuits filed in state court) knew that the fingerscans were being collected and used by their employers for timekeeping purposes, and they voluntarily provided that information.

Federal courts have dismissed such lawsuits, reasoning that plaintiffs effectively received notice and gave consent.  In Howe v. Speedway LLC, for example, the court in a fingerscan timekeeping case held that the plaintiff’s “fingerprints were collected in circumstances under which any reasonable person should have known that his biometric data was being collected.”  Similarly, in Santana v.Take-Two Interactive Software, Inc.the U.S. Court of Appeals for the Second Circuit held that plaintiffs essentially received the notice and consent contemplated by BIPA because “the plaintiffs, at the very least, understood that Take-Two had to collect data based upon their faces in order to create the personalized basketball avatars, and that a derivative of the data would be stored in the resulting digital faces of those avatars so long as those avatars existed.”  In dismissing for lack of standing, the McGinnis court reasoned that the plaintiff “knew his fingerprints were being collected because he scanned them in every time he clocked in or out of work.”

Another significant defense is constitutional standing.  Federal courts have recently dismissed BIPA lawsuits on the ground that they do not meet Article III standing requirements.  Defendants in state court will argue that Illinois constitutional standing (which Illinois state courts have held should be similar to federal law) requires a level of harm that, at a minimum, should be what Article III of the U.S. Constitution requires. To hold otherwise would lead to a different result for a party based entirely on whether the lawsuit is filed in federal or state court.

Defendants will argue that most of the claims are barred by the one-year statute of limitations that applies to claims involving the right of privacy.  Assuming that the one-year statute of limitations is applied, the classes of affected individuals will shrink considerably.

Defendants will also contend that the information collected/stored by the timekeeping devices is not considered biometric information under BIPA.  There is no library of fingerprints stored by these timekeeping devices.  Instead, the devices measure minutiae points and convert those measurements into mathematical representations using a proprietary formula that cannot be used to create a fingerprint.  More security is layered on top of that — the mathematical representation is encrypted.  For these reasons, no plaintiff in any of these biometric cases has been able to point to a single data breach involving biometric information.  The technology is essentially tokenization(similar to Apple Pay), where if a hacker were to access the actual device, he’d find nothing there to steal because the valuable thing (the credit card number or, in this case, fingerprint) is not stored on the device but is instead replaced by a numerical representation.

Plaintiffs will also have to prove that the defendants didn’t just violate BIPA, but did so negligently or intentionally.  This is not an easy standard to meet, especially if the trier of fact determines that these are “gotcha” lawsuits, meant to catch companies off-guard about a little known and rarely used state law.

Assuming the plaintiffs jump all these hurdles, they must still demonstrate that these cases are appropriate for class certification. The cases involve different facts regarding whether individual plaintiffs received notice, whether they gave consent, whether they used the fingerscan method of authentication or another method like PIN number or RFID card, whether they enrolled in Illinois, and whether their claim involves a violation of BIPA beyond collection or storage. Given these differences between plaintiffs, it will be difficult for them to meet the commonality and fairness requirements for class certification.

To be sure, some Defendants will face their own challenges.  A line of cases has held that where companies used their time-clock provider’s cloud service to store or back up timekeeping information from the clock, they may be in violation of BIPA’s prohibition against disclosure of biometric identifiers to a third party.  But at least one court has disagreed with that logic, stating that not all disclosures to a third party automatically present a concrete injury, and whether the third party has strong protocols and practices in place to protect data is relevant to the inquiry.

Defendants need only win one of these (or several other) defenses.  Plaintiffs must win them all.  In the meantime, plaintiffs must hope that the Illinois legislature does not notice that hundreds of BIPA lawsuits are flooding the Illinois state court system creating potentially crippling liability for companies that tried to adopt more secure methods of authentication, which could lead to an amendment that would make the law more consistent with its original intent. 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

On Friday afternoon an Illinois intermediate appellate court decided that the bar for a plaintiff bringing a class action lawsuit under the Illinois Biometric Information Privacy Act (BIPA) is low, creating a conflict with its sister intermediate appellate court. The Illinois Supreme Court is expected to resolve the conflict early next year. How the court resolves the conflict will significantly impact companies doing business in Illinois.

Background

BIPA requires companies to provide notice and obtain consent from Illinois residents before collecting their biometric information. It also limits what companies can do with biometric information and requires the adoption of certain security safeguards. Any person “aggrieved by a violation” of the law may sue for actual damages or statutory damages ranging from $1,000 to $5,000 per violation. You can learn more about BIPA from my earlier blog post.

Beginning in the fall of 2017, Illinois businesses of all sizes were hit with “gotcha” class action lawsuits brought by former employees looking for reasons to sue their former employers. Those companies used timekeeping systems that required employees to scan their fingers to punch in and out of work. Ironically, the timekeeping systems improved security by reducing fraud and strengthening authentication. Nevertheless, many companies were not aware of BIPA or the possibility that it might apply to their timekeeping systems. The plaintiff’s bar was quick to pounce. Over 150 class actions were filed by former employees claiming that they did not receive BIPA’s requisite notice and consent (despite the fact the employees voluntarily placed their fingers on these devices every day). The lawsuits in aggregate seek tens of millions of dollars from companies doing business in Illinois.

Requisite Harm for a Private Cause of Action

A key question in the BIPA litigation is what it means to be “aggrieved by a violation.” Is it enough that an employee doesn’t receive notice and consent, or must they show that they suffered some actual harm (e.g., financial loss or identity theft) as a result of the violation, as would be necessary in a typical data breach lawsuit?

In December of 2017, the Illinois Appellate Court (Second District) in Rosenbach v. Six Flags Entertainment Corp. held that a person aggrieved must allege some actual injury, adverse effect, or harm. The outcome makes sense because BIPA does not say that the data subject can sue “for a violation.” It requires two things: a violation of BIPA and that someone be aggrieved.

Nevertheless, last week the Illinois Appellate Court (First District) weighed in on the issue and reached an opposite conclusion, holding that a mere violation of BIPA, without additional harm, is all that is necessary to meet the “aggrieved by” standard for a private cause of action. The case, Sekura v. Krishna Schaumburg Tan, Inc., was brought against a tanning salon that used finger scans to admit members into its salons. The court rejected its sister court’s ruling in Rosenbach and held that aggrieved means only the deprivation of a legal right. The court further held that disclosure of biometric information to a third party (e.g., storing the information in the cloud) was sufficient to meet the “aggrieved by” standard, as was an allegation of mental injury. In short, the bar for meeting the “aggrieved by” standard, according to the First District’s conclusion, should be incredibly low.

What’s Next and When?

Presumably, the Sekura decision will be appealed quickly and joined with the Rosenbach case already pending at the Illinois Supreme Court. It is unclear what impact Sekura will have on the timing of a ruling from the Supreme Court on the issue, as briefing in the Rosenbach case was finished in September and the parties were simply awaiting the scheduling of an oral argument. It’s possible the court will wait for briefing to be perfected in the Sekura case before scheduling oral argument, or an expedited briefing process may take place because the issues in the two cases are so similar.

Substantively, one of the most significant consequences of the Sekura decision is that it could give the Illinois Supreme Court something to cite if it were inclined to reverse Rosenbach. I would argue that the reasoning in Rosenbach actually appears stronger in contrast to the Sekura decision. For example, the Sekura analogy of disclosing encrypted biometric information to a third party as equivalent to a disclosure of whether someone has AIDS under the AIDS Confidentiality Act is misplaced. Similarly, the Sekura reasoning makes the words “aggrieved by” meaningless as a mere violation of the statute also is all that is necessary to bring a private cause of action under the decision.

A Final Observation

Most concerning to me about the BIPA litigation generally is that it appears to be based on an unfounded fear and misunderstanding of the underlying technology companies use to collect, store, and share the subject information. Businesses are not collecting, storing, or sharing images of fingerprints, which might be accessed without permission and/or potentially misused. The finger scanning machines in question measure minutiae points and turn them into mathematical representations, which cannot be reverse engineered into a fingerprint. As a belt on these suspenders, the information is encrypted.

Two facts in the biometric privacy context are particularly telling and dispositive. First, no plaintiff or amici in any briefing in the more than 150 BIPA class actions has identified an example where biometric information was compromised. Why? Because the manner in which the finger scan information is collected is much like tokenization (a technology companies use to replace credit card numbers with valueless characters) – if a bad guy breaks in, all he can steal is a random set of characters that have no value.

Another important fact: all state data breach notification laws exempt encrypted information from the definition of personal information and the obligation to notify if it is the subject of a data breach. Why? Because there is no risk that a hacker can access the information and misuse it. Here, the subject information is encrypted so there is no risk of harm to the individuals bringing these lawsuits. The lawsuits are instead based on an unfounded fear of what could happen.

I wonder what impact a more fulsome explanation of the technology would have on the outcome of these cases. In the meantime, companies continue to spend significant sums of money defending these lawsuits and they face the risk of millions of dollars in potential liability.