Published by Al Saikali

The SEC recently agreed to a $1,000,000 settlement of an enforcement action against Morgan Stanley for its failure to have sufficient data security policies and procedures to protect customer data. The settlement was significant for its amount. The true noteworthiness here, however, lies not in the end result but the implications of how it was reached: (1) the “reasonableness” of a company’s data security safeguards shall be judged in hindsight, and (2) almost any data breach could give rise to liability. The SEC has left no room for error in making sure that your cybersecurity procedures and controls actually and always work.

What Happened?

Morgan Stanley maintained personally identifiable information collected from its brokerage and investment advisory services customers on two internal company portals. Between 2011 and 2014, an employee unlawfully downloaded and transferred confidential data for approximately 730,000 accounts from the portals to his own personal data storage device/website. It is unclear whether the transfer of information was for the employee’s personal convenience or a more nefarious purpose. Soon thereafter, however, the employee suffered a cyberattack on his personal storage device, leading to portions of the data being posted to at least three publicly available Internet sites. Morgan Stanley discovered the information leak through a routine Internet sweep, they immediately confronted the employee, and voluntarily brought the matter to law enforcement’s attention.

The employee who transferred the information to his personal device was criminally convicted for violating the Computer Fraud and Abuse Act by exceeding his access to a computer, he was sentenced to 36 months probation, and ordered to pay $600,000 in restitution. He also entered into a consent order with the SEC barring him from association with any broker, dealer, and investment adviser for five years.

Morgan Stanley entered into a consent order with the SEC pursuant to which Morgan Stanley agreed to pay a $1,000,000 fine, but did not admit or deny the findings in the order.

SIDE NOTE TO COMPLIANCE OFFICERS READING THIS BLOG POST – if ever you need a way to deter your employees from sending corporate information to their personal devices or email accounts, tell them about this case.

What Does The Law Require?

Federal security laws (Rule 30(a) of Regulation S-P – the “Safeguards Rule”) require registered broker-dealers and investment advisers to adopt written policies and procedures reasonably designed to:

  1. Insure the security and confidentiality of customer records and information;
  2. Protect against any anticipated threats or hazards to the security or integrity of customer records and information; and
  3. Protect against unauthorized access to or use of customer records or information that could result in substantial harm or inconvenience to any customer.

Here, the SEC based Morgan Stanley’s liability on the fact that Morgan Stanley:

failed to ensure the reasonable design and proper operation of its policies and procedures in safeguarding confidential customer data. In particular, the authorization modules were ineffective in limiting access with respect to one report available through one of the portals and absent with respect to a couple of the reports available through the portals. Moreover, Morgan Stanley failed to conduct any auditing or testing of the authorization modules for the portals at any point since their creation, and that testing would likely have revealed the deficiencies in the modules. Finally, Morgan Stanley did not monitor user activity in the portals to identify any unusual or suspicious patterns.

In other words, the authorization modules did not work in this instance, nor was there auditing to test and possibly identify the problem, nor had Morgan Stanley invested in sophisticated monitoring applications that would have identified that the employee was engaging in suspicious activity.

Why Should Companies Worry?

The most concerning part of the Morgan Stanley consent order is this paragraph, which describes some robust safeguards Morgan Stanley had implemented before the incident occurred:

MSSB [Morgan Stanley] adopted certain policies and restrictions with respect to employees’ access to and handling of confidential customer data available through the Portals. MSSB had written policies, including its Code of Conduct, that prohibited employees from accessing confidential information other than what employees had been authorized to access in order to perform their responsibilities. In addition, MSSB designed and installed authorization modules that, if properly implemented, should have permitted each employee to run reports via the Portals only with respect to the data for customers whom that employee supported. These modules required FAs [Financial Advisors] and CSAs [Client Service Associates] to input numbers associated with the user’s branch and FA or FA group number. MSSB’s systems then should have permitted the user to access data only with respect to those customers whose data the user was properly entitled to view. Finally, MSSB installed and maintained technology controls that, among other things, restricted employees from copying data onto removable storage devices and from accessing certain categories of websites.

Lesson learned: it doesn’t matter how robustly designed your policies and procedures may be, if they don’t actually work as designed then you could be liable under the Safeguards Rule.

Commentary

The standard applied by the SEC in this enforcement action is higher than a “reasonableness” standard. It is easy, after the fact, to find a weakness that could have been exploited. Indeed, it is unusual if you cannot identify such a vulnerability after the fact. If a criminal or departing employee is set on unlawfully accessing sensitive information he can likely do so no matter what hurdles you place in his way. A company should not be held liable for failing to stop every data incident. Some may argue that a company like Morgan Stanley must be held to a higher standard because of the known threats to the financial services industry and the potentially significant consequences to consumers of a financial services company suffering a data breach. Nevertheless, the law as written requires policies and procedures that are only “reasonably designed” to protect sensitive information; the law does not require that these policies and procedures be perfectly designed nor that they be effective 100% of the time, nor could it.

Hindsight is 20/20 and regulators would be hard pressed to find any organization that could show their policies and procedures are always followed. Could audits and testing have detected the fact that Morgan Stanley’s authorization module was not preventing the type of unauthorized access and transfer of sensitive information in this case? Possibly, depending on the depth of the audit and foresight of the auditors. But little benefit appears to have inured to Morgan Stanley for the fact it actually had an authorization module, data security training for its employees, policies and restrictions regarding employee access of information, controls that prevented the copying of data onto removable storage devices, and the fact that it voluntarily brought this matter to law enforcement’s attention.

Is there a risk now that the SEC’s interpretation of “reasonableness” will be applied similarly by state Attorneys General, the Health and Human Services’ Office of Civil Rights, the Federal Trade Commission, or other regulators? All of this reminds us that the definition of reasonableness in the context of data security is subjective, and that subjectivity is a risk that companies must address.

Takeaways

There are some important practical takeaways for companies from this settlement: (1) perform a risk assessment to determine how your organization could suffer from a similar risk (employee transferring corporate information to a personal device); (2) implement an authorization module and other policies and procedures to limit access (and identify unauthorized access) to sensitive information to those who have a legitimate business need; and (3) make sure you audit and test these controls so ensure that they actually work. Additionally, CISOs, compliance officers, and in house counsel would be well served to ensure that the story of this enforcement action becomes part of their organization’s data security training as part of the onboarding and annual training process.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

 

In October 2011, the U.S. Securities and Exchange Commission’s Division of Corporation Finance issued “CF Disclosure Guidance: Topic No. 2”, which was a guidance intended to provide some clarity as to the material cyber risks that a publicly traded company should disclose.  I previously wrote about the guidance.  This blog post is the first of a three-part series to take a deeper look at the guidance:  what does the guidance mean and require (Part I), how is the SEC using/enforcing the guidance (Part II), and how are companies complying with the guidance (Part III)? 

What is a disclosure guidance?

A disclosure guidance provides the views of a specific division of the SEC (in this case, the Division of Corporation Finance) regarding disclosure obligations (in this case, disclosure obligations relating to cybersecurity risks and cyber incidents).  It is not a rule, regulation, or statement of the Securities and Exchange Commission.  The SEC has neither approved nor disapproved its content.  In fact, the guidance did very little to change the legal landscape because companies are already required to disclose materials risks and incidents, so to the extent a cyber risk/incident is material, it must be disclosed regardless of the subject disclosure guidance.  Nevertheless, at a minimum, the guidance has brought attention to the need for a company to disclose risks/incidents related to cybersecurity and it attempts to clarify the types of cyber risks/incidents that should be disclosed.

What is the likelihood that the SEC will more clearly mandate disclosure of cyber incidents and risks?

Based on some recent events, there is a reasonable likelihood that we will see a Commission-level statement relatively soon, clearly and explicitly requiring publicly traded companies to disclose material cyber incidents and risks in their public filings.

On April 9, 2013, Senator Jay Rockefeller sent a letter to the recently confirmed SEC Chairwoman, Mary Jo White, in which he strongly urged the SEC to issue the guidance at the Commission level.  Senator Rockefeller cited investors’ needs to know whether companies are effectively addressing their cybersecurity risks, and a need for the private sector to make significant investments in cybersecurity.

Chairwoman White responded positively to Senator Rockefeller’s letter.  She reiterated the existing disclosure requirements to disclose risks and events that a reasonable investor would consider material.  She also informed Senator Rockefeller that she has asked the SEC staff to provide her with a briefing of current disclosure practices relating to cyber incidents/risks and overall compliance with the guidance, as well as recommendations for further action in this area.  In short, I would not be surprised to see further instruction from the SEC on the cyber incident/risk disclosure issue this year.

What is a cybersecurity risk or cyber incident under the guidance?

According to the guidance, a cyber incident can result from a deliberate attack or unintentional event and may include gaining unauthorized access to digital systems for purposes of misappropriating assets or sensitive information, corrupting data, or causing operational disruption.  Not all cyber incidents require gaining unauthorized access; a denial-of-service attack is such an example.  These incidents can be carried out by third parties or insiders and can involve sophisticated electronic circumvention of network security or social engineering to get information necessary to gain access.  The purpose may be to steal financial assets, intellectual property, or sensitive information belonging to companies, their customers, or their business partners.

Which cyber risks and incidents should be disclosed?

Publicly traded companies must disclose timely, comprehensive, and accurate information about risks and events that a reasonable investor would consider important to an investment decision. According to the guidance, material information about cybersecurity risks and cyber incidents must be disclosed when necessary to make other required disclosures not misleading.

What factors should a company consider in determining whether a risk or incident should be disclosed?

According to the guidance, companies should consider a number of factors in determining whether to disclose a cybersecurity risk, including:  (1) prior cyber incidents and the severity and frequency of those incidents; (2) the probability of cyber incidents occurring and the quantitative and qualitative magnitude of those risks (including the potential costs and other consequences resulting from misappropriation of assets or sensitive information, corruption of data or operational disruption); and (3) the adequacy of preventative actions taken to reduce cybersecurity risks in the context of the industry in which they operate and risks to that security, including threatened attacks of which they were aware.

What should a company disclose about a cyber risk or incident after it has determined that it wishes to make a disclosure?

Once a company has determined that it will disclose a risk or incident, it must adequately describe the nature of the material risks and specify how each risk affects the company.  Generic risks need not be disclosed.  Examples of appropriate disclosures include:  (1) discussion of aspects of the business or operations that give rise to material cybersecurity risks and the potential costs and consequences; (2) descriptions of outsourced functions that have material cybersecurity risks and how the company addresses those risks; (3) descriptions of cyber incidents experienced by the company that are individually, or in the aggregate, material, including a description of the costs and other consequences; (4) risks related to cyber incidents that remain undetected for an extended period; and (5) description of relevant insurance coverage.  The disclosure should be tailored to the company’s particular circumstances and avoid generic “boilerplate” disclosure.  That said, companies are not required to disclose information that would compromise the company’s cybersecurity.  Instead, companies should provide sufficient disclosure to allow an investor to appreciate the nature of the risks faced by the company in a manner that would not compromise the company’s cybersecurity.

Where in the public filing should the disclosure(s) be made?

There are a number of places in a company’s public filing where a disclosure of a cyber incident or risk may be made:

(1) Management’s Discussion and Analysis of Financial Condition – if the costs or other consequences associated with one or more known incidents or the risk of potential incidents represent a material event, trend, or uncertainty that is reasonably likely to affect the company’s results of operations, liquidity, or financial condition or would cause reported financial information not to be necessarily indicative of future operating results of financial condition.  An example provided in the guidance is a cyber attack that results in theft of material stolen intellectual property; there, the company should describe the property that was stolen, and the effect of the attack on its results of operations, liquidity, and financial condition, and whether the attack would cause reported financial information not to be indicative of future operating results or financial condition.  If it is “reasonably likely” that the attack will lead to reduced revenues, an increase in cybersecurity protection costs, or litigation costs, then those outcomes, the amount, and duration, should be discussed.

(2) Description of Business – if a cyber incident affects a company’s products, services, relationships with customers/suppliers, or competitive conditions, then the company should disclose these effects in the “Description of Business” section of the public filing.  An example provided in the Guidance is where a cyber incident materially impairs the future viability of a new product in development; such an incident and the potential impact should be discussed.

(3) Legal Proceedings – if a legal proceeding to which a company “or any of its subsidiaries” is a party involved a cyber incident, information may need to be disclosed in the “Legal Proceedings” section of the public filing.  The example provided in the Guidance is where customer information is stolen, which results in material litigation; there, the name of the court, the date the lawsuit was filed, the parties, a description of the factual basis, and the relief sought should be disclosed.

(4) Financial Statement Disclosures – companies should consider whether cyber risks and incidents have an impact on a company’s financial statements, and, if so, include them.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.

As I wrote in a previous post, the Securities and Exchange Commission’s (SEC) Division of Corporation Finance issued a Disclosure Guidance on October 13, 2011, that states publicly traded companies may be obligated to disclose cyber incidents and the risk of cyber incidents, depending on the application of various factors.

Now, according to a recent Bloomberg article, the SEC is cracking down on publicly traded companies’ failure to comply with the Guidance.  The SEC apparently sent “dozens” of letters to companies asking about their cybersecurity disclosures and pushing them to disclose.  Six of the companies who the SEC instructed to disclose included AIG, Amazon.com, Eastman Chemical Co., Google, Hartford Financial Services Group, and Quest Diagnostics, Inc.

With respect to Amazon.com, its Zappos.com unit was the victim of a cyber attack that resulted in the theft of addresses and credit card numbers belonging to 24 million of its customers.  In April, the SEC asked Amazon to disclose the attack, which, according to Bloomberg, Amazon now has, though not without objection.  Amazon initially resisted disclosing the cyber attack because, according to Amazon, Zappos did not contribute material revenue to Amazon.

Google, too, has now agreed to disclose a cyber attack that it had previously disclosed publicly in January 2010.  The SEC believed that disclosure in a formal SEC filing was necessary to “provide the proper context for your risk factor disclosures.”  Accordingly, Google agreed to repeat the information in its earnings report.

Hartford told the SEC that it hadn’t suffered a “material” cyber attack, but the SEC instructed it to disclose “any” attack.

AIG agreed to state in a future quarterly report that it had “from time to time, experienced threats to our data and systems, including malware and computer virus attacks, unauthorized access, systems failures and disruptions.”

The SEC’s action is significant because the Guidance is not technically a rule, though the SEC is effectively creating a rule by taking the position that these companies should have disclosed their breaches.  Failure to comply with an SEC letter can lead to fines amounting to hundreds of thousands of dollars; fighting the SEC in litigation could cost millions.  It will be interesting to see whether and to what extent the SEC will continue to crack down on companies that do not disclose cyber attacks and risks of cyber incidents.

 

DISCLAIMER:  The opinions expressed here represent those of Al Saikali and not those of Shook, Hardy & Bacon, LLP or its clients.  Similarly, the opinions expressed by those providing comments are theirs alone, and do not reflect the opinions of Al Saikali, Shook, Hardy & Bacon, or its clients.  All of the data and information provided on this site is for informational purposes only.  It is not legal advice nor should it be relied on as legal advice.