What's So Great About an Information Security Policy?

Lawyers and compliance professionals constantly tout the importance of internal information security policies, particularly in light of data privacy problems that are reported almost daily in the media.  Admittedly, drafting such policies as a proactive measure can be a pain because there is always a tendency to worry that, unless you’ve suffered a data breach, you are the proverbial “solution in search of a problem.”

But it’s not.  In fact, in some cases, it’s actually required.  HIPAA (for protected health information), Gramm-Leach-Bliley (for financial information) and Regulation S-P (for broker-dealers and investment advisors) have specific federal rules related to information security practices and policies.  However, it would be unwise for a company that isn’t covered by these industry-specific rules to think a security policy isn’t needed.

It’s nearly inevitable that your company will suffer some sort of data incident in its corporate life cycle.  Even if not required, an information security policy can provide companies with important real-world benefits when responding to a data security incident. 

First, the information security policy can be used as a benchmark for training employees on data security and, just as importantly, data breach response.  A good security policy will include, at a minimum, information about (i) how personally identifiable information will be collected, stored and shared, (ii) how to report a possible data security incident, (iii) how users/customers will be notified of an incident, and (iv) the appropriate “point” persons in the company responsible for handling data privacy incidents.

Second, your company’s insurers, auditors, clients or customers may well require that you have an information security policy in order to demonstrate a competence in data privacy.  Being able to report affirmatively that you have a policy – as opposed to saying you “will prepare one” – showcases that your company is a privacy-forward company.  In the event of a breach, regulators or litigants could potentially raise questions about your data security practices.  The fact that you have an information security policy that you implemented both before, during, and after the breach helps show that your company took commercially reasonably steps to secure the information breached.

Speaking of data breaches, an information security policy also has a back-end benefit both to the company and its users.  For companies that provide products or services online, a user’s e-mail address is often the one and only way to communicate with the user in the event of a data breach. Yet many state laws do not allow for e-mail notification unless the company has obtained the prior consents required under the federal E-Sign Act (which few online companies likely can achieve a as a practical matter).  But many states have provisions that allow companies with existing information security policies to follow the data breach notification procedures contained within their IT security policy.  If applicable, these provisions can allow companies to notify their users electronically.  This provides a more cost-effective approach to notice in large breaches and often reflects the most practical method of communicating with the company’s users or customers.

Of course, an effective IT security policy must be put into practice – a “living” document that is part of your company’s privacy culture.  The mere exercise of preparing an information security policy will bring your key employees together to thoughtfully consider your current practices and revise them to become best practices on both the front end and the back end of a data breach.

SEC Says No More Mr. Nice Guy on Investment Adviser Cybersecurity

Over the last couple years, the SEC’s cybersecurity bark has been worse than its bite.  Its Office of Compliance, Inspections, and Examinations issued examination priorities in 2014.  Commissioner Aguilar warned public company boards that they had better get smart about the topic a few months later.  The results of OCIE’s cybersecurity exam sweep were released in March of this year.  And the Investment Management Division said words, not many words, about investment advisers’ responsibilities in this area in July.

Alleged Facts

What it hasn’t done recently is sue somebody for violating Reg. S-P.  But yesterday it did.  According to the SEC’s settled administrative order:

  • St. Louis-based R.T. Jones Capital Equities Management stored sensitive personally identifiable information (PII) of clients and others on its third party-hosted web server from September 2009 to July 2013.
  • Throughout this period, R.T. Jones failed to conduct periodic risk assessments, implement a firewall, encrypt PII stored on its server, or maintain a response plan for cybersecurity incidents.
  • An unknown hacker gained access to the firm’s web server in July 2013, rendering the PII of more than 100,000 individuals, including thousands of R.T. Jones’s clients, vulnerable to theft.

The Safeguards Rule

Whoops.  But while all of that sounds bad, it’s not actually what the firm is being sued over.  At issue is Reg. S-P’s Rule 30(a), the Safeguards Rule, which says, “Every broker, dealer, and investment company, and every investment adviser registered with the Commission must adopt written policies and procedures that address administrative, technical, and physical safeguards for the protection of customer records and information.”  And unfortunately, R.T. Jones allegedly failed entirely to adopt written policies and procedures reasonably designed to safeguard customer information.  Put another way, if R.T. Jones did have written policies and procedures designed to avoid the failures bulleted above, the cyber attack might have been avoided and we wouldn’t be here.  It’s paying a $75,000 civil penalty to put this matter behind it.

Fortunately, to date, R.T. Jones has not received any indications of a client suffering financial harm as a result of the attack.  And the firm appears to have acted quickly and responsibly once it did discover the breach.

Three Thoughts

I have three quick thoughts.  First, this is a relatively easy case for the SEC to bring. RT. Jones didn’t just have inadequate policies and procedures.  According to the SEC’s order, it didn’t have any written policies and procedures reasonably designed to safeguard its clients’ PII.   Second, over 90% of the individuals whose information was compromised were not even R.T. Jones clients, but participants in an investment plan in which R.T. Jones had joined.  The information appears to have been useful to R.T. Jones in the aggregate, but perhaps not so as to individuals.  If not, the firm might have purged that information from its systems and avoided the liability from losing their data.  Finally, periodic risk assessments, firewalls, encryption, and a cybersecurity response plan seem like good ideas right now.  But you knew that already. 

The SEC's Investment Management Division Has Some Things to Tell You about Cybersecurity

Ed. Note: This entry is cross posted from Cady Bar the Door, David Smyth's blog offering Insight & Commentary on SEC Enforcement Actions and White Collar Crime.

Lots of agencies and organizations want to boss you around about cybersecurity.  In April, the SEC and the Justice Department published more directions on the issue.  We’ll cover the very brief guidance issued by the SEC’s Division of Investment Management first, and then turn to DOJ in a later post.

First, as with everyone else, the IM Division thinks cybersecurity is very, very important for investment companies and investment advisers.

Second, the staff recommended that advisers and funds consider a number of measures to strengthen cybersecurity:

·       Conduct a periodic risk assessment.

·       Create a strategy designed to prevent, detect and respond to cybersecurity threats. Specific pieces of the strategy could include: tiered access to sensitive information and network resources; data encryption; restricted use of removable storage media; and development of an incident response plan.

·       Implement the strategy through written policies and procedures and training that provide guidance to officers and employees.  Then monitor compliance.

·       Assess whether protective cybersecurity measures are in place at relevant service providers.

This is a truncated list, and it isn’t magical.  The suggestions could apply to literally any business. You can read the full version here, but FINRA is way ahead of the Investment Management Division in providing usable guidance on how to bolster cybersecurity.

Third, and more interestingly, the guidance suggests that funds and advisers should take their compliance obligations under the federal securities laws into account in assessing their ability to prevent, detect and respond to cyber attacks.  So, maintaining a compliance program that is reasonably designed to prevent violations of the securities laws could also mitigate exposure to cyber threats, the guidance says.  “For example, the compliance program of a fund or an adviser could address cybersecurity risk as it relates to identity theft and data protection, fraud, and business continuity, as well as other disruptions in service that could affect, for instance, a fund’s ability to process shareholder transactions.”  In other words, if a cyber attack prevents you from, say, being able to process shareholder transactions, the staff is going to look back and see how well prepared you were before the assault.  If you weren't prepared at all, the end result probably won't be pretty, for the shareholders or you.

The guidance recognizes that it’s impossible to anticipate and prevent every cyber attack.  But it wants you to try.  And appropriate planning could mitigate the impacts of those attacks, as well as help “compl[iance] with the federal securities laws.”  Consider yourself warned.

FIN4 May Have Embarked on a Risky Hacking/Insider Trading Strategy

I haven’t yet turned to a life of crime, so far be it from me to criticize actual criminals’ profit-maximizing strategies.  It’s easy for me to nitpick, but I’m not the one strapping on my mask and trying to earn a (dis)honest dollar every day.  But have a look at this Reuters story from Tuesday. 

In it, we learn that the SEC and the Secret Service are investigating a sophisticated computer hacking group known as “FIN4” that allegedly “has tried to hack into email accounts at more than 100 companies, looking for confidential information on mergers and other market-moving events. The targets include more than 60 listed companies in biotechnologyand other healthcare-related fields, such as medical instruments, hospital equipment and drugs.”  Apparently their plan is to harvest this information and then trade on it.  Nobody knows where FIN4 is from.  They could be overseas, but supposedly their English is flawless and they have a deep knowledge of how financial markets work, so maybe they’re in the United States.  At one level, a little terrifying!

But this group hasn’t devised a complex, superpowered algorithm to steal information. Instead, it’s allegedly stealing information the (sort of) old fashioned way: through social engineering. The Reuters story explains that FIN4 “used fake Microsoft Outlook login pages to trick attorneys, executives and consultants into surrendering their user names and passwords.” In at least one case, “the hackers used a confidential document, containing significant information that they had already procured, to entice people discussing that matter into giving their email credentials.”

I have two main thoughts.  First, sound information handling practices, and appropriate wariness among professionals using email, still go a long way toward securing confidential data within organizations.  It’s often not the most technologically advanced tactics that yield the worst data breaches.  Second, FIN4 has embarked on a complex money-making plan.  There may be many uses of this information, but one of them seems to be trading securities in the public markets.  That’s not as simple as it seems.  If you’re doing that, you’re on the grid and can’t really hide.  FINRA sees all of those trades and it isn’t that hard for regulators to find out who is making them.  When the Consolidated Audit Trail comes online,*  it will be substantially easier and faster. In the meantime, broker-dealers are obligated to identify who their customers are.  If those people have electronic connections to the ones involved in the hacking, those links could be enough for the SEC to get an asset freeze before profits are siphoned overseas. 

What FIN4 is allegedly doing is scary, but they haven’t yet built a criminal ATM.



* Speaking of the Consolidated Audit Trail, when is that thing coming online? 


FCC Stakes Out Privacy Territory in Broadband Privacy Workshop

If you thought all the action in privacy regulation centered around the Federal Trade Commission, the Federal Communications Commission would like you to think again. Yesterday, April 28, the FCC held a 3-plus hour workshop that started the regulatory “conversation” on the manner in which the FCC can or should regulate consumer broadband privacy.

Chairman Wheeler kicked off the event with opening remarks that included this unequivocal statement: “Privacy is unassailable.” He also said that “changes in technology do not affect our values.” From these words and the text of the FCC’s “Open Internet” order released earlier this year, not to mention the FCC’s recent $25 million data breach consent decree with AT&T, it is clear the FCC intends to be involved in regulating consumer privacy.

Yesterday’s workshop follows the recent Open Internet order in which the agency determined it would apply certain aspects of its Title II authority to the Internet (namely, certain “common carrier” provisions of the Communications Act). The order has broad impact on issues like consumer access to broadband content that have been widely written about. But what the order means for privacy is that the FCC’s rules on “customer proprietary network information” or CPNI, which have historically applied to traditional telephone companies and interconnected VoIP providers, may apply more broadly in some form or fashion to others in the broadband ecosystem—particularly, broadband Internet access providers.

By way of background, CPNI (defined in Section 222(h)(1) of the Communications Act) is information collected by telecommunications carriers about their customers.  CPNI includes things like “quantity, technical configuration, type, destination, location, and amount of use of a telecommunications service” that a customer subscribes to and billing information. It is a fairly specific definition, and it doesn’t include personal information like name, phone number, address, etc. 

Section 222(a) of the Communications Act requires telecommunications carriers to protect the confidentiality of customer information, and Section 222(c) restricts the ability of telecommunication carriers to use, disclose, or permit access to individually identifiable CPNI without the customer’s approval or as required by law. 

The Open Internet order makes plain that the FCC intends to apply these CPNI confidentiality provisions (or some form of them) to broadband and broadband Internet access providers. But the FCC will need to adopt new rules to apply the CPNI provisions of the statute in this way. 

Yesterday’s workshop started that process with a discussion among stakeholders, though no formal rulemaking has been launched. Panelists discussed the privacy implications of broadband Internet access (for example, the kind of data broadband providers have access to) as well as specific concerns with applying Section 222 to broadband Internet access services. One common theme was the potentially overlapping jurisdiction of the FTC and the FCC in the area of privacy. But, significantly, one thing the FCC brings to the table in this area is general rulemaking authority, which the FTC lacks. 

We’ll have to watch and wait to see if a notice of inquiry, notice of proposed rulemaking, or other agency guidance will come.

The FCC typically uploads events like this to its archive, so check here or here in a few days if you would like to view the full event.


Physician Practices Be Alert: You Might Be Violating HIPAA If You Produce Medical Records In Response To State Court Subpoenas

Over the past months, my experiences with physician practices have made me realize that many practices do not understand how HIPAA applies to subpoenas for medical records.  More worrisome, I suspect that many practices nationwide routinely violate HIPAA when they receive a subpoena.

Here’s what I’ve observed:  Practices receive state court subpoenas that are signed by lawyers and that demand the production of medical records, and the practices automatically assume they must produce the records.  This is a dangerous assumption—the production of the records may very well violate HIPAA.

Here’s what HIPAA requires in general terms:  If a practice receives a state court subpoena for medical records that is signed by a lawyer, the practice should not produce the records unless (1) the practice also receives (a) a court order requiring production, (b) a HIPAA “qualified protective order” that’s been entered in the lawsuit, (c) a HIPAA compliant authorization from the patient that authorizes the disclosure demanded by the subpoena, or (d) certain other matters designated by HIPAA’s rule concerning subpoenas, or (2) the practice takes certain additional actions required by HIPAA’s rule for subpoenas. 

If a practice receives such a subpoena without receiving any of these “additional” items or taking these “additional” actions, the practice will likely violate HIPAA if the records are produced.

Here’s what practices should do.  Because this area of HIPAA is somewhat complex and difficult for practices to navigate on their own, practices should consult with legal counsel when they receive such a subpoena.  Legal counsel can advise whether HIPAA permits the disclosure, whether the practice needs to object to the subpoena, and whether other actions should be taken.  On numerous occasions, we have reviewed such subpoenas, determined that they did not comply with HIPAA, and sent a letter objecting to the subpoena, and the practice never heard from the parties again. 

Take away:  If you receive a state court subpoena signed by a lawyer demanding the production of medical records, do not automatically produce the medical records.

BYOD: Five Things To Consider When Creating Your Policy

“BYOD” or “bring your own device” (also known as the “consumerization of IT”) is a fact of life in today’s workplace. BYOD refers to the practice of using personally owned devices—like smartphones, tablets, and laptops—for work purposes and allowing these devices to connect to company networks and applications. According to a Gartner study released in late 2014, 40% of U.S. employees working for large companies use personally owned devices for work purposes. Of those who reported using personally owned devices, only 25% were required by their employers to do so.  And of the 75% who chose to use their personally owned devices for work, half did so without their employers’ knowledge.

If that last statistic doesn’t alarm you a little, it should. 

BYOD can be great for productivity, but it also creates risk for companies. Why? Because personally owned devices are not managed or controlled by company IT, and security of the device is in the hands of the employee. In other words, these devices can be a source of company data loss or data leakage. For example, if an unencrypted personal laptop with company data on it gets lost or stolen, you may have a data breach on your hands. 

So how do you address these concerns?  Start with a written, employee-acknowledged BYOD policy. 

Here are five things to consider as you develop your policy:

1.      Start by building an interdisciplinary team to create the policy. The team should include IT, human resources, legal, compliance, and employees who use their personally owned devices. BYOD is not just an IT issue but also a legal and business issue. Different perspectives will help lead to a policy that fits your organization’s needs and is capable of being followed.


2.      Develop the goals of the policy. Security should be a goal but productivity is also important. Cost savings may also be an objective. These, and other goals, may be in tension at times. In the end, you want to develop a policy that strikes the right balance for your company. Consider a BYOD policy that ensures the only way enterprise data flows into or out from a device is through enterprise systems (e.g., avoid emailing business information to employee personal email accounts).


3.      Determine which employees are covered and how they are covered. There may be different policies for different types of employees based on job type or function. For example, the policy for exempt employees may be different than the one for non-exempt employees. Consider whether all employees need to be able to use tablets (for example) to access corporate data for work purposes. Be sure to consult with counsel about employment laws and regulations that may apply in this area.


4.      Decide which devices/uses are permitted and how the policy applies to each. For example, the BYOD team should conduct “app reviews” to decide what apps are okay for business use and what apps are not allowed.  Particular types of smartphones, tablets, or laptops may not meet the company’s security requirements and, if not, should not be permitted. Also, the policy for smartphones may be different than the policy for laptops because they are different devices used in different ways and may pose different security risks.


5.      Build in training so that each employee knows the policy and how it applies to him or her. In the end, security is about people just as much as it is about technology.


This isn’t an exhaustive list of considerations, but it will help get you started crafting a BYOD policy tailored to your company. It’s important that you do so, if you haven’t already.


You are What You Keep

Suffering a data breach is bad enough. As often as it appears to happen, companies that are affected by a breach still shoulder a considerable burden. Management must stop the trains to identify the cause and scope of the breach—and then prepare for the aftermath. Lawyers are involved. The company’s brand is at risk. And the costs—employee time, legal fees, security consultants—quickly escalate.

But what if you determine that your company didn’t really need the information that was exposed? Suppose you find out that the breach involved a file that contained drivers’ license numbers or even credit card information, but your company had virtually no administrative need for that information? Or suppose the data pertained to transactions by young adults 7 years ago – and it is highly unlikely that any of the information is still relevant (much less accurate). This is the kind of discovery that lends insult to injury. Your company is forced to stop and invest significant funds to respond to a data breach relating to data that you don’t have any use for anymore for its marketing or operational efforts.

The surest way to avoid this problem is to review and assess the way you currently collect, retain, and store information. Here are a few items to consider:

·       Collection – Do you really need all of the personal information that you are collecting from consumers? Review your intake procedure and revise it to collect only what you need for operational or marketing purposes. Also, are you even aware of all of the different portals through which your company may be collecting data from consumers? Be sure you’ve done that so that you can assure that you are doing a full assessment. Do you have someone in your organization responsible for tracking the types of data you are collecting and the different processes through which you are collecting the data?

·       Retention – How long are you storing personal information? And for what purposes? Are your practices consistent with PCI standards? What is your current retention policy and are you following it? There are federal and state laws that may govern the retention, disposal or destruction of your data. Be familiar with those laws. Within the confines of applicable laws, be sure you are not holding on to unnecessary or outdated data that would cause you intolerable frustration in the event it was breached. Do you have someone in your organization responsible for overseeing retention and disposal?

·       Third Party Partners and Vendors – If you are sharing personal information with other parties (which, of course, needs to be disclosed to consumers in your privacy policy), be sure that your agreements with those parties contain appropriate safeguards. Are you requiring your vendors to secure personal information and prohibit the disclosure of that information? What happens in the event of the breach? Who bears the cost of notification? Are you vendors required to indemnify you if their mistakes lead to actions against your organization?


There is a simple rule that applies in a data breach: You are what you keep. So be careful with what information you currently collect and retain. Talk to your lawyer about whether certain information that you may consider to be “stale” may be properly and legally disposed. And, more importantly, consider revising your practices going forward so you don’t continue to collect or retain any stale or unnecessary information going forward.

SEC Releases Results of Cybersecurity Exam Sweep

Ed. Note: This entry is cross posted from Cady Bar the Door, David Smyth's blog offering Insight & Commentary on SEC Enforcement Actions and White Collar Crime.

We’re behind on this, but better (a little bit) late than never. Last month the SEC’s Office of Compliance, Inspections and Examinations released the first results of its Cybersecurity Examination Initiative, announced in April 2014 (and discussed here). As part of the initiative, OCIE staff examined 57 broker-dealers and 49 investment advisers to better understand how these entities “address the legal, regulatory, and compliance issues associated with cybersecurity.”  

What the Exams Looked For

In the exams, the staff collected and analyzed information from the selected firms relating to their practices for: ♦ identifying risks related to cybersecurity; ♦ establishing cybersecurity governance, including policies, procedures, and oversight processes; ♦ protecting firm networks and information; ♦ identifying and addressing risks associated with remote access to client information and fund transfer requests; ♦ identifying and addressing risks associated with vendors and other third parties; and ♦ detecting other unauthorized activity.

Importantly, the report is based on information as it existed in 2013 and through April 2014, so it’s already somewhat out of date.


The Good News

The report includes some good news about how seriously the SEC’s registered entities are taking cybersecurity.

·       The vast majority of examined broker-dealers (93%) and investment advisers (83%) have adopted written information security policies.

·       The vast majority of examined broker-dealers (93%) and investment advisers (79%) conduct periodic risk assessments to identify cybersecurity threats, vulnerabilities, and potential business consequences.

·       The vast majority of examined firms report conducting firm-wide inventorying, cataloguing, or mapping of their technology resources. Smart.

·       Many firms are utilizing external standards and other resources to model their information security architecture and processes. These include standards published by National Institute of Standards and Technology (“NIST”), the International Organization for Standardization (“ISO”), and the Federal Financial Institutions Examination Council (“FFIEC”).

Encouraging! But the report didn’t bring all good tidings.


The Bad News

Here are some of the less auspicious facts:


·       88% of the broker-dealers and 74% of the advisers reported being the subject of a cyber-related incident.

·       Most of the broker-dealers (88%) require risk assessments of their vendors, but only 32% of the investment advisers do. 

·       Related to that, most of the broker-dealers incorporate requirements relating to cybersecurity risk into their contracts with vendors and business partners (72%), but only 24% of the advisers incorporate such requirements. Fewer of each maintain policies and procedures related to information security training for vendors and business partners authorized to access their networks.

·       A slight majority of the broker-dealers maintain insurance for cybersecurity incidents, and only 21% of the investment advisers do.    


The Rest

Almost two-thirds of the broker-dealers (65%) that received fraudulent emails seeking to transfer funds filed a Suspicious Activity Report with FinCEN, as they’re likely required to do. The report then notes that only 7% of those firms reported the incidents to other regulators or law enforcement. It’s curious to me why the SEC would expect other reports to happen. With the SAR obligations in place, those firms probably, and reasonably, think all the necessary reporting has been done after the SAR has been filed. Also, these firms’ written policies and procedures generally don’t address whether they are responsible for client losses associated with cyber incidents.  Along these lines, it might be that requiring multi-factor authentication for clients and customers to access accounts could go a long way toward pushing responsibility for those losses on the users. 

But don’t take my word for it. Read the report yourself, linked above and here.

FTC Commissioner Comments on Consumer Privacy Bill of Rights

Last week, we posted about the Consumer Privacy Bill of Rights “discussion draft” released by the Obama Administration. On Thursday, March 5, at the annual U.S. meeting of the International Association of Privacy Professionals (which I attended), FTC Commissioner Julie Brill answered questions about her take on the bill and other policy issues. Here are just a few comments from that discussion that merit a follow-up post:

  • Commissioner Brill stated in no uncertain terms that the draft bill is not protective enough of consumers. At various times, she said there are “serious weaknesses in the draft,” “there’s no there there,” it needs “some meat,” and “where are the boundaries?” She mentioned a specific example of a more consumer-protective approach relating to consent to certain data practices. She indicated she would like to see the bill require affirmative express consent of the individual for (a) material retroactive changes to a privacy statement and (b) use of sensitive information out of context of the transaction in which it was collected.
  • Although Section 402 of the draft bill provides that the FTC’s unfair and deceptive authority under Section 5 of the FTC Act remains intact, Commissioner Brill expressed concern that it is not clear enough in the bill that the FTC would retain its full authority to enforce the “common law” of privacy as developed in its prior enforcement actions. 
  • Will anything come of the bill this Congress? Commissioner Brill said she expects it’s unlikely given other legislative priorities at this time.

Commissioner Brill commended the Administration for grappling with tough issues and working to improve privacy overall. However, hers is one more voice calling for additional work on broad federal consumer privacy legislation.