By: Aaron Krowne

A heated battle regarding the general province of federal regulators over businesses’ privacy and data security practices is currently raging. We are referring to the pending case of FTC v. Wyndham Worldwide Corp., which is being much-watched in the data security world. It pits, on one side, the Federal Trade Commission (“FTC”), with its general authority to prevent “unfair or deceptive trade practices,” against Wyndham Worldwide Corp. (“Wyndham”), a hotel chain-owner which was recently hit by a series of high-profile data breach hack-attacks. The main question to be decided is: does the FTC’s general anti-“unfair or deceptive” authority translate into a discretionary (as opposed to regulatory) power over privacy and data security practices?

Background of the Case

On July 30, 2014, FTC v. Wyndham was accepted on appeal to the Third Circuit, after Wyndham failed in its attempt to have the case dismissed. However, Wyndham was granted an interlocutory appeal, meaning that the issues it raised were considered by the Circuit Court important enough to determine the outcome of the case and thus needed to hear an appeal immediately.

 The case stems from a series of data breaches in 2008 and 2009 resulting from the hacking of Wyndham computers. It is estimated that personal information of upwards of 600,000 Wyndham customers was stolen, resulting in over $10 million lost through fraud (i.e., credit card fraud).

The FTC filed suit against Wyndham for the breach under Section 5 of the FTC Act, alleging (1) that the breach was due to a number of inadequate security practices and policies, and was thus unfair to consumers; and (2) that this conduct was also deceptive, as it fell short of the assurances given in Wyndham’s privacy policy and its other disclosures to consumers.

The security inadequacies cited by the FTC present a virtual laundry-list of cringe-worthy data-security faux pas, including: failing to employ firewalls; permitting storage of payment card information in clear readable text; failing to make sure Wyndham-branded hotels implemented adequate information security policies and procedures prior to connecting their local computer networks to Hotels and Resorts (Wyndham’s parent company’s); permitting Wyndham-branded hotels to connect unsecure servers to the network; utilizing servers with outdated operating systems that could not receive security updates and thus could not remedy known vulnerabilities; permitting servers to have commonly-known default user IDs and passwords; failing to employ commonly-used methods to require user IDs and passwords that are difficult for hackers to guess; failing to adequately inventory computers connected to the network; failing to monitor the network for malware used in a previous intrusion; and failing to restrict third-party access.

Most people with basic knowledge of data security would agree that these alleged practices of Wyndham are highly disconcerting and do fall below commonly-accepted industry standards, and thus, anyone partaking in such practices should be exposed to legal liability for any damage that results from them. The novel development with this case is the FTC’s construction of such consumer-unfriendly practices as “unfair” under Section 5 of the FTC Act, which thus brings them under its purview for remedial and punitive action.

Wyndham resisted the FTC’s enforcement action by attempting to dismiss the case, arguing (1) that poor data security practices are not “unfair” under the FTC Act, and that (2) regardless, the FTC must make formal regulations outlining any data security practices to which its prosecutorial power applies, before filing suit.

Wyndham’s dismissal attempt based on these arguments was resoundingly rejected by the District Court. This Court’s primary rationale was, in effect, its observation that the FTC Act, with Section 5’s “unfair and deceptive” enforcement power, was intentionally written broadly, thus implying that the FTC has domain over any area of corporate practice significantly impacting consumers. Additionally, this broad drafting provides that this power is largely discretionary, which would be defeated by requiring it always be reduced to detailed regulations in advance.

Addressing the “unfairness” question directly, the FTC argued (and the District Court agreed) that, in the data-security context, “reasonableness [of the practices] is the touchstone” for Section 5 enforcement, and that, particularly, “unreasonable data security practices are unfair.” As to defining unreasonable security practices, Wyndham advocated a strict “ascertainable certainty” standard (i.e., specific regulations set out in advance), but the District Court (again, siding with the FTC) shot back that “reasonableness provides ascertainable certainty to companies.” This argument seems almost circular and fails to define what exactly is “reasonable” in this context. But the District Court observed that in other areas of federal enforcement (e.g., the National Labor Relations Board and the Occupational Safety and Health Act), an unwritten “reasonableness” standard is routinely used in the prosecution of cases. Typically, in such cases, reference is made to prevailing industry standards and practices, which, as the District Court observed, Wyndham itself referenced in its privacy policy.

Fears & Concerns

The upshot of the case is that if the FTC’s assertion of the power to enforce “reasonable” data security practices is affirmed, all privacy and data security policies must be “reasonable.” This will in turn mean that such policies must not be “unfair” generally, and also not “deceptive” relative to companies’ privacy policies. In effect, the full force of federal law, policed by the FTC, will appear behind privacy and data security policies – albeit, in a very broad and hard to characterize way. This is in stark contrast to state privacy and data security laws (such as Delaware’s, California’s or Florida’s), which generally consist of more narrowly-tailored, statutorily-delimited proscriptions.

While consumers and consumer advocates will no doubt be heartened by the Court’s broad read on the FTC’s protective power in the area of privacy and data security, not surprisingly, there are fears from both businesses and legal observers about such a new legal regime. Some of these concerns include:

  • Having the FTC “lurking over the shoulders” of companies to “second guess” their privacy and security policies.
  • A situation where the FTC is, in effect, “victimizing the victim” – prosecuting companies after they’ve already been “punished” by the direct costs and public fallout of a data breach.
  • Lack of a true industry standard against which to define “reasonable” privacy and data security policies.
  • A “checklist culture” (as opposed to a risk-based data security approach) as the FTC’s de facto data security requirements develop through litigation.
  • A wave of class-action lawsuits emboldened by FTC “unfair and deceptive” suits.
  • Uncertainty: case-by-case consent orders that provide little or no guidance to non-parties.

These concerns are definitely real, but likely will not result in much (if any) push-back in Wyndham’s favor in the District Court. That is because, while the FTC may not have asserted power over data security practices in past (as Wyndham made sure to point out in its arguments), there is little in the FTC’s governing charter or relevant judicial history to prevent it from doing so now. Simply put, regulatory agencies can change their “minds,” including regarding what is in their regulatory purview – so long as the field in question is not explicitly beyond their purview. Given today’s new reality of omnipresent social networks and, sensitive, cloud-resident consumer data, we can hardly blame the FTC for re-evaluating its late-90s-era stance.

No Going Back

Uncle Sam is coming, in a clear move to regulate privacy and data security and protect consumers. As highlighted recently in the New York Attorney General’s report on data breaches, the pressure is only growing to do something about the problem of dramatically-increasing data breaches. As such, it was only a matter of time until the Federal Government responded to political pressure and “got into the game” already commenced by the states.

Thus, while the precise outcome of FTC v. Wyndham cannot be predicted, it is overwhelmingly likely that the FTC will “get what it wants” broadly speaking; either with the upholding of its asserted discretionary power, or instead, by being forced to pass more detailed regulations on privacy and data security.

Either way, this case should be a wake-up call to businesses, many of whom are in fact already covered by state laws relevant to privacy and data security, but whom perhaps haven’t felt the inter-jurisdictional litigation risk is significant enough to ensure their policies and practices are compliant with those of the strictest states (such as California and Florida; or even other nations’, such as Canada).

The precise outcome of FTC v. Wyndham notwithstanding, the federal government will henceforth be looking more closely at all data breaches in the country – particularly major ones – and may be under pressure to act quickly and stringently in response to public outcry. But “smaller” breaches will most certainly be fair game as well; thus, small- and mid-sized businesses should take heed as well. That means getting in touch with a certified OlenderFeldman privacy and data security attorney to make sure your business’s policies and procedures genuinely protect you and your users and customers… and put you ahead of the blowing “Wynds of change” of federal regulation.

By: Aaron Krowne

On July 1, 2014, Delaware signed into law HB 295, which provides for the “safe destruction of records containing personal identifying information” (codified at Chapter 50C, Title 6, Subtitle II, of the Delaware Code). The law goes into effect January 1, 2015.

Overview of Delaware’s Data Destruction Law

In brief, the law requires a commercial entity to take reasonable steps to destroy or arrange for the destruction of consumers’ personal identifying information when this information is sought to be disposed of.

 The core of this directive is to “take reasonable steps to destroy” the data. No specific requirement is given for this, though a few suggestions such as shredding, erasing, and overwriting information are given, creating some uncertainty as to what steps an entity might take in order to achieve compliance.

For purposes of this law “commercial entity” (CE) is defined so as to cover almost any type of business entity except governmental entities (in contrast, to say, Florida’s law). Importantly, Delaware’s definition of a CE clearly includes charities and nonprofits.

The definition of personal identifying information (PII) is central to complying with the law. For purposes of this law PII is defined as a consumer’s first name or first initial and last name, in combination with one of the individual’s: social security number, passport number, driver’s license or state ID card number, insurance policy number, financial/bank/credit/debit account number, tax, payroll information or confidential health care information. “Confidential health care information” is intentionally defined broadly so as to cover essentially a patient’s entire health care history.

The definition of PII also, importantly, excludes information that is encrypted, meaning, somewhat surprisingly, that encrypted information is deemed not to be “personal identifying information” under this law. This implies that, if any of the above listed data is encrypted, all of the consumer’s data may be retainable forever – even if judged no longer useful or relevant.

The definition of “consumer” in the law is also noteworthy, as it is defined so as to expressly exclude employees, and only covers individuals (not CEs) engaged in non-business transactions. Thus, rather surprisingly, an individual engaging in a transaction with a CE for their sole proprietorship business is not covered by the law.

Penalties and Enforcement

The law does not provide for any specific monetary damages in the case of “a record unreasonably disposed of.” But, it does provide a private right of action, whereby consumers may bring suit for an improper record disposal in case of actual damages – however, that violation must be reckless or intentional, not merely negligent. Additionally, and perhaps to greater effect, the Attorney General may bring either a lawsuit or an administrative action against a CE.

Who is Not Effected?

The law expressly exempts entities covered by pre-existing pertinent regulations, such as all health-related companies, which are covered by the Health Insurance Portability and Accountability Act, as well as banks, financial institutions, and consumer reporting agencies. At this point it remains unclear as to whether CEs without Delaware customers are considered within the scope of this law, as this law is written so broadly that it does not narrow its scope to either Delaware CEs, or to non-Delaware CEs with Delaware customers. Therefore, if your business falls into either category, the safest option is to comply with the provisions of the law.

Implications and Questions

We have already seen above that this facially-simple law contains many hidden wrinkles and leaves some open questions. Some further elaborations and questions include:

  • What are “reasonable steps to destroy” PII? Examples are given, but the intent seems to be to leave the specifics up to the CE’s judgment – including dispatching the job to a third party.
  • The “when” of disposal: the law applies when the CE “seeks to permanently dispose of” the PII. Does, then, the CE judging the consumer information as being no longer useful or necessary count? Or must the CE make an express disposal decision for the law to apply? If it is the latter, can CEs forever-defer applicability of the law by simply never formally “disposing” of the information (perhaps expressly declaring that it is “always” useful)?
  • Responsibility for the information – the law applies to PII “within the custody or control” of the CE. When does access constitute “custody” or “control”? With social networks, “cloud” storage and services, and increasingly portable, “brokered” consumer information, this is likely to become an increasingly tested issue.

Given these considerable questions, as well as the major jurisdictional ambiguity discussed above (and additional ones included in the extended version of this post), potential CEs (Delaware entities, as well as entities who may have Delaware customers) should make sure they are well within the bounds of compliance with this law. The best course of action is to contact an experienced OlenderFeldman attorney, and make sure your privacy and data disposal policies place your business comfortably within compliance of Delaware’s new data destruction law.

By: Aaron Krowne

In a major recent case testing California’s medical information privacy law, part of the California Medical Information Act, or CMIA (California Civil Code § 56 et seq.), the Third District Court of Appeals in Sutter Health v. Superior Court held on July 21, 2014 that confidential information covered by the law must be “actually viewed” for the statutory penalty provisions of the law to apply. The implication of this decision is that it just got harder for consumers to sue for a “pure” loss of privacy due to a data breach in California and possibly beyond.

Not So Strict

Previously, CMIA was assumed to be a strict liability statue, as in the absence of actual damages, a covered party that “negligently released” confidential health information was still subject to a $1,000 nominal penalty. That is, if a covered health care provider or health service company negligently handled customer information, and that information was subsequently taken by a third party (e.g., a theft of a computer, or data device containing such information), that in itself triggered the $1,000 per-instance (and thus, per-customer record) penalty. There was no suggestion that the thief (or other recipient) of the confidential health information needed to see, or do anything with such information. Indeed, plaintiffs had previously brought cases under such a “strict liability” theory and succeeded in the application of CMIA’s $1,000 penalty.

 Sutter Health turns that theory on its head, with dramatically different results for consumers and California health-related companies.

Sutter was looking at a potential $4 billion fine, stemming from the October 2011 theft of a computer from its offices containing 4 million unencrypted client records. Sutter’s computer was password-protected, but without encryption of the underlying data this measure is easily defeated. Security at the office was light, with no alarm or surveillance cameras. Believing this to be “negligent,” some affected Sutter customers sued under CMIA in a class action. Given the potential amount of the total fine, the stakes were high.

The Court not only ruled against the Sutter customers, but dismissed the case on demurrer, meaning that the Court determined that the case was deficient on the pleadings, because the Plaintiffs “failed to state a cause of action.” The main reason, according to the Court, was that Plaintiffs failed to allege that an unauthorized person actually viewed the confidential information, therefore there was no breach of confidentiality, as required under CIMA. The Court elaborated that under CIMA “[t]he duty is to preserve confidentiality, and a breach of confidentiality is the injury protected against. Without an actual confidentiality breach there is no injury and therefore no negligence…”.

The Court also introduced the concept of possession, which is absent in CMIA itself, to delimit its new theory interpreting CMIA, saying: “[t]hat [because] records have changed possession even in an unauthorized manner does not [automatically] mean they have been exposed to the view of an unauthorized person.” So, plaintiffs bringing claims under CMIA will now have to allege, and ultimately prove, that their confidential information (1) changed possession in an unauthorized manner, and that (2) it was actually viewed (or, presumably, used) by an unauthorized party.

The Last Word?

This may not be the last word on CMIA, and certainly not the general issue of the burden of proof of harm in consumer data breaches. The problem is that it is extremely difficult to prove that anything nefarious has actually happened with sensitive consumer data post-breach, short of catching the perpetrator and getting a confession, or actually observing the act of utilization, or sale of the data to a third party. Even positive results detected through credit monitoring, such as attempts to use credit cards by unauthorized third parties, do not conclusively prove that a particular breach was the cause of such unauthorized access.

The Sutter court avers, in supporting its ruling, that we don’t actually know whether the thief in this case simply stole the computer, wiped the hard drive clean, and sold it as a used computer, and therefore no violation of CIMA. Yet, logically, we can say the opposite may have just as well happened – retrieval of the customer data may very well have been the actual goal of the theft. In an environment where sensitive consumer records can fetch as much as $45 (totaling $180 million for the Sutter customer data), it seems unwise to rely on the assumption that thieves will simply not bother to check for valuable information on stolen corporate computers and digital devices.

Indeed, the Sutter decision perhaps raises as many questions as answers on where to draw the line for “breach of confidential information.” To wit: presumably, a hacker downloading unencrypted information would still qualify for this status under the CMIA, so interpreted. But then, by what substantive rationale does the physical removal of a hard drive in this case not qualify? Additionally, how is it determined whether a party actually looked at the data, and precisely who looked at it?

Further, the final chapter on the Sutter breach may not yet be written – the data may still be (or turn out to have been) put to nefarious use, in which case, the court’s ruling will seem premature. Thus, there is likely to be some pushback to Sutter, to the extent that consumers do not accept the lack of punitive options in “open-ended” breaches of this nature, and lawmakers actually intend consumer data-handling negligence laws to have some “bite.”

Conclusion

Naively, it would seem under the Sutter Court’s interpretation, that companies dealing with consumer health information have a “blank check” to treat that information negligently – so long as the actual viewing (and presumably, use) of that information by unauthorized persons is a remote possibility. We would caution against this assumption. First, as above, there may be some pushback (judicially, legislatively, or in terms of public response) to Sutter’s strict requirement of proof of viewing of breached records. But more importantly, there is simply no guarantee that exposed information will not be released and be put to harmful use, and that sufficient proof of such will not surface for use in consumer lawsuits.

 One basic lesson of Sutter is that, while the company dodged a bullet thanks to a court’s re-interpretation of a law, they (and their customers) would have been vastly safer had they simply utilized encryption. More broadly, Sutter should have had and implemented a better data security policy. Companies dealing with customer’s health information (in California and elsewhere) should take every possible precaution to secure this information.

Do not put your company and your customers at risk for data breaches, contact a certified privacy attorney at OlenderFeldman to make sure your company’s data security policy provides coverage for all applicable health information laws.

By: Aaron Krowne

On June 20, 2014, the Florida legislature passed SB 1524, the Florida Information Protection Act of 2014 (“FIPA”). The law updates Florida’s existing data breach law, creating one of the strongest laws in the nation protecting consumer personal data through the use of strict transparency requirements. FIPA applies to any entity with customers (or users) in Florida – so businesses with a national reach should take heed.

Overview of FIPA

FIPA requires any covered business to make notification of a data breach within 30 days of when the personal information of Florida residents is implicated in the breach. Additionally, FIPA requires the implementation of “reasonable measures” to protect and secure electronic data containing personal information (such as e-mail address/password combinations and medical information), including a data destruction requirement upon disposal of the data.

Be forewarned: The penalties provided under FIPA pack a strong punch. Failure to make the required notification can result in a fine of up to $1,000 a day for up to 30 days; a $50,000 fine for each 30-day period (or fraction thereof) afterwards; and beyond 180 days, $500,000 per breach. Violations are to be treated as “unfair or deceptive trade practices” under Florida law. Of note for businesses that utilize third party data centers and data processors, covered entities may be held liable for these third party agents’ violations of FIPA.

While the potential fines for not following the breach notification protocols are steep, no private right of action exists under FIPA.

The Notification Requirement

Any covered business that discovers a breach must, generally, notify the affected individuals within 30 days of the discovery of the breach. The business must also notify the Florida Attorney General within 30 days if more than 500 Florida residents are affected.

However, if the cost of sending individual breach notifications is estimated to be over $250,000, or where over 500,000 customers are affected, businesses may satisfy their obligations under FIPA by notifying customers via a conspicuous web site posting and by running ads in the affected areas (as well as filing a report with the Florida AG’s office).

Where a covered business reasonably self-determines that there has been no harm to Florida residents, and therefore notifications are not required, it must document this determination in writing, and must provide such written determination to the Florida AG’s office within 30 days.

Finally, FIPA provides a strong incentive for businesses to encrypt their consumer data, as notification to affected individuals is not required if the personal information was encrypted.

Implications and Responsibilities

 One major take-away of the FIPA responsibilities outlined above is the importance of formulating and writing a data security policy. FIPA requires the implementation of “reasonable measures” to protect and secure personal information, implying that companies should already have such measures formulated. Having a carefully crafted data security policy will also help covered businesses to determine what, if any, harm has occurred after a breach and whether individual reporting is ultimately required.

For all of the above-cited reasons, FIPA adds urgency to a business formulating a privacy and data security policy if it does not have one – and if it already has one, making sure that it meets the FIPA requirements. Should you have any questions do not hesitate to contact one of OlenderFeldman’s certified privacy attorneys to make sure your data security policy adequately responds to breaches as prescribed under FIPA.

New Jersey Law Requires Photocopiers and Scanners To Be Erased Because Of Privacy Concerns

New Jersey Law Requires Photocopiers and Scanners To Be Erased Because Of Privacy ConcernsNJ Assembly Bill A-1238 requires the destruction of records stored on digital copy machines under certain circumstances in order to prevent identity theft

By Alice Cheng

Last week, the New Jersey Assembly passed Bill-A1238 in an attempt to prevent identity theft. This bill requires that information stored on photocopy machines and scanners to be destroyed before devices change hands (e.g., when resold or returned at the end of a lease agreement).

Under the bill, owners of such devices are responsible for the destruction, or arranging for the destruction, of all records stored on the machines. Most consumers are not aware that digital photocopy machines and scanners store and retain copies of documents that have been printed, scanned, faxed, and emailed on their hard drives. That is, when a document is photocopied, the copier’s hard drive often keeps an image of that document. Thus, anyone with possession of the photocopier (i.e., when it is sold or returned) can obtain copies of all documents that were copied or scanned on the machine. This compilation of documents and potentially sensitive information poses serious threats of identity theft.

Any willful or knowing violation of the bill’s provisions may result in a fine of up to $2,500 for the first offense and $5,000 for subsequent offenses. Identity theft victims may also bring legal action against offenders.

In order for businesses to avoid facing these consequences, they should be mindful of the type of information stored, and to ensure that any data is erased before reselling or returning such devices. Of course, business owners should be especially mindful, as digital copy machines  may also contain trade secrets and other sensitive business information as well.

Check Cloud Contracts for Provisions Related to Privacy, Data Security and Regulatory Concerns

Check Cloud Contracts for Provisions Related to Privacy, Data Security and Regulatory Concerns“Cloud” Technology Offers Flexibility, Reduced Costs, Ease of Access to Information, But Presents Security, Privacy and Regulatory Concerns

With the recent introduction of Google Drive, cloud computing services are garnering increased attention from entities looking to more efficiently store data. Specifically, using the “cloud” is attractive due to its reduced cost, ease of use, mobility and flexibility, each of which can offer tremendous competitive benefits to businesses. Cloud computing refers to the practice of storing data on remote servers, as opposed to on local computers, and is used for everything from personal webmail to hosted solutions where all of a company’s files and other resources are stored remotely. As convenient as cloud computing is, it is important to remember that these benefits may come with significant legal risk, given the privacy and data protection issues inherent in the use of cloud computing. Accordingly, it is important to check your cloud computing contracts carefully to ensure that your legal exposure is minimized in the event of a data breach or other security incident.

Cloud computing allows companies convenient, remote access to their networks, servers and other technology resources, regardless of location, thereby creating “virtual offices” which allow employees remote access to their files and data which is identical in scope the access which they have in the office. The cloud offers companies flexibility and scalability, enabling them to pool and allocate information technology resources as needed, by using the minimum amount of physical IT resources necessary to service demand. These hosted solutions enable users to easily add or remove additional storage or processing capacity as needed to accommodate fluctuating business needs. By utilizing only the resources necessary at any given point, cloud computing can provide significant cost savings, which makes the model especially attractive to small and medium-sized businesses. However, the rush to use cloud computing services due to its various efficiencies often comes at the expense of data privacy and security concerns.

The laws that govern cloud computing are (perhaps somewhat counterintuitively) geographically based on the physical location of the cloud provider’s servers, rather than the location of the company whose information is being stored. American state and federal laws concerning data privacy and security tend to vary while servers in Europe are subject to more comprehensive (and often more stringent) privacy laws. However, this may change, as the Federal Trade Commission (FTC) has been investigating the privacy and security implications of cloud computing as well.

In addition to location-based considerations, companies expose themselves to potentially significant liability depending on the types of information stored in the cloud. Federal, state and international laws all govern the storage, use and protection of certain types of personally identifiable information and protected health information. For example, the Massachusetts Data Security Regulations require all entities that own or license personal information of Massachusetts residents to ensure appropriate physical, administrative and technical safeguards for their personal information (regardless of where the companies are physically located), with fines of up to $5,000 per incident of non-compliance. That means that the companies are directly responsible for the actions of their cloud computing service provider. OlenderFeldman LLP notes that some information is inappropriate for storage in the cloud without proper precautions. “We strongly recommend against storing any type of personally identifiable information, such as birth dates or social security numbers in the cloud. Similarly, sensitive information such as financial records, medical records and confidential legal files should not be stored in the cloud where possible,” he says, “unless it is encrypted or otherwise protected.” In fact, even a data breach related to non-sensitive information can have serious adverse effects on a company’s bottom line and, perhaps more distressing, its public perception.

Additionally, the information your company stores in the cloud will also be affected by the rules set forth in the privacy policies and terms of service of your cloud provider. Although these terms may seem like legal boilerplate, they may very well form a binding contract which you are presumed to have read and consented to. Accordingly, it is extremely important to have a grasp of what is permitted and required by your cloud provider’s privacy policies and terms of service. For example, the privacy policies and terms of service will dictate whether your cloud service provider is a data processing agent, which will only process data on your behalf or a data controller, which has the right to use the data for its own purposes as well. Notwithstanding the terms of your agreement, if the service is being provided for free, you can safely presume that the cloud provider is a data controller who will analyze and process the data for its own benefit, such as to serve you ads.

Regardless, when sharing data with cloud service providers (or any other third party service providers)), it is important to obligate third parties to process data in accordance with applicable law, as well as your company’s specific instructions — especially when the information is personally identifiable or sensitive in nature. This is particularly important because in addition to the loss of goodwill, most data privacy and security laws hold companies, rather than service providers, responsible for compliance with those laws. That means that your company needs to ensure the data’s security, regardless of whether it’s in a third party’s (the cloud providers) control. It is important for a company to agree with the cloud provider as to the appropriate level of security for the data being hosted. Christian Jensen, a litigation attorney at OlenderFeldman LLP, recommends contractually binding third parties to comply with applicable data protection laws, especially where the law places the ultimate liability on you. “Determine what security measures your vendor employs to protect data,” suggests Jensen. “Ensure that access to data is properly restricted to the appropriate users.” Jensen notes that since data protection laws generally do not specify the levels of commercial liability, it is important to ensure that your contract with your service providers allocates risk via indemnification clauses, limitation of liabilities and warranties. Businesses should reserve the right to audit the cloud service provider’s data security and information privacy compliance measures as well in order to verify that the third party providers are adhering to its stated privacy policies and terms of service. Such audits can be carried out by an independent third party auditor, where necessary.

OlenderFeldman LLP was interviewed by Jennifer Banzaca of the Hedge Fund Law Report for a three part series entitled, “What Concerns Do Mobile Devices Present for Hedge Fund Managers, and How Should Those Concerns Be Addressed?” (Subscription required; Free two week subscription available.) Some excerpts of the topics Jennifer and Aaron discussed follow. You can read  the third entry here.

Preventing Access by Unauthorized Persons

This section highlights steps that hedge fund managers can take to prevent unauthorized users from accessing a mobile device or any transmission of information from a device.  Concerns over unauthorized access are particularly acute in connection with lost or stolen devices.

[Lawyers] recommended that firms require the use of passwords or personal identification numbers (PINs) to access any mobile device that will be used for business purposes.  Aaron Messing, a Corporate & Information Privacy Associate at OlenderFeldman LLP, further elaborated, “We generally emphasize setting minimum requirements for phone security.  You want to have a mobile device lock with certain minimum requirements.  You want to make sure you have a strong password and that there is boot protection, which is activated any time the mobile device is powered on or reactivated after a period of inactivity.  Your password protection needs to be secure.  You simply cannot have a password that is predictable or easy to guess.”

Second, firms should consider solutions that facilitate the wiping (i.e., erasing) of firm data on the mobile device to prevent access by unauthorized users . . . . [T]here are numerous available wiping solutions.  For instance, the firm can install a solution that will facilitate remote wiping of the mobile device if the mobile device is lost or stolen.  Also, to counter those that try to access the mobile device by trying to crack its password, a firm can install software that automatically wipes firm data from the mobile device after a specific number of failed log-in attempts.  Messing explained, “It is also important for firms to have autowipe ability – especially if you do not have a remote wipe capability – after a certain number of incorrect password entries.  Often when a phone is lost or stolen, it is at least an hour or two before the person realizes the mobile device is missing.”

Wipe capability can also be helpful when an employee leaves the firm or changes mobile devices. . . Messing further elaborated, “When an employee leaves, you should have a policy for retrieving proprietary or sensitive information from the employee-owned mobile device and severing access to the network.  Also, with device turnover – if employees upgrade phones – you want employees to agree and acknowledge that you as the employer can go through the old phone and wipe the sensitive aspects so that the next user does not have the ability to pick up where the employee left off.”

If a firm chooses to adopt a wipe solution, it should adopt policies and procedures that ensure that employees understand what the technology does and obtain consent to the use of such wipe solutions.  Messing explained, “What we recommend in many cases is that as a condition of enrolling a device on the company network, employees must formally consent to an ‘Acceptable Use’ policy, which defines all the situations when the information technology department can remotely wipe the mobile device.  It is important to explain how that wipe will impact personal device use and data and employees’ data backup and storage responsibilities.”

Third, a firm should consider adopting solutions that prevent unauthorized users from gaining remote access to a mobile device and its transmissions.  Mobile security vendors offer products to protect a firm’s over-the-air transmissions between the server and a mobile device and the data stored on the mobile device.  These technologies allow hedge fund managers to encrypt information accessed by the mobile device – as well as information being transmitted by the mobile device – to ensure that it is secure and protected.  For instance, mobile devices can retain and protect data with WiFi and mobile VPNs, which provide mobile users with secure remote access to network resources and information.

Fourth, Rege suggested hedge fund managers have a procedure for requiring certificates to establish the identity of the device or a user.  “In a world where the devices are changing constantly, having that mechanism to make sure you always know what device is trying to access your system becomes very important.”

Preventing Unauthorized Use by Firm Personnel

Hedge fund managers should be concerned not only by potential threats from external sources, but also potential threats from unauthorized access and use by firm personnel.

For instance, hedge fund managers should protect against the theft of firm information by firm personnel.  Messing explained, “You want to consider some software to either block or control data being transferred onto mobile devices.  Since some of these devices have a large storage capacity, it is very easy to steal data.  You have to worry not only about external threats but internal threats as well, especially when it comes to mobile devices, you want to have system controls that are put in place to record and maybe even limit the data being taken from or copied onto mobile devices.”

Monitoring Solutions

To prevent unauthorized access and use of the mobile device, firms can consider remote monitoring.   However, monitoring solutions raise employee privacy concerns, and the firm should determine how to address these competing concerns.

Because of gaps in expectations regarding privacy, firms are much more likely to monitor activity on firm-provided mobile devices than on personal mobile devices. . . . In addressing privacy concerns, Messing explained, “You want to minimize the invasion of privacy and make clear to your employees the extent of your access.  When you are using proprietary technology for mobile applications, you can gain a great deal of insight into employee usage and other behaviors that may not be appropriate – especially if not disclosed.  We are finding many organizations with proprietary applications tracking behaviors and preferences without considering the privacy implications.  Generally speaking, you want to be careful how you monitor the personal device if it is also being used for work purposes.  You want to have controls to determine an employee’s compliance with security policies, but you have to balance that with a respect for that person’s privacy.  When it comes down to it, one of the most effective ways of doing that is to ensure that employees are aware of and understand their responsibilities with respect to mobile devices.  There must be education and training that goes along with your policies and procedures, not only with the employees using the mobile devices, but also within the information technology department as well.  You have people whose job it is to secure corporate information, and in the quest to provide the best solution they may not even consider privacy issues.”

As an alternative to remote monitoring, a firm may decide to conduct personal spot checks of employees’ mobile devices to determine if there has been any inappropriate activity.  This solution is less intrusive than remote monitoring, but likely to be less effective in ferreting out suspicious activity.

Policies Governing Archiving of Books and Records

Firms should consider both technology solutions and monitoring of mobile devices to ensure that they are capturing all books and records that are required to be kept pursuant to the firm’s books and records policies and external law and regulation with respect to books and records.

Also, firms may contemplate instituting a policy to search employees’ mobile devices and potentially copying materials from such mobile devices to ensure the capture of all such information or communications from mobile devices.  However, searching and copying may raise privacy concerns, and firms should balance recordkeeping requirements and privacy concerns.  Messing explained, “In the event of litigation or other business needs, the company should image, copy or search an employee’s personal device if it is used for firm business.  Therefore, employees should understand the importance of complying with the firm’s policies.”

Policies Governing Social Media Access and Use by Mobile Devices

Many firms will typically have some policies and procedures in place that ban or restrict the proliferation of business information via social media sites such as Facebook and Twitter, including with respect to the use of firm-provided mobile devices.  Specifically, such a policy could include provisions prohibiting the use of the firm’s name; prohibiting the disclosure of trade secrets; prohibiting the use of company logos and trademarks; addressing the permissibility of employee discussions of competitors, clients and vendors; and requiring disclaimers.

Messing explained, “We advise companies just to educate employees about social media.  If you are going to be on social media, be smart about what you are doing.  To the extent possible, employees should note their activity is personal and not related to the company.  They also should draw distinctions, where possible, between their personal and business activities.  These days it is increasingly blurred.  The best thing to do is just to come up with common sense suggestions and educate employees on the ramifications of certain activities.  In this case, ignorance is usually the biggest issue.”

Ultimately, many hedge fund managers recognize the concerns raised by mobile devices.  However, many also recognize the benefits that can be gained from allowing employees to use such devices.  In Messing’s view, the benefits to hedge fund managers outweigh the costs.  “Everything about a mobile device is problematic from a security standpoint,” Messing said, “but the reality is that the benefits far outweigh the costs in that productivity is greatly enhanced with mobile devices.  It is simply a matter of mitigating the concerns.”

OlenderFeldman LLP was interviewed by Jennifer Banzaca of the Hedge Fund Law Report for a three part series entitled, “What Concerns Do Mobile Devices Present for Hedge Fund Managers, and How Should Those Concerns Be Addressed?” (Subscription required; Free two week subscription available.) Some excerpts of the topics Jennifer and Aaron discussed follow. You can read the second entry here.

Three Steps That Hedge Fund Managers Should Take before Crafting Mobile Device Policies and Procedures

As indicated, before putting pen to paper to draft mobile device policies and procedures, hedge fund managers should take at least the following three steps.  Managers that already have mobile device policies and procedures in place, or that have other policies and procedures that incidentally cover mobile devices, may take the following three steps in revising the other relevant policies and procedures.

First, Aaron Messing, a Corporate & Information Privacy Lawyer at OlenderFeldman LLP, advised that hedge fund managers should ensure that technology professionals are integrally involved in developing mobile device policies and procedures.  Technology professionals are vital because they can understand the firm’s technological capabilities, and they can inform the compliance department about the technological solutions available to address compliance risks and to meet the firm’s goals.  Such technology professionals can be manager employees, outside professionals or a combination of both.  The key is that such professionals understand how technology can complement rather than conflict with the manager’s compliance and business goals.

Second, the firm should take inventory of its mobile device risks and resources before beginning to craft mobile device policies and procedures.  Among other things, hedge fund managers should consider access levels on the part of its employees; its existing technological capabilities; its budget for addressing the risks of using mobile devices; and the compliance personnel available to monitor compliance with such policies and procedures.  With respect to employee access, a manager should evaluate each employee’s responsibilities, access to sensitive information and historical and anticipated uses of mobile devices to determine the firm’s risk exposure.

With respect to technology, Messing cautioned that mobile device policies and procedures should be supportable by a hedge fund manager’s current technology infrastructure and team.  Alternatively, a manager should be prepared to invest in the required technology and team.  “You should be sure that what you are considering implementing can be supported by your information technology team,” Messing said.  With respect to budgeting, a hedge fund manager should evaluate how much it is willing to spend on technological solutions to address the various risks posed by mobile devices.  Any such evaluation should be informed by accurate pricing, assessment of a range of alternative solutions to address the same risk and a realistic sense of what is necessary in light of the firm’s business, employees and existing resources.  Finally, with respect to personnel, a manager should evaluate how much time the compliance department has available to monitor compliance with any contemplated mobile device policies and procedures.

Third, hedge fund managers should specifically identify their goals in adopting mobile device policies and procedures.  While the principal goal should be to protect the firm’s information and systems, hedge fund managers should also consider potentially competing goals, such as the satisfaction levels of their employees, as expressed through employee preferences and needs.  As Messing explained, “It is not that simple to dictate security policies because you have to take into account the end users.  Ideally, when you are creating a mobile device policy, you want something that will keep end users happy by giving them device freedom while at the same time keeping your data safe and secure.  One of the things that I emphasize the most is that you have to customize your solutions for the individual firm and the individual fund.  You cannot just take a one-size-fits-all policy because if you take a policy and you do not implement it, it can be worse than not having a policy at all.”  OCIE and Enforcement staff members have frequently echoed that last insight of Messing’s.

Aaron and Jennifer also discussed privacy concerns with the use of personal devices for work:

Firm-Provided Devices versus Personal Devices:

As an alternative, some firms have considered adopting policies that require employees to make their personal phones available for periodic and surprise examinations to ensure compliance with firm policies and procedures governing the use of personal phones in the workplace.  However, this solution may not necessarily be as effective as some managers might think because many mobile device functions and apps have been created to hide information from viewing, and a mobile device user intent on keeping information hidden may be able to take advantage of such functionality to deter a firm’s compliance department from detecting any wrongdoing.  Additionally, Messing explained that such examinations also raise employee privacy concerns.  Hedge fund managers should consider using software that can separate firm information from personal information to maximize the firm’s ability to protect its interests while simultaneously minimizing the invasion of an employee’s privacy.

Regardless of the policies and procedures that a firm wishes to adopt with respect to the use of personal mobile devices by firm personnel, hedge fund managers should clearly communicate to their employees the level of firm monitoring, access and control that is expected, especially if an employee decides that he or she wishes to use his or her personal mobile device for firm-related activities.

Jennifer and Aaron also discussed controlling access to critical information and systems:

Limiting Access to and Control of Firm Information and Systems

As discussed in the previous article in this series, mobile devices raise many external and internal security threats.  For instance, if a mobile device is lost or stolen, the recovering party may be able to gain access to sensitive firm information.  Also, a firm should protect itself from unauthorized access to and use of firm information and networks by rogue employees.  A host of technology solutions, in combination with robust policies and procedures, can minimize the security risks raised by mobile devices.  The following discussion highlights five practices that can help hedge fund managers to appropriately limit access to and control of firm information and networks by mobile device users.

First, hedge fund managers should grant mobile device access only to such firm information and systems as are necessary for the mobile device user to perform his or her job functions effectively.  This limitation on access should reduce the risks associated with use of the mobile device, particularly risks related to unauthorized access to firm information or systems.

Second, hedge fund managers should consider strong encryption solutions to provide additional layers of security with respect to their information.  As Messing explained, “As a best practice, we always recommend firm information be protected with strong encryption.”

Third, a firm should consider solutions that will avoid providing direct access to the firm’s information on a mobile device.  For instance, a firm should consider putting its information on a cloud and requiring mobile device users to access such information through the cloud.  By introducing security measures to access the cloud, the firm can provide additional layers of protection over and above the security measures designed to deter unauthorized access to the mobile device.

Fourth, hedge fund managers should consider solutions that allow them to control the “business information and applications” available via a personal mobile device.  With today’s rapidly evolving technology, solutions are now available that allow hedge fund managers to control those functions that are critical to their businesses while minimizing the intrusion on the personal activities of the mobile device user.  For instance, there are applications that store e-mails and contacts in encrypted compartments that separate business data from personal data.  Messing explained, “Today, there is software to provide data encryption tools and compartmentalize business data, accounts and applications from the other aspects of the phone.  There are also programs that essentially provide an encryption sandbox that can be removed and controlled without wiping the entire device.  When you have that ability to segment off that sensitive information and are able to control that while leaving the rest of the mobile device uncontrolled, that really is the best option when allowing employees to use mobile devices to conduct business.  The solutions available are only limited by the firm’s own technology limitations and what is available for each specific device.”  This compartmentalization also makes it easier to wipe a personal mobile phone if an employee leaves the firm, with minimal intrusion to the employee.

Fifth, hedge fund managers should adopt solutions that prohibit or restrict the migration of their information to areas where they cannot control access to such information.  Data loss prevention (DLP) solutions can provide assistance in this area by offering network protection to detect movement of information across the network.  DLP software can also block data from being moved to local storage, encrypt data and allow the administrator to monitor and restrict use of mobile device storage.

OlenderFeldman will be speaking at SES New York 2012 conference about emerging legal issues in search engine optimization and online behavioral advertising. The panel will discuss  Legal Considerations for Search & Social in Regulated Industries:

Search in Regulated Industries
Legal Considerations for Search & Social in Regulated Industries
Programmed by: Chris Boggs
Since FDA letters to pharmaceutical companies began arriving in 2009, and with constantly increasing scrutiny towards online marketing, many regulated industries have been forced to look for ways to modify their legal terms for marketing and partnering with agencies and other 3rd party vendors. This session will address the following:

  • Legal rules for regulated industries such as Healthcare/Pharmaceutical, Financial Services, and B2B, B2G
  • Interpretations and discussion around how Internet Marketing laws are incorporated into campaign planning and execution
  • Can a pharmaceutical company comfortably solicit inbound links in support of SEO?
  • Should Financial Services companies be limited from using terms such as “best rates?

Looks like it will be a great panel. I will post my slideshow after the presentation.

(Updated on 3.22.12 to add presentation below)

Massachusetts Data Security Regulations

Massachusetts Data Security RegulationsService Providers Face New Regulations Covering Personal Information

If your company is a service provider (generally any company providing third-party services, ranging from a payroll provider to an e-commerce hosting provider) or your company utilizes service providers, you need to be aware of the Massachusetts Data Security Regulations (the “Regulations”). The Regulations require that by March 1, 2012, all service provider contracts must contain appropriate security measures to protect the personal information (as described below) of Massachusetts residents. See 201 CMR 17.03(2)(f). All companies that “own or license” personal information of Massachusetts residents, regardless of where the companies are physically located, will need to comply with the Regulations. Additionally, all entities that own or license personal information of Massachusetts residents are required to develop, implement and maintain a written information security program (“WISP”), which lists the administrative, technical and physical safeguards in place to protect personal information.

“Personal information” is defined by the Regulations as a Massachusetts resident’s first and last name, or first initial and last name, in connection with any of the following: (1) Social Security number; (2) driver’s license number or state-issued identification card number; or (3) financial account number, or credit or debit card number.

If your company uses service providers, you are responsible for your service provider’s compliance with the Regulations as it relates to your business and your customers. The Regulations are clear that if your service provider receives, stores, maintains, processes, or otherwise has access to personal information of Massachusetts residents, you are responsible to make sure that your service providers maintain appropriate security measures to protect that personal information. Therefore you should make sure that your agreements with service providers contain appropriate language, obligations and indemnifications to protect your interests and assure compliance by your service provider. If you are a service provider, you need to develop a comprehensive WISP in order to protect yourself from liability.

If you have any questions or concerns regarding the implementation of the Regulations or how it may affect your business, please feel free to contact us.