Technology can impact the way we work, play, communicate and live, and “big data” analysis – the processing of large amounts of data in order to gain actionable insights – has the ability to radically alter society by identifying patterns and traits that would otherwise go undiscovered. This data, however, can raise significant privacy concerns in the context of a merger or acquisition.

Dunn and Bradstreet interviewed us regarding various Tips for Customer Data Management During a Merger or Acquisition. We thought the topic was so interesting, that we decided to expand a little bit more on the subject.

As background, it is important to consider that there are three types of M&A transactions affecting data: stock transactions, mergers, and sales of assets. In a stock transaction, there are no data issues, while the owners of a company sell stock to a new owner, the entity itself remains intact.  This means business as usual from the entity’s standpoint, and there are no data or confidentiality issues.

By contrast, in a merger (where the target is not the surviving entity) or in an asset transaction, the original entity itself goes away, which means all of the assets in that entity have to be transferred, and there is a change of legal title to those assets (including to any data) which can have legal implications. For example, if a party consents to the use of their data by OldCo, and OldCo sells all of its assets to NewCo, does that party’s consent to use data also transfer to NewCo?

In a merger, data needs to be appropriately assigned and transferred, which often has privacy implications. Companies generally have privacy policies explaining how they collect and use consumers’ personal information. These policies often contain language stating that the company will not give such information to any third-party without the consumer’s consent. In such situations, the transfer of data must be done in accordance with the written commitments and representations made by that company (which may vary if different representations were made to different categories of individuals), and may require providing notice or obtaining consent from consumers (which, depending on the scope of the notice or consent required, can be an arduous task).

Companies also generally maintain employee data and client data in addition to consumer data. This information needs to be handled in accordance with contractual obligations, as well as legal obligations. National and foreign laws may also regulate the transfer of certain information. For example, in transborder transactions, or for transactions involving multinational companies, it is extremely important to ensure that any transfer of data complies with the data privacy and transborder transfer obligations applicable in all of the relevant jurisdictions.

Obligations may arise even during the contemplation of a merger, or during the due diligence process, where laws may impact the ability of companies to disclose certain information and documentation. For example, in the United States, financial companies are required to comply with the Sarbanes-Oxley Act and the Gramm-Leach-Bliley Act, which govern the controls required to protect certain types of data, and companies in the health care and medical fields are often required to comply with the Health Insurance Portability and Accountability Act.

In the multinational / crossborder context, businesses may run into challenges posed by conflicting multi-jurisdictional data protection laws, which may prevent routine data flows (such as phone lists or other employee data) to countries that are deemed to have insufficient data protection laws, or require that centralized databases comply with the laws in multiple jurisdictions. Additionally, employee rights to access and amend data, as well as requirements to obtain consent before collection and limitations on maintenance of data may cause challenges as well.

So what should companies do when contemplating or navigating a merger or acquisition? First, companies should determine what information they have. Next, companies must ensure that they understand what information they have, including the circumstances under which the information was collected, and what rights and obligations they have relative to that information. Companies should determine what ability they have to transfer information, what consents or approvals are necessary to do so, and the potential impact of a transfer on the various stakeholders.

The bottom line? Any technology, and big data in particular, can be put to both good and bad uses. It is important that as companies gather data about individuals, that that information be used in accordance with existing laws and regulations governing data use, as well as in a way that respects the privacy of the individuals to which the data pertains.

By: Aaron Krowne

On July 14, 2014, the New York Attorney General’s office (“NY AG”) released a seminal report on data breaches, entitled “Information Exposed: Historical Examination of Data Breaches in New York State” (the “Report”). The Report presents a wealth of eye-opening (and sobering) information on data breaches in New York and beyond. The Report is primarily based upon the NY AG’s own analysis of data breach reports received in the first eight years (spanning 2005 through 2013) based on the State’s data breach reporting law (NY General Business Law §899-aa). The Report also cites extensively to outside research, providing a national- and international picture of data breaches. The Report’s primary finding is that data breaches, somewhat unsurprisingly, are a rapidly growing problem.

A Growing Menace

The headline statistic of the Report is its finding that data breaches in or effecting New York have tripled between 2006 and 2013 the original source. During this time frame, 22.8 million personal records of New Yorkers were exposed in nearly 5,000 breaches, effecting more than 3,000 businesses. The “worst” year was 2013, with 7.4 million records exposed, mainly due to the Target and Living Social “mega-breaches,” which the Report revealed are themselves a growing trend. However, while the Report warned that these recent “mega breaches” appear to be a trend, businesses of all sizes are effected and at risk.

The Report revealed that hacking instances are responsible for 43% of breaches and constituted 64% of the total records exposed. Other major causes of breaches include “lost or stolen equipment or documentation” (accounting for 25% of breaches), “employee error” (totaling 21% of breaches), and “insider wrongdoing” (tallying 11% of breaches). It is thus important to note that the majority of breaches still originate internally. However, since 2009 hacking has grown to become the dominant cause of breaches, which, not coincidentally, is the same year that “crimeware” source code was released and began to proliferate. Hacking was responsible for a whopping 96.4% of the New York records exposed in 2013 (again, largely due to the mega-breaches).

The Report notes that retail services and health care providers are “particularly” vulnerable to data breaches. The following breaks down the number of entities in a particular sector that suffered repeated data breaches: 54 “retail services” entities (a “favorite target of hackers”, per the Report), 31 “financial services” entities, 29 “health care” entities, 27 “banking” entities, and 20 “insurance” entities.

The Report also points out that these breach statistics are likely on the low side. One reason for this is that New York’s data breach law doesn’t cover all breaches. For example, if only one piece of information (out of the two required types: (1) a name, number, personal mark, or other identifier which can be used to identify such natural person, combined with (2) a social security number, government ID or license number, account number, or credit or debit card number along with security code) is compromised, the reporting requirement is not triggered. Yet, the compromise of even one piece of data (e.g., a social security number) can still have the same effect as a “breach” under the law, since it is still possible for there to be actual damage to the consumer (particularly if the breached information can be combined with complementary information obtained elsewhere). Further, within a specific reported breach, the full impact of such may be unknown, and hence lead to the breach being “underestimated.”

 Real Costs: Answering To The Market

Though New York’s data breach law allows the AG to bring suits for actual damages and statutory penalties for failure to notify (all consumers effected, theNY AG’s office; and for large breaches, consumer reporting agencies is required), such awards are likely to be minor compared with the market impact and direct costs of a breach. The Report estimates that in 2013, breaches cost New York businesses $1.37 billion, based on a per-record cost estimate of $188 (breach cost estimates are from data breach research consultancy The Ponemon Institute). However, in 2014, this per-record estimate has already risen to $201. The cost for hacked records is even higher than the average, at $277. The total average cost for a breach is currently $5.9 million, up from $5.4 million in 2013. These amounts represent only costs incurred by the businesses hit, including expenses such as investigation, communications, free consumer credit monitoring, and reformulation and implementation of data security measures. Costs on the consumers themselves are not included, so this is, once again, an under-estimate.

 These amounts also do not include market costs, for which the cases of the Target and Sony Playstation mega-breaches of 2013 are particularly sobering examples. Target experienced a 46% drop in annual revenue in the wake of the massive breach of its customers’ data, and Sony estimates it lost over $1 billion. Both also suffered contemporaneous significant declines in their stock prices.

 Returning to direct costs, the fallout continues: on August 5, 2014, Target announced that the costs of the 2013 breach would exceed its previous estimates, coming in at nearly $150 million.

 Practices

The Report’s banner recommendation, in the face of all the above, is to have an information security plan in place, especially given that 57% of breaches are primarily caused by “inside” issues (i.e., lost/stolen records, employee error, or wrongdoing) that directly implicate information security practices. An information security plan should specifically include:

  • a privacy policy;
  • restricted and controlled access to records;
  • monitoring systems for unauthorized access;
  • use of encryption, secure access to all devices, and non-internet connected storage;
  • uniform employee training programs;
  • reasonable data disposal practices (e.g., using disk wiping programs).

 The Report is not the most optimistic regarding preventing hacking, but we would note that hacking, or the efficacy of it, can also be reduced by implementation of an information security plan. For example, the implementation of encryption, and the training of employees to use it uniformly and properly, can be quite powerful.

Whether the breach threat comes to you in the form of employee conduct or an outside hack attempt, don’t be caught wrong-footed by not having an adequate information security plan. A certified privacy attorney at OlenderFeldman can assist you with your businesses’ information security plan, whether you need to create one for the first time, or simply need help in ensuring that your current information security plan provides the maximum protection to your business.

By: Aaron Krowne

On July 1, 2014, Delaware signed into law HB 295, which provides for the “safe destruction of records containing personal identifying information” (codified at Chapter 50C, Title 6, Subtitle II, of the Delaware Code). The law goes into effect January 1, 2015.

Overview of Delaware’s Data Destruction Law

In brief, the law requires a commercial entity to take reasonable steps to destroy or arrange for the destruction of consumers’ personal identifying information when this information is sought to be disposed of.

 The core of this directive is to “take reasonable steps to destroy” the data. No specific requirement is given for this, though a few suggestions such as shredding, erasing, and overwriting information are given, creating some uncertainty as to what steps an entity might take in order to achieve compliance.

For purposes of this law “commercial entity” (CE) is defined so as to cover almost any type of business entity except governmental entities (in contrast, to say, Florida’s law). Importantly, Delaware’s definition of a CE clearly includes charities and nonprofits.

The definition of personal identifying information (PII) is central to complying with the law. For purposes of this law PII is defined as a consumer’s first name or first initial and last name, in combination with one of the individual’s: social security number, passport number, driver’s license or state ID card number, insurance policy number, financial/bank/credit/debit account number, tax, payroll information or confidential health care information. “Confidential health care information” is intentionally defined broadly so as to cover essentially a patient’s entire health care history.

The definition of PII also, importantly, excludes information that is encrypted, meaning, somewhat surprisingly, that encrypted information is deemed not to be “personal identifying information” under this law. This implies that, if any of the above listed data is encrypted, all of the consumer’s data may be retainable forever – even if judged no longer useful or relevant.

The definition of “consumer” in the law is also noteworthy, as it is defined so as to expressly exclude employees, and only covers individuals (not CEs) engaged in non-business transactions. Thus, rather surprisingly, an individual engaging in a transaction with a CE for their sole proprietorship business is not covered by the law.

Penalties and Enforcement

The law does not provide for any specific monetary damages in the case of “a record unreasonably disposed of.” But, it does provide a private right of action, whereby consumers may bring suit for an improper record disposal in case of actual damages – however, that violation must be reckless or intentional, not merely negligent. Additionally, and perhaps to greater effect, the Attorney General may bring either a lawsuit or an administrative action against a CE.

Who is Not Effected?

The law expressly exempts entities covered by pre-existing pertinent regulations, such as all health-related companies, which are covered by the Health Insurance Portability and Accountability Act, as well as banks, financial institutions, and consumer reporting agencies. At this point it remains unclear as to whether CEs without Delaware customers are considered within the scope of this law, as this law is written so broadly that it does not narrow its scope to either Delaware CEs, or to non-Delaware CEs with Delaware customers. Therefore, if your business falls into either category, the safest option is to comply with the provisions of the law.

Implications and Questions

We have already seen above that this facially-simple law contains many hidden wrinkles and leaves some open questions. Some further elaborations and questions include:

  • What are “reasonable steps to destroy” PII? Examples are given, but the intent seems to be to leave the specifics up to the CE’s judgment – including dispatching the job to a third party.
  • The “when” of disposal: the law applies when the CE “seeks to permanently dispose of” the PII. Does, then, the CE judging the consumer information as being no longer useful or necessary count? Or must the CE make an express disposal decision for the law to apply? If it is the latter, can CEs forever-defer applicability of the law by simply never formally “disposing” of the information (perhaps expressly declaring that it is “always” useful)?
  • Responsibility for the information – the law applies to PII “within the custody or control” of the CE. When does access constitute “custody” or “control”? With social networks, “cloud” storage and services, and increasingly portable, “brokered” consumer information, this is likely to become an increasingly tested issue.

Given these considerable questions, as well as the major jurisdictional ambiguity discussed above (and additional ones included in the extended version of this post), potential CEs (Delaware entities, as well as entities who may have Delaware customers) should make sure they are well within the bounds of compliance with this law. The best course of action is to contact an experienced OlenderFeldman attorney, and make sure your privacy and data disposal policies place your business comfortably within compliance of Delaware’s new data destruction law.

By: Aaron Krowne

On June 20, 2014, the Florida legislature passed SB 1524, the Florida Information Protection Act of 2014 (“FIPA”). The law updates Florida’s existing data breach law, creating one of the strongest laws in the nation protecting consumer personal data through the use of strict transparency requirements. FIPA applies to any entity with customers (or users) in Florida – so businesses with a national reach should take heed.

Overview of FIPA

FIPA requires any covered business to make notification of a data breach within 30 days of when the personal information of Florida residents is implicated in the breach. Additionally, FIPA requires the implementation of “reasonable measures” to protect and secure electronic data containing personal information (such as e-mail address/password combinations and medical information), including a data destruction requirement upon disposal of the data.

Be forewarned: The penalties provided under FIPA pack a strong punch. Failure to make the required notification can result in a fine of up to $1,000 a day for up to 30 days; a $50,000 fine for each 30-day period (or fraction thereof) afterwards; and beyond 180 days, $500,000 per breach. Violations are to be treated as “unfair or deceptive trade practices” under Florida law. Of note for businesses that utilize third party data centers and data processors, covered entities may be held liable for these third party agents’ violations of FIPA.

While the potential fines for not following the breach notification protocols are steep, no private right of action exists under FIPA.

The Notification Requirement

Any covered business that discovers a breach must, generally, notify the affected individuals within 30 days of the discovery of the breach. The business must also notify the Florida Attorney General within 30 days if more than 500 Florida residents are affected.

However, if the cost of sending individual breach notifications is estimated to be over $250,000, or where over 500,000 customers are affected, businesses may satisfy their obligations under FIPA by notifying customers via a conspicuous web site posting and by running ads in the affected areas (as well as filing a report with the Florida AG’s office).

Where a covered business reasonably self-determines that there has been no harm to Florida residents, and therefore notifications are not required, it must document this determination in writing, and must provide such written determination to the Florida AG’s office within 30 days.

Finally, FIPA provides a strong incentive for businesses to encrypt their consumer data, as notification to affected individuals is not required if the personal information was encrypted.

Implications and Responsibilities

 One major take-away of the FIPA responsibilities outlined above is the importance of formulating and writing a data security policy. FIPA requires the implementation of “reasonable measures” to protect and secure personal information, implying that companies should already have such measures formulated. Having a carefully crafted data security policy will also help covered businesses to determine what, if any, harm has occurred after a breach and whether individual reporting is ultimately required.

For all of the above-cited reasons, FIPA adds urgency to a business formulating a privacy and data security policy if it does not have one – and if it already has one, making sure that it meets the FIPA requirements. Should you have any questions do not hesitate to contact one of OlenderFeldman’s certified privacy attorneys to make sure your data security policy adequately responds to breaches as prescribed under FIPA.

Nathan D. Marinoff, Esq best collaboration tools. Joins the Firm

Nathan  specializes in corporate law and regularly advises domestic and international companies, Boards of Directors and investors in matters of corporate governance, public and private capital markets, venture capital and private equity investments, mergers and acquisitions, joint ventures, bank financings and commercial licensing and employment agreements.

Nathan began his legal career as a law clerk to a federal judge, following which he spent over seven years in private practice with Skadden, Arps, Slate, Meagher & Flom LLP and Morgan, Lewis & Bockius LLP.   Thereafter, he served as Deputy General Counsel at Virgin Mobile USA, overseeing the company’s initial public offering and its merger with Sprint Nextel, and as Senior Director, Legal at a New York private equity firm with over $8 billion in assets, providing counsel to the firm and legal oversight to over 30 portfolio companies. He is deeply involved in the community and serves as a member of the Board of Directors for two charities, The Jewish Education Project and Friends of Firefighters.

Nathan can be reached at: nmarinoff@olenderfeldman.com | 908-964-2432

When should you provide your social security number? State Farm asked us when sharing is required.

State Farm contacted OlenderFeldman LLP to ask when sharing your social security number is appropriate:

Think before revealing your Social Security Number (SSN). Its unauthorized use could lead to privacy invasion and identify fraud. Aaron Messing, an information privacy attorney at OlenderFeldman LLP, says sharing is generally required by law only for:

  • Records of financial transactions in which the IRS is interested (banking, stock market, investment, property, insurance or other financial transactions
  • Employment records
  • Driver’s license applications
  • Government benefit applications (Medicade, student loans, etc.)
  • Joining the armed forces
  • Obtaining some professional or recreational licenses

 

You can see the Fast Tracks article here.

The Internal Revenue Code (IRC) Section 83 governs property transferred to an employee in connection with the performance of services. Currently, the section states that such transfers of property (typically restricted stock or stock options) are subject to federal income tax when the property is no longer subject to a substantial risk of forfeiture; however, new regulations will go into effect on January 1, 2013.

By Alice Cheng

On May 29, 2012, the Internal Revenue Service (IRS) issued proposed regulations (REG-141075-09) under Section 83 to refine and narrow the concept of the substantial risk of forfeiture. Whether a substantial risk of forfeiture exists is based on the facts and conditions of a property transfer arrangement. The proposed regulations will address the confusion over the appropriate elements of what constitutes a substantial risk of forfeiture.

The Internal Revenue Code (IRC) Section 83 governs property transferred to an employee in connection with the performance of services. Currently, the section states that such transfers of property (typically restricted stock or stock options) are subject to federal income tax when the property is no longer subject to a substantial risk of forfeiture.

The proposed regulations will make clarifications in the following three areas:

  1. Under current regulations, a substantial risk of forfeiture exists subject to the performance (or non-performance) of substantial services of the employee, or to the occurrence of a condition related to the purpose of the transfer. The proposed regulation clarifies that a substantial risk of forfeiture arises only through a future service condition or a condition relating to the purpose of the transfer.
  2. Two issues will be considered to determine whether a substantial risk of forfeiture exists—the likelihood that the forfeiture event will occur, and the likelihood that the forfeiture will be enforced.
  3. A transfer restriction (such as lock-up provisions, buyback provisions, and blackout periods) generally does not create a substantial risk of forfeiture for the purposes of the Section. However, there is an exception if the sale of property at profit could subject a person to suit under Section 16(b) of the Securities and Exchange Act of 1943.

 The proposed regulations are to go into effect on January 1, 2013, and will apply to property transfers on or after that date.