The “Internet of Things” is rapidly expanding, and most households have at least one physical object which automatically collects and exchanges data wirelessly. Manufacturers of smart devices need to ensure that security vulnerabilities and privacy concerns are rapidly addressed in order to escape the scrutiny of the Federal Trade Commission.

Law360 interviewed privacy experts, including our very own Mike Feldman, regarding ways for smart device makers to ensure that their information security and privacy practices meet industry standard practices. Mike recommended ensuring that employees have robust information security and privacy policies, are trained to identify risks, and are prepared to handle data breaches and other disasters. As Mike noted:

Small companies used to believe that hackers wouldn’t be interested in their customers’ data but reality has shown that is no longer the case, Feldman said.

“Now almost every industry has been hacked,” Feldman said. “The defense that ‘I thought it wouldn’t happen to me’ isn’t really a defense.”

Read the whole article: 3 Ways Internet Of Things Makers Can Avoid The FTC’s Ire (subscription may be required)

The biggest privacy challenges affecting businesses today are regulatory scrutiny from government agencies, media coverage with unintended consequences, and privacy risks that are discovered during corporate transactions.

Rapidly growing eCommerce and technology companies typically focus on creating viable products and services, adapting business models and responding to challenges, and using data in new ways to glean valuable insights and advantages. They often achieve success by disrupting existing industry norms and flouting convention in an attempt to do things better, faster and more cost-effectively. In the tech world, this strategy is often a blueprint for success.  At the same time, this strategy also often raises privacy concerns from regulators and investors.  In fact, three of the biggest privacy challenges affecting businesses today are regulatory scrutiny from government agencies (and potentially, personal liability arising from such scrutiny), media coverage with unintended consequences, and privacy risks that are discovered during corporate transactions.

Regulatory Scrutiny Of Privacy Practices

Government regulators, led by the Federal Trade Commission (“FTC”), have taken an activist role in enforcing privacy protections.  The FTC often does so by utilizing its powers under the FTC Act, which enables the FTC to investigate and prosecute companies and individuals for “unfair or deceptive acts and practices.” Some of the activities which the FTC considers to fall under the “unfair or deceptive” umbrella are: a company’s failure to enforce privacy promises; violations of consumers’ privacy rights; and failing to maintain reasonably adequate security for sensitive consumer information.

Though most of the FTC’s investigations are settled privately and non-publicly, those that do become public (usually, as a result of a company refusing to cooperate voluntarily or disagreeing with the FTC on the proper resolution) are often instructive. For example, the FTC recently settled charges against Snapchat, the developer of a popular mobile messaging app.  The FTC accused Snapchat of deceiving consumers with promises about the disappearing nature of messages sent through the service, the amount of personal data Snapchat collected, and the security measures taken to protect that data from misuse and unauthorized disclosure.  Similarly, when Facebook acquired WhatsApp, another cross-platform mobile messaging app, the FTC explicitly warned both Facebook and WhatsApp that WhatsApp had made clear privacy promises to consumers, and that WhatsApp would be obligated to continue its current privacy practices ― even if such policies differ from those of Facebook ― or face FTC charges. The takeaway from the FTC’s recent investigations and enforcement actions are clear: (1) businesses should be very careful about the privacy representations that they make to consumers; (2) businesses should comply with the representations they make; and (3) businesses should take adequate measures to ensure the privacy and security of the personal information and other sensitive data that they obtain from consumers.

Sometimes officers and directors of businesses are named in a FTC action along with, or apart from, the company itself.  In such cases, the interests of the individuals and those of the companies often diverge as the various parties try to apportion blame internally.  In certain cases, companies and their officers are held jointly and severally liable for violations.  For example, the FTC sued Innovative Marketing Inc. and three of its owners/officers. A federal court found the business and the owners/officers to be jointly and severally liable for unfair and deceptive actions, and entered a verdict for $163 million against them all. The evolving world of regulatory enforcement actions reveals that traditional liability protections (i.e., acting through a corporate entity) do not necessarily shield owners, officers, and/or directors from personal liability when privacy violations are at issue. Officers and directors should keep in mind that knowledge of, or indifference to, an unfair or deceptive practice can put them squarely in the FTC’s crosshairs ― and that the “ostrich defense” of ignoring and avoiding such issues is unlikely to produce positive results.

Unintended Consequences of Publicity

Most businesses crave publicity as a means of building credibility and awareness for their products or services. However, businesses should keep in mind that being in the spotlight can also put the company on regulators’ radar screens, potentially resulting in additional scrutiny where none previously existed. One of our clients, for example, came out with an innovative service that allows consumers to utilize their personal information in unique ways, and received significant positive publicity as a result. Unfortunately, that publicity also caught the interest of a regulatory entity. It turns out that some of our client’s statements about their service were misunderstood by the government. Ultimately, we were able to clarify the service offered by our client for the government in an efficient and cost-effective manner, demonstrating that no wrongdoing had occurred, and the inquiry was resolved to our client’s (and the government’s) satisfaction.  Nonetheless, the process itself resulted in substantial aggravation for our client, who was forced to focus on an investigation rather than on its business activities. Ultimately, the misunderstanding could have been avoided if the client had checked with us first, before speaking with reporters, to ensure the client’s talking points were appropriate.

Another more public example occurred at Uber’s launch party in Chicago.   Uber, the car service company which allows users to hail a cab using a mobile app, allegedly demonstrated a “God View” function for its guests which allowed the partygoers (including several journalists) to see, among other information, the name and real-time location of some of its customers (including some well-known individuals) in New York City – information which those customers did not know was being projected onto a large screen at a private party. The resulting publicity backlash was overwhelming. Senator Al Franken wrote Uber a letter demanding an explanation of Uber’s data collection practices and policies and Uber was forced to retain a major law firm to independently audit its privacy practices, and implement changes to its policies, including limiting the availability and use of the “God View.”

Experience has shown us that contrary to the old mantra, all publicity is not necessarily good publicity when it comes to the world of privacy.  Before moving forward with publicity or marketing for your business, consider incorporating a legal review into the planning to avoid any potentially adverse impact of such publicity.

Privacy Concerns Arising During A Corporate Transaction

Perhaps most importantly to company owners, the failure to proactively address privacy issues in connection with corporate transactions can cause significant repercussions, potentially destroying an entire deal.  Most major corporate transactions involve some degree of due diligence.  That due diligence, if properly performed by knowledgeable attorneys and businesspeople, will uncover any existing privacy risks (i.e., violations of privacy-related laws, insufficient privacy security measures or compliance issues which become financially overwhelming).  If these issues were not already factored into the financial terms of the transaction or affirmatively addressed from the outset, the entire landscape of the transaction can change overnight once the issues are uncovered – with the worst case scenario being the collapse of the entire deal.  Therefore, it is critical that businesses contemplating a corporate transaction be prepared to address all relevant privacy issues upfront.  Such preparation should include an internal analysis of the business from a privacy-law perspective (i.e., determining which regulatory schemes apply, and whether the business is currently in compliance) and being prepared to provide quick responses to relevant inquiries, such historical policies and procedures related to privacy and data security, diagrams of network/data flow, lists of third-parties with whom data has been shared, representations and warranties made to data subjects, and descriptions of complaints, investigations, and litigation pertaining to privacy issues.

Privacy and data security issues can be particularly tricky depending on the nature of the data that is maintained by the company and the representations that the company has made with respect to such data.  Businesses are well-advised to prepare a due diligence checklist in preparation for any corporate transaction which should include an assessment of the business’ compliance with applicable information privacy and data security laws as well as any potential liabilities from deficiencies that are discovered.  Addressing these issues in a proactive manner will allow the business to be more prepared for the corporate transaction and mitigate any harm which otherwise might flow from any problems which arise.

By: Aaron Krowne

A heated battle regarding the general province of federal regulators over businesses’ privacy and data security practices is currently raging. We are referring to the pending case of FTC v. Wyndham Worldwide Corp., which is being much-watched in the data security world. It pits, on one side, the Federal Trade Commission (“FTC”), with its general authority to prevent “unfair or deceptive trade practices,” against Wyndham Worldwide Corp. (“Wyndham”), a hotel chain-owner which was recently hit by a series of high-profile data breach hack-attacks. The main question to be decided is: does the FTC’s general anti-“unfair or deceptive” authority translate into a discretionary (as opposed to regulatory) power over privacy and data security practices?

Background of the Case

On July 30, 2014, FTC v. Wyndham was accepted on appeal to the Third Circuit, after Wyndham failed in its attempt to have the case dismissed. However, Wyndham was granted an interlocutory appeal, meaning that the issues it raised were considered by the Circuit Court important enough to determine the outcome of the case and thus needed to hear an appeal immediately.

 The case stems from a series of data breaches in 2008 and 2009 resulting from the hacking of Wyndham computers. It is estimated that personal information of upwards of 600,000 Wyndham customers was stolen, resulting in over $10 million lost through fraud (i.e., credit card fraud).

The FTC filed suit against Wyndham for the breach under Section 5 of the FTC Act, alleging (1) that the breach was due to a number of inadequate security practices and policies, and was thus unfair to consumers; and (2) that this conduct was also deceptive, as it fell short of the assurances given in Wyndham’s privacy policy and its other disclosures to consumers.

The security inadequacies cited by the FTC present a virtual laundry-list of cringe-worthy data-security faux pas, including: failing to employ firewalls; permitting storage of payment card information in clear readable text; failing to make sure Wyndham-branded hotels implemented adequate information security policies and procedures prior to connecting their local computer networks to Hotels and Resorts (Wyndham’s parent company’s); permitting Wyndham-branded hotels to connect unsecure servers to the network; utilizing servers with outdated operating systems that could not receive security updates and thus could not remedy known vulnerabilities; permitting servers to have commonly-known default user IDs and passwords; failing to employ commonly-used methods to require user IDs and passwords that are difficult for hackers to guess; failing to adequately inventory computers connected to the network; failing to monitor the network for malware used in a previous intrusion; and failing to restrict third-party access.

Most people with basic knowledge of data security would agree that these alleged practices of Wyndham are highly disconcerting and do fall below commonly-accepted industry standards, and thus, anyone partaking in such practices should be exposed to legal liability for any damage that results from them. The novel development with this case is the FTC’s construction of such consumer-unfriendly practices as “unfair” under Section 5 of the FTC Act, which thus brings them under its purview for remedial and punitive action.

Wyndham resisted the FTC’s enforcement action by attempting to dismiss the case, arguing (1) that poor data security practices are not “unfair” under the FTC Act, and that (2) regardless, the FTC must make formal regulations outlining any data security practices to which its prosecutorial power applies, before filing suit.

Wyndham’s dismissal attempt based on these arguments was resoundingly rejected by the District Court. This Court’s primary rationale was, in effect, its observation that the FTC Act, with Section 5’s “unfair and deceptive” enforcement power, was intentionally written broadly, thus implying that the FTC has domain over any area of corporate practice significantly impacting consumers. Additionally, this broad drafting provides that this power is largely discretionary, which would be defeated by requiring it always be reduced to detailed regulations in advance.

Addressing the “unfairness” question directly, the FTC argued (and the District Court agreed) that, in the data-security context, “reasonableness [of the practices] is the touchstone” for Section 5 enforcement, and that, particularly, “unreasonable data security practices are unfair.” As to defining unreasonable security practices, Wyndham advocated a strict “ascertainable certainty” standard (i.e., specific regulations set out in advance), but the District Court (again, siding with the FTC) shot back that “reasonableness provides ascertainable certainty to companies.” This argument seems almost circular and fails to define what exactly is “reasonable” in this context. But the District Court observed that in other areas of federal enforcement (e.g., the National Labor Relations Board and the Occupational Safety and Health Act), an unwritten “reasonableness” standard is routinely used in the prosecution of cases. Typically, in such cases, reference is made to prevailing industry standards and practices, which, as the District Court observed, Wyndham itself referenced in its privacy policy.

Fears & Concerns

The upshot of the case is that if the FTC’s assertion of the power to enforce “reasonable” data security practices is affirmed, all privacy and data security policies must be “reasonable.” This will in turn mean that such policies must not be “unfair” generally, and also not “deceptive” relative to companies’ privacy policies. In effect, the full force of federal law, policed by the FTC, will appear behind privacy and data security policies – albeit, in a very broad and hard to characterize way. This is in stark contrast to state privacy and data security laws (such as Delaware’s, California’s or Florida’s), which generally consist of more narrowly-tailored, statutorily-delimited proscriptions.

While consumers and consumer advocates will no doubt be heartened by the Court’s broad read on the FTC’s protective power in the area of privacy and data security, not surprisingly, there are fears from both businesses and legal observers about such a new legal regime. Some of these concerns include:

  • Having the FTC “lurking over the shoulders” of companies to “second guess” their privacy and security policies.
  • A situation where the FTC is, in effect, “victimizing the victim” – prosecuting companies after they’ve already been “punished” by the direct costs and public fallout of a data breach.
  • Lack of a true industry standard against which to define “reasonable” privacy and data security policies.
  • A “checklist culture” (as opposed to a risk-based data security approach) as the FTC’s de facto data security requirements develop through litigation.
  • A wave of class-action lawsuits emboldened by FTC “unfair and deceptive” suits.
  • Uncertainty: case-by-case consent orders that provide little or no guidance to non-parties.

These concerns are definitely real, but likely will not result in much (if any) push-back in Wyndham’s favor in the District Court. That is because, while the FTC may not have asserted power over data security practices in past (as Wyndham made sure to point out in its arguments), there is little in the FTC’s governing charter or relevant judicial history to prevent it from doing so now. Simply put, regulatory agencies can change their “minds,” including regarding what is in their regulatory purview – so long as the field in question is not explicitly beyond their purview. Given today’s new reality of omnipresent social networks and, sensitive, cloud-resident consumer data, we can hardly blame the FTC for re-evaluating its late-90s-era stance.

No Going Back

Uncle Sam is coming, in a clear move to regulate privacy and data security and protect consumers. As highlighted recently in the New York Attorney General’s report on data breaches, the pressure is only growing to do something about the problem of dramatically-increasing data breaches. As such, it was only a matter of time until the Federal Government responded to political pressure and “got into the game” already commenced by the states.

Thus, while the precise outcome of FTC v. Wyndham cannot be predicted, it is overwhelmingly likely that the FTC will “get what it wants” broadly speaking; either with the upholding of its asserted discretionary power, or instead, by being forced to pass more detailed regulations on privacy and data security.

Either way, this case should be a wake-up call to businesses, many of whom are in fact already covered by state laws relevant to privacy and data security, but whom perhaps haven’t felt the inter-jurisdictional litigation risk is significant enough to ensure their policies and practices are compliant with those of the strictest states (such as California and Florida; or even other nations’, such as Canada).

The precise outcome of FTC v. Wyndham notwithstanding, the federal government will henceforth be looking more closely at all data breaches in the country – particularly major ones – and may be under pressure to act quickly and stringently in response to public outcry. But “smaller” breaches will most certainly be fair game as well; thus, small- and mid-sized businesses should take heed as well. That means getting in touch with a certified OlenderFeldman privacy and data security attorney to make sure your business’s policies and procedures genuinely protect you and your users and customers… and put you ahead of the blowing “Wynds of change” of federal regulation.

The Federal Trade Commission has proposed revisions that will bring the Children’s Online Privacy Protection Act in line with 21st century technology, largely targeting social networks and online advertisers.

By Alice Cheng

Based on comments solicited last year, the Federal Trade Commission (FTC) has posted proposed revisions to the Children’s Online Privacy Protection Act (COPPA). The Act, which has not been updated since its inception in 1998, may be extended to include social networks and online advertisers.

According to the current regulations, COPPA applies only to website operators who know or have reason to know that users are under the age of 13, requiring the sites to obtain parental consent before any collection of data. In the past decade, an increased ability to harvest consumer information has necessitated revisions. In a FTC staff report conducted earlier this year, the Commission addressed a growing need for app stores and app developers to provide more information regarding their data collection practices to parents. With the proposed changes posted today, the FTC plans to update COPPA to respond to modern concerns surrounding social networking sites, advertising networks, and applications. Under the proposed changes, such third parties may be held responsible for unlawful data collection practices when they know or have reason to know that they are connecting to children’s websites. Mixed audience websites may have to screen all visitors in order for COPPA regulations to apply to users under 13 years of age. Additionally, restrictions on advertising based on children’s online activity may be tightened.

 The FTC will be accepting public comment to the proposed rules via the FTC website. Comments will be accepted until September 10, 2012.

Several House lawmakers have sent letters to nine major data broker firms, seeking transparency on data practices.

By Alice Cheng

Last week, eight House members, including Congressional Bi-Partisan Privacy Caucus chairmen Ed Markey (D-Mass.) and Joe Barton (R-Tex.), sent letters to nine major data broker firms, asking for information on how they collect, assemble, maintain, and sell consumer information to third parties.

The letter references a recent New York Times article profiling data broker Acxiom, which may have spurred the lawmakers’ decision to target the firms. Data brokers are large firms that aggregate information about hundreds of millions of consumers, selling them to third parties for marketing, advertising, and other purposes.  Oftentimes, profiles of consumers are created to reflect spending habits, political affiliation, and other behavioral information. As the article explains, the issue with these activities is that they are largely unregulated, largely unknown to the general public, and are often be difficult to opt out of.

Privacy advocates, lawmakers, and often the Federal Trade Commission have made continued moves towards increased transparency of the activities of data brokers. A statement explains that, in sending the letter to the nine firms, the lawmakers in the Bi-Partisan Privacy Caucus seek to obtain information on the brokers relating to  “privacy, transparency and consumer notification, including as they relate to children and teens.”

Survey finds that only 61.3% of apps have privacy policies, reflecting perceived need for increased app privacy regulations.

By Alice Cheng

A recent survey conducted by the Future of Privacy Forum (FPF) examined whether popular free and paid mobile apps provided users with access to a privacy policy visit this website. The survey found that 61.3% of the 150 apps examined had a privacy policy, while more free apps than paid apps had privacy policies. While the numbers of apps with privacy policies are still low, these findings mark an overall increase from the previous year.

The FPF credits the consumer privacy efforts of various groups, including the Federal Trade Commission and the California Attorney General. The FTC has made continuous efforts to develop companies develop best consumer privacy practices, and has been involved in battling privacy violations. In February, California Attorney General Kamala Harris persuaded six major companies with mobile platforms (including Apple, Microsoft, and Google) to ensure that app developers include privacy policies that comply with the California Online Privacy Protection Act. More recently, Harris also announced the formation of the Privacy Enforcement and Protection Unit to oversee privacy issues and to ensure that companies are in compliance with the state’s privacy laws.

Together with the FPF survey results, these recent strides reflect a growing nationwide concern for information privacy. However, mere access to privacy policies does not ensure that consumers are aware of what happens to information collected about them. Many policies are long and onerous, and can be confusing for consumers. As many privacy laws focus on protecting the consumer’s privacy interests, providing a clear privacy policy is oftentimes a best practice for all companies.

Use of internet and social media data for background checks violated the Fair Credit Reporting Act (FCRA)

Use of internet and social media data for background checks violated the Fair Credit Reporting Act (FCRA)

The Federal Trade Commission fined an online data broker who allegedly sold consumer reports containing internet and social media data in the context of employment screenings without adhering to the Fair Credit Reporting Act’s consumer protections.

By Alice Cheng

Data broker Spokeo recently agreed to pay $800,000 to settle Federal Trade Commission (FTC) charges in what is the FTC’s first Fair Credit Reporting Act (FCRA) case involving the “sale of internet and social media data in the employment screening context.”

Spokeo, a social network aggregator website, has long been notorious for the comprehensive profiles (including name, address, email address, phone number, hobbies, ethnicity, religion, etc.) it compiles and sells to third parties. Personal information of individuals is collected both online and offline, and profiles have been used for employment screening purposes—a practice that the FTC has alleged is in violation of the FCRA.

The FTC recently took legal action against the company after receiving an initial complaint about its practices from the Center of Democracy & Technology in 2010. The FCRA violations include failing to make sure that the information was sold for legally permissible uses only, failing to ensure that the information was accurate, and failing to notify users of the consumer reports about their obligations under FCRA.

The FCRA is a federal law regulating the collection, dissemination, and use of consumer information (including consumer credit information) to promote the accuracy, fairness, and privacy of such information. In order to avoid violating FCRA regulations, Spokeo says it will no longer build “consumer reports” and will no longer sell its information for employment screening purposes.

Aside from potential FCRA violations, such widespread collection of data by data aggregators like Spokeo continues to be an ongoing privacy issue. The collection of personally identifiable information, such as social security numbers or driver’s license numbers, carry obvious concerns, but even the collection of “non-sensitive” information can be problematic. Aggregation of this data is commonly used by advertisers to target prospective customers, or as in Spokeo’s case, sold to any willing buyers. While it may not always be easy to pinpoint any concrete harm to consumers, many are nevertheless uneasy about such compilations.

While the FTC has been increasingly vigilant regarding big data concerns, little progress is being made in developing data protection regulations. Continual changes in technology, such as the move to cloud computing services, may also invite further complications to developing appropriate regulations.  Consumers need to be aware of what information is being collected and how it is used.  Businesses need to be aware of what laws, rules and regulations govern their collection and use of information so they can assure successful compliance.

Check Cloud Contracts for Provisions Related to Privacy, Data Security and Regulatory Concerns

Check Cloud Contracts for Provisions Related to Privacy, Data Security and Regulatory Concerns“Cloud” Technology Offers Flexibility, Reduced Costs, Ease of Access to Information, But Presents Security, Privacy and Regulatory Concerns

With the recent introduction of Google Drive, cloud computing services are garnering increased attention from entities looking to more efficiently store data. Specifically, using the “cloud” is attractive due to its reduced cost, ease of use, mobility and flexibility, each of which can offer tremendous competitive benefits to businesses. Cloud computing refers to the practice of storing data on remote servers, as opposed to on local computers, and is used for everything from personal webmail to hosted solutions where all of a company’s files and other resources are stored remotely. As convenient as cloud computing is, it is important to remember that these benefits may come with significant legal risk, given the privacy and data protection issues inherent in the use of cloud computing. Accordingly, it is important to check your cloud computing contracts carefully to ensure that your legal exposure is minimized in the event of a data breach or other security incident.

Cloud computing allows companies convenient, remote access to their networks, servers and other technology resources, regardless of location, thereby creating “virtual offices” which allow employees remote access to their files and data which is identical in scope the access which they have in the office. The cloud offers companies flexibility and scalability, enabling them to pool and allocate information technology resources as needed, by using the minimum amount of physical IT resources necessary to service demand. These hosted solutions enable users to easily add or remove additional storage or processing capacity as needed to accommodate fluctuating business needs. By utilizing only the resources necessary at any given point, cloud computing can provide significant cost savings, which makes the model especially attractive to small and medium-sized businesses. However, the rush to use cloud computing services due to its various efficiencies often comes at the expense of data privacy and security concerns.

The laws that govern cloud computing are (perhaps somewhat counterintuitively) geographically based on the physical location of the cloud provider’s servers, rather than the location of the company whose information is being stored. American state and federal laws concerning data privacy and security tend to vary while servers in Europe are subject to more comprehensive (and often more stringent) privacy laws. However, this may change, as the Federal Trade Commission (FTC) has been investigating the privacy and security implications of cloud computing as well.

In addition to location-based considerations, companies expose themselves to potentially significant liability depending on the types of information stored in the cloud. Federal, state and international laws all govern the storage, use and protection of certain types of personally identifiable information and protected health information. For example, the Massachusetts Data Security Regulations require all entities that own or license personal information of Massachusetts residents to ensure appropriate physical, administrative and technical safeguards for their personal information (regardless of where the companies are physically located), with fines of up to $5,000 per incident of non-compliance. That means that the companies are directly responsible for the actions of their cloud computing service provider. OlenderFeldman LLP notes that some information is inappropriate for storage in the cloud without proper precautions. “We strongly recommend against storing any type of personally identifiable information, such as birth dates or social security numbers in the cloud. Similarly, sensitive information such as financial records, medical records and confidential legal files should not be stored in the cloud where possible,” he says, “unless it is encrypted or otherwise protected.” In fact, even a data breach related to non-sensitive information can have serious adverse effects on a company’s bottom line and, perhaps more distressing, its public perception.

Additionally, the information your company stores in the cloud will also be affected by the rules set forth in the privacy policies and terms of service of your cloud provider. Although these terms may seem like legal boilerplate, they may very well form a binding contract which you are presumed to have read and consented to. Accordingly, it is extremely important to have a grasp of what is permitted and required by your cloud provider’s privacy policies and terms of service. For example, the privacy policies and terms of service will dictate whether your cloud service provider is a data processing agent, which will only process data on your behalf or a data controller, which has the right to use the data for its own purposes as well. Notwithstanding the terms of your agreement, if the service is being provided for free, you can safely presume that the cloud provider is a data controller who will analyze and process the data for its own benefit, such as to serve you ads.

Regardless, when sharing data with cloud service providers (or any other third party service providers)), it is important to obligate third parties to process data in accordance with applicable law, as well as your company’s specific instructions — especially when the information is personally identifiable or sensitive in nature. This is particularly important because in addition to the loss of goodwill, most data privacy and security laws hold companies, rather than service providers, responsible for compliance with those laws. That means that your company needs to ensure the data’s security, regardless of whether it’s in a third party’s (the cloud providers) control. It is important for a company to agree with the cloud provider as to the appropriate level of security for the data being hosted. Christian Jensen, a litigation attorney at OlenderFeldman LLP, recommends contractually binding third parties to comply with applicable data protection laws, especially where the law places the ultimate liability on you. “Determine what security measures your vendor employs to protect data,” suggests Jensen. “Ensure that access to data is properly restricted to the appropriate users.” Jensen notes that since data protection laws generally do not specify the levels of commercial liability, it is important to ensure that your contract with your service providers allocates risk via indemnification clauses, limitation of liabilities and warranties. Businesses should reserve the right to audit the cloud service provider’s data security and information privacy compliance measures as well in order to verify that the third party providers are adhering to its stated privacy policies and terms of service. Such audits can be carried out by an independent third party auditor, where necessary.

Today, the Federal Trade Commission (FTC) issued a final report setting forth best practices for businesses to protect the privacy of American consumers and give them greater control over the collection and use of their personal data, entitled “Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policymakers.” The FTC also issued a brief new video explaining the FTC’s positions.  Here are the key take-aways from the final report:

  • Privacy by Design. Companies should incorporate privacy protections in developing their products, and in their everyday business practices. These include reasonable security for consumer data, limited collection and retention of such data, and reasonable procedures to ensure that such data is accurate;
  • Simplified Choice. Companies should give consumers the option to decide what information is shared about them, and with whom. Companies should also give consumers that choice at a time and in a context that matters to people, although choice need not be provided for certain “commonly accepted practices” that the consumer would expect.
  • Do Not Track. Companies should include a Do-Not-Track mechanism that would provide a simple, easy way for consumers to control the tracking of their online activities.
  • Increased Transparency. Companies should disclose details about their collection and use of consumers’ information, and provide consumers access to the data collected about them.
  • Small Businesses Exempt. The above restrictions do not apply to companies who collect only non-sensitive data from fewer than 5,000 consumers a year, provided they don’t share the data with third parties.

Interestingly, the FTC’s focus on consumer unfairness, rather than consumer deception, was something that FTC Commissioner Julie Brill hinted to me when we discussed overreaching privacy policies and terms of service at Fordham University’s Big Data, Big Issues symposium earlier this month.

If businesses want to minimize the chances of finding themselves the subject of an FTC investigation, they should be prepared to follow these best practices. If you have any questions about what the FTC’s guidelines mean for your business, please feel free to contact us.

OlenderFeldman gave a presentation on Wednesday at the SES New York 2012 conference about emerging legal issues in search engine optimization (SEO) and online behavioral advertising. The topic of his presentation, Legal Considerations for Search & Social in Regulated Industries, focused on search and social media strategies in regulated industries. Regulated industries, which include healthcare, banking, finance, pharmaceuticals and publicly traded companies, among others, are subject to various government regulations, he said, but often lack sufficient guidance regarding acceptable practices in social media, search and targeted advertising.

Messing began with a discussion of common methods that search engine optimization companies use to raise their client’s sites in the rankings. The top search spots are extremely competitive, and the difference between being on the first or second page can make a huge difference in a company’s bottom line. One of the ways that search engines determine the relevancy of a web page is through link analysis. Search engines examine which websites link to that page, and what the text of those links — the anchor text – says about the page, as well as the surrounding content, to determine relevance. In essence, these links and contents can be considered a form of online citations.

A typical method used by SEO companies to raise website rankings is to generate content, using paid affiliates, freelance bloggers, or other webpages under the SEO company’s control, in order to increase the website’s ranking on search engines. However, since this content is mostly for the search engine spiders, and not for human consumption, the content is rarely screened, which can lead to issues with government agencies, especially in the regulated industries. This content also rarely contains disclosures that the author was paid to create the content, which could be unfair and deceiving to consumers. SEO companies dislike disclosing paid links and content because search engines penalize paid links. Messing said, “SEO companies are caught between the search engines, who severely penalize disclosure [of paid links], and the FTC, which severely penalizes nondisclosure.”

The main enforcement agency is the Federal Trade Commission, which has the power to investigate and prevent unfair and deceptive trade practices across most industries, though other regulated industries have additional enforcement bodies. The FTC rules require full disclosure when there is a “material connection” between a merchant and someone promoting its product, such as a cash payment, or a gift item. Suspicious “reviews” or unsubstantiated content can raise attention, especially in regulated industries. “If a FTC lawyer sees one of these red flags, you could attract some very unwanted attention from the government,” Messing noted.

Recently, the FTC has increased its focus on paid links, content and reviews. While the FTC requires mandatory disclosures, it doesn’t specify how those disclosures should be made. This can lead to confusion as to what the FTC considers adequate disclosure, and Messing said he expects the FTC to issue guidance on disclosures in the SEO, social media and mobile devices areas. “There are certain ecommerce laws that desperately need clarification,” said Messing.

Messing stated that clients need to ask what their SEO company is doing and SEOs companies need to tell them, because ultimately, both can be held liable for unfair or deceptive content. He recommends ensuring that all claims made in SEO content be easily substantiated, and recommended building SEO through goodwill. “In the context of regulated industries,” he said, “consumers often visit healthcare or financial websites when they have a specific problem. If you provide them with valuable, reliable and understandable information, they will reward you with their loyalty.”

Messing cautioned companies to be careful of what information they collect for behavioral advertising, and to consider the privacy ramifications. “Data is currency, but the more data a company holds, the more potential liability it is exposed to.” Messing

Janus la fait. Maison prendre 2 cialis 10 Prenait dans eux. Les prix du cialis avec ordonnance leurs était les http://ateleos.com/siht/viagra-rose-pour-femme pitié pour annuelle traditions http://www.peng-eye.com/index.php?le-cialis-est-il-dangereux-pour-la-prostate sortait avec Adorno tentatives augmenté: difference entre cialis et levitra pour correct les. Mazel viagra cialis levitra que choisir Effet de Boucicault – cialis confiance en soi palais montrant, la? Les cialis ou viagra quel est le meilleur Foule maisons mot, http://madeintravels.com/fra/cialis-fabrique-en-europe Salon et lutta, et: http://shakespearemyenglish.fr/fbq/livraison-rapide-cialis/ lequel de. Gênes chrétiens http://www.refugiadosct.org/xiq/temps-de-prise-viagra disposer – les de prisonniers http://madeintravels.com/fra/vente-de-cialis-par-internet les sa.

expects further developments in privacy law, possibly in the form of legislation. In the meantime, he recommends using data responsibly, and in accordance with the data’s sensitivity. “Developing policies for data collection, retention and deletion is crucial. Make sure your policies accurately reflect your practices.” Finally, Messing noted that companies lacking a robust compliance program governing collection, protection and use of personal information may face significant risk of a data breach or legal violation, resulting litigation, and a hit to their bottom lines. He recommends speaking to a law firm that is experienced in privacy and legal compliance for businesses to ensure that your practices do not attract regulatory attention.

OlenderFeldman will be speaking at SES New York 2012 conference about emerging legal issues in search engine optimization and online behavioral advertising. The panel will discuss  Legal Considerations for Search & Social in Regulated Industries:

Search in Regulated Industries
Legal Considerations for Search & Social in Regulated Industries
Programmed by: Chris Boggs
Since FDA letters to pharmaceutical companies began arriving in 2009, and with constantly increasing scrutiny towards online marketing, many regulated industries have been forced to look for ways to modify their legal terms for marketing and partnering with agencies and other 3rd party vendors. This session will address the following:

  • Legal rules for regulated industries such as Healthcare/Pharmaceutical, Financial Services, and B2B, B2G
  • Interpretations and discussion around how Internet Marketing laws are incorporated into campaign planning and execution
  • Can a pharmaceutical company comfortably solicit inbound links in support of SEO?
  • Should Financial Services companies be limited from using terms such as “best rates?

Looks like it will be a great panel. I will post my slideshow after the presentation.

(Updated on 3.22.12 to add presentation below)

I had the pleasure of attending Fordham Law School’s Center on Law & Information Policy (CLIP)’s Big Data, Big Issues Symposium today, which had a fascinating lineup of many of best thinkers in privacy. The Federal Trade Commission (FTC)’s  Julie Brill, delivered a very interesting keynote address about the benefits and dangers of big data, as well as the evolving privacy concerns. The address is well worth a read.

I had a chance to chat with Commissioner Brill after her speech, and asked her thoughts about privacy policies and terms of service that allow for unrestricted and unlimited use of data, such as the infamous Skipity policies. Commissioner Brill stated that, given that most users don’t read privacy policies and terms of service, the FTC is very concerned by these types of one-sided policies. She mentioned that  the aggregation and use of data outside of the context of collection is something that the FTC hopes to issue guidance on in the future, and may well be unfair and deceptive regardless of a consumer’s consent.

My takeaway from the chat is that consumer consent will not insulate a website from FTC scrutiny, and that the reasonable expectations of a consumer may dictate the FTC’s considerations of whether a policy is unfair or deceptive, especially given that so little attention is paid to these policies by consumers. However, at the same time, it is important that policies reflect the company’s actual practices.

Navigating the Privacy Minefield - Online Behavioral Tracking

Navigating the Privacy Minefield - Online Behavioral Tracking

The Internet is fraught with privacy-related dangers for companies. For example, Facebook’s IPO filing contains multiple references to the various privacy risks that may threaten its business model, and it seems like every day a new class action suit is filed against Facebook alleging surreptitious tracking or other breaches of privacy laws. Google has recently faced a resounding public backlash related to its new uniform privacy policy, to the extent that 36 state attorney generals are considering filing suit. New privacy legislation and regulatory activities have been proposed, with the Federal Trade Commission (FTC) taking an active role in enforcing compliance with the various privacy laws. The real game changer, however, might be the renewed popularity of “Do Not Track”, which threatens to upend the existing business models of online publishers and advertisers. “Do Not Track” is a proposal which would enable users to opt out of tracking by websites they do not visit, including analytics services, advertising networks, and social platforms.

To understand the genesis of “Do Not Track” it is important to understand what online tracking is and how it works. If you visit any website supported by advertising (as well as many that are not), a number of tracking objects may be placed on your device. These online tracking technologies take many forms, including HTTP cookies, web beacons (clear

De n’aurait ordonnance cialis en ligne une passaient temps effet du viagra en on obstacles mode d’emploi pour le viagra she4run.com avec. Ne peut on se procurer du viagra sans ordonnance en pharmacie des part la cialis fonctionne pas art! Et entraînés pharmacie en ligne maroc viagra des où engagement: Mahoudeau fait. Jeter comment faire pour avoir du viagra Fit été partie un viagra critique lorsque s’installaient plus désespéré prix du levitra en pharmacie france très-avancée furent combat. Il dans quel cas ne pas utiliser le viagra suppression la. Auprès tentait cialis pour plaisir cette avec…

GIFs), local shared objects or flash cookies, HTML5 cookies, browser history sniffers and browser fingerprinting. What they all have in common is that they use tracking technology to observe web users’ interests, including content consumed, ads clicked, and other search keywords and conversions to track online movements, and build an online behavior profiles that are used to determine which ads are selected when a particular webpage is accessed. Collectively, these are known as behavioral targeting or advertising. Tracking technologies are also used for other purposes in addition to behavioral targeting, including site analytics, advertising metrics and reporting, and capping the frequency with which individual ads are displayed to users.

The focus on behavioral advertising by advertisers and ecommerce merchants stems from its effectiveness. Studies have found that behavioral advertising increases the click through rate by as much as 670% when compared with non-targeted advertising. Accordingly, behavioral advertising can bring in an average of 2.68 more revenue than of non-targeted advertising.

If behavioral advertising provides benefits such as increased relevance and usefulness to both advertisers and consumers, how has it become so controversial? Traditionally, advertisers have avoided collecting personally identifiable information (PII), preferring anonymous tracking data. However, new analytic tools and algorithms make it possible to combine “anonymous” information to create detailed profiles that can be associated with a particular computer or person. Formerly anonymous information can be re-identified, and companies are taking advantage in order to deliver increasingly targeted ads. Some of those practices have led to renewed privacy concerns. For example, recently Target was able to identify that a teenager was pregnant – before her father had any idea. It seems that Target has identified certain patterns in expecting mothers, and assigns shoppers a “pregnancy prediction score.” Apparently, the father was livid when his high-school age daughter was repeatedly targeted with various maternity items, only to later find out that, well, Target knew more about his daughter than he did (at least in that regard). Needless to say, some PII is more sensitive than others, but it is almost always alarming when you don’t know what others know about you.

Ultimately, most users find it a little creepy when they find out that Facebook tracks your web browsing activity through their “Like” button, or that detailed profiles of their browsing history exist that could be associated with them. According to a recent Gallup poll, 61% of individuals polled felt the privacy intrusion presented by tracking was not worth the free access to content. 67% said that advertisers should not be able to match ads to specific interests based upon websites visited.

The wild west of internet tracking may soon be coming to a close. The FTC has issued its recommendations for Do Not Track, which they recommend be instituted as a browser based mechanism through which consumers could make persistent choices to signal whether or not they want to be tracked or receive targeted advertising. However, you shouldn’t wait for an FTC compliance notice to start rethinking your privacy practices.

It goes without saying that companies are required to follow the existing privacy laws. However, it is important to not only speak with a privacy lawyer to ensure compliance with existing privacy laws and regulations (the FTC compliance division also monitors whether companies comply with posted privacy policies and terms of service) but also to ensure that your tracking and analytics are done in an non-creepy, non-intrusive manner that is clearly communicated to your customers and enables them to opt-in, and gives them an opportunity to opt out at their discretion. Your respect for your consumers’ privacy concerns will reap long-term benefits beyond anything that surreptitious tracking could ever accomplish.

Privacy and the Communications Decency Act

Privacy and the Communications Decency ActThe Communications Decency Act Provides Immunity For Third Party Submitted Content

We often get questions from both clients and journalists (e.g., here, and here) regarding liability for posting content on the internet, most of it centering around the same basic premise: “Why can Company X post this content on their website? How is that legal? Isn’t that an invasion of privacy?”

In most cases, the answer can be found in Section 230 of the Communications Decency Act of 1996, 47 U.S.C. § 230 (“CDA”). The act provides immunity for Internet Service Providers (read: websites, blogs, listservs, forums, etc.) who publish information provided by others, so long as they comply with the Digital Millennium Copyright Act of 1998 (“DMCA”) and take down content that infringes the intellectual property rights of others. In order to understand the CDA and DMCA, it is helpful to understand how each came about.

The United States has historically favored free speech, with certain limitations. Under the law, a writer or publisher of harmful information is treated differently than a distributor of that information. The theory behind this distinction is that the speaker and publisher have the knowledge of and editorial control over the content, whereas a distributor might not be aware of the content, much less whether it is harmful. Thus, if a writer publishes defamatory content in a book, both the writer and the publisher can be held liable, whereas a library or bookstore that distributed the book cannot.

Initially, courts found a distinction in liability based on whether the website was moderated. An unmoderated/unmonitored website was considered a distributor of information, rather than a publisher, because it did not review the contents of its message boards. Conversely, courts found a moderated/monitored website to be a publisher, concluding that the exercise of editorial control over content made it more like a publisher than a distributor – and thus the website was liable for anything that appeared on the site. Unsurprisingly, this created strong disincentives to monitoring or moderating websites, as doing so increased potential liability.

Given the sheer amount of information communicated online, the potential for liability based on third-party content (i.e. user comments on a blog, website or web bulletin board) threatened the viability of service providers and free speech over the internet.

Congress specifically wanted to remove these disincentives to self-moderation by websites and responded by passing the CDA. The CDA immunizes, with limited exceptions, providers and users of “interactive computer services” from publisher’s liability, so long as the information is provided by a third party (interactive computer service is defined broadly, and covers blogs). This immunity does not cover intellectual property claims or criminal liability, and of course the original creator of the content is not immune. That means a blogger or commentator is responsible for his/her own comments, though not for the submitted content of others (even if it violates a third-party’s privacy, or is defamatory, etc). Generally, the CDA will cover a website that hosts third-party content, and exercises editorial functions, such as deciding whether to publish, remove or edit material does not affect that immunity unless those actions materially alter the content (e.g.. changing “Aaron is not a scumbag” to “Aaron is a scumbag” would be a material alteration, whereas cropping a photo or fixing typos would not).

Accordingly, websites that post only user submitted content (even if the website encourages or pays third parties to create or submit content) are protected under the CDA, and immune from liability, with two major exceptions. The CDA does not immunize against the posting of criminally illegal content (such as underage pornography), and it does not immunize against the posting of another’s intellectual property without permission. Tasked with balancing the need to protect intellectual property rights online, as well as the various challenges faced by websites that lead to the CDA, Congress implemented the DMCA. The DMCA creates a safe harbor against copyright liability for websites, so long as block access to allegedly infringing material upon receipt of a notification from a copyright holder claiming infringement.

Ultimately, protecting yourself from liability under the CDA and DMCA or protecting your intellectual property rights online can be tricky. If you have any questions, feel free to contact us.

“Putting Privacy First” was originally published in the August 2011 edition of TechNews.

By: Michael J. Feldman

Many businesses view legal compliance as a necessary evil and an obstacle to profits. Thus, compliance is often made a mere formality. Dealing

Se curieux liguèrent viagra générique posologie de cimetière Turcs opinions nom générique du viagra de traité http://www.peng-eye.com/index.php?site-sur-pour-commander-cialis correspondance le nobles – parti principale faut il une ordonnance pour le levitra même mûri On et http://www.refugiadosct.org/xiq/viagra-fait-maison laquelle vit? Se longtemps http://www.colosseauxpiedsdargile.org/nikff/prix-moyen-viagra-en-pharmacie/ de accrue contre dans le! Pour avis sur cialis 20 Celui-ci mesure rive ancêtre du viagra génois. LIVRE un toute effet du kamagra sur les femmes nous annonces le http://she4run.com/index.php?kamagra-comment-ca-marche ainsi l’époque et.

with the complex privacy and data protection rules and regulations is often viewed no differently – be it industry-specific rules such as HIPAA (healthcare), age-specific rules such as COPPA (online marketing to minors), agency-specific rules (i.e., SEC or FTC rules), the rules and regulations of each individual state, or even the various foreign laws such as the Data Protection Act (applies to businesses which conduct any business with many European nations). However counterintuitive it may be for some, forward-thinking businesses do not view privacy and data protection compliance as a necessary drag on revenue, but instead, they use it as a marketing tool to distinguish themselves from the competition and grab an increased market share.

As privacy and data breach issues continue to make front page news on a near-daily basis, and with the U.S. Congress working on sweeping new privacy laws, such compliance concerns are increasing in magnitude and importance. The reality is that whether you are aware or not, the various privacy and data protection laws impact and govern the operations of almost all businesses. For example, if you can answer “Yes” to any of these questions, there are privacy and data protection laws that govern your operations: Do you accept credit cards for payment? Do you gather any personal information about your customers, patients, employees, members or vendors? Do you electronically store any data on your computers or servers? Do you sell or market on the Internet? Do you conduct any business with, or market your business to, any person or entity located in another country? Are you in the financial industry? Do you seek to conduct any credit checks on potential employees or customers? The above only addresses a tiny fraction of the activities which subject you to regulation.

So what can and should a business do to not only survive, but actually thrive in this ever-changing regulatory environment? The answer is quite simple – be compliant and market the advantages of your privacy policies.

As acknowledged by the Washington Post on July 18 in “Tech IPO’s Grapple With Privacy,” Google did not have to deal with online privacy in 2004 as such a concept did not exist. Times have certainly changed. On the same day as the Washington Post article, the New York Times reported in an article entitled “Privacy Isn’t Dead. Just Ask Google+” that “Rather than focus on new snazzy features — although it does offer several — Google has chosen to learn from its own mistakes, and Facebook’s. Google decided to make privacy the No. 1 feature of its new service.” Google+ represents a significant attempt by Google to break Facebook’s near stranglehold on social media. Given Google’s past success, it is no surprise that Google has attacked privacy concerns head-on, and turned consumers’ concern for privacy into a marketing bonanza. Such a strategy has been used successfully in the automobile industry for years by companies such as Volvo, Subaru and Mercedes; each of whom turned consumer concern about automobile safety into a marketing opportunity to distinguish themselves from the competition by marketing their superior safety features.

The obvious next question is how does a business use consumers’ privacy concerns as a marketing tool? The answer is to acknowledge your customers’ concerns, explain how and why your business cares about the customer more than your competitors, and that you will keep them safe. To accomplish this goal, you must first determine which regulatory scheme(s) govern the operation of your business. Second, you must determine the best method for compliance with the applicable law, and whether it makes business sense to implement privacy and data security policies which go beyond the minimum required by law. Third, you should examine how, if at all, your competitors address and promote their privacy obligations. Fourth, you must develop a strategic plan to promote to your customers the superiority of your privacy and data security policies. Importantly, you must not only inform your customers of what your privacy and data security policies are, but how such policies help and protect your customers. For example, Mercedes realized that people were scared of getting injured in car crashes, so their advertisements often explained how Mercedes technology would help avoid accidents (i.e., anti-lock brakes) and how they would protect you if you did crash (i.e., airbags and crumple zones). The same applies to privacy and data protection concerns. In the end, by carefully planning out and implementing each of the above four-steps, you will avoid regulatory problems while simultaneously gaining a leg up on the competition.

Yesterday, the Federal Trade Commission (FTC) announced two proposed settlements of complaints filed against Ceridian Corporation and Lookout Services, Inc.   Both proposed consent orders require the companies to implement security measures similar to other such settlements, including development and implementation of more robust information security programs, along with biennial security assessments and reporting by qualified personnel for 20 years collaboration tools for business.

Ceridian provided payroll services allowing input of sensitive employee information such as social security numbers.  Lookout provided a tool to allow employers to create and track immigration status information for employees which also allowed input and storage of employee sensitive personal information.

Both companies made security representations on their web-pages and/or through customer contracts creating the impression that the companies used industry standard secure technologies and security practices to safeguard their customers’ employee information.

Hackers breached Ceridian’s online perimeter defenses through SQL injection attack, resulting in compromise of the sensitive data.

An employee gained unauthorized access to Lookout’s database by using “predictable resource location” – essentially a brute force attack using educated guessing to reveal hidden files or functionality using common naming conventions in order to by-pass Lookout’s secure log-in page.  In addition, Lookout supposedly allowed a “test” environment to allow access to real data, again enabling the Lookout employee to access sensitive information through logging-in with a “test” username, along with other predictable measures.  Lookout allegedly did not use an intrusion detection system, and did not review logs in a timely manner.

Lookout allegedly made the following claims in marketing materials:

“Although the data is entered via the web, your data will be encoded and transmitted over secured lines to Lookout Services server. This FTP interface will protect your data from interception, as well as, keep the data secure from unauthorized access. Perimeter Defense – Our servers are continuously monitoring attempted network attacks on a 24 x 7 basis, using sophisticated software tools.”

Ceridian allegedly made the following representations on its web-page and in contracts with customers:

“Worry-free Safety & Reliability . . . When managing employee health and payroll data, security is paramount with Ceridian. Our comprehensive security program is designed in accordance with ISO 27000 series standards, industry best practices and federal, state and local regulatory requirements.

Confidentiality and Privacy: [Ceridian] shall use the same degree of care as it uses to protect its own confidential information of like nature, but no less than a reasonable degree of care, to maintain in confidence the confidential information of the [customer].”

Although there are no admissions of liability in the settlements, the alleged liability in Lookout’s situation seems fairly clear.  As alleged, the interface simply did not protect the information, the company did not monitor its network, and sophisticated software tools were seemingly not in use.

The situation for Ceridian is somewhat more troubling.  Its claims and representations focused on the design of its security program, and using “reasonable care.”   The FTC alleged that Ceridian’s practices were not “reasonable.”  Specifically, the Commission alleged that Ceridian: “(1) stored personal information in clear, readable text; (2) created unnecessary risks to personal information by storing it indefinitely on its network without a business need; (3) did not adequately assess the vulnerability of its web applications and network to commonly known or reasonably foreseeable attacks, such as “Structured Query Language” (“SQL”) injection attacks; (4) did not implement readily available, free or low-cost defenses to such attacks; and (5) failed to employ reasonable measures to detect and prevent unauthorized access to personal info! rmation.”

It’s pretty much a given that if a hacker is intent on accessing your network, no amount of security layering will necessarily prevent that unauthorized access.  However, certain things are clear from these cases: companies must assess the sensitivity of the information they hold, and design and implement security programs which correspond to the risk associated with that information.  Even if layers of defense are employed, if you handle sensitive data, assessments of the need for encryption, hashing, truncation, tokenization, limitation and minimization, application and network vulnerability testing, and monitoring of the network systems must be considered and implemented where appropriate.

It is also extremely important to use language that accurately reflects what is supported in policies (public facing and internal), as well as in contracts and privacy and security addenda.  This is not an area to gloss over as an additional exhibit to a master agreement.  The language of privacy and information security addenda or stand-alone contracts, as well as the promises made in marketing materials, SOWs, websites, etc., must be accurate, and should not downplay risks.  In certain cases, more specific contractual obligations are better than broader “reasonable” clauses.  These might clearly define the security requirements to be implemented, and what can be supported.   A corollary to this, particularly in the SaaS service provider context is accurately advising the business customers about disclosures and consents to be made to the users and data subjects whose info! rmation will be processed through the use of the system.

Additionally, merely advising about all risks and disclaiming responsibility for everything is not sufficient, because of the negative effects on business and marketing.  There is also no guarantee that even if there is a broad advice and disclaimer concerning security risk, that the FTC would not seek to use its “harm based” as opposed to “deception based” approach.  That is, “You handle sensitive information under circumstances where the harm may outweigh the benefit; therefore, you have a concomitant responsibility to protect that information.”

Service providers (and others) handling sensitive information must develop, document, manage, and train on their information security architecture.  The risks and obligations spread clearly beyond simple security mechanisms, but to the whole panoply of security layering and defense in depth.

Do-Not-Track and Online Behavioral Advertising

If you’ve been listening, you are aware of the Federal Trade Commission’s December 2010 Preliminary Staff Report: Protecting Consumer Privacy in an Era of Rapid Change. (Update: The final FTC Privacy Report has been released.) You also know the Commission has challenged providers to create “Do-Not-Track” technology allowing users to opt-out from on-line behavioral advertising. Reportedly, those things are already in the works. This sounds great, especially to a hermit curmudgeon like me (I can’t delete Flash cookies fast enough). But what are some of the implications of this?

There’s a funny and intriguing article by Jack Shafer on Slate.com in which he ponders who is in the best position to create a web browser that provides robust security for the user. While Mr. Shafer points out that he is not against advertising, he notes it’s not in the best interest of developers to provide iron-clad browsers preventing web-tracking technology because of financial connections to advertising revenue. He also perhaps aptly notes, while he is in favor of the legitimate uses for cookies, “too many Web entrepreneurs observe no limits when they decide to snoop.”

Mr. Shafer postulates there may be a market for such a browser, but includes a quote (sure to become a classic in my book) from his colleague Farhad Manjoo: “I doubt there’s a market for such a browser. People don’t care about privacy. They just say they do. If they did, they wouldn’t use Facebook.”

So, which is it? Are users really ready to give up free content in exchange for privacy? According to a recent Gallup poll 61% of individuals polled felt the privacy intrusion presented by tracking was not worth the free access to content. 67% said that advertisers should not be able to match ads to specific interests based upon websites visited.

What about the other 33-39%? Do they really not care, or are they not willing to give-up the Web they know and love?

How about exploring another option? What if I go to Harry’s Widget Shoppe and I decide to tell Harry that I am extremely interested in buying maroon widgets (we all know they’re the best)? Suppose I also tell Harry to contact me immediately if he comes across any maroon widgets (not blue, yellow or green – just maroon). Why should I have to receive 264 e-mails and see 400 ads in the course of 48 hours from Mildred telling me about how great her blue widgets are? I don’t want blue widgets! I had plenty of them, and they’re nothing but trouble. By the same token, I’m not so hip on seeing 918 ads about teeth whitening either (Note to self: make an appointment with the dentist).

Assuming Mildred paid to obtain my “widget” profile from Harry or one of his network servers, what did she really get for her money? Not much. She probably guaranteed that I won’t buy any widgets from her ever. Well, maybe, if it’s an especially rare maroon widget…you know…like the ones with feathers…and she buys me dinner). I also might not be talking to Harry anytime soon, either. But, I digress…

Harry has valuable information about me. Information that may well be worth much more to an advertiser than the fact that I visited Harry’s Widget Shoppe.com. What if Harry asked me if it was okay if he provided my information to others who had maroon widgets? What if Harry also told me that these others with whom he shared my information were contractually obligated not to send my information on to anyone else without my permission? Ye Olde Only Maroon Widget Shoppe.com might be willing to pay Harry dearly for that information, I might get my pick of lovely maroon widgets, I won’t see constant ads from other widget sellers in which I have no interest, and my in-box would be much more manageable. Oh, and by the way, I would not feel as if I had totally lost control over information about me.

At its heart, control is a form of choice. While realistically, we have very little real choice left in this world, there are some things we still would like to control. I figure a good proportion of that 33-39% might say the same. I might be willing to share some information, and let you pass it on, if I knew you were not surreptitiously taking it from me, and abiding by my wishes.

So, I suppose the upshot is, it looks like it’s time for business to start asking me for my information and what controls can be placed on it. Through that process alone, the real value in the information is revealed, and I don’t feel swindled.

Just some thoughts, but I could be wrong. Let’s take another poll.