Posted on Thursday, July 24 2014 at 8:28 am by

Watershed Event on 21st C. Regulation of Privacy, Technology, Civil Liberties & Cybersecurity

OFB-Qianlongsatz03-Krieger

UPDATE: In my humble opinion, this hearing was the watershed we expected. Ranking Member Cummings really appeared at the very end of the hearing to be moved by the testimony, expressed it as a “critical moment,” and praised the hearing, which represented extraordinary movement from the party line at the beginning of the hearing. Politico, Mother Jones and all the rest failed to note that movement at all, focusing only on the most vitriolic moments of the hearing. Perhaps I am naive, but the final moments of the hearing appeared to open the door to bipartisan investigation in the public interest.

__________________________________________________________________________

Original Post:

One of the most interesting and potentially influential political events on privacy, cybersecurity, civil liberties and technology regulation in the US and beyond — and of course that is saying a lot in the age of Snowden — will take place online, free, now, and you simply cannot miss it. The US House Committee on Oversight and Government Reform is about to hold a hearing entitled:

The Federal Trade Commission and Its Section 5 Authority: Prosecutor,

Judge, and Jury

Yes, the event is political, like any Congressional hearing nowadays, and the partisan thunder has been rolling for days before the storm. Yesterday, Senate Commerce Chairman Jay Rockefeller (D-W.Va.) was so “troubled by the impropriety” of the related investigation by House Oversight Committee Chairman Darrell Issa (R-Calif.), which he considers “interference” in the important FTC proceeding against LabMD, that he determined he needed to take the rare step of himself trying to interfere in the House proceeding. The agenda for today’s hearing shows the weakness of Senator Rockefeller’s claim, however. Not only are the CEO of LabMD and another small businessperson on the agenda, but the legal scholars who, as I said in a previous post, have written the most important law review articles on opposite sides of the issue will each testify and take questions.

The ultimate issue at stake is one of the most important facing us in the 21st Century:

How can regulation keep up with exponential rates of change in technology?

The FTC has taken the position that in order to keep up, it needs to be able to enforce regulatory standards without specific notice of those standards. With help from FTC Commissioner Maureen Ohlhausen and the two scholars who will testify today, here’s how I can best express the issue to you:

Why would an agency trying to raise standards for the security of personal information avoid giving notice of its standards? Federal Trade Commissioner Maureen Ohlhausen recently offered remarks[1] that clarify just how important this strategy is to the FTC. In short, her argument is that given widespread innovation and the rate of change in technology, the information regulators need to gather in order to promulgate regulations is so widely dispersed and ephemeral that notice-and-comment rulemaking is stale by the time it is promulgated and carves regulatory categories unfit for their purposes. Her solution is the FTC’s Section 5 “unfairness” jurisdiction, which gathers information only from the parties and makes judgments on those specific facts, calling it “ex postregulation.” She notes that while the results only bind the parties, others can and should look to the results as evidence of how the FTC would regard similar facts, and that “when the FTC weighs that precedent in future cases, it can then consider any changes in the underlying facts.”

If you are trying to run a business, you might find ex post regulation an elegant solution for the regulator but at least worrisome in that the rules regarding your facts are not known in advance. Those who know the FTC’s settlement agreements – almost always involving 20 years of monitoring – find it more troubling. Perhaps most troubling is the knowledge that the consent orders obtained generally involved no admission of wrongdoing, and represent practical business decisions by enterprises wishing to avoid years of ruinous litigation and damage to their reputations, rather than judgments of courts on the merits.

Commissioner Ohlhausen is well aware of the amount of power ex post regulation gives the FTC, and perhaps for that reason starts her speech with “Principle 1: Regulatory Humility.”[2] Professors Solove and Hartzog made the case, in a very thoughtful and influential article written before her remarks and somewhat inconsistent with them, that the FTC has exercised, if not humility, then at least restraint in the actions it has brought, providing justification for current trend of viewing FTC privacy and information security consent orders under its Section 5 unfairness and deception authorities as development of a “common law.”[3]

The FTC’s actions may not have lived up to the justification that Professors Solove and Hartzog have developed for them, nor to the principle of humility. For example, when an administrative law judge recently ordered the FTC to disclose its “unfairness” information security standards in the LabMD case,[4] the FTC did not claim that the security provisions mentioned in its more than fifty information security cases constitute precedent; it generally confirmed that every judgment is case-specific.[5] By the same token, the FTC does not ask its experts in the cases it brings to review its settlement agreements; rather it asks only for–and then relies on–a case-specific judgment based on the expert’s (mostly technical) security expertise; that is ex post information security regulation in action.[6]

Here’s the link again. Don’t miss it!

[1] The Procrustean Problem with Prescriptive Regulation , Remarks of Maureen K. Ohlhausen, Commissioner, U.S. Federal Trade Commission to the Sixth Annual Telecom Policy Conference of the Free State Foundation, Washington, DC, March 18, 2014. Commission Ohlhausen noted that “The views expressed in these remarks are my own and do not necessarily reflect the views of the Federal Trade Commission or any other Commissioner.”

[2] For a good article on how fair notice principles could be considered by the FTC, see Stegmaier, Gerard M. and Bartnick, Wendell,Psychics, Russian Roulette, and Data Security: The FTC’s Hidden Data Security Requirements(May 9, 2013). George Mason Law Review, Vol. 20, No. 3, pp. 673-720, 2013. Available at SSRN: http://ssrn.com/abstract=2263037

[3] Solove, Daniel J. and Hartzog, Woodrow,The FTC and the New Common Law of Privacy(August 15, 2013). 114 Columbia Law Review 583 (2014); GWU Legal Studies Research Paper No. 2013-120; GWU Law School Public Law Research Paper No. 2013-120. Available at SSRN: http://ssrn.com/abstract=2312913 orhttp://dx.doi.org/10.2139/ssrn.2312913

[4] http://www.ftc.gov/system/files/documents/cases/140501labmdordercompel.pdf

[5] Transcript of the Testimony of Daniel Kaufman, May 12, 2014, athttp://assets.law360news.com/0543000/543678/LabMD-Kaufman-Transcript.pdfandhttp://www.phiprivacy.net/wp-content/uploads/LabMD-Kaufman-Transcript.pdf

[6] See, e.g., Expert Report of Raquel Hill, Ph.D., included on p. 19 athttp://www.ftc.gov/system/files/documents/cases/140502mtnlimitexpertrpt.pdf

Posted on Saturday, July 12 2014 at 2:42 pm by

The Presentation of Self in the Everyday Workplace

 

tecnomatixjack

How should employers and employees deal with US law’s new recognition that digital life on a phone or cloud is often as intimate as a diary?

Recent big, bold Supreme Court decisions on cellphone privacy have come at about the same time as the best summer business reading is presenting a cure for the dishonesty of the current employer-employee relationship. The confluence of these two unrelated developments may in fact offer some useful opportunities in connection with the lesser dishonesty of being employed while using the Internet.

The apparent new right in the privacy of cell phone data was immediately recognized asbased more on the richness of the personal data than where the data resides, potentially protecting data of comparable richness in all of its clouds and other travels far beyond the cell phone. Already, we can see ripples of these criminal cases in a civil case involving employees going after other employees’ cell phones.

The first big question for employers and employees is: As these ripples move into the workplace and employers realize that their BYOD policies and employee handbooks may not be clear enough regarding the search of personal cloud repositories or other personal information stores, will they continue to design policies, consents and acknowledgements as broadly as the law permits and to some extent requires? Or will they — and in which ways can they, in view of their obligations to monitor discrimination and harassment — think about zones or counterbalancing principles of privacy, because as the Supreme Court recognized the digital lives to which employees can grant access are as personal as the most intimate diary?

That question brought to mind the source of this post’s title, Erving Goffman, because it goes beyond the privacy of data elements to the composition of the employee self in relation to the employer and other employees. Goffman treated face-to-face interaction as theatrical performance, and distinguished a “backstage” in which people could be themselves and prepare for performance. In some ways, what we have done by not (in the US) extending employee privacy rights from the private physical spaces (e.g., lockers) to the employer-sponsored electronic media on which many employees live is to get rid of the backstage, and social media intensifies the self-expression. (Employee self-expression online has long resulted in countless workplace disputes and more recent broad NLRB protection of certain content.)

The big question, restated in Goffman’s terms, is to what extent and how employers will allow employees to have a backstage. Not all of us need a backstage to be creative and productive and authentic, but others are quite clear that they do.  So, as most work becomes more and more temporary and part-time, and employers focus more and more on creating honest, bilateral “alliance” relationships, a concrete question the employer bilingual in Goffman and Hoffman might ask is:

How can I monitor what I need to monitor while still providing enough of a backstage for the ones who need it, enabling the alliances we want?

And the digital workplace privacy policy may even become a document that applicants and employees want to read, because it might speak directly to their ability to have authentic relationships in the workplace.

482262_634181002280063750

Posted on Thursday, July 10 2014 at 9:21 am by

Healthcare in 20 Years & 10 Years, & How to Reconnect Your Brain and Body Now

1.  In 20 Years, It’s Plug & Play:

In 20 years, humans will finally attain the status of cars for their medical care. They’ll have wearable and embeddable sensors with predictive analytics, and, most importantly, autonomous driving capabilities. Most cases of cancer will be successfully treated, Alzheimer’s will be substantially delayed or even pre-empted. DNA sequencing will be performed for most individuals at birth (or as a fetus). Hospitals, except for certain key functions like intensive-care units and operating rooms, will be completely transformed to data-surveillance centers. People will look back and laugh about the old physical office visit and the iconic ‘stethoscope’ along with the way so much of health care was rendered in the pre-digital era.

— Eric J. Topol, chief academic officer of Scripps Health and professor of genomics at the Scripps Research Institute, quoted in the Wall Street Journal

2.  OK, So What About in 10 Years?

INFO-mHealth

Click on the infographic to enlarge

3.  But What About Now?  Surely you can’t rewire the connection between the brain and the body?

http://www.popsci.com/article/science/how-it-works-system-reverses-paralysis

For many more every day, read

Posted on Saturday, June 28 2014 at 7:20 pm by

Why Healthcare Providers Should Take from Data Brokers, & Why Privacy Advocates & Regulators Shouldn’t Try to Stop Them

3930592049_ef0299fa38_z

Many brilliant people with the very best of intentions felt or expressed dismay this week at a good article with the body-snatchers-invasion-class title, “Your Doctor Knows You’re Killing Yourself. The Data Brokers Told Her.” Surely the marauding hordes of data brokers (the targets of Federal Trade Commission investigations!), should be kept far away from the sacrosanct relationship between doctor and patient! The article ends with an ethicist intoning that the strategy “is very paternalistic toward individuals, inclined to see human beings as simply the sum of data points about them.”

I thought about that article and that ethicist yesterday as I sat with my family in a travel clinic giving us the shots we would need for a cultural exchange program in a village in India. The great doctor and nurse in the clinic knew nothing about us (Well, thanks to an electronic medical record, they knew about 1 pain med I didn’t even take a decade ago, which they dutifully asked me about). They didn’t even know any of the information we had entered on-line for them this week (The nurse explained that “On-line, you’re just a number.”).

My mind flashed as it often does between many years of helping friends in the public health world as they spent many years testing hypotheses in search of truths applicable generally across vast populations and my current life working with big data initiatives producing real-time, actionable, individual-focused information. I thought how eager the four of us (and I assure you that my professor wife and clever children are not big data lawyers) would have been for this doctor and nurse to get the scoop on us, in the case of the children to engage them more and in the case of the parents to give us the warnings we most need to hear. And I thought as I have for many years about paternalism in medicine.

I got into health care law in 1982, my first year of law school, in part because that was the year Paul Starr released his seminal book, “The Social Transformation of American Medicine.” It was a great study on the creation followed by the corporatization of a profession. There was a lot of talk from leaders as well as scholars in those days about medical costs and the profession of medicine; I for one will never forget Joe Califano’s “Medicine is too important to be left to doctors and politicians.” So many big failures and little successes followed; the constant of massive federal lobbying (in my humble opinion) served as birth control and an occasional abortion against effective health care reform.

If we fast-forward to today, so much has not changed yet in health care. The biggest changes on the immediate horizon may be role of apps, mobile devices and home monitoring in personal health. As even the U.S. Supreme Court acknowledged this week, mobile devices have our whole lives on them, incredibly rich information so much of which bears directly on our health. The next iterations of iOS and Android will make integration of that information work better than ever. And the data on the fitness apps to date shows that they become effective (like Weight Watchers and AA) when others are watching. When others are watching, a health or fitness app becomes an effective “commitment device;” when others are not watching, those apps can be a little like Odysseus tying himself to the mast with a slip knot, only to crash his ship into the same rocks that have sunk countless New Years’ resolutions.

In our “bowling alone” society distrustful of the unregulated purveyors of health and fitness apps, however, where can you find a trustworthy ship or crew to serve as your commitment device? Oh, look! The big failures to change our health care system into something better have left your doctor–not so deprofessionalized after all–as someone you trust! And look! Your doctor is regulated by relatively stringent privacy and information security rules (including rules against marketing to you without your authorization), and if she enters a relationship with one of those apps in which she shares your information or the app creates information on her behalf, it is subject to most of those same regulations! And then (to get back at last to the beginning of this post) there is all the other information on you both on and off your mobile device, information, e.g., about your food, your activity levels and your stressors; that information, too, when received by your provider, becomes subject to not only those privacy and information security rules, but to more stringent state rules and the rules of professional ethics.

Finally, I ask you this: Which is the most and least paternalistic to you as a patient: (1) to give you the choice about whether your healthcare provider really knows you or knows only your self-reported issues, (2) to force her to know things about you that you haven’t reported, or (3) to make sure that she tells you what to do knowing only the few facts you have given her that day? With electronic medical records and health information exchanges, we have apparently already made the choice for (2) and are spending a great deal of money to try to make it work. Much more inexpensively, health care providers can have much richer information about you and your health than is often available on an EMR or in an HIE, and can be the most trusted repository available for that information. I submit to you for that reason that providers will not be serving their patients well if in the very near future they are not taking and using information from data brokers, at least giving patients the choice described in (1), and privacy advocates and regulators will be preventing important improvements in our health and healthcare systems if they prevent providers from taking and using such information.

Posted on Sunday, June 8 2014 at 3:18 pm by

Privacy for Franchisors: Tough Regulation without Standards or Scalability

 

WAR & CONFLICT BOOK ERA:  WORLD WAR II/THE HOME FRONT/RELOCATION

Where a zealous regulator has a great deal of power, but no published standards or accountability to legislatures or courts, and appears to exercise limited discretion in applying a single, onerous penalty on all entities it regulates, your object as a regulated entity might well be to escape the notice of the regulator.  At the time of this writing, that is the situation faced by franchisors whose privacy and information security practices are regulated by the Federal Trade Commission (FTC).

In the Aaron’s and Wyndham cases, the FTC made it clear that franchisors are in its crosshairs, but the FTC was shooting from opposite directions in the two cases.  Aaron’s[1] is fundamentally about holding the franchisor responsible for software installed by franchisees, and the FTC forges the link in ways that should give franchisors pause, i.e., through the communication and IT support platforms provided to franchisees.  For example, factors considered by the FTC included the following common “mistakes”:  The franchisor allowed franchisees to access software designer’s website (without which they couldn’t activate software), the franchisor’s server was used to transmit and store emails containing content obtained with the software, and the franchisor provided franchisees with tech support for the software.

In Wyndham,[2] on the other hand, the fundamental issue is whether the franchisor has exerted enough control, in a multitude of areas of information security, over the franchisees to establish an information security program that the FTC deems “reasonable.”  Aaron’s mistake of commission (and vicarious liability) was a piece of privacy-invasive software; Wyndham’s mistakes of omission were all the things it did not do to create a comprehensive information security program in the FTC’s eyes, as evidenced by its three security breaches in two years (not a large number for a large hospitality chain).  Both cases were built on both of the fundamental areas of authority claimed by the FTC over privacy and information security:  the relatively uncontroversial “deception” authority to enforce privacy and security “promises” in privacy policies, and the more controversial “unfairness” authority to enforce “reasonable security.” Unfairness authority lies at the heart of Wyndham, however, and when Wyndham became the first entity to challenge that authority in court, the FTC received its first judicial affirmation of both its unfairness authority and its ability to enforce that authority without published standards.[3]

Why would an agency trying to raise standards for the security of personal information avoid giving notice of its standards?  Federal Trade Commissioner Maureen Ohlhausen recently offered remarks[4] that clarify just how important this strategy is to the FTC.  In short, her argument is that given widespread innovation and the rate of change in technology, the information regulators need to gather in order to promulgate regulations is so widely dispersed and ephemeral that notice-and-comment rulemaking is stale by the time it is promulgated and carves regulatory categories unfit for their purposes.  Her solution is the FTC’s Section 5 “unfairness” jurisdiction, which gathers information only from the parties and makes judgments on those specific facts, calling it “ex post regulation.” She notes that while the results only bind the parties, others can and should look to the results as evidence of how the FTC would regard similar facts, and that “when the FTC weighs that precedent in future cases, it can then consider any changes in the underlying facts.”

If you are trying to run a business, you might find ex post regulation an elegant solution for the regulator but at least worrisome in that the rules regarding your facts are not known in advance.  Those who know the FTC’s settlement agreements – almost always involving 20 years of monitoring – find it more troubling.  Perhaps most troubling is the knowledge that the consent orders obtained generally involved no admission of wrongdoing, and represent practical business decisions by enterprises wishing to avoid years of ruinous litigation and damage to their reputations, rather than judgments of courts on the merits.

Commissioner Ohlhausen is well aware of the amount of power ex post regulation gives the FTC, and perhaps for that reason starts her speech with “Principle 1: Regulatory Humility.”[5]  Professors Solove and Hartzog made the case, in a very thoughtful and influential article written before her remarks and somewhat inconsistent with them, that the FTC has exercised, if not humility, then at least restraint in the actions it has brought, providing justification for current trend of viewing FTC privacy and information security consent orders under its Section 5 unfairness and deception authorities as development of a “common law.”[6]

The FTC’s actions may not have lived up to the justification that Professors Solove and Hartzog have developed for them, nor to the principle of humility.  For example, when an administrative law judge recently ordered the FTC to disclose its “unfairness” information security standards in the LabMD case,[7] the FTC did not claim that the security provisions mentioned in its more than fifty information security cases constitute precedent; it generally confirmed that every judgment is case-specific.[8]   By the same token, the FTC does not ask its experts in the cases it brings to review its settlement agreements; rather it asks only for–and then relies on–a case-specific judgment based on the expert’s (mostly technical) security expertise; that is ex post information security regulation in action.[9]

The LabMD case is a very important one in that there the FTC is applying its ex post standards not to an entity the information security obligations of which are uncertain, but to an entity whose obligations regarding consumer information security are covered by one of the most detailed regulatory structures in the country, the rules under the Health Insurance Portability and Accountability Act of 1996 (HIPAA).  Moreover, HIPAA security standards – particularly as they apply to small health care providers like LabMD — are by Congressional design and regulation lower than FTC standards,[10] so the imposition of higher standards frustrates Congressional and HHS choices.  Thus if the FTC can apply its Section 5 authority to LabMD, it can arguably apply that authority to any entity in commerce, regulated or not.

The US health care marketplace resembles the franchise economy in consisting of dispersed networks of large and small entities, the smaller of which have limited resources for information security.  The 1996 HIPAA statute therefore stated that in promulgating information security regulations, the Secretary of HHS must take into account “the needs and capabilities of small health care providers and rural health care providers (as such providers are defined by the Secretary),”[11] and the preamble to the HIPAA Security Rule states accordingly that one of the foundations of the rule is that “it should be scalable, so that it can be effectively implemented by covered entities of all types and sizes.”[12]

This principle of scalability is not only a HIPAA requirement; it is basic to pragmatic information security; a small entity can only do what it can do, so it needs applications that take care of the security issue as much as possible, by default.  If a small entity takes on a big risk (e.g., a large data file), it cannot do so with the same IT staffing of a large entity, so it needs guidance to outsource, e.g. to a secure cloud provider, not guidance to use tools that–even if it could identify them–it would never properly deploy and integrate and effectively use.

FTC Chair Edith Ramirez expresses the starkly contrasting position of the FTC:

The FTC conducts its data security investigations to determine whether a company’s data security measures are reasonable and appropriate in light of the sensitivity and volume of consumer information it holds, the size and complexity of its data operations, and the cost of available tools to improve security and reduce vulnerabilities.[13]

Her statement is quite accurate: The FTC’s standards vary only by the risk associated with the information and the cost of “tools,” not including the availability of knowledge of those tools and not including – and this is critical in the information security area – the cost of implementing and integrating those tools, and the cost of taking action in response to the complex signals of many detection tools, which in fact require large IT staffs not feasible for small entities.  The FTC thus has neither a mandate nor a mission to consider the regulated entity and the feasibility of compliance, scalability, the availability of knowledge of which tools are the best, and the ability to integrate technical tools rather than just buying something off the shelf.

Judge William Duffey of the Northern District of Georgia got a look at this case before deciding that the federal courts have no jurisdiction to do anything about it yet, and offered a lot of advice in open court that underscores the big question of whether the FTC, now apparently “clothed with immense power” by the Wyndham decision, can exercise responsible discretion or Commissioner Ohlhausen’s first principle of “humility” (including whether federal courts can help with those lessons after the FTC”s administrative process is complete).  He said:

I think it’s the responsibility of the government to be fundamentally fair to the people that it’s regulating, and that it would be in your interest and I would hope your motivation as an employee of the government…. [H]ow does any company in the United States operate when they are trying to focus on what HIPAA requires and to have some other agency parachute in and say, well, I know that’s what they require, but we require something different, and some company says, well, tell me exactly what we are supposed to do, and you say, well, all we can say is you are not supposed to do what you did.[14]

Remarking at the notion that a small laboratory about to go out of business should be subject to the 20 years of monitoring that is a universal feature of the FTC’s consent decrees, he suggested that the FTC consider:

a good faith, transparent, authentic discussion about what your concerns are, and trying to get those allayed by some process which would not be a twenty-year monitoring. You know, I have defended people that had twenty-year monitoring responsibilities by an agency, big companies, and it’s very, very expensive, and it’s really intrusive, and in my personal opinion, having been on both sides, they generally are not necessary. But there is never a middle ground. There should be.[15]

For now, however, there is no middle ground.  A franchisor has no option but to act on some difficult decisions.

_____________________________________________________________________________________________________________________

Disclosure: KTS has represented LabMD.

Photo: Japanese-American-owned grocery, 1942.

[1] http://www.ftc.gov/enforcement/cases-proceedings/122-3256/aarons-inc-matter

[2] http://www.ftc.gov/enforcement/cases-proceedings/1023142/wyndham-worldwide-corporation

[3] FTC v. Wyndham Worldwide Corp., No. 2:13-cv-01887-ES-JAD, 2014 BL 94785 (D.N.J. Apr. 7, 2014).

[4] The Procrustean Problem with Prescriptive Regulation , Remarks of Maureen K. Ohlhausen, Commissioner, U.S. Federal Trade Commission to the Sixth Annual Telecom Policy Conference of the Free State Foundation, Washington, DC, March 18, 2014.  Commission Ohlhausen noted that “The views expressed in these remarks are my own and do not necessarily reflect the views of the Federal Trade Commission or any other Commissioner.”

[5] For a good article on how fair notice principles could be considered by the FTC, see  Stegmaier, Gerard M. and Bartnick, Wendell, Psychics, Russian Roulette, and Data Security: The FTC’s Hidden Data Security Requirements (May 9, 2013). George Mason Law Review, Vol. 20, No. 3, pp. 673-720, 2013. Available at SSRN: http://ssrn.com/abstract=2263037

[6] Solove, Daniel J. and Hartzog, Woodrow, The FTC and the New Common Law of Privacy (August 15, 2013). 114 Columbia Law Review 583 (2014); GWU Legal Studies Research Paper No. 2013-120; GWU Law School Public Law Research Paper No. 2013-120. Available at SSRN: http://ssrn.com/abstract=2312913 or http://dx.doi.org/10.2139/ssrn.2312913

[7] http://www.ftc.gov/system/files/documents/cases/140501labmdordercompel.pdf

[8] Transcript of the Testimony of Daniel Kaufman, May 12, 2014, at http://assets.law360news.com/0543000/543678/LabMD-Kaufman-Transcript.pdf and http://www.phiprivacy.net/wp-content/uploads/LabMD-Kaufman-Transcript.pdf

[9] See, e.g., Expert Report of Raquel Hill, Ph.D., included on p. 19 at http://www.ftc.gov/system/files/documents/cases/140502mtnlimitexpertrpt.pdf

[10]   See fns. 11 & 12, infra.

[11]42 U.S. Code § 1320d–2(d)(1)(A)(v).

[12] )68 Fed. Reg. 8,334, 8,335 (Feb. 20, 2003)

[13] PREPARED STATEMENT OF THE FEDERAL TRADE COMMISSION on Protecting Personal Consumer Information from Cyber Attacks and Data Breaches, Before the Committee on Commerce, Science, and Transportation, US Senate, Washington, D.C., March 26, 2014.

[14] http://docs.law.gwu.edu/facweb/dsolove/Information-Privacy-Law/files/LabMD%20Transcript%202014-05-07.pdf, at 94

[15] Ibid., at 89

Posted on Friday, May 30 2014 at 10:35 am by

KT IP Industry Summary: Rockes Get no Personal Jurisdiction over Pebble Beach from Keyword Ads

3Harikalar_Diyari_Flintstones_06018_nevit

Rocke v. Pebble Beach Co., CIV.A. 12-3372, 2014 WL 1725366 (E.D. Pa. Apr. 29, 2014)

On a vacation to California with her husband, Mrs. Rocke decided to get a massage at The Spa at Pebble Beach. After putting on the spa slippers supplied by the spa, Mrs. Rocke tripped, falling and hitting her head. When she returned to her home state of Pennsylvania, the Rockes brought suit against The Spa in district court.

In alleging jurisdiction, the Rockes claimed that the defendant had sent direct mailings and email solicitations to Pennsylvania citizens and had also advertised in national golf magazines and solicited the business of Pennsylvanians through its website.

The district court dismissed the complaint for lack of personal jurisdiction, and the Third Circuit affirmed, likewise finding no general or specific jurisdiction, but nevertheless remanded, holding that the plaintiffs were entitled to jurisdictional discovery.

Following discovery, the plaintiffs argued a new basis for jurisdiction—the defendant’s practice of purchasing keyword advertisements from Google. The plaintiffs contended that “[t]he intent behind the advertising campaign [was] to attract customers from around the world, including Pennsylvania, to Pebble Beach Resorts.”

The court rejected the argument, reasoning that “[b]y purchasing AdWords, Pebble Beach (like many other businesses utilizing the same marketing technique) seeks to make itself more visible to anyone in the world who searches Goggle using certain keywords or search terms.” The court found that plaintiffs failed to produce any evidence that the defendant purchased Pennsylvania-specific AdWords in order to solicit Pennsylvania business” and held that “[t]he mere fact that Pennsylvania residents are potentially swept up in the broad ocean of people to whom Pebble Beach is advertising through AdWords is not even direct a contact, much less continuous and systematic one.”

While the decision leaves open the possibility that keywords targeted to Pennsylvanians might have yielded a different result, the court made clear that simply purchasing keyword advertisements is insufficient to subject a defendant to nationwide jurisdiction.

Posted on Friday, April 11 2014 at 6:14 pm by

DOJ and FTC Release Policy Statement on Antitrust Implications of Information Sharing

hie-information-sharing-2011-08-10

Authors:

Yesterday, the Federal Trade Commission (“FTC”) and the Department of Justice (“DOJ”) – the two federal antitrust enforcement agencies – issued a joint antitrust Policy Statement regarding arrangements through which industry participants, including competitors, share cybersecurity information. The statement outlines the agencies’ enforcement policy and analytical approach to information exchanges focused on cybersecurity issues. The Policy Statement makes clear that the antitrust laws do not stand as “a roadblock to legitimate cybersecurity information sharing.” In fact, the Assistant Attorney General for DOJ’s Antitrust Division, Bill Baer, called the Policy, which gives industry a great deal of leeway to share cybersecurity information, an “antitrust no-brainer.”

The antitrust laws have traditionally treated exchanges of information between and among competitors with a fair amount of suspicion. This Policy Statement is intended to give companies, even if they compete, the green light to share much needed information to protect against cyber-attacks. And it serves as yet another indicator that the agencies consider cybersecurity as a major threat to the nation’s economy and security. From an antitrust perspective, the ability for industries to collaborate to prevent attacks is particularly relevant, given the FTC’s recent spate of aggressive investigations and enforcement actions for data breaches that expose consumers’ sensitive information, both personal and financial data, to either unintentional disclosure or theft by cybercriminals.

The agencies and the Obama administration have recognized the complexities associated with addressing these rapidly evolving threats, which require both companies and government agencies constantly to adapt to defend against new types of attacks. Given both the economic and national security concerns raised by cybersecurity, the Obama administration issued a February 2013 Executive Order on the importance of government/business collaboration on cybersecurity. That order in turn led the National Institute of Standards and Technology (“NIST”), in February 2014, to issue a voluntary cybersecurity framework.

As the FTC and DOJ note in their Policy Statement, public/private collaboration alone cannot solve the cybersecurity issues that U.S. companies face. Companies must also collaborate to share information about emerging threats, as well as to share potential solutions. In fact, the most useful information-sharing is often not from government or from other areas of the economy, but among companies in the same industry–whether energy, financial services, retail, healthcare or hospitality—which tend to be targeted by similar malware and/or the same groups of attackers. The agencies note that some formal and informal private-to-private information sharing mechanisms (like Information Sharing and Analysis Centers (“ISACs”)) do exist in certain industries, but note that some companies have expressed a reluctance to share information with their competitors due to antitrust concerns.

To allay these concerns, the Policy Statement outlines the agencies’ general policy on information exchange, as well as the specific analysis they apply to exchanges of cybersecurity information. The agencies’ approach to information sharing is spelled out in the 2000 Competitor Collaboration Guidelines and the 1996 Health Care Guidelines. Generally speaking, the agencies are primarily concerned with exchanges involving competitively sensitive information – e.g., recent, current and future pricing, cost information, and output information – because such exchanges might facilitate market allocation or price fixing among competitors.

Generally, information exchanges, without more, are not illegal per se. Instead, the antitrust agencies apply a balancing test known as the “rule of reason,” which weighs the potential procompetitive benefits associated with an exchange against the anticompetitive harm that might result. In performing this analysis, the agencies focus on the context in which the information is exchanged, the parties exchanging the information, the nature of the information exchanged, and whether the exchange generates any procompetitive benefits, like increased efficiency, lower costs, or increased output.

In the Policy Statement, the agencies walk through how the general information exchange analysis would apply to exchanges of cybersecurity information. First, the agencies note that such exchanges increase efficiency and improve information security, both of which are procompetitive. Second, the agencies address the nature of the information, explaining that cybersecurity information tends to involve highly technical information. For example, the agencies note that companies might exchange a known source IP address for a denial of service attack or a threat signature for a new type of attack. Information such as this is not the type of competitively sensitive information relating to price, cost, or output that generally concerns the agencies. Accordingly, if companies confine their sharing to technical information that does not reveal information traditionally treated as competitively sensitive, the antitrust risks should be minimal, at best.

Finally, the agencies consider any potential harm to competition caused by an exchange of cybersecurity information. Due to the fact-specific nature of this inquiry, the agencies reference DOJ’s October 2000 business review letter to the Electric Power Research Institute, Inc. (“EPRI”), in which it analyzed a proposed cybersecurity information exchange program.

EPRI is a nonprofit organization focusing on technological solutions to issues in the energy industry. It proposed exchanging information concerning best practices and information relating to vulnerabilities. In time, EPRI anticipated its members engaging in discussion or analysis of real-time cybersecurity threats. In evaluating the exchange, the DOJ noted that the information exchanged would focus on cyber and physical security, and that EPRI had said it would not allow participants to exchange either price or cost information, or vendor recommendations. Ultimately, the DOJ concluded that:

 [a]s long as the information exchanged is limited…to physical and cybersecurity issues, the proposed interdictions on price, purchasing and future product innovation discussions should be sufficient to avoid any threats to competition. Indeed, to the extent that the proposed information exchanges result in more efficient means of reducing cybersecurity costs, and such savings redound to the benefit of consumers, the information exchanges could be procompetitive in effect.

Both the new Policy Statement and the underlying EPRI business review letter should give companies comfort that collaborating on cybersecurity issues with competitors will not lead to scrutiny from the agencies. Nonetheless, counsel should be careful to remind participants in such exchanges to keep them focused on technical issues, as broadening the scope of the discussion to include vendor recommendations, pricing, or cost will create antitrust risk.

 

Posted on Wednesday, April 2 2014 at 10:33 am by

Message to the White House on Big Data’s Range of Apprehension, Privacy and Unfairness

Powerloom_weaving_in_1835

“The risk reasonably to be perceived defines the duty to be obeyed, and risk imports relation; it is to another or others within the range of apprehension.”

As you know, Helen Palsgraf was on a train platform in the early 20th Century when a man, jostled in trying to board a train while carrying a package of fireworks, dropped the package, and the explosion made scales fall on her.  In denying her rights of recovery against the railroad, Judge Cardozo’s words above greased the wheels of the dumb mechanical networks and satanic mills of the first industrial revolution (belatedly, as law tends to do), where scales could fall on Ms. Palsgraf unforeseeably.  The railroad’s range of apprehension–ability to anticipate–then was limited, but how can law handle the information networks of this industrial revolution, which expand the range of apprehension as far as we want in almost any direction we want?   If we’re smart, we don a blindfold where we don’t need and want to know, and tell people about it, like Google not letting Glass do facial recognition.  Because “intentions are more valuable than identities” in big data economics, some companies are destroying data that enable or at least facilitate reidentification and protecting data against reidentification.   As the many types of possible inappropriate/impermissible discrimination associated with big data become more and more clear, these de-identification or pseudonomization efforts may in some cases have to give way to approaches that protect people by identifying them; for this reason, we can expect some traditional contrasts between US anti-discrimination law and European privacy law to continue in the big data economy.

Again and again, it is one of the most difficult questions for electronic communication networks and websites, and ever more so in the big data economy:  To what extent do we have an obligation to know–and then to act on–everything on our servers or in our cloud, to police the content that users enter, to take action to protect something or someone, to report violations?  What obligations does the “100% auditability” of big data create?   And the obligations keep growing.   A recent FTC investigation found a franchisor responsible for privacy violations of its franchisees in part  because the franchisees’ actions were documented on the franchisor’s email system, as how many franchisees’ actions are not?

A great example of somebody who just left his blindfold and maybe his brain in the car is the automotive executive who said at a conference that  “We know everyone who breaks the law, we know when you’re doing it. We have GPS in your car, so we know what you’re doing. By the way, we don’t supply that data to anyone.”  When his statement was reported, he immediately retracted it, stating that “We do not track our customers in their cars without their approval or their consent.”  (Note to consumers: Check your purchase agreements, leases, financing agreements and insurance policies/applications.)

There are global and industrial-strength issues, however, in which we cannot get away with a blindfold–cannot avoid the benefits and burdens of foresight–as easily as we can with privacy and fairness.  How much longer will commercial airliners be flying without universal GPS tracking after Malaysia Airlines Flight 370?  Even by 2008, there were more non-human nodes online than people; by 2020, there may be more than 50 billion non-human nodes online.   When people are not the surveillance targets, the technology designer, manufacturer or user has more trouble donning a blindfold.  Imagine inspecting a factory or a construction worksite wearing Glass.  Thousands of pictures have been taken of everything, and have been uploaded into tools capable of predictive pattern detection as never before, as a scan with Glass of the  East New York Long Island Rail Road station in the 1920s might have revealed scales that would fall even if some fireworks exploded far away on the train platform.  Even if exposures could not have been detected, there are still all of those pictures, each potentially more damning than the “smoking gun” email in or out of context in today’s product liability litigation.

And what of the machines, buildings and other products created in the big data economy, sending off information about everything around them for their entire lives?   The scales themselves at the train station would be admitting their proclivity to fall 24/7, perhaps identifying elevated threat levels when packages containing fireworks entered the train station, themselves of course broadcasting 24/7.    Who will be responsible for being on the receiving end of such messages for the entire life of a building, a bridge or an airplane?  Is that the same entity that will keep the software updated and protected for that whole life?   The same entity responsible for incident response and safety?   Or, with no range of apprehension to limit the duty to be obeyed, will we allocate non-consensual duties in new ways?

Palsgraf  may be the most famous U.S. case of the 20th Century on non-consensual duties (torts, but also property and statutes).   FIPPs or no FIPPs, non-consensual duties look like they are multiplying exponentially in the emerging internets of everything, if only because there will be so many interactions with so many different types of intelligent objects in any moment as to make even the best efforts to create meaningful choice incomplete.  So it looks like we’re headed into another industrial revolution in which individuals may be swept up in systems over which they have no control (like the satanic mills), and therefore non-consensual duties will play a big role, both outside and within “trust networks,” in the areas of  both privacy and fairness.   We need norms for privacy and fairness, which is what the current White House big data initiative is about.   We can be so much better than we were in the Palsgraf era, though, because when we take off the blindfold, the probabilities of harms and benefits can be infinitely more contextually informed now than they were then through the use of predictive analytics itself, and better-informed judgments about risk of harm can be the basis for more well-grounded non-consensual duties of privacy and fairness.

Posted on Monday, March 31 2014 at 5:51 pm by

Fifth Circuit Upholds Finding That Overly Broad Confidentiality Agreement Violated the National Labor Relations Act

 

Authors

We have reported in previous Legal Alerts that the National Labor Relations Board (“NLRB”) is closely scrutinizing employers’ personnel policies and workplace rules to identify language that unlawfully restricts employees’ rights under the National Labor Relations Act (“NLRA”). On March 24, 2014, the United States Court of Appeals for the Fifth Circuit upheld an NLRB finding that an employee confidentiality agreement’s provision prohibiting employees from disclosing “financial information” and “personnel information and documents” to outsiders violated the NLRA.

The Court’s Decision in Flex Frac Logistics, L.L.C. v. NLRB

The NLRA gives employees the right to act concertedly for the purpose of collective bargaining or other mutual aid and protection. The NLRB and the courts have recognized that this statutory provision gives employees the right to discuss wages and other terms and conditions of employment with each other and with a union. An employer’s interference with this right, whether by disciplining employees who exercise the right, by expressly prohibiting such conduct, or by merely maintaining a workplace rule that chills the exercise of the right, is an unfair labor practice under the NLRA.

In its recent decision in Flex Frac Logistics, L.L.C. v. NLRB, the Fifth Circuit reviewed an NLRB decision finding that an employee confidentiality agreement unlawfully interfered with employees’ right to engage in protected concerted activity. The confidentiality agreement prohibited employees from disclosing confidential information to anyone outside the company and defined “confidential information” to include “financial information” and “personnel information and documents.” Although the agreement did not explicitly prohibit the disclosure of wage information, the Fifth Circuit found sufficient evidence to support the NLRB’s conclusion that the agreement would chill employees’ exercise of their rights under the NLRA because employees would reasonably believe that the agreement prohibited the disclosure of wage information. The court noted that the agreement’s definition of confidential information included “financial information,” which necessarily encompasses wages, and that the reference to “personnel information” was not limited so as to exclude wage information. Moreover, the court stated that the NLRB did not have to base its finding of an unfair labor practice in this regard on evidence that employees did, in fact, interpret the confidentiality agreement as restricting their disclosure of wage information to outsiders. It is sufficient, the court held, that the language of the agreement would reasonably tend to chill employees’ exercise of their NLRA rights. Because the language at issue here could reasonably be interpreted as barring disclosure of wage rates, the court upheld the NLRB’s ruling that the confidentiality agreement violated the NLRA.

Practical Implications

In recent years, the NLRB has applied increasing scrutiny to employee handbooks, workplace rules, and employment contracts to identify provisions that may reasonably be interpreted as prohibiting conduct protected by the NLRA, even going so far as to find a handbook provision requiring employees to be courteous to others unlawful. As the Flex Frac Logistics decision illustrates, employers cannot rely on the courts to rein in the NLRB’s excesses in this area. Employers should therefore be proactive in reviewing their personnel policies and employee agreements to ensure that they do not contain provisions that are so broadly worded as to infringe upon employees’ right to act in concert with respect to the terms and conditions of employment. Often, the addition of only a few clarifying words can make the difference between a lawful rule that furthers the employer’s legitimate business interests and an unlawful rule that can lead to costly unfair labor practice proceedings.

Posted on Tuesday, March 18 2014 at 1:38 pm by

New York Department of Financial Services Begins Accepting Applications for the Establishment of Regulated Virtual Currency Exchanges

 

On March 11, 2014, the New York Department of Financial Services (“NYDFS”) issued a public order announcing that the NYDFS will consider formal proposals and applications for the establishment of regulated virtual currency exchanges operating in New York. It is expected that the NYDFS will expand its oversight to include those virtual currency exchanges doing business with New York residents. The NYDFS stated that formal proposals and applications may be submitted immediately and may be modified by the applicant through discussions with the NYDFS during the application process to ensure strong legal and operational controls, including anti-money laundering (“AML”), cyber security and consumer protections.

The Order is the NYDFS’ latest step toward regulation following its January 2014 public hearing,  which explored potential regulatory frameworks for virtual currency-related transactions. This move by the NYDFS is also a result of recent events, including the collapse of Mt. Gox, the vulnerabilities in the virtual currency markets and the need for stronger oversight through regulation. The NYDFS also stated that it continues to work on regulations, including a “BitLicense” specific to virtual currency transactions and activities, and intends to propose a regulatory framework no later than the end of the second quarter of 2014.

Although the Order does not provide specific guidelines for virtual currency exchange applications and proposals, we expect that the requirements will be similar to money services business, potentially including bond and net worth requirements. Firms should also ensure their proposals include robust internal control systems covering AML, cyber security and consumer protections. Specifically, and similar to other financial institutions covered under the Bank Secrecy Act and its implementing regulations, applicants should establish AML programs tailored to virtual currency transactions that include (i) the development of internal policies, procedures, and controls to combat money laundering; (ii) the designation of a compliance officer; (iii) an ongoing training program; and (iv) an independent audit function to test the program.

For more information, please contact any member of the Financial Institutions team.

Aaron M. Kaslow 202.508.5825 Akaslow@kilpatricktownsend.com

Michael A. Mancusi 202.824.1419 Mmancusi@kilpatricktownsend.com

Stephen F. Donahoe 202.508.5818 Sdonahoe@kilpatricktownsend.com

Erich M. Hellmold 202.639.4734 Ehellmold@kilpatricktownsend.com

Kevin M. Toomey 202.508.5859 Ktoomey@kilpatricktownsend.com