Lessons Learned: Vendor Sued in Class Action Suit for Security Misses

You’re thinking that something about the title of this post sounds familiar, right? Information technology (IT) vendors and third party service providers have been in the spotlight for security breaches for some time (see, for example, vendor-based security lapses affecting Target, CVS, and Concentra, as just a few), and it doesn’t sound surprising that an IT vendor has been sued related to a security incident. After all, whether you’re an IT vendor or an IT customer, if you draft or negotiate contracts for a living, these situations are what you try to contract for, right?

Right…but…the recent federal class action suit filed in Pennsylvania against Aetna and its vendor surfaces several new privacy and security considerations for vendors and their customers. The vendor in question was not an IT vendor or service provider. Instead, the plaintiff’s allegations relate to Aetna’s use of a mailing vendor to send notification letters to Aetna insureds about ordering HIV medications by mail. According to the complaint, the vendor used envelopes with large transparent glassine windows – windows that did not hide the first several lines of the enclosed notification letters. The plaintiff asserts that anyone looking at any of the sealed envelopes could see the addressee’s name and mailing address – and that the addressee was being notified of options for filling HIV medications. As a result, the vendor and Aetna are alleged to have violated numerous laws and legal duties related to security and privacy.

For all vendors and service providers, but especially those that don’t focus primarily on privacy and security issues, the Aetna complaint is enlightening. To these vendors and service providers, and to their customers: Do your customer-vendor contracts and contract negotiations contemplate what Aetna and its mailing vendor may not have?

  • Do your contracts for non-IT and non-healthcare services fully consider the risk of privacy and security litigation? A noteworthy facet of the Aetna case is that the mailing vendor was sued for privacy and security violations that were not exclusively due to the customer’s acts or omissions. That is, while the contents of the mailer certainly were key, the vendor’s own conduct as a mailing services provider (not an IT or healthcare provider) was instrumental in the suit being filed against the vendor (and Aetna). Vendor services that previously didn’t, or ordinarily don’t, warrant privacy or security scrutiny, may, after all, need to be looked at in a new light.
  • Do your contract’s indemnification and limitation of liability clauses contemplate the possibility of class action litigation? Class action litigation creates a path for plaintiffs to bring litigation for claims that otherwise could not and would not be brought. Class action litigation against data custodians and owners for security breaches is the norm, and the possibility and expense of class action litigation is frequently on the minds of their attorneys and contract managers who negotiate contracts with privacy and security implications. But, for vendors and service providers providing arguably non-IT services to these customers – the idea of being subject to class action litigation is often not top-of-mind.
  • Before entering into a contract, have you considered whether the specific vendor services being provided to the particular customer in question implicate laws you hadn’t considered? Vendors that operate in the information technology space – and their customers – generally are well-aware of the myriad of privacy and security laws and issues that may impact the vendors’ business, including, as a very limited illustration, the EU General Data Protection Regulation, HIPAA, New York Cybersecurity Requirements, Vendors that aren’t “IT” vendors (and their customers), on the other hand, may not be. For example, the Aetna mailing vendor may not have contemplated that, as alleged by the Aetna plaintiff, the vendor’s provision of its services to Aetna would be subject to the state’s Confidentiality of HIV-Related Information Act and Unfair Trade Practices and Consumer Protection Law.
  • Have you considered which specific aspects of vendor services may directly impact potential legal liability, and have you adequately identified and addressed them in the contract? No, this is not a novel concept, but it nonetheless bears mention. A key fact to be discovered in the Aetna litigation is whether it was Aetna, or the vendor, that made the decision to use the large-window envelopes that, in effect, allegedly disclosed the sensitive and personally identifiable information. Given the current break-neck pace at which many Legal and Contract professionals must draft and negotiate contracts, however, unequivocally stating in a contract the details and descriptions of every single aspect of the services to be provided is often impractical (if not impossible). But, some contract details are still important.

Whether or not this class action suit is an outlier or is dismissed at some point, consider data security and other privacy and security issues in contracts and how vendor or service provider conduct may give rise to a security breach or security incident.

The Future of Data Privacy: You Can Run but You Can’t Hide (or Can You?)

In Ernest Cline’s dystopian novel Ready Player One, the world’s population is addicted to a virtual reality game called the OASIS. The villain in the book is a large communications company named IOI that will stop at nothing to rule the world—the OASIS virtual world, that is. IOI’s motivation is, simply put, profit, profit, and more profit as it peddles its goods and services in the digital reality. Through subterfuge, spying, rewards, and an assortment of other tactics, IOI gathers intelligence on its users, competitors, and enemies, and then uses that information to its advantage.

But even in a fully-connected, always-on digital world such as the OASIS, people have effective tools against IOI’s tracking. They lie. They throw up roadblocks. They create alternate selves. They create private rooms to hold clandestine chats. They go underground. They disconnect.

In a 2013 survey by Pew Research Center, 86 percent of Internet users stated that they had attempted to minimize their digital footprints by taking affirmative steps such as deleting cookies, using a false name or email address, or using a public computer to mask their identities.1 A 2015 survey by TRUSTe/National CyberSecurity Alliance found that 89 percent of consumers refuse to do business with a company that does not protect their privacy.2 Those are just two of dozens of surveys showing similar metrics.3

In response to users’ privacy concerns over the past decade, consumer-friendly privacy protection tools continue to make their way into the marketplace. For example, VPN privacy protection add-ons are now readily available for web browsers, and some browsers, such as Opera, come with a free VPN built directly into the browser.4 Ad blockers have become so popular that some websites are restricting access if a browser blocks ads on the site.5 And privacy-conscious search engines like DuckDuckGo continue to gain loyal users.6

So what does this have to do with the legal intricacies of data privacy? A lot, actually. As demand increases for privacy tools, more companies are meeting that demand in new and innovative ways. Although the privacy risks inherent in artificial intelligence (AI) are well-documented, we are also seeing companies develop AI applications designed to help protect consumer privacy by creating digital noise, or obfuscation, around a person’s online activities. These tools essentially create new layers of false interests and pretend preferences tied to an individual’s online persona, which makes it more difficult for marketers to know which preferences and opinions are true and which are false.7  Expect to see a variety of AI-powered obfuscation and other related tools and services arriving over the next few years as consumers attempt to distract data collectors from real data.

Whether or not these new tools and services are legal will be the subject of much debate, especially by any company being thwarted in its efforts to collect reliable information about a user. Some of these tools will also present novel legal issues related to AI, such as whether an unmonitored chatbot can create a legal contract on behalf of its owner (probably) or whether the owner of an AI tool is always responsible for its activities, even if the AI tool acts contrary to its owner’s instructions (maybe). Then there are the questions of who’s guarding the guards and whether these new privacy tools will eventually be used to collect even more information from consumers.8

In the future, we will certainly see new legislation, regulations, and court holdings affecting how companies and third parties may use personal information of individuals. But technical innovation is much faster and more responsive to consumer demand. As consumers desire better protection for their information, expect to see more privacy tools emerge to help control the types and amounts of data shared with companies and marketers. And as this develops further, these new tools will undoubtedly bring new legal questions and challenges.

This article was originally published in Best Lawyers Business Edition, Summer 2017, p.23.

————————

1http://www.pewinternet.org/2013/09/05/anonymity-privacy-and-security-online/
2 https://www.truste.com/resources/privacy-research/ncsa-consumer-privacy-index-us/
3http://www.pewinternet.org/search/?query=privacyhttps://epic.org/privacy/survey/;https://www.law.berkeley.edu/research/bclt/research/privacy-at-bclt/berkeley-consumer-privacy-survey/
4http://www.opera.com/computer/features/free-vpn
5https://www.pubnation.com/blog/publishers-fight-back-how-the-top-50-websites-combat-adblock
6http://www.digitaltrends.com/web/duckduckgo-14-million-searches/
7https://www.wired.com/2017/03/wanna-protect-online-privacy-open-tab-make-noise/https://www.nyu.edu/projects/nissenbaum/papers/Politicalandethicalperspectivesondataobfuscation.pdf
8In 2016, a popular browser add-on ironically named “Web of Trust” was discovered to be collecting and selling information about its users (see http://www.pcmag.com/news/349328/web-of-trust-browser-extension-cannot-betrusted). In 2017, an inbox management service called Unroll.me was sued for selling user data gleaned from users’ inboxes (see https://www.cnet.com/news/unroll-me-hit-with-privacy-suit-over-alleged-sale-of-user-data/).