Lessons Learned: Vendor Sued in Class Action Suit for Security Misses

You’re thinking that something about the title of this post sounds familiar, right? Information technology (IT) vendors and third party service providers have been in the spotlight for security breaches for some time (see, for example, vendor-based security lapses affecting Target, CVS, and Concentra, as just a few), and it doesn’t sound surprising that an IT vendor has been sued related to a security incident. After all, whether you’re an IT vendor or an IT customer, if you draft or negotiate contracts for a living, these situations are what you try to contract for, right?

Right…but…the recent federal class action suit filed in Pennsylvania against Aetna and its vendor surfaces several new privacy and security considerations for vendors and their customers. The vendor in question was not an IT vendor or service provider. Instead, the plaintiff’s allegations relate to Aetna’s use of a mailing vendor to send notification letters to Aetna insureds about ordering HIV medications by mail. According to the complaint, the vendor used envelopes with large transparent glassine windows – windows that did not hide the first several lines of the enclosed notification letters. The plaintiff asserts that anyone looking at any of the sealed envelopes could see the addressee’s name and mailing address – and that the addressee was being notified of options for filling HIV medications. As a result, the vendor and Aetna are alleged to have violated numerous laws and legal duties related to security and privacy.

For all vendors and service providers, but especially those that don’t focus primarily on privacy and security issues, the Aetna complaint is enlightening. To these vendors and service providers, and to their customers: Do your customer-vendor contracts and contract negotiations contemplate what Aetna and its mailing vendor may not have?

  • Do your contracts for non-IT and non-healthcare services fully consider the risk of privacy and security litigation? A noteworthy facet of the Aetna case is that the mailing vendor was sued for privacy and security violations that were not exclusively due to the customer’s acts or omissions. That is, while the contents of the mailer certainly were key, the vendor’s own conduct as a mailing services provider (not an IT or healthcare provider) was instrumental in the suit being filed against the vendor (and Aetna). Vendor services that previously didn’t, or ordinarily don’t, warrant privacy or security scrutiny, may, after all, need to be looked at in a new light.
  • Do your contract’s indemnification and limitation of liability clauses contemplate the possibility of class action litigation? Class action litigation creates a path for plaintiffs to bring litigation for claims that otherwise could not and would not be brought. Class action litigation against data custodians and owners for security breaches is the norm, and the possibility and expense of class action litigation is frequently on the minds of their attorneys and contract managers who negotiate contracts with privacy and security implications. But, for vendors and service providers providing arguably non-IT services to these customers – the idea of being subject to class action litigation is often not top-of-mind.
  • Before entering into a contract, have you considered whether the specific vendor services being provided to the particular customer in question implicate laws you hadn’t considered? Vendors that operate in the information technology space – and their customers – generally are well-aware of the myriad of privacy and security laws and issues that may impact the vendors’ business, including, as a very limited illustration, the EU General Data Protection Regulation, HIPAA, New York Cybersecurity Requirements, Vendors that aren’t “IT” vendors (and their customers), on the other hand, may not be. For example, the Aetna mailing vendor may not have contemplated that, as alleged by the Aetna plaintiff, the vendor’s provision of its services to Aetna would be subject to the state’s Confidentiality of HIV-Related Information Act and Unfair Trade Practices and Consumer Protection Law.
  • Have you considered which specific aspects of vendor services may directly impact potential legal liability, and have you adequately identified and addressed them in the contract? No, this is not a novel concept, but it nonetheless bears mention. A key fact to be discovered in the Aetna litigation is whether it was Aetna, or the vendor, that made the decision to use the large-window envelopes that, in effect, allegedly disclosed the sensitive and personally identifiable information. Given the current break-neck pace at which many Legal and Contract professionals must draft and negotiate contracts, however, unequivocally stating in a contract the details and descriptions of every single aspect of the services to be provided is often impractical (if not impossible). But, some contract details are still important.

Whether or not this class action suit is an outlier or is dismissed at some point, consider data security and other privacy and security issues in contracts and how vendor or service provider conduct may give rise to a security breach or security incident.

Anti Anti-Virus

In July 2017, Bloomberg reported that the anti-virus and security company Kaspersky Lab has been cooperating with the Russian Federal Security Service (FSB), the name of the Russian counterintelligence agency and successor of the KGB, since 2009.   On September 13, 2017, the US federal government mandated that all software made by Kaspersky Lab be removed from government computer systems.  Retailers such as Best Buy are also taking steps to remove Kaspersky Lab’s products from their retail offerings.

Kaspersky Lab issued a response, claiming that it has done nothing wrong and is merely a pawn in a political game between the US and Russia.  Russia responded to the report by urging Russian companies to only use Russian software.

Although it’s unlikely we will ever have a definitive answer about whether Kaspersky Lab is gathering data for the Russian FSB, this incident highlights a growing concern that foreign governments might be collaborating with software and hardware companies to spy on other governments, corporate enterprises, and consumers.   How can companies protect themselves in this environment?  Consider five things:

  1. A company should have a plan in place to quickly install a replacement if it discovers that software or hardware in its environment has been compromised.  Often this means maintaining a list of alternative providers and, when possible, having a contact at those alternative providers in case a purchase needs to be made quickly.
  2. Prior to making a purchase, conduct a search of news and industry reports on the brand and product to find any stories that might raise a red flag.
  3. After making a purchase, set an online news alert with the product name and “spy,” “spyware,” “malware,” “security issue” and similar terms in the search field (however, this doesn’t work well for network security or anti-virus products, since nearly every news story about those products contain these terms).
  4. Subscribe to security sites, such as SecureList.com or KrebsOnSecurity.com, that track potential security issues affecting enterprises and consumers.
  5. In particularly egregious circumstances, unplug the software or hardware so it stops collecting and transmitting information, but first be aware of how that will impact your other systems.

While there is no surefire way to identify a software or hardware vendor intent on stealing information, these steps can help mitigate damages by notifying companies of any known or suspected issues.  In the end, staying current on security risks is one important factor in defending your company, and yourself, against cyber-mischief.

The Future of Data Privacy: You Can Run but You Can’t Hide (or Can You?)

In Ernest Cline’s dystopian novel Ready Player One, the world’s population is addicted to a virtual reality game called the OASIS. The villain in the book is a large communications company named IOI that will stop at nothing to rule the world—the OASIS virtual world, that is. IOI’s motivation is, simply put, profit, profit, and more profit as it peddles its goods and services in the digital reality. Through subterfuge, spying, rewards, and an assortment of other tactics, IOI gathers intelligence on its users, competitors, and enemies, and then uses that information to its advantage.

But even in a fully-connected, always-on digital world such as the OASIS, people have effective tools against IOI’s tracking. They lie. They throw up roadblocks. They create alternate selves. They create private rooms to hold clandestine chats. They go underground. They disconnect.

In a 2013 survey by Pew Research Center, 86 percent of Internet users stated that they had attempted to minimize their digital footprints by taking affirmative steps such as deleting cookies, using a false name or email address, or using a public computer to mask their identities.1 A 2015 survey by TRUSTe/National CyberSecurity Alliance found that 89 percent of consumers refuse to do business with a company that does not protect their privacy.2 Those are just two of dozens of surveys showing similar metrics.3

In response to users’ privacy concerns over the past decade, consumer-friendly privacy protection tools continue to make their way into the marketplace. For example, VPN privacy protection add-ons are now readily available for web browsers, and some browsers, such as Opera, come with a free VPN built directly into the browser.4 Ad blockers have become so popular that some websites are restricting access if a browser blocks ads on the site.5 And privacy-conscious search engines like DuckDuckGo continue to gain loyal users.6

So what does this have to do with the legal intricacies of data privacy? A lot, actually. As demand increases for privacy tools, more companies are meeting that demand in new and innovative ways. Although the privacy risks inherent in artificial intelligence (AI) are well-documented, we are also seeing companies develop AI applications designed to help protect consumer privacy by creating digital noise, or obfuscation, around a person’s online activities. These tools essentially create new layers of false interests and pretend preferences tied to an individual’s online persona, which makes it more difficult for marketers to know which preferences and opinions are true and which are false.7  Expect to see a variety of AI-powered obfuscation and other related tools and services arriving over the next few years as consumers attempt to distract data collectors from real data.

Whether or not these new tools and services are legal will be the subject of much debate, especially by any company being thwarted in its efforts to collect reliable information about a user. Some of these tools will also present novel legal issues related to AI, such as whether an unmonitored chatbot can create a legal contract on behalf of its owner (probably) or whether the owner of an AI tool is always responsible for its activities, even if the AI tool acts contrary to its owner’s instructions (maybe). Then there are the questions of who’s guarding the guards and whether these new privacy tools will eventually be used to collect even more information from consumers.8

In the future, we will certainly see new legislation, regulations, and court holdings affecting how companies and third parties may use personal information of individuals. But technical innovation is much faster and more responsive to consumer demand. As consumers desire better protection for their information, expect to see more privacy tools emerge to help control the types and amounts of data shared with companies and marketers. And as this develops further, these new tools will undoubtedly bring new legal questions and challenges.

This article was originally published in Best Lawyers Business Edition, Summer 2017, p.23.

————————

1http://www.pewinternet.org/2013/09/05/anonymity-privacy-and-security-online/
2 https://www.truste.com/resources/privacy-research/ncsa-consumer-privacy-index-us/
3http://www.pewinternet.org/search/?query=privacyhttps://epic.org/privacy/survey/;https://www.law.berkeley.edu/research/bclt/research/privacy-at-bclt/berkeley-consumer-privacy-survey/
4http://www.opera.com/computer/features/free-vpn
5https://www.pubnation.com/blog/publishers-fight-back-how-the-top-50-websites-combat-adblock
6http://www.digitaltrends.com/web/duckduckgo-14-million-searches/
7https://www.wired.com/2017/03/wanna-protect-online-privacy-open-tab-make-noise/https://www.nyu.edu/projects/nissenbaum/papers/Politicalandethicalperspectivesondataobfuscation.pdf
8In 2016, a popular browser add-on ironically named “Web of Trust” was discovered to be collecting and selling information about its users (see http://www.pcmag.com/news/349328/web-of-trust-browser-extension-cannot-betrusted). In 2017, an inbox management service called Unroll.me was sued for selling user data gleaned from users’ inboxes (see https://www.cnet.com/news/unroll-me-hit-with-privacy-suit-over-alleged-sale-of-user-data/).

Getting Your Data Back – a Hostage Crisis?

One of the key differences between a cloud computing delivery model and a customer-hosted solution is the service provider, not the customer, possesses the customer’s data under a cloud computing delivery model. At the end of such a relationship the customer needs its data returned. Many service providers’ form agreements, however, do not address when and in what format the data will be returned. Given the vital importance of data to a company’s business, a customer should address this issue prior to entering into such an agreement.

What seems like a relatively simple provision to implement can sometimes lead to surprisingly protracted discussions. Customers often request their data be returned at expiration or termination of the contract (or during the termination / expiration period) in the format requested by the customer. Service providers’ concerns with such a requirement is the customer might request a format that is different than that being used, resulting in expensive and time-consuming file conversion. Or the customer might request some of its data in paper and electronic format, requiring the service provider to print reams of paper. These concerns lead service providers to counter with a provision requiring the service provider to return the data in its then-current format.

This typically leads to the customer raising its concern that the data could be returned in a format that is no longer compatible with the customer’s systems, requiring the customer to undertake the expensive and time-consuming conversion process and causing a material adverse impact to the customer’s business.

What’s the right answer? Each negotiation will be different depending on factors such as the importance of the data, the leverage of the parties, and the amount of data at issue. Service providers, however, must be sensitive to the customer’s concerns of the data being the customer’s lifeblood, and the customer not wanting to be held hostage at the end of the relationship. I’ve seen parties eventually agree that the service provider must return the data upon expiration or termination in a format reasonably usable by the customer at no additional cost to the customer, or in a format reasonably requested by the customer and commonly used in the industry based on the type of data.

 

Negotiating Cloud Contracts

We’ve all heard the phrase. Cloud vendors speak it in somber, authoritative tones as frustrated customers grumble and curse. The phrase? “Sorry, we don’t make changes to our standard contract.”

A virtual fight has been brewing over “the phrase” in the last few weeks. The first volley was fired by Bob Warfield in a blog post called “Gartner: The Cloud is Not a Contract” where Mr. Warfield argues that it’s perfectly reasonable for a cloud provider to use “the phrase.” In fact, he says, if a cloud provider doesn’t say it at every opportunity, the provider risks becoming *gasp* a mere datacenter. He says:

What [the Cloud] is about is commoditization through scale and through sharing of resources which leads to what we call elasticity. That’s the first tier. The second tier is that it is about the automation of operations through API’s, not feet in the datacenter cages.

* * *

Now what is the impact of contracts on all that? First, a contract cannot make an ordinary datacenter into a Cloud no matter who owns it unless it addresses those issues. Clouds are Clouds because they have those qualities and not because some contract or marketer has labeled them as such. Second, arbitrary contracts have the power to turn Clouds into ordinary hosted data centers: A contract can destroy a Cloud’s essential “Cloudness”!

* * *

How do we avoid having a contract destroy “Cloudness?” This is simple: Never sign a contract with your Cloud provider that interferes with their ability to commoditize through scale, sharing, and automation of operations. If they are smart, the Cloud provider will never let it get to that stage.

Mr. Warfield goes on to argue that any deviation to a cloud provider’s contract that impacts scale, sharing, or automated ops essentially destroys the benefit of cloud computing, and results in turning a cloud contract into a managed data center contract. In other words, if a provider is not a “pure” cloud provider, they are a datacenter provider.

Lydia Leong at Gartner quickly escalated the battle, responding in a blog post entitled “The Cloud and Customized Contracts.” Ms. Leong counters that cloud providers should be careful in using “the phrase” since it might not align with their business goals. At the same time, she also cautions that customers looking for substantive customizations to a cloud offering might undermine the cost savings they are seeking:

[A] cloud provider has to make decisions about how much they’re willing to compromise the purity of their model — what that costs them versus what that gains them. This is a business decision; a provider is not wrong for compromising purity, any more than a provider is right for being totally pure. It’s a question of what you want your business to be, and you can obtain success along the full spectrum. A provider has to ensure that their stance on customization is consistent with who and what they are, and they may also have to consider the trade off between short-term sales and long-term success.

* * *

Customers have to be educated that customization costs them more and may actually lower their quality of the service they receive, because part of the way that cloud providers drive availability is by driving repeatability. Similarly, the less you share, the more you pay.

* * *

. . . I believe that customers will continue to make choices along that spectrum. Most of them will walk into decisions with open eyes, and some will decide to sacrifice cost for customization. They are doing this today, and they will continue to do it. Importantly, they are segmenting their IT portfolios and consciously deciding what they can commoditize and what they can’t. . . . [U]ltimately, the most successful IT managers will be the ones who be the ones that manage IT to business goals.

So should a customer negotiate a cloud contract or not? As Ms. Leong states, it depends on the customer’s business demands. If the business demands the lowest cost and is willing to bear additional risk, then a non-negotiated “pure” cloud contract might be best. On the other hand, if the business demands that costs and risks be balanced, or that risk mitigation take priority over cost savings, then a negotiated contract is likely the best option.

Fortunately for customers, market forces are already influencing cloud providers to make their contracts more detailed and customer-friendly. In a recent article about cloud predictions for 2011, journalist George Lawton writes:

As cloud providers compete for new customers, many will begin to extend more elaborate guarantees, concrete remedies and better data transit awareness. The guarantees will provide better legal protection on the control of data. Confident providers will also include more detailed service-level agreements (SLAs) and financial remedies, covering all aspects of the cloud service, that could affect the customer’s business performance. Cloud providers will also offer to provide improved visibility into the movement of data to maintain legal requirements.

If this trend continues and cloud providers include reasonable protections for customers in their standard contracts, then hearing “the phrase” might not be so bad after all. In the meantime, customers must continue to balance cost savings with risk mitigation, and negotiate (or not) accordingly.

Is Your Data in the Cloud Backed Up and Recoverable?

In 2011, Acronis, a backup and recovery solutions provider, launched a Global Disaster Recovery Index for small and medium-sized businesses to measure IT managers’ confidence in their backup and recovery operations. Notably, businesses in the United States scored poorly in their confidence in their ability to execute disaster recovery and backup operations in the event of a serious incident, either in their own environment or a third-party cloud environment.

As companies move various functions to a cloud environment, they can increase their confidence by contractually agreeing to data backup and recovery requirements with their cloud providers. Indeed, customers can specify, as a service level or other contractual requirement, the (a) recovery point objective (“RPO”), which is the point in time to which the provider must recover data, and (b) recovery time objective (‘RTO”), which defines how quickly the provider must restore the data to the RPO.

Too often, however, companies sign cloud agreements without clearly specifying these metrics. Indeed, when a disaster or disruption occurs, many companies are surprised to find their contracts silent on these metrics, and the cloud provider operating under a much less stringent RPO and RTO than the company expected.