RETURN TO SENDER: Aetna to Pay $17M to Settle Claims Related to Vendor Mailer Data Breach

Aetna has agreed to pay $17.2 million and to implement a “best practices” policy regarding sensitive policyholder data, in order to settle class action litigation brought against it arising from a mass mailing sent by one of its mailing vendors. As discussed in a blog post last year, federal class action litigation was brought against Aetna and its mailing vendor in 2017 based on the vendor’s use of glassine envelopes to communicate HIV medication information to Aetna insureds. The envelopes revealed that the named addressee was contacted about options for filling HIV medication prescriptions. The litigation alleged violations by Aetna and its vendor of several laws and legal duties related to security and privacy. The contract lessons for customers and vendors that arise from the events in question, which were identified in the earlier post, remain the same. Do your contracts for non-IT and non-healthcare services fully consider the risk of privacy and security litigation? Do your contract’s indemnification and limitation of liability clauses contemplate the possibility of class action litigation? Before entering into a contract, have you considered whether the specific vendor services being provided to the particular customer in question implicate laws you hadn’t considered? And, Have you considered which specific aspects of vendor services may directly impact potential legal liability, and have you adequately identified and addressed them in the contract? Importantly, the newly announced settlement, itself, provides three bonus lessons. Published data breach cost statistics are helpful, to a point.  In its 2017 Cost of Data Breach Study, Ponemon Institute reports that the average per capita cost of data breach in the U.S. for the period of the study was $225. It also reports that, for the same period, the average total organizational cost in the U.S. for a data breach was $7.35 million. Somewhat remarkable, as part of its settlement Aetna agreed to pay $17.2 million in connection with the breach in question – a figure that is about $10 million over the average reported by Ponemon Institute. But, Aetna’s payment is not out of the ballpark, as averages are averages, after all. Much more remarkable, however, is the per capita settlement amount. Aetna’s settlement figure represents a per capita amount of $1,272 – that number is more than five times the reported average. (For reference, that per capita cost would put Equifax’s settlement number for its recent breach at $185 billion dollars.) Bottom line, when considering or counseling clients as to the financial impacts of data breaches, the average cost figures for data breaches are as important as the qualification of the figures, themselves, as only averages (with any number of data security breaches costing more, or less, than the averages). Data breach cost statistics often do not compare well with litigation settlement amounts.  Yes, Aetna agreed to pay $17.2 million as part of the settlement, as compared to Ponemon Institute’s reported $7.35 million average U.S. data breach cost. While the $7.35 million figure includes forensics costs, customer churn, post data breach costs, and other direct and indirect expenses, the $17.2 million figure is not as comprehensive. It does not include, for example, Aetna’s legal fees incurred to defend and settle the class action litigation, nor does it include other pre-settlement costs and expenses incurred by Aetna. As efficient or helpful as it may be to compare published per capita or per breach data statistics with litigation settlement amounts, it’s also important to identify the full scope of costs and expenses that the published statistics include, as well as what costs and expenses are not included in the settlement amounts. Data breach cost statistics and litigation settlement amounts don’t include non-monetary settlement obligations.  Cost-per-record, cost-per-breach, and litigation settlement figures can be particularly meaningful and relatable, especially when considering or counseling clients as to the potential financial impacts of data security breaches. Notably, however, the material obligations of defendants settling data breach litigation matters typically are not limited to monetary payments. Aetna, for example, as part of its settlement, also agreed to develop and implement a “best practices” policy for use of certain personally identifiable information, to provide policy updates for five years, to provide policy training for certain Aetna personnel for five years, and to require outside litigation counsel to sign business associate agreements, among other commitments. These activities will require Aetna to incur additional costs and expenses, including costs and expenses for internal and, possibly, external resources in connection with the performance of these activities. Supplementing the earlier post on this Aetna class action litigation and lessons learned, the recent Aetna settlement and the new lessons cited above provide an even fuller picture of data and security breach and related contract considerations. Not only is it invaluable to consider data privacy and security issues in contracts and the roles of vendors and service providers, it also is important to consider and counsel clients as to the full potential impacts of data breaches, including potential litigation settlement amounts, costs and expenses in addition to settlement amounts, and non-monetary settlement-related obligations.

Lessons Learned: Vendor Sued in Class Action Suit for Security Misses

The recently alleged data privacy and security incident involving Aetna and one of its vendors is now the basis of class action litigation. Both the vendor and Aetna are named defendants in the class action suit, which raises potentially novel issues for companies and their third-party service providers to consider when contracting for technology and non-technology services that touch data security and privacy matters.

Anti Anti-Virus

While there is no surefire method of identifying a software or hardware vendor intent on stealing information, taking some simple steps can help mitigate damages by notifying companies of any known or suspected issues.

The Future of Data Privacy: You Can Run but You Can’t Hide (or Can You?)

In Ernest Cline’s dystopian novel Ready Player One, the world’s population is addicted to a virtual reality game called the OASIS. The villain in the book is a large communications company named IOI that will stop at nothing to rule the world—the OASIS virtual world, that is. IOI’s motivation is, simply put, profit, profit, and more profit as it peddles its goods and services in the digital reality. Through subterfuge, spying, rewards, and an assortment of other tactics, IOI gathers intelligence on its users, competitors, and enemies, and then uses that information to its advantage. But even in a fully-connected, always-on digital world such as the OASIS, people have effective tools against IOI’s tracking. They lie. They throw up roadblocks. They create alternate selves. They create private rooms to hold clandestine chats. They go underground. They disconnect. In a 2013 survey by Pew Research Center, 86 percent of Internet users stated that they had attempted to minimize their digital footprints by taking affirmative steps such as deleting cookies, using a false name or email address, or using a public computer to mask their identities.1 A 2015 survey by TRUSTe/National CyberSecurity Alliance found that 89 percent of consumers refuse to do business with a company that does not protect their privacy.2 Those are just two of dozens of surveys showing similar metrics.3 In response to users’ privacy concerns over the past decade, consumer-friendly privacy protection tools continue to make their way into the marketplace. For example, VPN privacy protection add-ons are now readily available for web browsers, and some browsers, such as Opera, come with a free VPN built directly into the browser.4 Ad blockers have become so popular that some websites are restricting access if a browser blocks ads on the site.5 And privacy-conscious search engines like DuckDuckGo continue to gain loyal users.6 So what does this have to do with the legal intricacies of data privacy? A lot, actually. As demand increases for privacy tools, more companies are meeting that demand in new and innovative ways. Although the privacy risks inherent in artificial intelligence (AI) are well-documented, we are also seeing companies develop AI applications designed to help protect consumer privacy by creating digital noise, or obfuscation, around a person’s online activities. These tools essentially create new layers of false interests and pretend preferences tied to an individual’s online persona, which makes it more difficult for marketers to know which preferences and opinions are true and which are false.7  Expect to see a variety of AI-powered obfuscation and other related tools and services arriving over the next few years as consumers attempt to distract data collectors from real data. Whether or not these new tools and services are legal will be the subject of much debate, especially by any company being thwarted in its efforts to collect reliable information about a user. Some of these tools will also present novel legal issues related to AI, such as whether an unmonitored chatbot can create a legal contract on behalf of its owner (probably) or whether the owner of an AI tool is always responsible for its activities, even if the AI tool acts contrary to its owner’s instructions (maybe). Then there are the questions of who’s guarding the guards and whether these new privacy tools will eventually be used to collect even more information from consumers.8 In the future, we will certainly see new legislation, regulations, and court holdings affecting how companies and third parties may use personal information of individuals. But technical innovation is much faster and more responsive to consumer demand. As consumers desire better protection for their information, expect to see more privacy tools emerge to help control the types and amounts of data shared with companies and marketers. And as this develops further, these new tools will undoubtedly bring new legal questions and challenges. This article was originally published in Best Lawyers Business Edition, Summer 2017, p.23. ———————— 1http://www.pewinternet.org/2013/09/05/anonymity-privacy-and-security-online/ 2 https://www.truste.com/resources/privacy-research/ncsa-consumer-privacy-index-us/ 3http://www.pewinternet.org/search/?query=privacy; https://epic.org/privacy/survey/;https://www.law.berkeley.edu/research/bclt/research/privacy-at-bclt/berkeley-consumer-privacy-survey/ 4http://www.opera.com/computer/features/free-vpn 5https://www.pubnation.com/blog/publishers-fight-back-how-the-top-50-websites-combat-adblock 6http://www.digitaltrends.com/web/duckduckgo-14-million-searches/ 7https://www.wired.com/2017/03/wanna-protect-online-privacy-open-tab-make-noise/; https://www.nyu.edu/projects/nissenbaum/papers/Politicalandethicalperspectivesondataobfuscation.pdf 8In 2016, a popular browser add-on ironically named “Web of Trust” was discovered to be collecting and selling information about its users (see http://www.pcmag.com/news/349328/web-of-trust-browser-extension-cannot-betrusted). In 2017, an inbox management service called Unroll.me was sued for selling user data gleaned from users’ inboxes (see https://www.cnet.com/news/unroll-me-hit-with-privacy-suit-over-alleged-sale-of-user-data/).

Getting Your Data Back – a Hostage Crisis?

One of the key differences between a cloud computing delivery model and a customer-hosted solution is the service provider, not the customer, possesses the customer’s data under a cloud computing delivery model. At the end of such a relationship the customer needs its data returned. Many service providers’ form agreements, however, do not address when and in what format the data will be returned. Given the vital importance of data to a company’s business, a customer should address this issue prior to entering into such an agreement.