“W-2 Phishing Attacks Targeting Businesses to Cash in on Busy Tax Season: 10 Tips to Protect Your Business”

Cyber criminals are taking advantage of tax season to lure valuable W-2 information from vulnerable businesses. An example of a common phishing scheme starts with a scammer posing as a legitimate employee of a company, sending an email that looks like it is coming from an internal email address, often the Human Resources department or the Finance department, or even from the CEO of the company. A cyber criminal may even impersonate an employee using stolen personal data from that employee. The email from the scammer attempts to trick the recipient into sending the scammer W-2’s, often creating a sense of urgency for a quick response. As we all know, a W-2 contains valuable information such as an individual’s name, address, social security number, salary and withheld taxes. Cyber criminals can use this information to file fake tax returns and pocket tax refunds.

As recently as February 17, 2017, the IRS warned of a new phishing scam where tax professionals and state tax agencies are sent an email impersonating a software provider with the subject line “Access Locked.” The email tells the recipient that access to the software was suspended due to errors in the recipient’s security details. Then, the email requires the recipient to “unlock” the software by clicking on a link that directs the recipient to a fake web page, prompting the recipient to provide his/her user name and password, which is used by the scammer to steal client information. https://www.irs.gov/uac/newsroom/security-summit-alert-tax-professionals-warned-of-new-scam-to-unlock-their-tax-software-accounts

Other common ways phishing attacks occur are by: (1) embedding a link in an email that redirects the recipient to an unsecured website that asks for sensitive personal information, (2) including with an email a malicious attachment or ad that allows an intruder to use loopholes in security to obtain personal information, or (3) impersonating a known vendor or an employee over the telephone to obtain company information.

We offer some tips to help prevent succumbing to W-2 phishing attacks that are already plaguing this tax season:

  1. Pick up the phone: If you receive an email asking for a W-2 or hear of someone in your company receiving such an email, verify the authenticity of the request. A simple solution is to pick up the telephone and call the apparent author of the email to ask if he/she indeed asked for a W-2. The same rule of thumb should apply if you receive an email asking for a money transfer or other sensitive information.
  2. Check the sender’s email address for discrepancies: Often an email address from a scammer will look almost like a company’s internal email address, but there might be a spelling error with one letter off, or a period added or taken away. Scrutinize the email address from a sender asking for W-2s to see if there are any discrepancies that might provide a clue that the email is fake.
  3. Don’t just reply, forward instead: Instead of automatically hitting reply to an email from what appears to be a known colleague asking for W-2s or credentials that could be used to obtain W-2s, forward the email to the legitimate email address you have for the person who the email looks like it is coming from and ask she/he to verify if he/she sent the forwarded email.
  4. Redact W-2s: If your business is not required to provide or maintain unredacted W-2s, then redact (black out) all but the last 4 digits of social security numbers on W-2s you generate or maintain. This reduces the sensitive personal information available on the W-2 and makes the W-2s much less valuable if a scammer ever was able to obtain them.
  5. Encrypt W-2s (and all sensitive company information): Even if you cannot redact W-2s, all W-2s and sensitive company information should be encrypted, both at rest and when being transmitted (including in the mobile and “own device” environments).
  6. Train your workforce: Regularly educate and train your workforce on phishing attacks. Test your workforce on the training provided. Phishing attacks work because of human error. Training and testing of your workforce to recognize phishing attacks can greatly reduce the risk of success of a phishing attack.
  7. Implement and maintain strong information security: Ensure that your company has robust spam filters that are regularly updated, block malicious websites, enable browser ad-ons that prevent a user from clicking on malicious links, use antivirus software, and keep all security systems current with updates and patches. Apply all of these security programs to mobile environments and “own devices” to prevent exploitation of vulnerabilities in the mobile environment or from “bring your own device” practices.
  8. Restrict access to W-2 information: Ensure that only key personnel have authority to access personally identifiable information, in this case W-2 tax information. Such access should be restricted to only those who require it to perform their job duties.
  9. Restrict outflow of W-2 information: Restrict internal staff’s ability to copy sensitive data into unapproved methods of transmission, such as email and web browsers, including controlling the ability to copy, paste and print sections of documents. Loss prevention endpoint technology and application controls are available in this area.
  10. Implement, practice and update a Data Loss Prevention (DLP) Program: Cyber risks present a fast- evolving landscape. Data loss through cybercrime and internal risks represent increasing business exposures. Prevention is key to mitigation in this area and a better option than facing a breach unprepared. An entity that knows those risks and controls the data that flows within and outside of its walls can best remain competitive in the marketplace. Using this knowledge, a company can most efficiently protect sensitive data and quickly respond to security incidents.

If you would like more information or assistance in this area, please contact a member of our Cybersecurity & Privacy Team at [email protected].You can also reach Cinthia Motley at [email protected] or 312-849-1972 and Nora Wetzel at [email protected] 415.627.3478

Recent Trends in Bankruptcy Sales of Customer Data

In 2005, Congress amended the Bankruptcy Code to address privacy concerns in connection with sales of customer data in bankruptcy cases. The Code was specifically amended to restrict or prohibit the sale of customers’ personally identifiable information – as defined by the Bankruptcy Code – when in violation of a debtor company’s existing privacy policy.
In practice, the statute mostly has operated to facilitate these sales pursuant to a bankruptcy court approval process, which is conditioned upon satisfaction of certain procedural safeguards.  After quickly reviewing the basic statutory framework, we discuss some recent cases involving bankruptcy sales of customer data.  We then provide our summary of lessons learned and key takeaways.

Statutory Framework
Section 101(41A) of the Bankruptcy Code’s enumerates the specific items of personal information that constitute Personally Identifiable Information within the meaning of the Bankruptcy Code, if provided by an individual in connection with obtaining a product or service from the debtor primarily for personal, family or household purposes.  They are as follows: first and last name, residence, email address, telephone number, social security number, or credit card account numbers.  In addition, Section 101(41A) provides that Personally Identifiable Information can include a birth date, place of birth or any other item of information concerning an identified individual that, if disclosed, would result in identifying such individual physically or electronically, if such information is identified with one or more of the above enumerated items of personal information.

Section 363(b)(1) of the Bankruptcy Code provides that if the debtor has a privacy policy in effect at the time of the bankruptcy filing, which prohibits the transfer of Personally Identifiable Information (“PII”), the Information cannot be sold in bankruptcy unless additional requirements are satisfied.  If triggered, section 363(b)(1) prohibits the sale of PII unless the bankruptcy court finds that the sale is consistent with the debtor’s privacy policy or the court approves the sale at a hearing after (a) appointing a consumer privacy ombudsman to assist the court in reviewing the facts and circumstances of the sale and (b) finding that the sale of the information would not violate applicable nonbankruptcy law.

The bankruptcy court orders the appointment of the consumer privacy ombudsman pursuant to section 332 of the Bankruptcy Code, who may appear and be heard at the sale hearing.  Section 332 provides a non-exclusive list of the information and topics to be included in the ombudsman’s report and recommendations to the court. They include the potential losses or gains of privacy to consumers if the sale is approved, the potential costs or benefits to consumers if the sale is approved, and the potential alternatives that would mitigate privacy losses or potential costs to consumers.

Recent Bankruptcy Sales of Customer Data

  1. BPS Holdings (2017):  The debtor companies manufactured, distributed and sold sports equipment, accessories and apparel under a number of band names.  Products were sold in U.S. and Canada, and the companies operated a number of websites which collected a variety of PII from their customers, in some cases from minors.  After filing bankruptcy, the debtors requested bankruptcy court approval to complete two sales of their businesses:  (1) Sale of their soccer apparel and equipment business (“Soccer Business”) to their co-founder and (2) Sale of their hockey, lacrosse, and baseball businesses (“Other Business”) to a newly formed company.

    The bankruptcy court appointed a privacy ombudsman, who examined the debtors’ privacy policies and data collection practices among the various businesses.  The ombudsman recommended court approval of both sales under certain terms and conditions, and the both sales were recently approved by the bankruptcy court.

    Sale of Soccer Business:  The ombudsman found that the debtors operated two websites for the Soccer Business, pursuant to which they collected customer names, addresses, phone numbers, email addresses and order histories.  They did not collect any other categories of PII, nor track customer activity via cookies or other tracking technologies.  At the time of the bankruptcy filing, a privacy policy was posted on one of the websites, which promised customers that their PII would not be sold or transferred to any other company for any reason whatsoever.

    The privacy ombudsman  recommended that the court approve the sale subject to the following conditions: (1) the buyer must engage in substantially the same line of business, (2) the buyer must adhere to all material terms of the existing privacy policy, (3) the buyer must agree to obtain the customers’ affirmative consent before making any material changes to the privacy practices to the PII collected under the existing privacy policy, and (4) the buyer must agree to comply with applicable privacy and data protection laws.

    The privacy ombudsman did not recommend, and the buyer not agree,  that notice be given to the customers of the proposed sale with an ability to opt-out of the sale of their PII to the buyer.  The sale was approved without any required opt-out notice..

    Sale of the Other Businesses:  The ombudsman found that the debtors operated several websites and Instagram pages among the different sports businesses, collecting customer names, mailing addresses, phone numbers, email addresses, birth dates, ages, genders, zip codes, and payment information, in different combinations.  The debtors also collected anonymized customer usage and demographic data from Google and Amazon.  Certain of the websites also collected personally identifiable from minors.

    The ombudsman reported that some websites for the various businesses posted privacy policies, while others did not.  Most of the privacy policies promised customers that their PII would not be sold without prior notice; one of the websites posted a policy that PII might be shared with affiliated companies or third party service providers for the purpose of conducting business, and promised that PII would not be provided to any third parties for their own marketing purposes. In certain instances, the ombudsman indicated that he had requested, but had not received, any prior or currently applicable privacy policies.

    The ombudsman recommended that the sale be approved on a number of conditions.  As to websites which notified customers that their PII would not be sold without prior notice, the ombudsman recommended (1) email notice of the sale to customers, (2) if the buyer did not agree to be bound by the existing privacy policy, an opt-out opportunity, and (3) the buyer’s agreement to comply with applicable privacy and data protection laws.  As to the website which promised customers that their PII would not be shared, the ombudsman recommended that the buyer obtain the customers’ affirmative consent to the sale of the PII or a showing by the buyer that it would (1) engage in substantially the same line of businesses, (2) adhere in all material respects to the existing privacy policy (3) obtain customer affirmative consent before making any material changes to privacy practices, and (4)agree to comply applicable privacy and data protection laws.

    For websites with no privacy policies, the ombudsman did not recommend any conditions other than the buyer’s agreement to comply with applicable privacy and data protection laws.  For websites in which the ombudsman was unable to confirm the existence or absence of any privacy policy, the ombudsman recommended that the debtors obtain consent from the customers before the sale of their PII to the buyer.  Lastly, the ombudsman objected to the debtor’s transfer of any PII of children under the age 13, consistent with the Children’s Online Privacy Protection Act.

    The court approved the sale without requiring opt-out notices to consumers, but required affirmative customer consent with respect to the sale of PII collected prior to existing privacy policies for certain of the websites.  The court also required the debtors to delete all PII of children prior to the sale.

  2. Aeropostale (2016):  The debtor companies sold clothing in the U.S. and Canada in retail outlets and through 2 websites under a variety of brands. The websites collected customers’ names and addresses (mailing and email).  Phone numbers also could be collected for shipping purposes only. Similar PII was collected in the retail stores.  The websites also tracked and collected historical usage and transaction data, and the customers’ IP address, browser information and reference site domain name.

    The company also conducted certain contests and sweepstakes, which, in certain instances, required customers to provide their social security numbers, in addition to their names and addresses.  The company did not collect credit card numbers or other payment information.

    At the time of the bankruptcy filing, the posted privacy policy on one of the websites stated that the PII would not be shared with others “except with your consent or as described in this Privacy Policy.” The policy described a number of circumstances for the companies’ sharing of PII with affiliates or marketing or service partners, or where required by law, but the policy did not provide for the sharing of the PII in the event of a bankruptcy or sale of the company or its assets.  On the second website, the posted privacy policy explicitly promised customers that their PII would “never” be sold, rented or given away.

    After filing bankruptcy, the debtors conducted an auction of their operating assets, including the customer PII, and thereafter moved for approval of the sale to the winning bidder.  The court-appointed  ombudsman recommended approval of the sale of the customer PII after reporting that under the terms of the sale the proposed transfer of PII was subject to a 60 day opt-out notice to customers after the closing of the sale as to any future use of their PII by the buyer. The ombudsman noted that this opt-out provision was not a specific recommendation of the ombudsman, rather it was agreed to between the debtors and the buyer.

    The ombudsman specifically recommended that the sale be further conditioned upon the buyer’s agreement to (1) employ appropriate security controls and procedures to PII, (2) abide by all application laws and regulations with respect to PII, (3) abide by the debtor companies’ existing privacy policies and related promises, and (4) respect all prior requested opt-out requests by customers.  In addition, the ombudsman recommended that absent prior express consent from customers, the buyer’s future use of PII should be limited to the purposes of continuing business operations that were purchased and providing goods and services to customers.

    Thereafter, the bankruptcy court approved the sale after adopting the ombudsman’s recommended conditions to the sale of the PII.

  3. Golfsmith (2016):  The affiliated debtors were the largest specialty golf retailer in the world, offering customers an extensive selection of golf equipment and related services.  The debtors operated their business as an integrated multi-channel retailer, with retail stores, catalog sales and e-commerce pursuant a website. After filing bankruptcy, the debtors moved to sell their assets pursuant to a court supervised auction.  The winning bidder, a large sporting goods retailer, sought to purchase the business as a going concern.

    Included in the purchased assets were all of the Debtors’ customer information including contact information (name, email, mailing address, and phone number), birthday and gender, and transaction history, with the exception of any credit card information or social security number information that might be in the debtors’ possession.  At the time of the bankruptcy filing, the debtors’ privacy policy disclosed that certain PII would be shared with trusted third party service providers, but phone numbers would not be made available to other companies or organizations and email addresses would not be shared or distributed and would remain in the sole possession of the debtors.  An earlier privacy policy also promised customers that their email addresses would not be sold.

    The privacy ombudsman’s report recommended approval of the sale subject to a number of conditions, including the buyer’s agreement to (1) be bound by and succeed to the debtors’ existing privacy policy, (2) be responsible for any violation of the privacy policy after the closing of the sale, (3) notify the customers of the sale and provide them with an opt-out opportunity for the transfer of any customer PII to the buyer, which such notice to be posted both on the debtors’ website and retail stores, (4) provide further opt-out notice to customers of any attempt to convert the customers to the buyer’s privacy policy, and (5) safeguard all customer PII in a manner consistent with industry standard data protections and applicable information security laws and best practices.

    In addition, the ombudsman recommended that the buyer agree to destroy all PII for which it determined that there was no reasonable business need and that the debtors destroy all customer PII not transferred to the buyer within 90 days after the closing of the sale.

    The court approved the sale as conditioned by the ombudsman’s recommendations.

  4. RadioShack (2015):  After filing bankruptcy, the debtor proposed a sale of its customer records database along with certain IP on a standalone basis. The data was not part of a sale of the debtor’s business to the buyer as a going concern. The data base included customer names, email and mailing addresses, and phone numbers and extensive transaction data, including credit and debit card numbers and social security numbers.  The debtor carved-out the credit and debit card numbers and social security numbers from the proposed sale.

    The debtor’s pre-bankruptcy privacy policies advised customers that, among other things, the company’s mailing list would not be sold, customer PII would not be used for any purpose other than carrying out services requested from the company, and the company would not “sell or rent customer PII to anyone at any time.”

    The proposed sale drew objections from the Federal Trade Commission and State Attorneys Generals from 38 states.  In addition, the court appointed a consumer privacy ombudsman to review the proposed sale. Thereafter, the FTC, States Attorneys General, debtor and successful bidder mediated this dispute and reached a consensual resolution which also was subsequently endorsed by the ombudsman.

    As part of the settlement, the buyer agreed to purchase only a very limited subset of the customer PII, namely (1) email addresses of customers that were active within 2 years prior to the bankruptcy filing along with certain limited transaction data collected in the five years prior to the bankruptcy filing and (2) customer names and mailing addresses with certain limited transaction data associated therewith in the 5 year period prior to bankruptcy.  No customer phone numbers were sold.

    In addition, the buyer agreed to a number of other protections in the mediated settlement, including the buyer’s agreement to (1) become a successor in interest under the debtor’s existing privacy policies, adhering to all material terms and assuming liability for any violations thereof, (2) effectuate an extensive notice and opt-out procedure for affected customers, (3) not make further material changes to the privacy policies without further notice and opt-out opportunity to affected customers, (4)  safeguard all PII in a manner consistent with industry data security protections, applicable information security laws and best practice and (4) destroy all PII for which it had no reasonable business need. In addition, the debtor agreed to destroy any PII not conveyed to the buyer.

    The court approved the sale as modified by this mediated settlement.

Lessons Learned and Key Takeaways

  • Sales of customer PII on a standalone basis, or which are not part of a sale of the debtor’s business in which the buyer will continue to provide the same or similar products or services, will continue to draw greater judicial scrutiny and likely require more limitations and protections, as a condition to their approval by the bankruptcy court.
  • Absent objections by affected consumers, the bankruptcy courts likely will continue to approve sales of customer PII in bankruptcy cases in accordance with the recommendations of the consumer privacy ombudsmen who are appointed by the courts, in many instances with no opportunity for customer opt-out.
  • Although a number of bankruptcy sales of PII have included  some form of opt-out notice to the affected customers, it remains to be seen in future cases whether buyers will continue to agree or be required to provide  such notices.  Much may depend upon the particular factual circumstances, but consumer privacy ombudsman do not consistently recommended such restrictions as a condition to the approval of these sales.
  • Some bankruptcy sales of PII have been conditioned upon the buyer assuming certain liability for breaches of the debtor’s privacy policy and/or obligations to safeguard PII in accordance with applicable law or industry standards.  At the same time, the debtor’s assets are often sold to the buyer free and clear of any liens, claims, or interests, including potential successor liability. It remains to be seen whether significant disputes or litigation will arise after the closing of these bankruptcy sales of customer PII in the event of a subsequent discovery of a data security breach or other breach of the debtor’s prior privacy policies.

Chicago Attorneys Cinthia Granados Motley and Ashley Jackson Discuss Ways to Avoid Wrongful Collection of Data Claims

Chicago based  attorneys Cinthia Granados Motley and Ashley Jackson were published on Law360 February 7, 2017. The article, “10 Ways To Avoid Wrongful Collection Of Data Claims,” discusses tips by using the who, what, where, when and why of consumers to help answer the most asked questions.

FTC Report Highlights Privacy Concerns and Best Practices for Cross-Device Tracking

On January 23, 2017, the FTC released a Staff Report (the Report) on cross-device tracking, a commonly used practice that allows companies to associate multiple internet-based devices with the same consumer in order to track behavior across devices.

The Report follows the FTC’s Workshop on cross-device tracking, and alerts companies engaged in cross-device tracking of certain best practices for avoiding potential violations of applicable law and regulations.

Specifically, the Report recommends that companies engaged in cross-device tracking: (1) be transparent about their data collection and use practices; (2) provide choice mechanisms that give consumers control over their data; (3) provide heightened protections for sensitive information, including health, financial, and children’s information; and (4) maintain reasonable security of collected data.

Overview of Cross-Device Tracking

Tracking allows companies to track a consumer’s activity across smartphones, tablets, desktop computers, and other connected devices. This provides advertisers with a much stronger understanding of the consumer, which has valuable implications for advertising. For example, retailers that use tracking technology would be able to see that a customer made a purchase on her smartphone after seeing an ad on her work computer. It can also help advertisers tailor ads to consumers, for example, to send advertisements about a belt to match a pair of shoes she previously bought from the retailer.

To engage in cross-device tracking, companies use both “deterministic” and “probabilistic” techniques. Deterministic techniques are used to track consumer behavior based on the affirmative use of a consumer-identifying characteristic, such as the consumer’s login credentials. For example, when a consumer logs in to an online platform on a number of devices, the consumer’s behavior on one device can be used to inform targeted advertising through the same platform on the consumer’s other devices.

Probabilistic approaches, by contrast, involve inferring which consumer is using a device, even when a consumer has not logged in to a service. A common example of this is IP address matching, whereby devices using the same IP address — e.g., a cell phone, laptop, and smart television on the same local network — are presumed to belong to the same consumer. Similarly, if a consumer’s smartphone uses the same IP address as her work computer during business hours, and then uses the same IP address as her home computer during non-business hours, an ad platform might infer that the work computer, smartphone, and home computer belong to the same person. Or if several devices visit the same unusual combination of websites, a platform might infer that the devices belong to the same user.

Often, companies that collect and use deterministic data — e.g., email providers, social networks, or shopping sites — will work with entities engaged in probabilistic tracking in order to learn even more about the consumer’s behavior.

The FTC’s Report

The FTC Report is based, in part, on FTC research relating to cross-device tracking, which involved testing 100 popular websites on two separate devices. The study found, among other things, that third-party technology tracking technology was embedded in at least 87 of the 100 websites, and that 861 third parties were observed connecting to both devices. The study also found that 96 of the 100 websites allowed consumers to submit a username or email address, and 16 of the websites shared the username or email with third parties.

The FTC Report recognized that tracking has several benefits, such as giving consumers a more seamless user experience across their devices, providing increased fraud detection and security, and allowing marketers to provide a better experience for consumers by delivering more relevant ads.

The Report focused more heavily on privacy challenges tied to cross-device tracking. For example, many consumers do not realize that they are being tracked across devices, especially by probabilistic approaches. Consumers may also not realize that cross-device tracking is often not limited to cell phones, tablets, and laptops, but that their information may also be tracked from smart televisions, wearable devices, and even in-person purchases made in brick-and-mortar stores. The number and variety of entities with access to consumer information, including third-party advertising networks that have no relationship to the consumer, creates an additional privacy concern. Additionally, data collected through cross-device tracking may include highly-private personal information which, if exposed through a security breach, could result in considerable consumer harm. For example, by connecting searches made from a smart phone about baby monitors to a laptop search for maternity clothes, a company could infer that the user is pregnant; an additional search of “preeclampsia” could lead the data aggregator to infer that the user may have a high-risk pregnancy, a medical condition that the user may not have intended to share.

The Report makes a number of recommendations to companies engaged in cross-device tracking, namely:

  • That companies engaged in cross-device tracking fully disclose to consumers their use of cross-device tracking practices and the extent of those practices, including the nature of any data collected. That such companies provide opt-out tools or other ways for consumers to limit cross-device tracking.
  • That companies refrain from engaging in cross-device tracking of sensitive information, including financial, health, children’s information, or precise geolocation data, without first obtaining the express consent of the consumers to whom the information belongs.
  • That companies take necessary security steps to protect the data they collect in the process of tracking consumers’ activity across devices.

The Report recognized that the Network Advertising Initiative (NAI) and Digital Advertising Alliance (DAA) have already taken steps to self-regulate with regard to non-cookie tracking (and for the DAA, cross-device tracking more specifically), but advises that both organizations could strengthen their efforts to address cross-device tracking.

In a concurring statement on the Report, FTC Commissioner Maureen K. Ohlhausen said, “[T]oday’s Report does not alter the FTC’s longstanding privacy principles but simply discusses their application in the context of a new technology.” The Commission voted 3-0 to issue the Report.

Considerations for Companies Engaged in Cross-Device Tracking

In light of the FTC Report, companies engaged in cross-device tracking should review their current practices, and ensure that their privacy policies and other relevant consumer-facing policies adequately describe any cross-device tracking activities and provide a way for customers to opt out of being tracked. Companies that fail to fully, conspicuously, or accurately disclose the extent of tracking activities may face liability. (See my previous post, here.)

FTC Settles Ashley Madison Data Breach Complaint

The operators of Ashley Madison, the dating website for married people that became famous following its massive data breach in 2015, settled claims brought by the Federal Trade Commission (“FTC”) regarding that breach and their security practices and representations. Ruby Corp., Ruby Life Inc., and ADL Media Inc. (collectively, “Ruby”), named as defendants, were responsible for the operation of ashleymadison.com.

Hackers breached Ashley Madison—a site with over 18 million users in the United States alone—in 2014 and 2015, with intruders reportedly gaining access to Ruby’s networks multiple times. Ruby did not detect the breach until July 2015, when an employee noticed large data transfers.

Ruby has agreed to pay $1.6 million to resolve charges relating to the hack to the FTC and state regulators, with several million more of the judgment suspended in light of financial limitations of the company The FTC, 13 states, and the District of Columbia entered into a settlement to resolve the complaint.  Earlier this year, Ruby also entered into a compliance agreement with the Office of the Privacy Commissioner of Canada and an enforceable undertaking with the Office of the Australian Information Commissioner.  Additionally, multidistrict class action litigation brought by numerous former Ashely Madison customers continues against Ruby.

The Complaint

The FTC filed a complaint in the District Courts for the District for Columbia. Among the allegations raised in the complaint were that Ruby:

  • Failed to have a written organizational information security policy
  • Failed to secure remote access, regularly monitor unsuccessful login attempts, revoke passwords of ex-employees, restrict access to systems based on employee job functions, implement controls to protect against retention of passwords and encryption keys, and permitting employees to reuse passwords
  • Failed to proper train employees to perform data-security measures related to their jobs
  • Failed to ensure that third-party providers utilized reasonable security measures
  • Failed to monitor their system at random intervals to identify security breaches and to ensure the effectiveness of their protective measures.
  • The complaint also alleged that Ruby falsely:
  • Assured users that their information was private and protected
  • Created fake profiles to attract new users, and consumers had no way to distinguish between real and fake profiles
  • Claimed it had received a “Trusted Security Award,” as well as stating that it was “100% secure,” “risk free,” and “completely anonymous”
  • Required consumers were required to purchase the right to fully delete their profiles, and were only told after payment that their information would be retained for 6 to 12 months thereafter. Ruby then either retained the information for up to 12 months, or completely failed to remove the information.

Settlement Agreement

In addition to enjoining Ruby from misrepresentations as to its security practices and its utilization of fake profiles, the Settlement Agreement set forth a series of data security practices that Ruby is required to implement, with initial and biennial assessments of compliance required.The Settlement Agreement requires Ruby to obtain its assessments from an objective third-party professional that will monitor Ruby and the execution of its new security program. The Settlement Agreement also prevents Ruby from using personal information received from the online dating sites obtained prior to the entry of the Settlement Agreement, unless it complies with the requirements discussed above regarding the cessation of its misrepresentations to consumers. Ruby must also submit a compliance report to the FTC.

Some Takeaways

The complaint and subsequent Settlement Agreement is only the latest in the FTC’s exercise of its asserted power to investigate and prosecute companies for inadequate data security. The mandated security program outlined in the Settlement, for example, provides a useful roadmap that proactive businesses may utilize to preemptively show that their compliance is in line with FTC expectations. The Settlement Agreement provides warnings to those who freely throw about statements and self-award seals regarding the security of their platforms. The complaint and settlement also reinforce the importance of restricting access to systems based on services providers, employee job functions, and the importance of internal employee and vendor controls regarding password usage and retention.

  • The FTC noted that 36 million individuals worldwide were affected, making it one of the largest data breaches it has investigated.
  • Finally, Ruby was ordered to pay $8,750,000 in satisfaction of the judgment, but this amount was suspended. Instead, Ruby will pay $828,500 to the FTC, and $828,500 to the 13 states and DC, for a total of approximately $1.6 million. Should Ruby be found to have misrepresented its financial condition, Ruby will immediately owe the full amount of the judgment.
  • The Settlement Agreement outlines a comprehensive data security program for personal information collected. In doing so, it stated that the program was to be “appropriate to Defendants’ size and complexity, the nature and scope of Defendants’ activities, and sensitive of the personal information collected from or about consumers,” signaling that the FTC are not promoting a one-size-fits-all approach to data security. The safeguards ordered, however, are of the type that are likely to be expected of most companies. For example, the FTC requires that Ruby designate an employee to take responsibility for the program, create protocols to identify and resolve internal and external risks, and to conduct a risk assessment to assess the sufficiency of, necessity for, and implementation of various safeguards. The Settlement also requires the creation of a process to select and retain third-party service providers that will be capable of safeguarding any information they receive from Ruby.
  • The parties quickly settled the matter following the filing of the Complaint. The Settlement Agreement—a stipulated order for permanent injunctive and other relief—was entered into by the FTC, 13 states, and the District of Columbia against Ruby.
  • The FTC brought charges alleging unfair security practices, and misrepresentations regarding network security, user profiles, terms and conditions for deleting profiles, and data security seals.

As New York Attorney General Schneiderman stated: “This settlement should send a clear message to all companies doing business online that reckless disregard for data security will not be tolerated.”  (New York will receive $81,330.94 of the payment being made, since up to 652,627 New York residents were members of Ashley Madison at the time of the security breach).  Businesses who want  to take an active approach to data security compliance can glean much from the FTC’s complaint and settlement here.

One Good Deal After Another – Navy Data Breach, Damages and Sovereign Immunity

“One good deal after another” – This old expression from my time of service in the USN popped into my head as I read news of the latest breach of information regarding Navy personnel. In sum, reported the Navy on November 23, the laptop of a government contractor supporting a naval contract was “compromised” and “unknown individuals” accessed sensitive information on over 130,000 sailors and former sailors, including Social Security numbers.  At last report, there is no evidence the leaked data has been misused.

The facts so far, as reported, are facially similar to those at issue in In re Science Applications Int’l Corp. Litigation, 45 F.Supp.3d 14 (D.D.C. 2014) (“SAIC”) where an employee of SAIC, an information-technology company that handles data for the federal government, had her car broken into and back-up tapes containing health care information regarding millions of members of the armed services and their families were stolen.  The SAIC court rejected plaintiffs’ claims for increased risk of identity theft and monitoring costs on the grounds set out in in Clapper v. Amnesty International USA, 133 S. Ct. 1138 (2013) holding that, in addition to a substantially increased risk of harm resulting from the occurrence, there also had to be “a substantial probability of harm with that increase taken into account.” SAIC, 45 F. Supp. 3d at 16 (emphasis in original).  Because there was little likelihood that the thief involved even knew what information he or she had come into, much less possessed the technology to access it, the SAIC court found no injury-in-fact for the bulk of the plaintiffs.  However, the court allowed two claims to go forward, including a claim under the Privacy Act for previously unreceived unsolicited calls to an unlisted number pitching medical products and services targeted at a specific medical condition listed in the stolen medical records. Id. at 33.

We might note that news of this latest “compromise” came out the same month as the long awaited ruling in Welborn v. IRS, —- F. Supp. 3d —–, 2016 WL 6495399 (D. D.C. 2016), brought, as you will recall, as a result of 330,000 tax-related documents stolen during a cyberattack that extended from mid-February to mid–May 2015 and targeted the IRS’s “Get Transcript” program.  Among other causes of action, the plaintiffs brought suit under the Privacy Act and the Internal Revenue Code.  The Welborn court also rejected plaintiffs’ claims for an increased threat of future identity theft and fraud as a result of the IRS security breach as entirely speculative and depending on the decisions and actions of one or more independent, and unidentified, actors. Id. at *8 (quoting Clapper, 133 S.Ct. at 1150).  However, the Welborn court found that three of the plaintiffs, two of whom alleged that they had suffered actual identity theft when someone filed false tax returns and claimed fraudulent refunds in their names, and one who alleged she had “been the victim of at least two occasions of fraudulent activity in her financial accounts, one of which resulted in the removal of funds from a personal financial account, which occurred after the IRS data breach,” had alleged sufficient injury-in-fact to maintain standing.  The latter of these three was dismissed for lack of pleadings of causation as after in time is not sufficient to show causation. Id. at *9 – *10.  The court then held that 1) the remaining two plaintiffs’ claims for  unauthorized disclosure under the Privacy Act were preempted by the tax code, and 2) plaintiffs’ Privacy Act claims for “failure to safeguard” must be dismissed for failure to allege actual damages (as opposed to injury-in-fact). Id. at *12 (to plead any Privacy Act claim adequately, a plaintiff must plead “actual—that is, pecuniary or material—harm”).  Ultimately, the court also dismissed the plaintiffs’ claims for unauthorized disclosure under the Tax Code on grounds of sovereign immunity.  To allege improper disclosure under the Code, a plaintiff must allege (1) knowing or negligent, (2) disclosure, (3) of a return or return information.  The IRS argued, and the court held, that plaintiffs’ attempt to present a “failure to protect” claim couched as an “improper disclosure” claim, but the Code does not authorize suit against the IRS based on a failure to protect.  That is, the plaintiffs’ attempt to expand liability would expand the government’s waiver of sovereign immunity to include a claim not contemplated by the Code.

A very similar argument could also have been made with regard to the Privacy Act claims, but the court did not reach them due to its finding on no actual damages, and the holding has broad implications. Finally, the involvement of a government contractor in the scenario of the Navy breach could also implicate the recent Supreme Court decision in Campbell-Ewald Co. v. Gomez, No. 14-857, 2016 WL 228345 (2016), regarding “derivative sovereign immunity.”  The Navy had contracted with Campbell to develop a recruiting campaign that included sending text messages to young adults, but the contract stated that messages could be sent only if those individuals had “opted in” to receive marketing solicitations.  Campell-Ewald developed a list of cellular phone numbers for contacting users, and then transmitted the Navy’s message to more than 100,000 people.  Gomez, who had not opted in by consenting to receive messages, received one anyway and filed a nationwide class action seeking damages and alleging that Campbell-Ewald had violated the Telephone Consumer Protection Act (“TCPA”).  Campbell-Ewald argued that, as a contractor acting on the Navy’s behalf, it had acquired (i.e. had “derived”) immunity from the Navy’s sovereign immunity from suit under the TCPA.  However, the Supreme Court held that Campbell-Ewald violated both federal law (the TCPA) and the Government’s explicit contractual instructions that messages were to be sent only to individuals who had “opted in.”  The Court held that when a contractor violates both federal law and the Government’s explicit instructions, there is no “derivative immunity” and the contractor is not shielded from suit.

Proper Handling of Biometric Data — Lessons Learned from a $1.5 Million Illinois Class Action Settlement

In 2008, Illinois passed the Biometric Information Privacy Act, 740 ILCS 14/1 (the Act or BIPA), which requires companies to obtain a person’s consent before collecting that person’s biometric data. Illinois, unlike other states such as Texas, provides a private right of action for individuals whose data was collected without proper notification and consent. Under Section 15 of the Act (Retention; collection; disclosure; destruction), a private entity in possession of biometric identifiers or biometric information must develop a written policy establishing a retention schedule and guidelines for destruction of the data.

In what is being reported as the first settlement under the Illinois statute, on December 1, 2016, an Illinois state court approved a $1.5 million class action settlement between L.A. Tan Enterprises Inc. (L.A. Tan) and a class of its customers. Sekura v. L.A. Tan, Ill. Cir. Ct. 2015-CH-16694. The class plaintiffs alleged that L.A. Tan, which used fingerprint scanning technology rather than a key fob for membership purposes, failed to obtain written consent from its customers to use the data. The complaint also alleged that the company failed to provide information about how it would store the biometric data and the circumstances under which it would destroy the data, i.e., when the customer dropped his or her membership or the franchise closed.

What makes this settlement interesting is the fact that the complaint did not allege that the biometrics data was lost, stolen or sold. Instead, the class plaintiffs alleged that the company did not treat the data as carefully as the law requires. Similar to settlements with the OCR over HIPAA violations, the L.A. Tan settlement also requires the company to take corrective action to ensure compliance with the Illinois statute and to destroy all biometric data it still holds.

The sensitivity of biometric data requires companies that conduct business in Illinois to not only properly collect the data, but also store and dispose of the data as required by law. Failure to do so, could expose those companies to unnecessary liability even if the data is not lost, stolen or misused.

Two federal courts, for example, have denied defense motions to dismiss actions brought under BIPA. See In re Facebook Biometric Information Privacy Litigation, Case No. 15-cv-03747, 2016 WL 259385 (N.D. Ca. 5/5/16)(Social networking website users brought punitive class action against an website operator under BIPA, alleging that the operator unlawfully collected and stored biometric data derived from their faces. The court denied the defense motions to dismiss and for summary judgment finding that the users stated a cause of action under BIPA) and Norberg v. Shutterfly, Inc., 152 F. Supp. 3d 1103 (N.D. Ill. 2015)(Consumers brought action against operator of several photo sharing websites, seeking statutory damages for alleged violations of BIPA. Case dismissed with prejudice on April 15, 2016, pursuant to confidential settlement agreement). More recently, however, another federal court in Illinois granted the defense motion to dismiss a BIPA complaint for lack of jurisdiction under Spokeo. See, McCollough v. Smart Carte, Inc., Case no. 16 C 0377, 2016 WL 4077108 (N.D. Ill. 8/1/16).

Strike Three – You’re Out – Data Breach Shareholder Derivative Lawsuit Against Home Depot Dismissed

On November 30, 2016, Judge Thomas W. Thrash dismissed a shareholder derivative action brought against Home Depot as a result of the breach of its security systems and theft of its customers’ personal financial data (“the Breach”) in 2014. In Re The Home Depot, Inc. Shareholder Derivative Litigation, Civ. No. 1:15-CV-2999, 2016 WL 6995676 (N.D. Ga. 2016). In the derivative action, Plaintiffs asserted that Home Depot was harmed as a result of the company’s alleged delay in responding to significant security threats, and thus sought to recover under three primary claims against Home Depot’s current and former directors and officers (“Ds&Os”). These included the following alleged claims: (1) breach of the duty of loyalty by failing to institute internal controls sufficient to oversee the risks in the event of a breach, and for disbanding a Board of Directors committee that was responsible for overseeing those risks; (2) waste of corporate assets; and (3) violation of Section 14(a) of the Securities Exchange Act in connection with Home Depot’s 2014 and 2015 proxy filings. According to Judge Thrash, all of the claims against the Ds&Os “ultimately” related to what they “knew before the Breach and what they did about that knowledge.” Defendants filed a motion to dismiss, which Judge Thrash ultimately granted applying Delaware law. It was undisputed that no demand was made on the Home Depot Board of Directors. Thus, Plaintiffs had the burden of demonstrating that the demand requirement was excused because it would have been futile.

Judge Thrash analyzed each of the three claims against the Ds&Os. As for the primary claim that the Directors allegedly breached their duty of loyalty and that they failed to provide oversight, Plaintiffs were required to show that the Directors either “knew they were not discharging their fiduciary obligations or that the Directors demonstrated a conscious disregard for their responsibilities[.]” When combined with the general demand futility standard, Plaintiffs essentially needed to show that a majority of the Board faced substantial liability because it consciously failed to act in the face of a known duty to act. Judge Thrash stated that this is “an incredibly high hurdle for the Plaintiffs to overcome[.]”

In finding that Plaintiffs’ failed to overcome this hurdle, Judge Thrash rejected Plaintiffs’ arguments about the significance of disbanding the Infrastructure Committee charged with oversight of the risks Home Depot faced in the event of a data breach. Plaintiffs alleged that the Board failed to amend the Audit Committee’s charter to reflect the new responsibilities for data security that had been transferred from the Infrastructure Committee, as required by the Company’s Corporate Governance Guidelines. As a result, Plaintiffs alleged that the Board failed to designate anyone with the responsibility to oversee data security, thereby leaving the company without a reporting system. Judge Thrash concluded that “[t]his argument is much too formal.” Regardless of whether the Audit Committee had “technical authority,” both the Committee and the Board believed it did. Given the factual allegations that the Audit Committee received regular reports from management on the state of Home Depot’s data security, and the fact that the Board in turn received briefings from both management and the Audit Committee, the court concluded that “there can be no question that the Board was fulfilling its duty of loyalty to ensure that a reasonable system of reporting existed.”

The court also rejected Plaintiffs’ argument that the Board’s failure “to ensure that a plan was in place to ‘immediately’ remedy the deficiency in [Home Depot’s data security],” supported the breach of the duty of loyalty claim. Plaintiffs acknowledged in the complaint that the Board acted before the Breach occurred, that it had approved a plan that would have fixed many of Home Depot’s security weaknesses, and that it would be fully implemented by February 2015. Under Delaware law, the court held that directors violate their duty of loyalty only if “they knowingly and completely failed to undertake their responsibilities.” Judge Thrash concluded that “as long as the Outside Directors pursued any course of action that was reasonable, they would not have violated their duty of loyalty.”

In addition, Plaintiffs alleged that there was “a plan,” but that “it moved too slowly.” The court held that this was not the standard under which to evaluate demand futility on a duty of loyalty claim. The court noted that with the benefit of hindsight, “one can safely say that the implementation of the plan was probably too slow, and that the plan probably would not have fixed all of the problems Home Depot had with its security.” However, the court also found that “simply alleging that a board incorrectly exercised its business judgment and made a ‘wrong’ decision in response to red flags…is not enough to plead bad faith.”

Based on the foregoing analysis of the demand futility issue, the court had little difficulty discounting the claim of corporate waste. Plaintiffs alleged that the Board’s insufficient reaction to the threats posed by alleged deficiencies in compliance with contractual requirements for data security caused significant losses to the company, which constituted a waste of Home Depot’s assets. Here, the court concluded that the Plaintiffs’ claim was basically a challenge to the Director’s exercise of their business judgment, and although with hindsight, it “was easy to see that the Board’s decision to upgrade Home Depot’s security at a leisurely pace was an unfortunate one,” the decision nevertheless fell squarely within the discretion of the Board and was protected under business judgment rule.

Finally, Plaintiffs’ Section 14(a) claims, which were also subject to a demand requirement, alleged that Defendants omitted important information from their 2014 and 2015 Proxy Statements by not disclosing that Home Depot had known of specific threats to its data security, and that the Audit Committee’s charter was not amended to reflect that the responsibility for IT and data security had been transferred to it. The court rejected these arguments, noting that regardless of whether the charter was amended, “everyone believed and acted as if the Committee did have oversight over data security during the relative time period.” Further, the court found that Plaintiffs failed to specifically identify which statements in the Proxy Statements were false or misleading and also failed to plead with particularity how the omission caused the alleged loss. Thus, the court held that the claim did not demonstrate the necessary duty to disclose required under Section 14 (a). Moreover, “because [Plaintiffs] had not demonstrated a substantial likelihood that the Defendants would have been liable for a Section 14(a) violation,” the court found that demand was neither futile for the Section 14(a) claims, nor excused.

This decision is in step with two other recent decisions dismissing shareholder derivative actions against companies arising out of high-profile data breaches. See Palkon v. Holmes, et.al. 2014 WL 5341880 (D.N.J. Oct. 20, 2014) (court, applying Delaware law, dismissed a derivative action against Wyndham Hotels brought after that company suffered a large data breach, relying in part on the protections afforded the Ds&Os under the business judgment rule); Davis et al. v. Steinhafel et al., No. 14-cv-203, (D. Minn. July 7, 2016) (court dismissed derivative action against Target because a claim could not be stated in connection with a corporation’s special litigation committee’s decision not to pursue derivative claims against the company’s officers or directors, particularly where it demonstrated that the decision was based on a thorough and impartial investigation).

With the prevalence of security breaches taking place against various corporations, including large retailers, Home Depot is yet another reminder of the potential exposure presented by cyber-liability for the boardroom, including costly litigation even if the corporation prevails. Judge Thrash’s opinion provides guidance on how the business judgment rule can protect Ds&Os for their decision-making with respect to the demands of cybersecurity. Given the numerous references to the “benefits of hindsight,” however, corporate boards should be vigilant in assessing their cybersecurity plans. There may come a time when a court will not so readily apply the “business judgment rule” to a Board’s decision making process in addressing cybersecurity concerns.

Governmental Updates You Need to Know About

In the past few weeks, the government issued alerts and guidance on two noteworthy topics involving data security issues: phishing and ransomware – discussed below:

  • Don’t Get Phished: OCR Warns of Phishing Scheme Targeting HIPAA Covered Entities & Business Associates

As previously reported in the March 21, 2016 and July 12, 2016 Blog Posts, the 2016 HIPAA Audit Season has been underway for the better part of this past year. As stated on its website, “OCR uses the audit program to assess the HIPAA compliance efforts of a range of entities covered by HIPAA regulations.” The OCR intends to use the audits as a proactive measure, in conjunction with its ongoing complaint investigations and compliance reviews, to identify problems before they result in breaches. On July 12, 2016, the OCR sent emails to 167 Covered Entities, including health plans, healthcare providers, and healthcare clearinghouses, advising that they would be subject to desk audits.

On November 28, 2016, the U.S. Department of Health and Human Services (“HHS”) issued an Alert advising that a phishing email is being circulated on what appears to be HHS Departmental letterhead under the signature of OCR’s Director, Jocelyn Samuels. According to the Alert, this email appears to be an official government communication, and targets employees of HIPAA covered entities and their business associates.

The email prompts recipients to click a link regarding possible inclusion in the HIPAA Privacy, Security, and Breach Rules Audit Program. The link directs individuals to a non-governmental website marketing a firm’s cybersecurity services which is not associated with HHS or the OCR.

As in the case of any possible phishing email, HHS reminds the public that if you or your organization have any questions about whether the communication about a HIPAA audit is legitimate, you should contact the agency directly via email at [email protected]

This advice applies to any suspicious email communication you or your organization may receive. It also serves as a reminder to review your policies and procedures and training materials to ensure that your employees do not fall for the phishing bait and expose your organization to intrusions.

  • FTC Joins the Chorus on Responding to Ransomware

In August 2016, the Office of Civil Rights (“OCR”) issued a Fact Sheet: Ransomware and HIPAA, which was followed by a U.S. Government Interagency Report entitled “How to Protect Your Organizations from Ransomware”. These materials provided “best practices and mitigation strategies focused on the prevention and response to ransomware incidents.”

In early September 2016, the Federal Trade Commission (“FTC”) announced that it too would offer guidance on how to protect against ransomware and would take action against those that failed to protect consumer’s personal data.

Fulfilling its promises, on November 10, 2016, the FTC issued advice on how to defend against ransomware. This follows the FTC’s session on ransomware that is part of its Fall Technology Series. The FTC noted an uptick in ransomware attacks and that 91% of these attacks come from phishing emails. The FTC also provided guidance on the answer to the question that everyone has: do you pay the ransom? Following the advice of law enforcement, the FTC advises not to pay the ransom, but notes that the decision to pay is a business decision. It does caution that the payment of the ransom may signal to the hackers that the business does not have a back-up or other access to the hacked data and therefore may increase its ransom demand.

If you are concerned that your business may become victim of a ransomware attack or you need assistance with developing a plan to respond to one, the Sedgwick Cybersecurity team can assist you in responding to such an attack or preparing a response plan. Contact us at [email protected] or contact Cinthia Motley at 312-849-1972.

New Jersey TCCWNA Developments Affecting Online Retailers

The New Jersey Truth-in-Consumer Contract, Warranty and Notice Act, N.J.S.A 56:12-14, et seq. (“TCCWNA”) is a unique consumer protection statute that prohibits sellers and other commercial entities from providing consumer contracts or notices containing unenforceable terms. As stated by the sponsor of the Act, the inclusion of unenforceable provisions “deceives a consumer into thinking that they are enforceable and for this reason the consumer often fails to enforce his rights.” Sponsor Statement to Assembly Bill cited in Shelton v. Restaurant.com, Inc.,70 A.3d 544, 551 (N.J. 2013).The statute raises the stakes in drafting consumer contracts because sellers not only have to consider the enforceability of every provision of their contracts under traditional criteria such as contract formation and unconscionability; they must also consider that the very inclusion of an unenforceable term in a contract may violate the Act.

The Act has spawned increasing numbers of class actions and threatened class actions against retailers whose websites contain allegedly unenforceable provisions in their Terms of Use (TOU) and Terms of Sale (TOS). Those complaints center primarily on exculpatory and indemnity provisions in the TOU and TOS. This paper reviews the most recent developments that may affect such claims and considers pending legislation to revise the Act.

Section 15 of the TCCWNA

Section 15 of the Act provides that no seller or other commercial entity shall offer a consumer a contract or notice “which includes a provision that violates any clearly established legal right of a consumer or responsibility of a seller … as established by State or Federal Law.” So, for example, exculpatory and indemnification clauses in consumer contracts may violate the statute if they impose on consumers all risks of using the site or purchasing products from the site and fail to specify that the defendant could still be held liable for its own conduct under certain circumstances. See e.g., Walters v. YMCA, 437 N.J.Super. 111, 118-119 (2014) (premises liability cannot be disclaimed) and Castro v. Sovran Self Storage, 2015 WL 4380775 (D.N.J 2015) (self-storage operator cannot sell contents at private sale without notice).

The statute was enacted in 1981, long before the rise of e-commerce, which helps explain why we found only two documented opinions referencing the Act before 2005, but well over a hundred opinions since then. The pace of recent opinions is accelerating as plaintiffs push for broader interpretations of the Act against online retailers. In many such cases, retailers’ website TOU or TOS were drafted based on the law of the jurisdiction specified in their standard terms and conditions, without scrutinizing the enforceability of each provision under New Jersey law. Nevertheless, such online retailers could find themselves in violation of the TCCWNA if any of those terms are deemed contrary to any New Jersey or federal law.

Even in cases where potentially unenforceable provisions have not been enforced against or directly harmed the consumer-plaintiffs, courts have found potential violations of the TCCWNA if overbroad terms “discourage suits, whether or not the provisions are enforceable, and therefore fall directly within the TCCWNA’s ambit.”  Castro, Id. at 9. However, in Sauro v. L.A. Fitness International, 2013 WL 978807, *9 (D. N.J. 2013), the District Court found that an allegedly overbroad exculpatory and indemnification provision did not violate the TCCWNA because the agreement included a savings clause specifying that the limitations were only applicable “to the fullest extent permitted by law” and “as broad and inclusive as is permitted by law in the State of New Jersey.” Accordingly, the District Court interpreted the provision as making the consumer contract co-extensive with New Jersey law, providing the seller with the maximum legal protection offered by New Jersey law without explicitly stating what limitations applied for the consumer.

A more recent case, Kendall v. Cubesmart L.P., 2016 WL 1597245 (D.N.J., Apr. 21, 2016), cast doubt on the effectiveness of a savings clause to prevent violations of the Act. In that case, a leak in plaintiff’s rented storage unit caused water damage to his stored goods. The defendant storage company refused to pay for the lost goods, citing various contract provisions limiting its liability, then it sold the unit contents. Plaintiff sued, alleging that his storage facility agreement violated the TCCWNA and the New Jersey Self Service Storage Facility Act (“SSFA”) because a provision stated that in the event of a default by the renter, the owner may sell personal property at a public or private sale without notice to the renter “in the manner permitted by applicable law.”

The court held that the provision violated Section 15 of the Act because a private sale without notice violates the SSFA. The court distinguished Sauro, which relied on the savings clause to dismiss the complaint, because the provision in Kendall’s agreement “does not merely state that a sale may occur [without notice], as permitted by law, leaving it to the consumer to discover that only public sales are permitted under New Jersey law. Instead, [it] unequivocally states that a private sale may occur,” which is impermissible under New Jersey law in those circumstances. Id. at *7 (emphasis by the court). The court added:

TCCWNA permits sellers to expand valid terms of a consumer contract so that they extend to the fullest degree allowed by law. But sellers cannot include invalid terms, discouraging consumers from exercising their clearly established rights and, at the same time, avoid liability under TCCWNA by including general assurances that those terms of the consumer contract would only be exercised in compliance with applicable law.

Id. at *7 (emphasis by the court). Accordingly, even savings clauses may not prevent liability under the Act if a provision in the agreement is clearly unenforceable as a matter of law.

Recent case law based on the Supreme Court’s ruling in Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 1548 (2016) suggests a new line of defense against TCCWNA claims. In Spokeo, plaintiff filed a class action alleging violations of the Fair Credit Reporting Act because Spokeo created an inaccurate personal profile of him. The Supreme Court held that Article III standing requires an injury to be both concrete i.e., one that actually exists, and particularized to plaintiff. It found that plaintiff failed to allege any injury caused by the inaccurate information, and held that a bare procedural violation of a statute without concrete harm does not satisfy the injury-in-fact requirement of Article III. The court added, a plaintiff  does not automatically satisfy the injury-in-fact requirement “whenever a statute grants a person a statutory right and purports to authorize that person to sue to vindicate that right.” Id. at 1549. The case was remanded to the Ninth Circuit, which had found that plaintiff had standing.

In Candelario v. Rip Curl, Case No. 16-00963 (C.D. Cal., September 7, 2016) the court applied Spokeo to a Section 15 TCCWNA claim alleging unenforceable terms in defendant’s website. Plaintiff alleged that she purchased a tank top through the Rip Curl website that “was not the cut or quality depicted on Defendant’s website.” She then reviewed defendant’s website and found provisions that purportedly barred her from seeking remedies to which she was legally entitled and shielded defendant from liabilities for which it was legally responsible. Citing Spokeo, the court held that plaintiff ‘s mere allegation that the product she ordered was different from the one depicted on the website did not allege a concrete injury-in-fact. The case is on appeal in the Ninth Circuit.

It should be noted that Article III standing would not bar a claim brought in state court. In New Jersey, “standing … is an element of justiciability rather than an element of jurisdiction.” N.J. Citizen Action v. Riviera Motel Corp., 296 N.J.Super. 402, 411, (App.Div.1997), appeal dismissed, 152 N.J. 361, 704 A.2d 1297 (1998). However, the statutory penalties under Section 17 of the Act, discussed below, are payable to the “aggrieved consumer,” an undefined term in the statute. The courts have not yet ruled on whether an “aggrieved consumer” means a person who has sustained an actual injury. If the terms are synonymous, non-injured plaintiffs would lack standing for failure to assert a justiciable claim.

Section 16 of the TCCWNA

Section 16 of the Act prohibits consumer contracts, notices, or signs from stating that “any of its provisions is or may be void, unenforceable or inapplicable in some jurisdictions without specifying which provisions are or are not void, unenforceable or inapplicable within the State of New Jersey…”  In essence, this section requires retailers to note—in any provision stating that enforceability may vary based on state law—the precise extent to which the given section would be enforceable in the State of New Jersey.

This is even more complicated than it sounds, because New Jersey courts have found liability waivers to be unenforceable to the extent they violate public policy. See, e.g., Marcinczyk v. State of N.J. Police Training Commission, 203 N.J. 586 (2010). Accordingly, could a consumer contract provision that is unenforceable as against public policy constitute a violation of the TCCWNA? The answer would depend upon whether the provision is found to violate a “clearly established right,” which would be fact specific. These uncertainties make it difficult for a retailer to know whether and how enforceable a given term may be.  If the retailer over-estimates the enforceability of a given provision, it may run afoul of the TCCWNA, but if it under-estimates the enforceability of a provision, it may risk subjecting itself to claims that the customer otherwise could have waived.

A recent case in New Jersey relied on Spokeo to reject a claim under Section 16 of the TCCWNA. In Hecht v. The Hertz Corporation, ­­­Case No: 2:16-cv-01485 (D.N.J., October 20, 2016), plaintiff rented a car through defendant’s website. The rental agreement provision entitled “Void Where Prohibited” recited that all services may not be available in all locations and that restrictions may apply to the use of services in some jurisdictions. Plaintiff alleged that this provision violated Section 16 of the Act because it failed to describe which restrictions applied or were void in New Jersey. The court rejected the claim because plaintiff sustained no real injury. It added that even if the statute gave plaintiff standing to bring the claim under state law, that legislation did not confer Article III standing, which is a separate requirement in federal court. Id., Slip Op. at *4.

Penalties Under the TCCWNA

Section 17 of the TCCWNA imposes substantial penalties for violations in the context of a class action.  A seller who violates the TCCWNA is liable “for a civil penalty of not less than $100.00 or for actual damages, or both at the election of the consumer, together with reasonable attorney’s fees and court costs.” In the context of a class action, damages can add up dramatically because putative class action plaintiffs typically seek to represent all visitors to the site, which can number in the tens of thousands over a given period of time. Moreover, plaintiffs argue that this penalty applies for each violation in a website’s terms and conditions. One court held that plaintiff stated a claim under both Section 15 and Section 16 of the Act, Martinez-Santiago v. Public Storage, 38 F.Supp.3d 500, 511-512 (D.N.J. 2014), but no court has expressly ruled on whether a separate statutory penalty must be imposed for each provision that violates the Act. Under the argument advanced by plaintiffs, a retailer could be liable for several hundred dollars per unenforceable provision for each visitor to the website, plus attorneys’ fees.

Given this potential liability, and the broad remedial interpretations of the Act by the courts, most retailers targeted by TCCWNA claims opt to settle them quickly.

Proposed Legislation to Expand the TCCWNA

There have been several recent attempts to revise the TCCWNA on both sides of the issue. In January 2016, New Jersey Assembly Bill 759 (and its counter-part Senate Bill 755) were pre-filed for introduction in the 2016 legislative session. The proposed bills sought to prohibit any provision whereby a consumer waives or limits any rights under TCCWNA “or any other federal or State consumer protection law,” prohibits reduction of the time to bring a TCCWNA claim within the otherwise applicable statute of limitations, or renders void and unenforceable any provision which inhibits the ability to bring a class action TCCWNA claim. The bills also sought to prohibit consumer contracts from requiring consumers to consent to venue and jurisdiction outside of New Jersey or waive a right to jury trial. The sponsors of these bills had proposed the same set of amendments in the 216th legislative session in Assembly Bill 4079, which was not passed.

In September 2016, Assembly Bill 4121 was introduced which sought to prohibit class certification of TCCWNA claims “in the absence of an ascertainable economic loss resulting from the alleged violation.” The legislation would also require allegedly aggrieved consumers whose economic loss was $250 or less to first request reimbursement from the seller at least 35 days before filing a TCCWNA suit.

All of these bills are still under consideration by various New Jersey legislative committees, and depending upon what is passed, there could be significant consequences for online retailers selling to New Jersey consumers.


The case law under the TCCWNA remains in flux on several important issues besides those discussed above, and the statute may be expanded by the New Jersey legislature. Accordingly, online retailers will continue to be subject to claims under the TCCWNA.  Given the increasing popularity of these suits, retailers should act quickly to make sure that their terms comply with the Act in its present form and as it may be amended.  At a minimum, any terms should be revised to state how a provision would be enforced in New Jersey.  As noted, the enforceability of a given provision may be difficult to ascertain.  Retailers should therefore consult counsel with expertise in this area, to make sure that their terms are compliant.