by Bret Cohen, Jillian Hart, and Matthew Brown
The Supreme Judicial Court (“SJC”) recently issued its anticipated decision in Attorney General v. Facebook, Inc., 487 Mass. 109 (2021), addressing the extent to which the work product doctrine or the attorney-client privilege protects internal investigations from disclosure.
The decision affirmed in part and reversed in part a Superior Court decision (Attorney General v. Facebook, Inc., 2020 WL 742136 (Jan. 17, 2020) (Davis, J.)) that held that the work product doctrine did not apply to documents the Massachusetts Attorney General (“AG”) sought from social media giant Facebook, Inc.’s (“Facebook”) internal investigation into a data privacy breach.
The SJC’s decision provides an important reminder for companies to tread carefully and always consult with counsel before launching an internal investigation to understand what may be discoverable in future litigation.
Facts and Background
After a widely publicized data breach incident involving one of its third-party applications (“apps”), Facebook undertook an internal investigation, led by outside counsel, to determine the extent to which the platform’s apps misused user data and to evaluate associated liability. Facebook intended for the app developer investigation (“ADI”) to identify any other apps that misused user data and to assess Facebook’s potential liability from the incident. Both in-house and outside counsel “designed, managed, and overs[aw]” the ADI and “devised and tailored the ADI’s methods, protocols, and strategies to address the specific risks posed by these legal challenges.” Outside counsel also retained third-party technical experts and investigators to assist in the ADI.
As a result of the data breach incident, the AG opened its own investigation into whether Facebook misrepresented the extent to which it protected or misused user data. In accordance with its authority under M.G. L. c. 93A, the AG issued a series of civil investigative demands to Facebook. Facebook complied in part, but refused on privilege grounds to honor six of the AG’s requests. The first five requests sought information related to the identities of certain apps and app developers that Facebook identified and reviewed during its ADI. The sixth request, in contrast, sought Facebook’s internal communications and correspondence regarding certain apps.
The AG filed a petition to compel compliance with its demands in the Superior Court’s Business Litigation Session. The Superior Court sided with the AG, holding that the work product doctrine did not cover Facebook’s ADI and, even if it did, the AG made the required showing of a substantial need for the information that it could not obtain without undue hardship. As to Facebook’s asserted attorney-client privilege, the Superior Court held that the privilege did not cover the information sought by the AG’s first five requests, and ordered the production of responsive documents. Regarding the sixth request, however, the Superior Court held that it did seek the disclosure of potentially privileged material, and accordingly ordered Facebook to provide the AG with a detailed privilege log.
Facebook appealed and the SJC heard the case on direct appellate review.
The SJC affirmed in part and reversed in part the Superior Court’s decision.
Work Product Doctrine: Addressing the first five requests, the SJC held that the work product doctrine did apply, because Facebook conducted the ADI in anticipation of litigation. The Court, however, carefully distinguished fact work product from opinion work product. It held that as to documents constituting fact work product, the AG demonstrated substantial need and undue hardship requiring production. At the same time, the SJC held that remand was necessary to determine whether any of the responsive documents that Facebook withheld constituted opinion work product. The SJC held that, if any of the withheld documents constituted opinion work product, such documents are only “discoverable, if at all, in rare or extremely unusual circumstances.” Facebook, 487 Mass. at 128 (internal quotations omitted).
Attorney Client Privilege: Addressing the sixth request, the SJC agreed with the Superior Court that to the extent Facebook objected on the basis of attorney-client privilege, Facebook must produce a detailed privilege log so that the AG could assess (and potentially contest) the privilege assertions. The SJC reasoned that the request sought information dating back years before the ADI began, as well as communications that did not involve attorneys, both of which might fall outside the scope of the attorney-client privilege. The SJC held that the attorney-client privilege did not cover the first five requests as they merely sought underlying facts rather than attorney-client communications. In doing so, the SJC emphasized that “the attorney-client privilege only protects communications between attorneys and a client about factual information, not the facts themselves,” noting that “this distinction is important and somewhat collapsed by the advocacy in the instant case.” Facebook, 487 Mass. at 123.
In its decision, the SJC identified and discussed three important issues pertaining to employers contemplating or conducting internal investigations.
First, the SJC considered whether the work product doctrine applies to an internal investigation. In the instant case, the SJC held that the work product doctrine applied to the ADI because: (1) documents were prepared; (2) by or for Facebook for its agents; and (3) in anticipation of litigation. The SJC specifically found that, although Facebook had an ongoing compliance program, the ADI was “meaningfully distinct” from the compliance program, with its own distinct methodology focused on past violations, rather than improving ongoing operations in the normal course of business. In short, the mere fact that the ADI also served Facebook’s business purposes did not mean that the work product doctrine was inapplicable.
Second, the SJC discussed whether the information sought by the AG constituted fact work product or opinion work product. As the SJC stated, “the line between fact work product and opinion work product is not always clear.” In this regard, the SJC noted that although Facebook made multiple public statements about the ADI and the investigatory process (which Facebook could not then claim to constitute opinion work product), any “undisclosed strategic decision-making by counsel, including the assessment of legal risk or liability  revealed by the factual analysis” might qualify as opinion work product.
Third, the SJC considered whether, in relation to fact work product, the party seeking disclosure established a substantial need for and undue hardship from denied access to the work product sufficient to warrant its discovery. The SJC held that the AG met its burden by demonstrating both. With respect to the AG’s substantial need, the SJC found that the app-related information sought was central to the statutorily authorized c. 93A investigation. Likewise, with respect to the AG’s asserted undue hardship, the SJC distinguished the ADI from a routine internal investigation that “involved simply interviewing key employees and other witnesses or reviewing a manageable number of documents, tasks that can be easily replicated by third parties or government investigators.” Here, the ADI was a years-long investigation involving a vast quantity of information and included analysis of millions of apps by hundreds of outside experts. Therefore, the SJC ruled that the enormous costs and time required to duplicate the ADI was sufficient to demonstrate undue hardship.
Although this area of law is far from settled, the Facebook decision provides helpful guidance for companies contemplating and conducting internal investigations. Key guideposts include:
- Engage counsel in advance of an internal investigation to discuss the objective and parameters of such investigation.
- Any outside experts involved in the investigation should be retained by outside counsel and should be bound by confidentiality agreements.
- Review what records and files the company develops in the regular course of business and be mindful that these records may be discoverable if not created in anticipation of litigation.
During the internal investigation, consider carefully what information and documents may be characterized as fact (versus opinion) work product and, therefore, may be discoverable. As the SJC cautions, the line between the two is “not always clear” and, consequently, aspects of internal investigations, especially fact work product, may be discoverable.
Bret Cohen chairs the Labor & Employment and Trade Secrets & Employee Mobility Practice Groups at Nelson Mullins Riley & Scarborough LLP. His practice covers a wide range of areas, including the enforcement of non-compete and employment agreements, complex commercial and trade secrets litigation, and advice and counsel on termination and transition of high-level executives.
Jillian Hart is an associate in the Labor & Employment Group at Nelson Mullins Riley & Scarborough LLP. Jillian focuses her practice on employment and trade secrets litigation and also advises clients on a variety of employment matters, including restrictive covenants and wage and hour issues.
Matthew Brown is an associate in the Labor & Employment Group at Nelson Mullins Riley & Scarborough LLP. Matthew focuses his practice on trade secrets and non-compete litigation and advice and counsel on a variety of issues, including worker classification and employment agreements.
Fair Housing Enforcement in the Age of Digital Advertising: A Closer Look at Facebook’s Marketing AlgorithmsPosted: February 19, 2020
The increasing use of social media platforms to advertise rental opportunities creates new challenges for fair housing enforcement. The Fair Housing Act, 42 U.S.C. §§ 3601-19 (“FHA”) makes it unlawful to discriminate in the sale or rental of housing on the basis of race, color, religion, sex, familial status, national origin, and disability (“protected classes”). The FHA also prohibits discriminatory advertising, including distributing advertisements in a way that denies people information about housing opportunities based on their membership in a protected class. Accordingly, advertisers and digital platforms that intentionally or unintentionally cause housing advertisements to be delivered to users based on their membership in a protected class may be liable for violating the FHA.
In March 2018, in response to what they perceived to be discriminatory advertising on Facebook, the National Fair Housing Alliance (“NFHA”) and several housing organizations filed suit in federal court in New York City. The lawsuit alleged that Facebook’s advertising platform enabled landlords and real estate brokers to prevent protected classes from receiving housing ads. Facebook settled the suit on March 19, 2019. As part of the settlement, Facebook agreed to make a number of changes to its advertising portal so that housing advertisers can no longer choose to target users based on protected characteristics such as age, sex, race, or zip code. Facebook also committed to allow experts to study its advertising platform for algorithmic bias. It remains to be seen whether this agreement goes far enough in curtailing discriminatory advertising practices, as Facebook is confronting further enforcement action from a government watchdog in respect to similar issues. Moreover, a recent research study found that Facebook’s digital advertising platform may still lead to discriminatory outcomes despite changes already made.
On August 13, 2018, the Assistant Secretary for Fair Housing and Equal Opportunity filed a complaint with the Department of Housing and Urban Development (“HUD”) alleging that Facebook is in violation of the FHA. The Office of Fair Housing and Equal Opportunity determined in March, 2019 (the same time as the settlement agreement with NFHA) that reasonable cause exists and issued an official Charge against Facebook.
Notwithstanding these suits and administrative actions, it remains that, for fair housing claims to survive in court against media giants like Facebook, HUD and future plaintiffs must first successfully argue that Facebook is not protected by the Communications Decency Act (“CDA”).
Communications Decency Act
Congress enacted the CDA, in part, to prohibit obscene or indecent material from reaching children on the internet, and also to safeguard internet ingenuity. What was meant as a protectionist measure for the young, impressionable, and inventive, however, evolved into a powerful defense tool used by web applications, like Facebook. Section 230 of the CDA immunizes providers of interactive computer services against liability arising from content created by third parties. To overcome the CDA hurdle, litigants have to demonstrate that Facebook “materially contributes” to the management of content on their platform. Fair Hous. Counsel of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1163 (9th Cir. 2008). While many online service providers have successfully used Section 230 in their defense, the protections offered to internet service providers are not absolute.
The CDA contains requirements that restrict the application of Section 230. The language in Section 230 prevents a “provider or user of an interactive computer service” from being “treated as the publisher or speaker of any information” that is exclusively “provided by another content provider.” The U.S Court of Appeals for the Ninth Circuit concluded that publishing amounts to “reviewing, editing, and deciding whether to publish or to withdraw from publication third-party content.” Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1102 (9th Cir. 2009). The idea is that website operators would no longer be liable for deciding to edit or remove offending third-party content.
Based on this reading, the law immunizes only “certain internet-based actors from certain kinds of lawsuits.” The statute, as discussed in Roommates.com, LLC, 521 F.3d at 1162, provides no protection to online content that was created by a website operator or developed – in whole or in part – by the website operator. Courts have reaffirmed the CDA’s limited scope to protect self-policing service providers that act as “publishers” of third-party content, as opposed to liability against all categories of third-party claims (i.e. violations of civil rights laws at issue in this article). Barnes, 570 F.3d at 1105; Accord Doe v. Internet Brands, Inc., 824 F.3d 846, 852-53 (9th Cir. 2016). These limitations are crucial. If a plaintiff can show that Facebook developed the content on its platform in whole or in part or, content aside, that Facebook produces discriminatory outcomes via mechanisms on its platform developed by Facebook, it may be excluded from Section 230 immunity.
Optimization Discrimination Study
In a recent study by researchers at Northeastern University,  evidence of Facebook’s control over ad dissemination demonstrates how Facebook manages output of information based on headlines, content, and images, using “optimization.” In short, the Study set out to determine how advertising platforms themselves play a role in creating discriminatory outcomes. The Study highlighted the mechanisms behind, and impact of, ad delivery, which is a process distinct from ad creation and targeting. For example, the Study found that inserting musical content stereotypically associated with Black individuals was delivered to over 85% Black users, while musical content stereotypically associated with White people was delivered to over 80% White users. The researchers concluded that “ad delivery process can significantly alter the audience the ad is delivered to compared to the one intended by the advertiser based on the content of the ad itself.” The study also simulated marketing campaigns and found that Facebook’s algorithms “skewed [ad] delivery along racial and gender lines,” which are protected categories under the FHA. These results suggest that, even if a housing advertiser can no longer choose to explicitly target ads based on attributes like age, gender, and zip code, a housing advertiser could still use Facebook’s marketing platform to steer ads away from protected segments of users by manipulating the content of the ad itself. Moreover, the platform may cause such discriminatory outcomes regardless of whether or not the advertiser intended such results.
Case Law Interpreting CDA
The Study’s findings set the foundation for evaluating Facebook’s control over the manipulation of content and ad distribution on their platform. Two seminal cases, Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997) and Roommates.com, LLC, 521 F.3d 1152 (2008), outline tests to determine when online platforms are considered content managers versus content providers. The Study makes a strong case for why Facebook is a content manager, eliminating immunity under Section 230. Litigants can also persuasively distinguish their arguments against Facebook from a recent decision interpreting Section 230 liability. In Herrick v. Grindr, LLC, 306 F. Supp.3d 579 (2018), the U.S. Court of Appeals for the Second Circuit ruled in favor of Grindr (a same-sex dating application) on all but one of the plaintiff’s claims. The plaintiff had argued that Grindr failed to monitor and remove content created by the plaintiff’s ex-partner, and the court concluded that Section 230 barred several of his claims because they were “inextricably related” to Grindr’s role in editing or removing offending content (which is protected conduct under the CDA). Herrick v. Grindr, LLC, 306 F. Supp.3d 529, 588. The Supreme Court denied Herrick’s petition to review on October 7, 2019.
A major distinguishing feature between the facts in Herrick and the Study’s findings against Facebook is how the two websites handle third-party content. In Herrick, the claim against Grindr was based on Grindr’s failure to remove content generated by a third-person. The issue with Facebook exists in the use of optimization algorithms. The point is that discriminatory outcomes are ultimately a result of Facebook’s manipulation of ad delivery for the purpose of reaching certain groups at the exclusion of others in protected categories. Facebook’s tools go well beyond the function of “neutral assistance,” because its platform directs advertisements to sectors of people using discriminatory preferences created by Facebook, not third-parties.
If it can be successfully argued that Facebook is not immune from suit under the CDA, housing advertisers and digital platforms that intentionally or unintentionally target ads to certain groups of users based on their membership in a protected class may be sued for violating the FHA. As described above, the Facebook Study determined that housing advertisers may still be able to use Facebook’s marketing platform to steer housing ads away from protected classes of tenants by manipulating the content of the ad. In such circumstances, the housing advertiser who uses the ad’s content as a covert method of discriminatory distribution may be violating the FHA. The digital platform may also be liable either because they are actively involved in facilitating the selective distribution of ads, or as an agent vicariously liable for the advertiser’s conduct.
Even if it cannot be shown that a housing advertiser intended to discriminate, if the ad delivery mechanism has the effect of distributing housing ads in a discriminatory way, the advertiser and platform may still be liable for violating the FHA under a theory of disparate impact. Disparate impact discrimination occurs when a neutral policy or practice has a discriminatory effect on members of a protected class. See Texas Dep’t of Hous. & Cmty. Affairs v. Inclusive Communities Project, Inc., 135 S. Ct. 2507, 2523 (2015); see also 24 C.F.R. § 100.500. A three-part burden shifting framework is used to evaluate liability. Id. Protected class members have the initial burden of establishing that a practice has a disproportionate adverse effect on a protected class. To meet this initial burden, a plaintiff must “allege facts at the pleading stage or produce statistical evidence demonstrating a causal connection” between the policy and the disparate impact. Inclusive Communities, 135 S. Ct. 2507, 2523 (2015).
If a protected class member makes out a prima facie claim of disparate impact, the burden then shifts to the accused party to show that the practice is necessary to achieve a valid interest. See Robert G. Schwemm, Calvin Bradford, Proving Disparate Impact in Fair Housing Cases After Inclusive Communities, 19 N.Y.U.J. Legis. & Pub. Pol’y 685, 696-697 (2016). The protected class members then have an opportunity to show that the interest could be achieved through less discriminatory means. Id.
In the digital advertising context, protected class members would have the initial burden of showing that they were denied equal access to information about a housing opportunity as a result of a housing advertiser’s marketing campaign.
While the Facebook Study was able to demonstrate the potential for “skewed” ad delivery based on protected characteristics, further research is needed to determine how a plaintiff might marshal statistical evidence to support a particular claim. As the Facebook Study notes, without access to a platform’s “data and mechanisms” it may be difficult to assess whether or not a particular advertising campaign has led to discriminatory outcomes. Therefore, it may be challenging for adversely affected users to develop the necessary data at the pleading stage to make out a prima facie claim of disparate impact. This might explain why HUD is continuing to pursue its legal challenge against Facebook despite the remedial measures it has already agreed to undertake, including allowing research experts to study its advertising platform for algorithmic bias. In other words, HUD’s intent may be to better understand how Facebook’s ad delivery algorithm works now so it can limit its discriminatory impact.
Because digital advertising companies play an active role in the ad delivery process, it follows that a discriminatory distribution of ads could be attributed to the platform. While there are limited case decisions involving FHA liability and algorithmic decision-making programs, the court in Connecticut Fair Hous. Ctr. v. Corelogic Rental Prop. Sols, LLC, 369 F. Supp. 3d 362 (D. Conn. 2019), found that plaintiffs had pled sufficient facts to establish a causal connection between a tenant screening company’s alleged activity and unlawful housing denials to support a claim of disparate impact based on race. Id. at 378-379. The court found that the defendant had created and provided the automated screening process, suggested the categories by which the housing provider could screen potential tenants, made eligibility determinations, and sent out letters to potential tenants notifying them of these decisions. Id.
Digital advertising companies similarly create the marketing platform for housing advertisers to use, provide criteria from which to choose the users, and design and maintain the algorithms that decide to whom the ads will be delivered. Therefore, a sufficient nexus should exist between the advertising platform’s activity and the selective distribution of ads to support a disparate impact claim.
Housing providers and digital advertising platforms arguably have a “valid interest” in being able to effectively market their housing services, and ad delivery algorithms are an efficient way to reach relevant users. However, given the abundance of print and online advertising options available for housing advertisers that do not rely solely on ad delivery algorithms, such as Craigslist, Zillow, Trulia, and Apartments.com etc., less discriminatory means exist by which housing advertisers can successfully market their services.
HUD recently proposed a new disparate impact rule that would raise the bar even higher for plaintiffs bringing disparate impact claims and provide housing advertisers with a defense if a digital advertising platform’s algorithmic model was the cause of a discriminatory outcome. A number of tenant advocacy groups and other stakeholders, such as Harvard Law School’s Cyberlaw Clinic, have submitted comments opposing the proposed rule, arguing, among other concerns, that it would perpetuate discrimination by “significantly reduc[ing] incentives for algorithm users and vendors to test their tools for bias” contrary to the purpose of the FHA.
The FHA was designed to provide all home-seekers, who have the resources, with equal access to housing stock and opportunity. It seems clear that online platforms in the business of designing and maintaining their algorithms have an impact on large segments of protected populations. The tension between the need for more information to combat discriminatory algorithms and propriety interests remain. However, one important way to move forward is to balance these interests by staying within the bounds of the FHA, including incentives for platforms to evaluate their ad delivery tools for distribution bias, and ensure a more inclusive participation in the housing market for all social media users.
Nadiyah J. Humber is the Assistant Clinical Professor of Law and Director of the Corporate Counsel, Government, and Prosecution Clinical Externship Programs at Roger Williams University School of Law (“RWU”). RWU students earn academic credit externing for in-house legal offices of corporations, offices of prosecution, and government agencies in Rhode Island and beyond. Professor Humber teaches related seminars for each program on the role of one client entities and professional development through practice.
James Matthews is a Clinical Fellow in Suffolk Law School’s Accelerator Practice and Housing Discrimination Testing Program (HDTP) where he supervises law students in housing discrimination, landlord-tenant, and other consumer protection matters related to housing. Attorney Matthews also has significant teaching and professional presenting experience. He helps conduct fair housing trainings and presentations as part of HDTP’s community education and outreach. He also teaches an upper-level landlord-tenant course he developed which includes instruction on state and federal fair housing law.
 Nat’l Fair Housing Alliance, et al v. Facebook, Inc., No. 18 Civ. 2689, Complaint (detailing allegations), available at https://nationalfairhousing.org/wp-content/uploads/2018/03/NFHA-v.-Facebook.-Complaint-w-Exhibits-March-27-Final-pdf.pdf (last visited Jan. 13, 2020).
 National Fair Housing Alliance, Facebook Settlement, available at https://nationalfairhousing.org/facebook-settlement/ (last visited Jan. 20, 2020).
Assistant Sec’y of Fair Hous. & Equal Opportunity v. Facebook, Inc., No 01-18-0323-8, 1, Charge of Discrimination (detailing procedural history), available at https://www.hud.gov/sites/dfiles/Main/documents/HUD_v_Facebook.pdf (last visited Dec. 8, 2019).
 47 U.S.C. § 230(c) (2019).
 Brief of Internet, Business, and Local Government Law Professors as Amici Curiae Supporting the Respondents at 7 Homeaway.com & Airbnb, Inc. v. City of Santa Monica, Nos. 2:16-cv-06641-ODW, 2:16-cv-06645-ODW (9th Cir. May 23, 2018). See generally 47 U.S.C. §§ 230(b) (detailing policy goals for freedom on the internet).
 Brief of Internet, Business, and Local Government Law Professors as Amici Curiae Supporting the Respondents at 9, Homeaway.com & Airbnb, Inc. v. City of Santa Monica, Nos. 2:16-cv-06641-ODW, 2:16-cv-06645-ODW (9th Cir. May 23, 2018).
 47 U.S.C. §§ 230(c)(1), (f)(3) (2019).
 Brief of Internet, Business, and Local Government Law Professors as Amici Curiae Supporting the Respondents at 4, Homeaway.com & Airbnb, Inc. v. City of Santa Monica, Nos. 2:16-cv-06641-ODW, 2:16-cv-06645-ODW (9th Cir. May 23, 2018).
 Ali, Muhammad, et. al, Discrimination through Optimization: How Facebook’s Ad Delivery Can Lead to Skewed Outcomes, available at https://www.ccs.neu.edu/home/amislove/publications/FacebookDelivery-CSCW.pdf (last visited Dec. 6, 2019).
 Id. at 7 (explaining optimization on Facebook).
 See Nadiyah J. Humber, In West Philadelphia Born and Raised or Moving to Bel-Air? Racial Steering as a Consequence of Using Race Data on Real Estate Websites, 17 Hastings Race & Poverty L.J. 129, 153-155 (2020) (analyzing pertinent case law precedent for Section 230 immunity). There is a difference between online services that manage content (content provider) on their sites versus those that act more as a store house of information (service provider). Id.
 47 U.S.C. § 230(c) (2019).
 Bloomberg Law available at https://news.bloomberglaw.com/tech-and-telecom-law/grindr-harassment-case-wont-get-supreme-court-review
 47 U.S.C. § 230(c) (2019) (citing language from the act). Distinguishing O’Kroley v. Fastcase Inc., No. 3-13-0780, 2014 WL 2881526, at *1–2 (M.D. Tenn. June 25, 2014) (finding that providing search returns based on automated algorithms and user inputs does not constitute creating content).
 Muhammad, supra note 9.
 See supra note 2.
 See HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard, 84 Fed. Reg. 42853 (proposed Aug. 19, 2019) (for example, providing a defense where a “(2) plaintiff alleges that the cause of a discriminatory effect is a model, such as a risk assessment algorithm, and the defendant . . . (ii) Shows that the challenged model is produced, maintained, or distributed by a recognized third party that determines industry standards, the inputs and methods within the model are not determined by the defendant, and the defendant is using the model as intended by the third party . . .”)
 See Cathy O’Neil, Comment Regarding Docket No. FR-6111-P-02, http://clinic.cyber.harvard.edu/files/2019/10/HUD-Rule-Comment-ONEIL-10-18-2019-FINAL.pdf (last visited Jan. 20, 2020).