Our topic of discussion for episode 41 of Case of the Week is whether public information on a social media platform, such as Facebook, that is deleted from public view but stored by the social media company is required to be produced. The case we analyze this week is In re Application Pursuant to 28 U.S.C. § 1782 of The Republic of the Gambia presided over by United States Magistrate Judge Faruqui.
Good morning and welcome to our Case of the Week for September 28, 2021. My name is Kelly Twigger. I am the CEO of eDiscovery Assistant and the principal at ESI Attorneys. Thanks so much for joining us this week.
Through our partnership with ACEDS, each week we choose a recent decision in electronic discovery that highlights key issues for litigators and those involved in the ediscovery process, and we talk about the practical implications of that decision for you, your private practice, and for your clients.
The link to the decision that we’re discussing today is available in the comments section on whatever platform you’re viewing us on. You could also view a link to our 2020 Case Law Report that we put out in conjunction with Doug Austin at eDiscovery Today.
We also invite you to pop over and take a look at our newly updated website and to sign up for the blog to be able to receive the weekly case law newsletter, including a link to this video broadcast as well as to the various blog posts and information that we provide over there.
This week’s case touches on a real life controversy that is, at best, incredibly important, but provides some useful information for us as ediscovery practitioners and litigators on the scope of information that is available under 28 U.S.C. § 1782 , as well as what is precluded and whether the deleted content from social media applications is precluded from production under the Stored Communications Act. Our decision comes from the case In re Application Pursuant to 28 U.S.C. § 1782 of The Republic of the Gambia. This is a decision from September 22, 2021. So very recent. Just last week issued by United States Magistrate Judge Faruqui.
Judge Faruqui is relatively recent to the bench, and this is the second decision that she has had on 28 U.S.C. § 1782 since she joined the bench in September of 2020. Both of those decisions are included in eDiscovery Assistant.
Many of the applications for content under 28 U.S.C. § 1782 come out of the D.C. Circuit and the D.C. Bench so this is a case from the District of the District of Columbia.
As you know, with eDiscovery Assistant, we apply issue tags to each of our cases to allow you to search them better and faster within the application. The issues this week include Stored Communications Act, 28 U.S.C. § 1782, Facebook, social media, technology assisted review, privacy, third party subpoena, and proportionality.
Let’s get into the facts of our case. If you are not familiar with the genocide of the Rohingya population in Myanmar, which really took place from 2010 through to about 2019, and that’s really the subject of this decision today. The Rohingya are an ethnic minority in Myanmar, and the Myanmar government refused to recognize the Rohingya citizenship within their country and considered them illegal immigrants. For a period of many years, beginning back in 2012, the Myanmar government engaged in just a series of violent, horrible events to essentially create the genocide of the entire Rohingya population within that country.
At issue in this case is information that was contained on the Facebook platform that furthered the Myanmar government’s ability to carry out that genocide. As we dig into the facts of this case, we find that essentially Myanmar was leveraging Facebook as the internal news source for the country to put out all kinds of false information that facilitated the genocide of the Rohingya population.
Within the context of the case, I want to make sure to note what the Court notes, which is that it is not it’s intent within the the actual layout of this decision to vilify Facebook in any way, shape or form. To the contrary, it commends it for maintaining the information that Facebook did maintain. I think that’s important going in here.
Within the United States, we have a couple of different laws that come into play here. One is Section 28 U.S.C. § 1782, and that statute is one that allows a foreign entity to come into the United States and seek information that is otherwise relevant to foreign proceedings. The second is the Stored Communications Act. The Stored Communications Act is one section of the Electronic Communications Privacy Act. It was enacted in 1986. Yes, 1986, a very, very long time ago, well before Al Gore invented the Internet.
Those are the two statutes that we’re dealing with today for purposes of understanding whether content that Facebook deleted from its platform — once it realized the scope of what the Myanmar government was doing to further the genocide of the Rohingya population — and still retained. Gambia has now brought litigation against the Myanmar government and is seeking this information from Facebook to further its case.
Kind of diving into now what the Court is talking about within the case, the Court notes that by Facebook’s own admission, it was too slow to respond to the concerns that were raised about its role in the genocide of the Rohingya. In 2018, roughly six years into the genocide in Myanmar, Facebook began deleting accounts and other content from its platform that were used by the Myanmar government agents to really spark the genocide. Gambia seeks the content that Facebook deleted as part of its investigation for use in the Gambia’s litigation against the Republic of the Union of Myanmar at the International Court of Justice under 28 U.S.C. § 1782. Gambia seeks the records for, “evidence of genocidal intent necessary to support a finding of responsibility for genocide of the ‘Rohingya’.”
Facebook’s arguments in opposition to the subpoena are that the Gambia request 1) violates the Stored Communications Act and 2) is unduly burdensome. The Gambia filed the underlying suit in the International Court of Justice in November of 2019 and sought this application to the District of the District of Columbia just recently, a couple of months ago. This decision is essentially the Judge’s ruling on whether or not Gambia is entitled to the deleted information that Facebook has maintained from its investigation into the use of its platform to further the genocide of the Rohingya people.
There was an independent international fact finding mission on Myanmar by the United Nations Human Rights Council that talked about that the Rohingya were in a situation of severe systemic and institutionalized oppression from birth to death due to state policies and practices implemented over decades in Myanmar. The investigation went on to state that the Myanmar military and other security forces committed human rights violations on a colossal scale in violation of all basic tenants of international law. The operations have devastating impact on the Rohingya civilian population, which was targeted, brutalized and terrorized.
In 2012, just to provide a little bit of context in terms of timing, there was pre-planned violence in the Rakhine state in Myanmar, in which homes of the Rohingya were looted and burned. There were murders that were carried out, summary executions and large scale displacement affecting both Rakhine and Muslims.
In October of 2016, the Myanmar military carried out so called “clearance operations”, which constituted systematic mass executions, disappearances, detention and torture of the Rohingya civilians and the destruction of homes, mosques and qurans. The UN mission concluded that Myanmar intended to eradicate the Rohingya.
That’s really the background that sets us up for what Facebook’s role really was related to the genocide.
At the time, Facebook was the most common social media platform in use in Myanmar. It was the only platform online for news, and Myanmar officials used the platform to release news and information. Media outlets used Facebook to disseminate articles. Essentially, Facebook was the face of the Myanmar government to the Myanmar population.
According to the Court, undoubtedly, Facebook had, “a powerful democratizing effect in Myanmar by exposing millions of people to concepts like democracy and human rights.” The Court also noted that Facebook content, “contributed to shaping public opinion on the Rohingya and Muslims more generally” and was used to “spread anti-Muslim, anti-Rohingya and anti-activist sentiment.” Specifically, “organized groups made use of multiple fake accounts and news pages to spread hate speech, fake news, and misinformation for political gain.” The viral spread of information on Facebook led to instances of “communal justice, communal violence, and mob justice.”
The UN mission concluded that Myanmar officials weaponized Facebook for “a carefully crafted hate campaign to develop a negative perception of Muslims among the broad population in Myanmar. This hate campaign portrayed the Rohingya and other Muslims as an existential threat to Myanmar and to Buddhism. And in the case of the Rohingya, it went a step further and was accompanied by dehumanizing language and the branding of an entire community as a illegal immigrants.”
In August of 2018—six years after this genocide started—Facebook acknowledged that the violence against Rohingya in Myanmar was horrific and that it was too slow to prevent misinformation and hate. At that time, it deleted and banned accounts of key individuals and organizations in Myanmar, including the commander in chief of Myanmar’s armed forces and the military’s television network. It also deleted, “seemingly independent new and opinion pages that covertly pushed the messages of the Myanmar military based on what Facebook deemed, ‘coordinated inauthentic behavior’ by Myanmar officials in violation of its terms of service.”
That’s really important, because you see that they’re talking about public pages, public facing pages that are disseminating this information. These are not private accounts on Facebook. These are public facing pages that the government of Myanmar is using to distribute false information to its citizens.
In October and December of 2018, Facebook deleted an additional 438 pages, 17 groups and 160 Facebook and Instagram accounts for the same reasons. These accounts, when they were deleted, had almost 12 million followers. Now, Facebook did preserve the content that it deleted, and Gambia now seeks three things under 28 U.S.C. § 1782:
- The pubic and private communications associated with the deleted content.
- Documents associated with Facebook’s internal investigation and the information that was deleted.
- A Rule 30 (b)(6) deposition from Facebook on their investigation and the information that was deleted.
What is the Court’s analysis here under 28 U.S.C. § 1782 and the Stored Communication Act?
28 U.S.C. § 1782 provides that “the district court of the district in which a person resides or is found may order him to give his testimony or statement or to produce a document or other thing for use in a proceeding in a foreign or international tribunal.”
Consideration of the application of 28 U.S.C. § 1782 Requires a two-step inquiry—1) whether the court has the authority to grant the request and 2) whether it should exercise its discretion to do so. There are also statutory elements of that Section 1782 that require 1) that the person resides or is found in the district that the discovery requested, 2) will be used in a proceeding before a foreigner international tribunal, and 3) that the request is made by an interested person. As long as those three mandatory factors are met, the courts have broad discretion in deciding whether or not to grant or deny these applications.
Now, there are four guidelines that the Court looks at and whether it should exercise its discretion. First, whether the respondent is a participant in the international proceedings, which it is here because Gambia is leading the litigation. Second, whether the tribunal is resistant to using this kind of discovery. Third, whether the application circumvents the tribunals proof gathering restrictions, and finally, whether the requested discovery is unduly, intrusive or burdensome.
The Court also knows that district courts must exercise their discretion under 1782 in light of the twin aims of the statute, which have been described as, “providing efficient means of assistance to participants in international litigation in our federal courts and encouraging foreign countries by example, to provide similar means of assistance to our courts.” Now, that really means we want to help out foreign courts when we can, and we want them to return the favor.
This is a situation that arises a lot. If you go into eDiscovery Assistant and use the 28 U.S.C. § 1782 tag, you’re going to see numerous cases across the country where foreign tribunals have sought information from US participants, US parties.
That’s 28 U.S.C. § 1782. Let’s look at the Stored Communications Act.
As I mentioned, this is one part of the Electronic Communications Privacy Act that was passed in 1986, which was a long, long time ago in technology terms, in this country. The SCA is not a catch all statue, which is designed to protect the privacy of stored Internet communications. As the Court notes here, “instead, it’s narrowly tailored to provide a set of Fourth Amendment like protections for computer networks.”
What are the parties talking about? Gambia argues that the Myanmar officials who provided created pages and provided information on Facebook platforms were not protected users under the SCA, which requires that “a user is any person or entity who uses an ECS and is duly authorized by the provider of such service to engage in the uses of that ECS.” Electronic communication service is an ECS. An ECS is any service which provides users the ability to send or receive wire or electronic communications. Facebook is and has always been classified as an ECS under the Stored Communications Act.
The question really becomes, does the Stored Communications Act prevent Facebook from divulging communications that are deleted from the platform? Put in terms of the language of the statute, the parties both agreed that the question is whether because the communications are offline (i.e. taken off Facebook’s platforms but still preserved) whether they are considered in backup storage under the Stored Communications Act.
That’s really the biggest issue because we’re not talking about live data. If you go and subpoena Facebook for a live profile of an active user on Facebook, that content is going to be subject to the Stored Communications Act, and you will not be able to get it from Facebook absent a warrant, a criminal warrant. This situation is completely different because we’re talking about content that was deleted from the platform and kept by Facebook. And the question is whether or not that information is a backup as the Stored Communications Act defines it. Essentially, the Court looks at the definition of backup related to deleted content.
Facebook wants to argue that in fact, this content, although kept, is a backup of the content that was removed from its platform. And Gambia says, “no, it’s not a backup. It’s just content that you took off the platform and you’re storing separately.”
The Court looks at some technical issues related to the deleted content, one, that Facebook alone retains offline access to it, but that any, “archive of deleted messages that Facebook continues to maintain constitutes the only available record of these communications and cannot serve as a backup copy of communication stored elsewhere.” The Court says, this is not a backup of an original copy. This is it. This is the sole copy of the information that you have, and as such, it is not a backup, as defined by the Stored Communications Act.
Facebook, of course, argues that it preserves the data as part of an autopsy into its role in the Rohingya genocide. And the Court says, “Well, that’s admirable and great that you kept it for your own self reflection, but it doesn’t mean that you kept it as a backup. It’s not a backup.”
The Court also looks to the case law, which states that ESI is stored for backup protection if it is a copy meant to prevent its destruction, and that, again, is not what happened here. The fact is that Facebook took the data down and banned the associated accounts and maintained a copy of all that information. What we don’t have is a backup. That means that the SCA does not apply here.
The next argument from Facebook is the privacy considerations for those who posted the information on Facebook. This is a pretty standard argument that you’ll find whenever information is sought from a social media platform. First, you’ll hear arguments on the Stored Communications Act, and second, you’ll hear that they’re trying to protect the privacy interest of the users.
Facebook says here that if the Court allows this information to be produced, that it’s holding will have sweeping privacy implications and that every time a service provider deactivates a user’s account for any reason, the contents of the user’s communications would become available for disclosure to anyone, including the U.S. Government. The Court says: “Facebook taking up the mantle of privacy rights is rich with irony. News sites have entire sections dedicated to Facebook’s sordid history of privacy scandals.” Essentially, the Court says, no, we’re not buying that argument in this particular.
The way that a user signs up for Facebook juxtaposed against how users use the platform here for purposes of the genocide of the Rohingya are completely different. The Court says a user signs up for Facebook is subject to Facebook’s terms of service, and the failure to abide by those terms can result in Facebook deleting that account. If Facebook deletes the account, it is no longer subject to the Stored Communications Act. Here, the Court says, “this is not a situation where these officials created private accounts that were subject to their own privacy rights. These pages and these groups were created with the sole purpose of disseminating public information.” It’s a very different inquiry.
The Court says that the privacy concerns are minimal given a narrow category of interest that Gambia seeking and the requested content. Again, they just confirmed that the deleted content does not fall within the electronic storage protection of the Stored Communication Act.
The Court then looks at the exceptions to the SCA to determine whether there is anything else that might protect the information from disclosure. The Court really looked at a number of different cases—each of which are in eDiscovery Assistant—in doing this evaluation, but essentially looked at the California Supreme Court case in Facebook vs. Superior Court and found that courts can compel productions of communications that are accepted from SCA protections.
Here, the Court looks at then the consent exception, and the Stored Communications Act was designed to cover private and not public posts. Because the Court found that much of the content here was posted publicly before Facebook removed it, it is an exception to the Stored Communications Act. One, the Court already found deleted content is not subject to the SCA because it’s not a backup. Now, we’re saying that we also have an exception to the Stored Communications Act in that this information was posted to be available publicly. It was not meant to be stored privately, and therefore it meets the consent exception of the SCA.
The consent analysis, according to the Court, turns on the fact intensive inquiry as to “whether the posts have been configured by the user as being sufficiently restricted that they are not readily available to the general public.” Clearly, that is not the case here; all of these posts were made with the intent of them being consumed publicly. The consent that’s sought follows within the consent exception of the SCA and the provider protection is not applicable.
Now we’ve determined that the factors of 1782 have been met. We’ve talked about privacy. Now we’ve got to do the analysis of the Court’s discretion and whether or not the court will provide access to this information.
The Court says it’s uncontested by the parties that the factors of 28 U.S.C. § 1782 were met. Now it comes down to whether or not the request to discovery is unduly burdensome.
The Court looks at that and is essentially going to say no, it’s not. But here’s the analysis breakdown. The records sought by Gambia are a discrete and known universe of records. Gambia specifically identified the communications of 17 individuals, four entities and nine Facebook pages that they wanted. They’re not asking Facebook to conduct any further searches for coordinated in authentic behavior related to the Rohingya genocide so the traditional discovery burden of searching for content does not apply here. The Court notes that, in fact, Facebook has already produced some of the records that we’re talking about.
The parties disagreed on the date range for information. Gambia wanted information back to 2012, which was the start of the genocide. Facebook wanted to limit its content to 2016 forward. The Court looked at the data and the timing and the facts surrounding the genocide and said that the 2012 data is highly probative of the instigation of the genocide and that Facebook has not demonstrated its ability to respond, that it’s going to be very difficult for them. Instead, they touted the responsiveness of their Myanmar team to identifying and searching their platform and finding this information. The Court says, “Well, you can’t have it both ways. You either have this team who’s provided all this information, and you have it at the ready or you’re going to have to continue searching.” The Court said, “we think it’s going to be easy for you to pull that information together.” The Court also found that Facebook could leverage technology assisted review to be able to minimize the burden of reviewing and producing information as necessary.
That’s going to be a little bit harder as we know, because traditionally, ediscovery platforms don’t work as well with the type of content that’s available from social media platforms. That’s something I’m sure that Facebook has worked out.
The Court then turns under its discretion to look at relevance. That’s pretty much a no brainer here. The records are highly relevant to the foreign proceeding, which balances out any incremental burden on Facebook.
The Court also states that there really is no alternative avenue of discovery here. All the information was on the Facebook platform that was shown to the primary source of disseminating public information in Myanmar around the genocide, and that there’s no other way to conduct that discovery for Gambia. Based on all of those, the Court decides to exercise his discretion. The Court then exercises discretion with regard to the content that was posted on Facebook and was removed so that content is to be provided from 2012 forward.
The Court then turns to the requests for internal investigation records from Facebook, and the Court says that Facebook must produce anything that’s not privileged that relates to the internal investigation. There’s a quote that the Court gives us a couple of longer quotes here that I think are really important for consideration of this issue:
The investigation records will illuminate how Facebook connected the seemingly unrelated inauthentic accounts to Myanmar government officials. Specifically, these records may show which accounts or pages were operated by the same officials or from the same government locations. Therefore, Facebook’s internal investigation data, to the extent it exists, may be even more significant to the Gambia’s ability to prove genocidal intent. Given that Facebook has already conducted its investigation, the additional burden of production is minimal. Moreover, the requested discovery could not have greater import in the ICJ litigation.
The Court is saying, look, the information you’re going to provide from the public pages that these officials were running, and the information that was being disseminated is huge, of monumental importance. But what you pieced together at Facebook as a result of what you know was created is even a greater import to the Gambia for purposes of this litigation. And it is. It’s basically like saying we did all the work, underlying work for the case, and now we’re going to give it to the trial lawyer and let him put it on. That’s essentially what’s happening here. Facebook did a huge internal investigation into its role in this genocide and has all of that information available, and the Court says you have to provide it.
In conclusion, the Court really reconfirms that the Stored Communications Act has a well established path for disclosure of deleted content, and that we don’t even need a rewrite of the Stored Communications Act to be able to find that here. The Court ends with this quote, “Facebook can act now. It took the first step by deleting the content that fueled a genocide. Yet it has stumbled at the next step, sharing the content. Failing to do so here would compound the tragedy that has befallen the Rohingya. A surgeon that exercises the tumor does not merely throw it in the trash. She seeks a pathology report to identify the disease. Locking away the requested content would be throwing away the opportunity to understand how disinformation beget genocide of the Rohingya and would foreclose a reckoning at the ICJ.”
What are our takeaways from this case? I’ve advocated for years, both in presentations and in writing, that we are desperate for Congress to rewrite the ECPA, including the SCA, to bring it in line with where today’s technology lives, the environment in which we live, the Internet, and all things associated with it, including social media platforms. We obviously have issues where we need to draft legislation to protect consumer privacy. We have issues about the intersection of the First Amendment and private company platforms like Facebook, Twitter, LinkedIn, and how those need to be handled from a consumer privacy perspective, as well as gaining access to information.
Here in this case, those issues that I raise are not even on the table. The sole issue in this case was about whether or not deleted content from a platform that still exists was subject to the protections of the Stored Communications Act, and the Court answered that very definitively for the first time that the answer is no. Now, that is completely different than if you are seeking content for a social media provider that is active on the platform. I mentioned this a little bit earlier that active content is still going to be subject to the protections of the Stored Communications Act. You need to know and understand that difference and also that that content once deleted from those platforms may be lost forever unless the platform has a reason to preserve it.
That’s our Case of the Week for this week. Thanks so much for joining me. I’ll be back next week with another edition of our Case of the Week from eDiscovery Assistant.
If you are an ACEDS member and interested in using eDiscovery Assistant, there is a discount available to current ACEDS members and a trial for folks taking the ACEDS exam. If you’re interested in either of those, you can reach out to us at a email@example.com. One of our team will be in touch.
If you’re interested in doing a free trial of our case law and resource database, please sign up for a trial in the upper right hand corner. If you’d like to do something broader for your organization or for multiple seats, please reach out to us at firstname.lastname@example.org.
Thanks so much. Have a great week. Stay safe and healthy and I’ll see you next week.
Watch Next Episode Live
Catch the Replay
You’ll be able to read episode summaries and watch the recorded show right here on the eDiscovery Assistant blog or access the recorded versions from the ACEDS social media channels.