On May 29 and May 30, Facebook deactivated the Facebook accounts of scores of people in Tunisia or Tunisians abroad, including prominent bloggers and journalists who were previously censored by state bodies before and during the 2011 uprising, as well as artists and businesspeople who depend on the platform for their jobs.
No public explanation was given at the time by Facebook, yet one week later, on June 5, the company published a note explaining that it deleted 182 accounts of people it claimed were involved in “coordinated inauthentic behavior” – a phrase that appears to mean a propaganda campaign using its platform. Facebook published this note on the same day that the Atlantic Council’s Digital Forensic Research lab (DFRLab) published a nine-month investigation into a Tunisian propaganda network using Facebook. Neither report clarified how these 182 accounts were connected to the political propaganda detailed in the DFRLab report or what methods Facebook used to determine such links.
Between May 29 and June 5, activists concerned with transparency and digital rights had reached out to Facebook to inquire about what happened, why accounts were deactivated, and in some cases to lodge appeals and lobby Facebook to recover some of the deleted accounts. These efforts succeeded in convincing the company to reverse the deactivation of several accounts; however, some activists appeared to reassess their approach to the deactivated accounts when Facebook declared—without providing detailed evidence to the public—that a group of unspecified, deactivated accounts had been related to a propaganda campaign linked to the 2019 elections.
Meshkal spoke with several people whose accounts were deactivated who claim they have no connection to a propaganda campaign. Some of them had their accounts quickly restored after institutions or individuals with connections to Facebook intervened on their behalf. Yet many of those who did not benefit from these interventions still have their accounts blocked.
In order to document and tell this story clearly, Meshkal has divided its reporting into the following sections:
- Bloggers’, Journalists’ Accounts Deactivated
- Artists’, Businesspeople’s Accounts Deactivated
- Facebook Responds
- Civil Society Groups Step In
- The War Over Content Moderation
Also read Meshkal’s previous reporting on Facebook’s role in Tunisian politics from September 2019 here.
Bloggers’, Journalists’ Accounts Deactivated
“I thought I was hacked,” Sarah Ben Hamadi said about the moment she discovered her Facebook account had been deactivated the night of Friday, May 29.
“My Facebook account is very personal. It’s limited to friends. I don’t talk about politics on Facebook…Facebook for me is always a personal account,” Ben Hamadi told Meshkal in a phone interview.
Ben Hamadi posted a screenshot of the message she had received from Facebook on her Twitter account – where, unlike on her Facebook account she does post about politics and has over 130,000 followers. That message from Facebook not only informed her that her account had been deactivated, but it also stated that the “decision is final,” and that “unfortunately, for security reasons, we cannot provide more information on the reasons behind the deactivation of your account.”
“The message that we have received this time was a first: for Facebook to say that we no longer have the right to use our accounts and that the decision was final,” Haythem El Mekki, a prominent pundit, blogger, journalist and long-time radio host of one of the most popular national radio shows on current affairs, Mosaique FM’s Midi Show, told Meshkal. Mekki’s Facebook account and his professional page were deactivated and he received the same message from Facebook as Ben Hamadi.
“Usually, when we are subjected to massive reporting campaigns, we are requested one of two things: either an identity verification or they tell us that we violated Facebook’s community standards and that our accounts are suspended with our ability to request an investigation,” Mekki said.
In January 2011 during the uprising against the governing regime during the time of former President Zine el Abidine Ben Ali, Mekki’s Facebook account was reportedly hacked and deleted, with some evidence that this was done by state authorities.
“I think that Facebook is more relevant than ever in Tunisia today. It is a principal source of information for many Tunisian citizens, and it is a very perverse source of information as it could be easily manipulated,” Mekki told Meshkal. “Facebook has become really key for [getting] information in Tunisia, and it is essential to do everything possible to counter its fake news, propaganda, and manipulation”
As for Ben Hamadi, she prefers using Twitter over Facebook because of a lack of trust in the latter.
“I don’t trust Facebook, all the security things, issues that everyone knows about it – sharing of personal data etc. That’s why I never used it as much as Twitter,” Ben Hamadi said.
Arabasta is the pseudonym of a blogger who blogged under this name to avoid repression during the pre-2011, Ben Ali era. His account was also deactivated in the recent sweep of deactivations on either Friday May 29 or Saturday May 30 although he can’t recall which day exactly. Arabasta attempted to ask a contact at Facebook about his account deactivation.
“The communication I had [directly] with Facebook didn’t lead to anything,” Arabasta told Meshkal.
While Arabasta also received an initial account deactivation message from Facebook indicating it was “final,” friends of his did manage to lodge appeals to other Facebook employees. Like Mekki and Ben Hamadi, these appeals appear to have worked and Arabasta’s account was reactivated by Sunday, May 31. Pundit Haythem El Mekki’s account was also restored very quickly after the intervention of civil society groups like IWatch, but his official page – which had about 70,000 followers, remained deleted at the time this article was published.
French journalist Benoit Delmas, who has been based in Tunisia for ten years, also had his account deactivated on Saturday May 30. Delmas wrote an article for Le Point on May 31 exploring the issue. He told Meshkal that when he and his paper reached out to directors at Facebook’s offices in France for comment on the story, his account was restored within “five to seven minutes.”
Asked in an interview with Meshkal on June 2 if Facebook had offered any reaction or response to his article, Delmas said the response had up to that point been “silence. Total silence.”
According to one activist working to restore some of these deactivated accounts, at least two other journalists were affected by the deactivations including one with DW+ and another with Express FM.
Artists’, Businesspeople’s Accounts Deactivated
While the bloggers and journalists whose accounts were deactivated managed to have their accounts reactivated quickly, others with less media influence have not had their accounts recovered.
One is Hamdi Toukebri, a DJ who also goes by the stage name DJ Hamdi Ryder. Not only was Toukebri’s personal account deactivated at the same time as those of everyone else affected, but he said a professional page he manages called “Downtown Vibes,” and a fan page of his were also deactivated. Toukebri told Meshkal that he had about 4000 friends on his personal account, 14,000 followers for Downtown Vibes which represents an artist collective, and 2000 followers on his fan page.
“I use these pages for work to share any music events I have,” Toukebri said. “Like all people, Facebook is a very important tool to use for work. Even if you have a website, you will still need Facebook.”
Toukebri says that as a result of the deactivations, he and five other musical artists he works with are struggling professionally. One of these friends is a Tunisian but not based in Tunisia, raising yet another question of who was targeted in the May 29 – May 30 account deactivations.
“There were other artists already facing crazy times with the covid-19 for like three months now. And now, when things are going back to normal, we don’t have any pages,” he told Meshkal.
Toukebri reached out to IWatch, which had been collecting the names of people whose accounts had been deactivated and inquiring with Facebook to see why they were deleted and then to have them restored. IWatch had initially been using its connection to Facebook as part of what is called a “Trusted Partner” of Facebook, a partnership that began in January 2020 according to IWatch Projects Coordinator Henda Fellah. However, after the Atlantic Council’s Digital Forensic Lab (DFRLab) published its June 5 report on the 2019 election propaganda campaign conducted by a company called UReputation using Facebook and after Facebook linked its account deactivations to the activity covered in the report in a separate June 5 publication, IWatch seems to have adjusted its approach to the deactivated accounts.
“I got in contact with guys with IWatch. At the beginning they asked me to send URLs of accounts and pages. Then after they saw the article about the UReputation agency, then they decided they don’t want to work on this kind of story anymore,” Toukebri told Meshkal. “IWatch is not responding to me anymore. In the beginning they were helpful, they got in touch with me, and now I saw a post from the president. He said they are not anymore involved in this story. We don’t have anyone.”
Toukebri is likely referring to either a June 5 Facebook post by IWatch president Achref Aouadi or his June 6 post. In the June 5 post, Aouadi states that “we as an organization, we’re not proud to recover the accounts of people who participated in deceiving the will of the people.” The DFRLab report indicated that UReputation had been helping presidential candidate Nabil Karoui, who in the past has used his media network to defame IWatch and harass the family members of IWatch’s leadership.
Toukebri told Meshkal that he has friends of friends who apparently worked for UReputation, but that he personally has no relation to it. He also said that he knows the accounts of some people who worked for UReputation that were deactivated but later restored – something that IWatch’s Aouadi in a June 8 TV debate noted was not the case as far as the accounts IWatch had asked Facebook to review.
Another person affected by the account deactivations is Steve Abdelatif, a businessperson who runs a popular restaurant in Tunis. He and three other managers of that restaurant’s Facebook page all had their accounts deactivated at the same time as other people. Unlike some of the other accounts affected, Abdelatif did not receive the same message that his account deactivation was “final.” Abdelatif then tried to appeal the deactivation using Facebook’s online tools, but received an email from Facebook stating:
“…something you posted doesn’t follow the Facebook Community Standards. Unfortunately we can’t provide additional information about why your account was disabled, and we won’t be able to reactivate your account.”
“I rarely post. I got over Facebook when it became too virtual for me,” Abdelatif told Meshkal. “It became a little oversaturated,” he said, explaining that he uses the platform mainly when he needs to find contacts.
Of the four people who are page administrators for his restaurant, Abdelatif said that one of them got their account restored quickly because they were close to people in IWatch.
Asked why he thought his account had been deactivated, Abdelatif told Meshkal on Sunday May 31 that it was likely a mistake.
“I’m sure that they’re gonna open it up. I must be collateral damage because I really don’t use that shit,” Abdelatif said. “I accepted a long time ago that I live in a surveillance state.”
The following day, Abdelatif’s account was restored. He had contacted IWatch the previous night but was unsure whether it was their intervention that got him his account restored since it was restored only half an hour after he spoke with them.
Many other ordinary people also had their accounts deactivated, and most of them still did not have their accounts restored, according to activists.
On June 4, in response to questions from Meshkal, Facebook’s Middle East and North Africa communications team based in Dubai provided a statement by email that they said could only be attributed to a “Facebook company spokesperson.”
“Due to a technical error we recently removed a small number of profiles, which have now been restored. We were not trying to limit anyone’s ability to post or express themselves, and apologize for any inconvenience this has caused,” the statement read.
Facebook declined to specify how many were restored and declined to provide more details on the accounts that they did not restore. They also declined to explain what criteria were used to determine which accounts to restore and which ones to not restore. However, a representative from Facebook’s Dubai office indicated that accounts that were not restored were linked to Facebook’s work to limit “coordinated inauthentic behavior” – essentially a propaganda campaign using Facebook.
On June 5, Facebook published more details about this in its “May 2020 Coordinated Inauthentic Behavior Report,” noting it “had removed 446 Pages, 182 Facebook accounts, 96 Groups, 60 events and 209 Instagram accounts. This activity originated in Tunisia and focused on Francophone countries in Sub-Saharan Africa.”
The Facebook report continued:
“This network used fake accounts to masquerade as locals in countries they targeted,post and like their own content, drive people to off-platform sites, and manage Groups and Pages posing as independent news entities. Some Pages engaged in deceptive audience building tactics changing their focus from non-political to political themes including substantial name and admin changes over time. We found this network as part of our internal investigation which linked this activity to a Tunisia-based PR firm Ureputation”
Facebook did not clarify in this statement what criteria were used to determine accounts’ and pages’ relationship to UReputation. The statement also appears to contradict Facebook’s statement to Meshkal indicating that the deletions of some accounts were a “technical error.”
However, Haythem El Mekki, the journalist, gave an explanation that Facebook itself did not provide. Mekki stated in a June 8 TV debate program that he and others who had their accounts deactivated but who are not related to UReputation had their accounts deactivated because “Facebook considered that each person who is friends with someone who worked for the company, were administrators with them on a page, or was a member of a group with them, connected once with them – they are considered a part of the company.”
Mekki did not clarify how he knew this, and asked by Meshkal later how he knew this he said he had, in part, pieced it together himself and in part depended on a source whom he would not reveal.
“In the absence of any direct explanation from Facebook, we can only speculate that these accounts may in fact be collateral damage for the Operation Carthage clean-up, since Facebook’s report on its investigation makes clear that the company routinely uses “automated systems” to detect and disable “fake accounts,” wrote Rima Sghaier and Marwa Fatafta, two digital rights activists, in a blog post for Access Now, where Fatafta works as Middle East and North Africa Policy Manager. (The term “Operation Carthage” is not used in Facebook’s June 5 report, but it was used by to describe the work UReputation was doing in another June 5 report by the Atlantic Council’s DFRLab).
However, despite Facebook’s report that UReputation had engaged in “coordinated inauthentic behavior,” Facebook restored some the accounts linked to UReputation that it had initially deactivated, according to one activist working on restoring accounts and one person whose account was deactivated.
Facebook’s June 5 report on a propaganda campaign in Tunisia was released on the same day that the Atlantic Council’s Digital Forensic Research lab (DFRLab) published a nine-month investigation into a Tunisian propaganda network using Facebook. The report documented a campaign it called “Operation Carthage” run by company UReputation that appeared to try and influence the 2019 elections in Tunisia in favor of presidential candidate and media mogul Nabil Karoui. A similar campaign by Israel-based Archimedes group which also appeared to benefit Karoui had been taken down by Facebook in May 2019. The DFRLab report also claims that the same company was involved in trying to shape and influence elections in several other countries in Africa.
Meshkal reached out to the lead author of this report, Andy Carvin, a senior fellow at the Atlantic Council, to ask whether Facebook’s deletion of accounts in Tunisia may have been an effort to get ahead of the publication of DFRLab’s report and ahead of potential public concern over election propaganda and manipulation. However, in an email response, Carvin explained that DFRLab had coordinated their “open-source investigation with [Facebook’s] internal one” which is why Facebook and DFRLab “announced [their] findings simultaneously.”
Carvin also commented on the recent deactivation of some accounts in Tunisia that he believes are unrelated to what he refers to as the “Operation Carthage data set.”
“Given how these innocent accounts came down at the exact moment as the accounts that were among the Operation Carthage data set, yet were not listed as part of the data set, I’m almost certain it was an accident,” Carvin wrote to Meshkal.
Meshkal asked to see the data set, but Carvin responded that the data set was “proprietary info” shared by Facebook and that he would need Facebook’s permission to share it.
The DFRLab announced a partnership with Facebook on May 17, 2018, with President and CEO of the Atlantic Council Frederick Kempe announcing the partnership as aiming “to support the world’s largest community in their effort to strengthen democracy.” The Atlantic Council’s website lists Facebook Inc. as one of only five donors who gave over $1,000,000 to the Atlantic Council in 2018.
Asked by Meshkal about the nature of Facebook’s cooperation or coordination with the Tunisian government, Facebook also declined to provide an answer. However, Meshkal spoke with someone who works with the Tunisian government who said the Tunisian government is currently coordinating with Facebook on communication about Covid-19. Meshkal also reported in September 2019 that activists believed Facebook was working with the Tunisian state’s High Independent Election Authority (ISIE by its French initials). At that time, Facebook also did not respond to Meshkal’s questions about the extent of its cooperation with the Tunisian government, and ISIE also did not respond to Meshkal’s requests for comment.
Facebook’s responsiveness to inquiries and appeals appears to be different depending on each employee and each section within Facebook. Meshkal and people affected by the account deactivations reached out to Khaled Koubaa, a Facebook employee in Tunisia responsible for “Public Policy North Africa,” but inquiries made to Koubaa received no response or were dismissed, according to people Meshkal spoke with. However, activists told Meshkal that when they reached out to Facebook employees in California and France, they had better responsiveness. Meanwhile, Meshkal received an official response to one of its questions from Facebook’s Dubai office within two days.
Civil Society Groups Step In
Between May 29 and June 5, activists concerned with transparency and digital rights had been lobbying Facebook to recover some of the deleted accounts and had succeeded in convincing the company to reverse the deactivation of several accounts.
Many of the activists who got involved did so as individuals. However some groups involved in advocating for rights got involved including IWatch, Access Now, Digital Citizenship, and SMEX.
Rima Sghaier, a researcher and digital rights activist created an online platform that would allow people whose accounts were deactivated to provide their information to a database activists could then use to ask Facebook about why they were deactivated and potentially appeal the deactivations. Sghaier was quick to clarify that several other activists had already been doing similar actions to collect names and lobby Facebook, and several of them have been coordinating their efforts collectively.
Henda Fellah, Projects Coordinator at IWatch, was one of the other activists who began coordinating to try and restore accounts. Fellah told Meshkal that IWatch has been a “Trusted Partner” with Facebook since January 2020 and decided to use this position to make inquiries about those accounts that were deactivated.
Asked why IWatch got involved, Fellah explained that they were concerned that the deactivations happened without explanation, and it could set a precedent for future deletions.
“If it happens and we don’t talk about it and we don’t look into this issue, maybe it happens for us next,” Fellah told Meshkal. “Everyone can be subject to getting our accounts disappeared without any explanation. So it’s more about transparency.”
“If there are no reasons for deletion, it seems to me these people should have their Facebook accounts restored,” Achref Aouadi, president of IWatch said in a June 8 TV debate program. “As an organization, IWatch sent 56 requests…28 accounts were restored in Tunisia, 22 of them were sent by IWatch,” he continued.
Aouadi went on to explain that after the June 5 Facebook and Atlantic Council’s DFRLab reports, they looked back over the accounts that they had asked be restored and found 13 of the ones they lodged had worked for the company UReputation.
“The good thing is that of these 13 people, zero of their accounts were restored,” Aouadi said.
Fellah explained that with regards to IWatch’s partnership with Facebook, “Facebook reached out to us. They chose us. We didn’t ask for it,” but that now that they have the partnership they are trying to use it “for the good of Tunisia,” especially as IWatch expands its online fact-checking efforts.
The so-called “Trusted Partner Channel” allows Facebook to receive content moderation reports from organizations it has partnered with. Fellah said that this was useful when a contact notified her of a video documenting child abuse that was posted on Facebook. The contact who had reached out to Fellah said she had already flagged the video to Facebook asking it to be taken down, however her request had been rejected. In contrast, when Fellah flagged the video using the Trusted Partner Channel, it was taken down seven days after she flagged it to Facebook.
Another organization that asked Facebook to review deleted accounts was Access Now. According to Marwa Fatafta, Middle East and North Africa Policy Manager at Access Now, the lack of transparency surrounding how Facebook deactivated accounts is concerning.
“Access Now is worried about their digital rights, right to access the internet, the right to freedom of expression regardless if they are journalists or not,” Fatafta told Meshkal.
Fatafta also noted that a similarly worrying pattern of deactivation of Facebook accounts without a lot of public information about how or why they were being deactivated recently happened in Syria.
“It appears to be that Facebook has disabled accounts that belonged to Syria activists,” Fatafta told Meshkal. “One of the biggest problems around this is lack of transparency. The decision was made in an opaque and arbitrary manner, and Facebook did not allow users to appeal …it is extremely worrying. Many of these accounts belong to human rights activists and include documentation of human rights violations.”
Fatafta also noted that some users in Tunisia reported that they couldn’t verify their accounts via Facebook’s support page.
The War Over Content Moderation
Tunisia and Syria may not be the only countries affected by mass deactivations with little transparency or accountability provided by Facebook. In one recent tweet, Jillian C. York of the Electronic Frontier Foundation noted other recent problematic cases of what she called “content moderation…at scale”.
There appears to be a tug of war on how Facebook should operate its content moderation, with one side calling for stricter policing of so-called “fake news,” “inauthentic behavior,” and “election interference,” and another side calling for more circumspection so as not to trample digital rights.
Some state authorities are in the camp calling for Facebook to more strictly police content. This has taken on a geopolitical and military valence as U.S. politicians and some U.S. institutions have asserted varying degrees of Russian interference in the 2016 U.S. elections, including allegations by some U.S. Senators of a disinformation campaign on social media. In 2019, NATO issued a report focused on what it called Manipulation Service Providers based in Russia which provide social media influence through “inauthentic activity” for cheap fees. The NATO report noted that “the openness of the industry was striking,” and called for stronger oversight and regulation of social media platforms.
These echo moves by some politicians in Tunisia to more vigorously and directly regulate speech on social media, including a draft law presented to parliament by the Tahya Tounes party in late March which introduced strict criminal penalties for “electronic libel.” That initiative, ostensibly meant to fight “fake news,” was condemned by activists, journalists and politicians as threatening freedom of expression, and it was subsequently withdrawn. However, Tahya Tounes has taken up the initiative again in light of the new DFRLab report on UReputation’s propaganda operations, citing the report as evidence that stronger measures are needed to police social media.
On Wednesday, June 10, Tahya Tounes held a press conference where it called on the public prosecutor to examine the case. The state news agency TAP reported that Tahya Tounes made the argument that it was calling for stricter policing in order to “to ensure voters’ free will to express themselves and protect Tunisia’s democratic process.”
On the other side, many digital rights groups have stressed the need for social media companies and especially Facebook to be more transparent and accountable in how it moderates content.
In 2018, over 100 institutions working on digital rights and transparency, including international groups, signed an open letter calling on Facebook to be more transparent and accountable in its content moderation. The letter began by noting that a museum, a politician, and a journalist had all been victims of “a misapplication of Facebook’s Community Standards.” At the time, Facebook responded insisting that it already was “in line” with many of the letter’s recommendations. Yet many digital rights organizations remain concerned that Facebook is not sufficiently transparent in its policies.
“Part of the lack of transparency is also what Facebook defines as hate speech, terrorist content. Content moderation policies are 100 percent at the discretion of Facebook. What we’ve been calling for is that these policies should be in line with the international human rights framework,” Fatafta of Access Now told Meshkal. Access Now is a signatory to the 2018 open letter to Facebook.
Facebook’s community standards contain a broad definition of terrorism that includes “any non-state actor that engages in, advocates, or lends substantial support to purposive and planned acts of violence.” This policy is relevant to issues relating to Israel and Palestine, where there is an internationally recognized state on one side and several actors on the other side considered “non-state actors.”
Wajdi Mselmi, who is a journalist with the Tunisia-based Arabic language news website Inhiyez, told Meshkal that both his personal accounts and Inhiyez’s official Facebook page have been blocked in the past due to their editorial line supporting armed Palestinian groups. Mselmi said he was first blocked for 48 hours after a post in September 2019 which referenced the downing of an Israeli drone by a Palestinian armed group.
After that, Mselmi said that every time he wrote a post that even mentioned key words or phrases like “Palestinian resistance,” or the names of armed groups, his account was blocked by Facebook. Mselmi told Meshkal he received a message that he had been blocked due to “incitement to chaos and creating propaganda for terrorist groups.” He has since had his account restored after completing an identity verification check.
The scrutiny from Facebook has also affected the website Inhiyez, which Mselmi said is blocked from using Facebook Live.
“We were banned on various occasions for even the simplest journalism report we made about resistance movement in Palestine,” Mselmi told Meshkal.
Mselmi said he has seen other Arabic-language media organizations that take an editorial line in support of what they see as Palestinian resistance also blocked and banned by Facebook. He told Meshkal that he believes that Facebook’s content moderation policy follows official U.S. policies.
“Obviously it’s an American company and it follows in the general sense U.S.-centric policies,” Fatafta of Access Now noted of Facebook, but she also clarified that national governments and local laws do have effects on policy in different countries.
With regard to Israel and Palestine, Fatafta noted that Israel’s Ministry of Justice has a special cyber unit dedicated to sending requests to social media companies to take down what it considers incitement to violence. “Some content that has been removed from Israel-Palestine is a result of pressure by Israel’s government.”
Others noted that when it comes to North Africa and West Asia, Facebook is less attentive to content moderation challenges.
“What is frustrating for us, the NGOs, is that we think that Facebook is not giving importance to these issues in MENA [Middle East and North Africa] region,” Henda Fellah of IWatch told Meshkal, stressing in particular that Facebook could devote more resources to its Ad Library, a tool that allows the public to trace funding of content. Currently, only limited Facebook content is covered by the Ad Library.
Fellah pointed out that when IWatch was monitoring social media during the 2019 elections, they had a difficult time trying to follow up with Facebook to investigate who is behind or funding political communication.
Some of those whose accounts were deleted were sympathetic to the challenges Facebook faces when trying to moderate content.
“Facebook has more than one billion people on it. They have an obligation to make their platform more and more efficient and help the spread of right information. They are doing it: they are trying to help official accounts appear on timelines. It still has improvement to do,” Arabasta told Meshkal.
Others continue to try and hold Facebook accountable.
“The company has fallen short of its commitments and attempts at improving transparency and accountability in the MENA region,” Fatafta and Sghaier concluded in their blog post.