Saturday, Dec 28, 2024
CLOSE

Facebook, Instagram almost banned from App Store in 2019 over human trafficking content


Facebook, Instagram almost banned from App Store in 2019 over human trafficking content

According to documents, the company knew that human traffickers were using its platforms in this manner since at least 2018. It got so bad that it was made worse in 2019, (*(AppleAAPLThreatened to pull Facebook and Instagram’s Access to the App Marketplace, a platform that the social media giant relies to reach hundreds upon millions of users each Year. Internally (*)FacebookFB(*()Employees rushed to remove problematic content and make emergency policy modifications to avoid what they called a “potentially serious” consequence for their business. However, Facebook was able to reassure Apple at the time and prevent its removal from the app shop. However, the issues remain. The stakes are very high. Facebook documents describe women being trafficked in this fashion being subjected both to physical and sexual abuse, being deprived off food and paid, as well as having their travel documents confiscated to prevent them from fleeing. An internal Facebook report dated earlier this year stated that “gaps still exists in our detections of on-platform entities engaged in domestic servitude” as well as how the company’s platforms were used to recruit and buy “domestic slaves” according to Facebook documents. SME used search terms from Facebook’s internal research to find active Instagram accounts offering domestic workers for sale. These accounts were similar to those that Facebook researchers had flagged. After SME inquired about them, Facebook removed the accounts and posted them. A spokesperson for Andy Stone confirmed that they were in violation of its policies.Stone stated, “We prohibit human exploitation in any uncertain terms.” “We’ve been fighting human trafficking since the inception of our platform. Our goal is to keep anyone who exploits others from having a place on our platform.”SME has reviewed internal Facebook documents included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. A consortium of 17 US news organisations, including SME, obtained redacted versions. These documents include information about human trafficking on Facebook’s apps. They also give deep insights into the company’s approach to misinformation, hate speech moderation, internal investigation on its newsfeed algorithm and communications related to Capitol Riot.

The Apple threat, first reported in The Wall Street Journal last month by The Wall Street Journal shows the potential consequences of Facebook’s ongoing difficulties with moderating problematic content, especially in non English-speaking countries. Representatives for Haugen stated in a SEC complaint that they were interested in learning the truth about Facebook’s failure to stop human trafficking and almost losing access the Apple App Store. This revelation comes as tensions between Apple and Facebook have been rising in recent months due to privacy concerns.

Stone directed SME to a Facebook letter that was sent last summer to several United Nations representatives. It addressed human trafficking and its efforts to stop it. The letter notes that domestic servitude content “rarely” is reported to Facebook by users.

Facebook wrote that “To counter these difficulties… we have also created technology that can proactively locate and take action upon content related to domestic servitude.” “We have been able detect and remove over 4000 pieces of illegal organic content in Arabic, English and French since January 2020.”Facebook tried to discredit earlier reporting by Wall Street Journal and Haugen’s testimony before a Senate subcommittee earlier in the month. In a John Pinette, Vice President of Communications at Facebook, stated that “The Facebook Papers” publication was not imminent. He said that a curated selection of millions of documents from Facebook cannot be used to draw fair conclusions.

Facebook’s business faces a’severe’ threat

In the fall of 2019, the BBC approached Facebook about an investigation it was soon to publish about an illegal online marketplace for domestic workers — which operated in part using Instagram — and shared the hashtags it had used to locate the content, according to an internal Facebook report. The report states that while Facebook removed 703 Instagram accounts promoting domestic servitude, other domestic servitude remained on Facebook “due to the underreporting and absence of proactive detection.”tweet last weekApple contacted Facebook in October 2019 following the publication of the BBC investigation. They threatened to remove their apps from the App store for hosting content that facilitates human slavery. In a November 2019 internal document, “Apple Escalation on Domestic Servitude — How we made this” was described.

” A Facebook employee described the actions taken by the company over the course of a week in order to mitigate the threat. These included taking action against more that 130,000 pieces in Arabic of domestic servitude-related material on Facebook and Instagram, expanding its policy against domestic servitude and launching proactive detection instruments in Arabic and English.

“Removing our applications from Apple platforms would have had potentially severe consequences to the business, including depriving millions of users of access to IG & FB,” the document states. “To mitigate this risk, we formed a large group of working people that worked round the clock to create and implement our response strategy.

Despite the chaos of that week, Facebook was well aware that such content existed before the BBC reached out. “Was this issue known by Facbeook prior to the BBC inquiry and Apple escalation?” According to the internal report, “Yes.” [Site Event]Internal documents show that Facebook workers in the Middle East, North Africa and North Africa regions flagged reports about Instagram profiles dedicated to selling domestic laborers in March 2018. These reports were not followed up on because our policies didn’t acknowledge the violation, according to an internal report on domestic servitude content from September 2019.

Stone, the spokesperson for Facebook, stated that the company had a policy against human exploitation abuses at the moment. “We have had such policies for a long time. He said that the policy was strengthened following that point.

Internal Facebook documents show that Facebook introduced an expanded “Human Exploitation” Policy on May 29, 2019. This policy included a ban against domestic servitude content that is related to recruitment, facilitation or exploitation.

A Facebook employee posted a summary of an investigation into a transnational human trafficking network in September 2019. The network used Facebook apps to facilitate the sexual exploitation and sale of at least 20 victims. The criminal network used more 100 fake Instagram and Facebook accounts to recruit female victims. They used Messenger and WhatsApp for coordination of transportation of the women to Dubai where they were forced into “massage parlors.”

The investigation revealed that $152,000 was spent on advertisements on its platforms to promote the scheme, including ads targeting men from Dubai. According to the report, the company deleted all pages and accounts that were related to the trafficking ring. According to the report, one of the recommended “action items” is to request that Facebook clarify its policies regarding ad revenue from human trafficking in order to “prevent reputational risks for the company (not profit from ads spent on HT).

A week later, a second report detailed more extensively the problem of domestic servitude abuse on Facebook’s platform. The document contains samples of Instagram ads for workers. One advertisement describes a 38-year old Indian woman who is being sold for $350. (The company claims it has removed the related accounts).

Ongoing challenges

Recent documents show that Facebook has not been able to remove domestic servitude content despite its efforts to do so immediately and over the months and weeks following the Apple threat.

Internally distributed in January 2020, a report found that “our platform allows all three stages (recruitment facilitation, exploitation) via complex, real-world networks.” It also identified common naming conventions for domestic servitude accounts and helped with detection. “Facebook profiles, IG Profiles and Pages were used by labor traffickers to exchange victims’ documentation… promote victims for selling, and arrange buying and selling of other fees,” the document stated.

Researchers discovered that labor recruitment agencies often communicated directly with victims through direct messages, but rarely posted violations of post public content, making them difficult for detection. The report stated that Facebook did not have “robust proactive detector methods… of Domestic Servitude English and Tagalog in order to prevent recruitment” despite the Philippines being a top source country for victims and that it did not have detection capabilities enabled for Facebook stories. Researchers identified at least 1.7million users who could benefit by information about worker rights.

“While our past efforts are a beginning to address the off platform harm that results domestic servitude, there are still opportunities for prevention, detection, enforcement,” the February report stated. Stone stated that the company has made on-platform interventions to remind job seekers of their rights and provides information through its Help Center for those who come across human trafficking content.

Even though Facebook researchers have extensively investigated the issue in detail, domestic servitude content can still be easily found on Instagram. SME last Wednesday identified multiple Instagram accounts that claimed to be offering domestic workers for sale. They used several common account names as a basis, including one account called “Offering Domestic Workers” and featuring photos and descriptions about women. This account also includes their age, weight, length of contract, and other personal information. SME asked Facebook for clarification and they removed the posts.

Stone stated that Facebook launched “search interventions” in English and Spanish in early 2021. These search interventions create friction in search when users type in keywords related to specific topics (that we have vetted through academic experts). He said that these search intervention were launched for sex trading, sexual solicitation and prostitution (English) and domestic servitude and labor exploitation in Arabic.

Stone stated, “Our goal was to help deter people searching for this kind of content.” “We are continually improving this experience to add links to useful resources and expert organisations.”

This article is part a SME series published on The Facebook Papers. It contains over ten thousand pages leaked internal Facebook documents. They provide deep insight into the company’s internal culture, misinformation and hate speech moderation.

The entire series can be viewed here

.

The post Facebook, Instagram almost banned from App Store in 2019 over human trafficking content appeared first on Social Media Explorer.