Apple allegedly threatened to throw Facebook from the App Store after a report indicating the social network’s laxity on the human trafficking issues it faced

Apple threatened to kick Facebook out of its App Store after a 2019 BBC report detailed how human traffickers were using Facebook to sell victims, according to the Wall Street Journal. The Journal viewed company documents which show that a Facebook investigative team was tracking down a Middle East human trafficking market whose organizers were using Facebook’s services. What appeared to be employment agencies advertised for domestic workers that they could provide against their will, according to the Journal.

Dozens of Facebook documents reviewed by the Wall Street Journal show that Facebook employees have sounded the alarm bells about how its platforms are being used in developing countries, where its user base is huge and growing. Employees reported that human traffickers in the Middle East were using the platforms to offer housekeepers for sale who were presented as forced labor, thank you, without a contract of employment, social protection and passport confiscation.

Apple has threatened the social network to withdraw its applications from the App Store so that it undertakes to repress this practice.

Still, an internal memo revealed that Facebook was aware of the practice even before that date: A Facebook researcher wrote in a 2019 report that Facebook was aware of the problem prior to the BBC’s investigation and escalation of Apple? , according to the Journal.

Under the question reads *: Yes. Throughout 2018 and the first half of 2019, we conducted the global understanding exercise to fully understand how domestic servitude does not manifest on our platform throughout its lifecycle *: recruitment, facilitation and operation.

Time and time again, the documents show, Facebook researchers have identified the platform’s adverse effects. time and time again, despite congress hearings, its own engagements and numerous media presentations, society has failed to correct them. The documents offer perhaps the clearest picture to date of the extent of awareness of Facebook’s issues within the company, down to the CEO himself.

Reading internal documents, Facebook employees warned that armed groups in Ethiopia were using the site to incite violence against ethnic minorities. They sent their bosses warnings about the use of platforms for organ sales, pornography and government action against political dissent. The documents also showed the company’s response, which in many cases is inadequate or non-existent. A Facebook spokesperson said the company has deployed global teams, local partnerships and third-party fact checkers to keep users safe.

Facebook says its rules all apply. Company documents reveal a secret lite that is exempt

Mark Zuckerberg said that Facebook allows its users to speak on an equal footing with the elites of politics, culture and journalism, and that its standards apply to everyone. Privately, the company has built a system that has exempted high-level users from some or all of its rules. The program, known as * cross check * (cross check) or * XCheck *, was intended as a quality control measure for large accounts. Today, it protects millions of VIPs from the company’s normal application, according to the documents. Many abuse this privilege, posting material including harassment and incitement to violence that would usually result in sanctions. Facebook says the review of the program is fair, that it was designed for a good purpose, and that the company is working to fix it.

Facebook knows Instagram is toxic to many teenage girls, company documents show

Researchers at Instagram, which is owned by Facebook, have been studying how its photo-sharing app affects millions of young users for years. On several occasions, the company has found that Instagram is harmful to a significant percentage of them, especially teenage girls, more than other social media platforms. In public, Facebook has always played down the app’s negative effects, including in comments to Congress, and has not made its research public or available to academics or lawmakers who requested it. In response, Facebook says that the negative effects are not widespread, that mental health research is valuable, and that some of the harmful aspects are not easily addressed.

Facebook has tried to make its platform a healthier place. The result is far from what was expected

In 2018, Facebook announced that it had made a change in its algorithm to improve its platform and stop the signs of declining user engagement. Zuckerberg said his goal was to strengthen bonds between users and improve their well-being by promoting interactions between friends and family. Within the company, the documents show, staff members warned that the change was having the opposite effect. This made Facebook and those who used it even more angry. Zuckerberg resisted some of the fixes proposed by his team, according to the documents, because he feared they would lead people to interact less with Facebook. Facebook, in response, claims that any algorithm can promote objectionable or harmful content and that the company is doing its best to alleviate the problem.

Moderators who do not understand certain languages ​​and leave harmful messages

Facebook’s AI-powered content moderators cannot read certain languages ​​used on the platform, raising concerns about how the company controls content in countries that speak languages ​​other than English , the Wall Street Journal reported Thursday.

The Journal viewed company documents which show that Facebook does not have enough employees who can speak local languages ​​to monitor events in other countries, markets in which the company has grown. strengthen its non-US user base. More than 90% of monthly Facebook users are located outside of North America, according to the Journal.

The report shows how the lack of human moderators with multilingual skills (combined with the shortcomings of using bots to weed out toxic posts) weakens Facebook’s ability to monitor harmful content online, a topic that brought it under review. meticulous in several countries in recent years.

Facebook employees have expressed concerns about how the system allowed bad actors to use the site for harmful purposes, according to documents viewed by The Journal.

A former vice president of the company told the Journal that Facebook sees the potential damage in foreign countries as simply the price to pay for doing business in those markets. He also said that there is very rarely a significant and concerted effort to invest in repairing these areas.

Drug cartels and human traffickers have used Facebook to target victims. One cartel, in particular, is the biggest criminal drug threat to the United States, according to US officials, and has used several Facebook pages to post pictures of violent scenes and graphics and gun footage. An internal investigation team wanted the cartel to be banned altogether, but the team tasked with doing so never followed through, the report said.

In Ethiopia, groups have used Facebook to incite violence against the ethnically cleansed Tigren people. This content has slipped through the cracks due to a lack of moderators who speak this language. The company had also failed to translate its rules of community standards into the languages ​​used in Ethiopia, according to the Journal.

And most Arabic-speaking Moroccan Facebook moderators are unable to speak other Arabic dialects, which has allowed violent content to stay online.

In most cases, Facebook only removed harmful posts when they caught the public’s attention and didn’t fix the automated systems (nicknamed classifiers) that allowed that content to be posted in the first place, according to the. report.

Sources : BBC, WSJ

See as well :

Apple blocked the FlickType Watch keyboard saying that keyboards are not allowed on the Apple Watch and then announced a clone of it on the device
Epic trial: Apple will no longer be able to force developers to use its payment system, a move that could seriously affect the publisher’s revenue on the App Store
Apple censors the words and phrases that its customers can engrave on its products in China, Hong Kong and Tawan: human rights, freedom of the press and democracy are among them
Apple posts record revenue of $ 89.6bn in 2Q21, up 54% year-on-year, but still remains vigilant about the shortage of chips

The post Apple allegedly threatened to throw Facebook from the App Store after a report indicating the social network’s laxity on the human trafficking issues it faced appeared first on Archyde.

Source link

Leave a Comment