![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
. | ![]() |
. |
![]() by Staff Writers Hong Kong (AFP) May 13, 2020
Facebook has apologised for its role in the deadly communal unrest that shook Sri Lanka two years ago after an investigation found that hate speech and rumours spread on the platform may have led to violence against Muslims. The riots in early 2018 erupted as anti-Muslim anger was whipped up on social media, forcing the Sri Lankan government to impose a state of emergency and block access to Facebook. The tech giant commissioned a probe into the part it may have played, and investigators said incendiary content on Facebook may have led to violence against Muslims. "We deplore the misuse of our platform," Facebook said in a statement to Bloomberg News after the findings were released Tuesday. "We recognize, and apologize for, the very real human rights impacts that resulted." At least three people were killed and 20 injured in the 2018 unrest, during which mosques and Muslim businesses were burned, mainly in the central part of the Sinhalese Buddhist-majority nation. The hate speech and rumours spread on Facebook "may have led to 'offline' violence", according to Article One, the human rights consultancy hired to conduct the investigation. The consultants also suggested that before the unrest, Facebook had failed to take down such content, which "resulted in hate speech and other forms of harassment remaining and even spreading" on the platform. Article One said one civil society organisation had tried to engage with the company on the misuse of Facebook as far back as 2009. In 2018, officials had said mobs used Facebook to coordinate attacks, and that the platform had "only two resource persons" to review content in Sinhala, the language of Sri Lanka's ethnic majority whose members were behind the violence. Facebook has 4.4 million daily active users in Sri Lanka, according to the report by Article One. The firm said Tuesday it had taken a number of steps in the last two years to better protect human rights. "In Sri Lanka... we are reducing the distribution of frequently reshared messages, which are often associated with clickbait and misinformation," Facebook said in a statement accompanying reports, which also looked at Indonesia and Cambodia. It said it had also hired more staff, including Sinhala speakers, and started using detection technology to protect vulnerable groups. - Concerns in Indonesia - Article One also investigated the impact of Facebook's services -- including WhatsApp, Messenger and Instagram -- in Indonesia. It found that in addition to political attacks and attempts to influence elections, vulnerable groups across the sprawling archipelago faced increased risks. The sharing of images without consent, cyberbullying and sexual exploitation threatened women especially, the consultancy said. "In some cases, women are blackmailed or even forced into abusive relationships or into situations of rape to avoid the embarrassment of nude photos being made public on Facebook's platform," said the report, released at the same time as the findings on Sri Lanka. "In other cases, Facebook's platforms have been used to connect customers to sex workers, some of whom may be trafficked." Article One said it also "found evidence of online bullying and child sexual exploitation, including online grooming of children" on Facebook's platforms. The social media company said that, like in Sri Lanka, it is ramping up efforts to protect its users from harm, including more staff and improved technology to identify hate speech in Indonesian. Facebook has been rolling out a number of programmes to prevent misuse after coming under increasing pressure in recent years over a series of privacy scandals, as well as criticism of its slow response to human rights concerns. -- Bloomberg News contributed to this story -- qan/fox
![]() ![]() Police stop fewer black drivers at night when a 'veil of darkness' obscures their race Stanford CA (SPX) May 07, 2020 The largest-ever study of alleged racial profiling during traffic stops has found that blacks, who are pulled over more frequently than whites by day, are much less likely to be stopped after sunset, when "a veil of darkness" masks their race. That is one of several examples of systematic bias that emerged from a five-year study that analyzed 95 million traffic stop records, filed by officers with 21 state patrol agencies and 35 municipal police forces from 2011 to 2018. The Stanford-led stu ... read more
![]() |
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |