London, Aug 17 : Facebook’s algorithm actively recommends Holocaust denial content, according to a new analysis released on Monday by the Institute for Strategic Dialogue (ISD), a UK-based counter-extremist organisation.
Holocaust denial has long been used as a means of attacking Jewish communities.
The researchers found that when a user follows public pages containing Holocaust denial content, Facebook “actively promotes” further Holocaust denial content to that user.
The investigation revealed that searching the term “Holocaust” in Facebook surfaced in Holocaust denial pages and groups.
From clicking through to these Holocaust denial pages, Facebook’s recommendation algorithm led the ISD researchers to further Holocaust denial pages, said the report.
For the study, the researchers examined the extent to which Holocaust denial content is readily accessible across Facebook, Twitter, Reddit and YouTube.
They analysed the term ‘holohoax’, which is commonly used by Holocaust deniers, for the research.
The paper identified 36 Facebook pages and groups, which are either specifically dedicated to Holocaust denial or which host Holocaust denial content.
The Facebook pages and groups have a combined number of followers of 366,068.
The report comes amid growing demand from Holocaust survivors to remove such content from the social networking platform.
“We take down any post that celebrates, defends or attempts to justify the Holocaust. The same goes for any content that mocks Holocaust victims, accuses victims of lying, spews hate, or advocates for violence against Jewish people in any way,” a Facebook spokesperson was quoted as saying in a statement by The Guardian.
The researchers also identified 2,300 pieces of content mentioning “holohoax” on Reddit, 19,000 pieces of content on Twitter and 9,500 pieces of content on YouTube, all created between June 1, 2018 and July 22, 2020.
The researchers also found that Holocaust denial content reduced significantly in the past year on YouTube.
It demonstrated how appropriately applied content moderation policies can be effective in denying dangerous conspiracy theorists a public platform, said the study.
Disclaimer: This story is auto-generated from IANS service.