In December 2019, Ahmed El-Sheikh, digital media expert and member of the New Media Council at the National Union of Journalists in the United Kingdom and Ireland started a three-year investigation that probed flaws in the Google and Facebook algorithms.
The investigation found that although the search engine and social network had amended their search algorithm to show helplines and prioritise helpful material when a user searched for suicide related content in English, they failed to do so when the searches were in Arabic.
The investigation, produced and presented by El-Sheikh himself, was broadcast under the title of “The Death Algorithm” by Alghad TV in London and Cairo on 24 August.
El-Sheikh observed how the artificial intelligence (AI) of Google and Facebook responsible for tackling suicide does not working properly in the UK and Egypt when users search in Arabic.
In addition, El-Sheikh detailed the differences when users searched the word “suicide” on both platforms in English from the UK versus doing the same in Arabic from inside or outside the UK.
In the former situation, both websites prioritised showing suicide prevention helplines, followed by articles or posts on discouraging suicide.
Nevertheless, in the latter – when the search was done in Arabic from the UK – the top result was still the helpline, but it was followed by posts or articles explaining methods of suicide in detail.
Shockingly, the results get worse when the search is done in Egypt.
Neither of the websites mentioned the helpline when the search was done in Arabic, only showing it when the search was done in English.
According to the World Health Organisation (WHO), Egypt ranks first in suicide rates across the Arab world, followed by Sudan, Yemen and Algeria.
There were 2,584 suicides in 2021, according to the latest statistics from the National Council for Criminal and Social Research. The council cited social issues that Egyptian young adults regularly, which contributes to the high number of cases, including bullying, lack of mental health awareness, shaming and sexual coercion.
El-Sheikh, who spent three years investigating the issue, observed minor improvements in Google’s algorithm over the time period. For example, when the investigation started in 2019, the search engine did not show helpline details when the search is done in Arabic from the UK, which it now does.
The documentary interviewed Professor Mohamed Abdel-Muguid, Pro Vice-Chancellor for Science, Technology, Engineering and Medicine (STEM) at Canterbury Christ Church University, who found that Google algorithm failed to detect suicidal thoughts in two cases; when he used the female pronoun in Arabic and when he searched for “I want to die” instead of “I want to suicide” in English.
It also interviewed two psychiatrists, Dr. Gamal Ferwiz from Egypt and Dr. Mahmoud Lawaty from the UK, and both agreed that users could commit or avoid suicide based on the nature of material that they watch online.
El-Sheikh concluded his report by stressing that the Google and Facebook should improve their AI in order to provide the same service and quality in all countries and all languages, especially in the Middle East where suicide is a serious issue.
In Egypt, strides have been made to raise awareness in regards to mental health in the past years.
General Secretariat of Mental Health, at the Ministry of Health and Population launched a support hotline for those who have psychological problems or suicidal tendencies.
These hotlines can be contacted for psychological inquiries and psychological support through 08008880700 and 0220816831, while The National Council for Mental Health line can be contacted for psychological inquiries through 20818102.