International Day Against Homophobia, Transphobia & Biphobia



May 17th is a day of pride. From 1948 to 1990, homosexuality was registered by the World Health Organisation in mental illness. On 17 May 1990, its general assembly, in a resolution, stressed that “homosexuality is not a mental illness or disorder.” The UN makes a significant contribution to defending the rights of the LGBTQI + community.

More specifically, UNESCO recognises that no country can achieve the Sustainable Development Goal 4 – that is, to ensure inclusive education and fair quality and to promote lifelong learning opportunities for all – if students are discriminated against or experiencing violence because of actual or perceived sexual orientation and gender identity.

UNESCO’s objectives in supporting educational change in homophobic and transphobic violence are:

-the collection of reliable data on the nature, extent and impact of homophobic and transphobic violence in educational settings, particularly in areas and countries where little or no data are available;

-the documentation and sharing of best practices for action;

-the raising awareness and building alliances at national, regional and global level, and

-the facilitation of actions in selected countries to prevent and address homophobic and transphobic violence in educational settings.

Violence based on sexual orientation and gender identity/ expression often targets homosexual, bisexual and transgender people, as well as those whose gender expression does not conform to binary gender norms, including gender, such as physical, sexual and psychological violence and bullying.

However, these forms of violence often occur in digital environments as well. More specifically, some platforms that offer electronic communication services create obstacles for individuals of the LGBTQI + community in exercising their digital rights. The content published is arbitrarily monitored by these private platforms, in accordance with standards and “Community directives”, according to EDRi, the European institution for digital rights. The practices of these platforms include the removal of many LGBTQI + community accounts, posts and thematic ads in the context of content moderation reporting, while homophobic, transphobic and sexist content often remains unreasonable.

In addition, the requirement for authentication based on official identity documents prevents trans people from using their new name and identity. For many of them, especially those living in conservative societies, it is difficult to achieve a change of name and gender. As a result, they see their accounts being deleted on a regular basis, after a few months of use, losing all their content and contacts. With little chance of recovering their accounts, their freedoms online are severely hampered.

According to Access Now, an international advocacy organisation for digital rights, “from social media to email accounts to dating profiles, repressed LGBTQI+ people often rely on online platforms to stay connected to their community. . However, governments, law enforcement, and others seeking to silence these voices have developed sophisticated cyber-attacks, such as e-fishing programs, to gain access to these accounts, endangering both the account holder and all their contacts. “In Egypt, for example, this data has been used to identify and apprehend those who identify as homosexual under a draconian anti-homosexuality law.”

In addition, the LGBTQI + community faces harassment and discrimination in many forms, such as humiliating personal attacks, constant surveillance by police and local gangs, or sexual assaults online. Access Now reports that in the United States, young people in the LGBTQI + community are three times more likely to experience cyberbullying and other forms of harassment.

In Jordan, a few years ago the government ordered to block an online magazine with content from the LGBTQI + community following critical remarks by a member of parliament in the international media. Other publications have faced outright attacks on denial of service that have taken down their websites, mass reporting of legal content on social media, and physical intimidation in order to discourage further publication.

Data security is also vital for everyone, but for many members of the LGBTQI + community, the disclosure of their personal information can be life threatening. Platforms that share the gender identity or sexual orientation information of their users with third-party advertisers have led to the unwanted disclosure of this information in both personal and professional environments. It was also recently discovered that Grindr – a popular gay dating app – had revealed its users’ HIV status, as well as personal identities, including email address and GPS location, to third-party analytics companies, putting people at risk. risk of exposure, discrimination, and abuse.

To sum up, many governments and individuals are also using sophisticated social engineering and e-fishing attacks to “infect” the LGBTI + community’s electronic devices with malware, turning it into a surveillance device they carry with them everywhere.

Finally, more and more governments and companies are using artificial intelligence (AI) systems to track and track citizens and consumers. Faster check-in at airports using face recognition or a seamless and personalised shopping experience are just two of the benefits. But this technology is basically flawed in terms of recognising and categorising people in all their diversity. Many of these artificial intelligence systems work by dividing people into two groups – men or women/ male or female. The consequences for people whose gender does not match the gender given to them at birth can be serious.

According to Access Now, ”algorithms are also programmed to reinforce outdated stereotypes about race and gender that are harmful to everyone. This kind of technology is based on the misconception that our gender and sexual orientation can be determined by how we look or the sound of our voice, how we move or just how “masculine” or “feminine” our name is. It is a defect that can easily cause discrimination. Imagine the humiliation you would feel if, in your attempt to catch your flight, you were barred from entering the aircraft by a computer that determined that you did not match the gender marker on your passport. Or imagine not having access to a public toilet just because a face recognition algorithm thinks your face does not look masculine or feminine enough”.

On 21 April 2021, the European Commission, proposed a legal framework – a set of rules and guidelines – for regulating artificial intelligence systems. The proposal for the Regulation sets out a subtle regulatory structure that prohibits certain uses of artificial intelligence, largely regulates high-risk uses and slightly regulates less dangerous artificial intelligence systems. The proposal requires providers and users of high-risk artificial intelligence systems to comply with data and data management rules, emphasizes documentation and record keeping, transparency in providing information to users, human oversight, accuracy and the safety.

Its main innovation, written in last year ‘s White Paper on Artificial Intelligence, is a requirement for ex ante conformity assessments to ensure that high – risk artificial intelligence systems meet these requirements before they can be marketed or put into operation. Despite these innovations, the proposed Regulation seems to have some gaps and omissions.

In particular, it does not focus on those affected by artificial intelligence systems, as it lacks any general information requirement for individuals subject to algorithmic evaluations. Little attention is paid to algorithmic justice in the text of the regulation as opposed to its accompanying recitals. And the recently required compliance assessments are merely internal procedures, not documents that could be audited by the public or a regulatory authority. However, the proposal could prove to be the basis for transatlantic co-operation to cast a common regulatory net on a emerging technology, said White House National Security Adviser Jake Sullivan.

As DATAWO, we support gender equality in the digital environments and we support the implementation of human rights in the digital world for LGBTQI+ community as well.

You May Also Like