Our second online event on ”Algorithmic bias and gender”



Nowadays, many institutions and organisations both in the public and private sector make decisions based on artificial intelligence (AI) systems using machine learning, whereby a series of algorithms takes and learns from massive amounts of data to find patterns and make predictions. These systems inform how much credit financial institutions offer in different customers, who the health care system prioritises for COVID-19 vaccines, and which candidates companies call in for job interviews. Yet gender bias in these systems is pervasive and has profound impacts on women’s short- and long-term psychological, economic, and health welfare. It can also reinforce and amplify existing harmful gender stereotypes and prejudices.

Having these in mind, we are very pleased to invite you in our second online event ”Algorithmic bias and gender” that will be held in English via Microsoft Teams Meeting on 14 July 2021 at 11:00 am (Athens time). Guest speakers in this event will be Ioanna Papageorgiou (Doctoral Researcher at the NoBIAS project), Dr. Evangelia Balatsou (Founder of  Greek Girls Code), Stella Kasdagli (Co-founder of Women On Top organisation , writer) and Petra Molnar (Migration Tech Monitor at Refugee Law Lab).

The objective of this panel discussion is all the participants to raise the awareness on the discriminatory use of AI technlogies against women and other genders. DATAWO’s aim is to approach this issue in a interdisciplinary way, so that we can emerge what are the challenges set by these technologies in the gender equality legal framework and what suggestions we can make towards a more inclusive AI regulatroy framework.

We are looking forward to e-meeting you next month on 14 July 2021 and discuss with you why AI systems are biased, what are the impacts of gender-biased AI on human rights and how we can prioritise gender equity and justice in AI era. Save the date and stay tuned for our third and last online event for this summer!

You May Also Like