Syntho wins UNESCO’s gender bias challenge at VivaTech 2021 in Paris
We are happy to be announced as winner at VivaTech 2021 for UNESCO’s gender bias challenge. Syntho: “bias in = bias out” and we propose to solve imbalances in the input data by balancing it with intelligent synthetic data. At VivaTech, we demonstrated our brand new ‘data balancing feature’, one of our new value adding synthetic data features, that takes your data to the next level!
An introduction to VivaTech, UNESCO and the gender bias challenge
What is VivaTech?
VivaTech is Europe’s biggest startup and tech event hosted on June 16-19, 2021. This year, the organization hosted a hybrid experience due to COVID, in-person in Paris and online worldwide, that brings together an even larger community of innovators.
- More information can be found on: www.vivatechnology.com
What is UNESCO?
UNESCO is the United Nations Educational, Scientific and Cultural Organization. UNESCO stands up for freedom of expression and access to information, as a fundamental right and a key condition for democracy and development. Serving as a laboratory of ideas with digital innovation at its heart, UNESCO helps countries develop policies and programs that foster the free flow of ideas and knowledge sharing to tackle the world’s challenges and ensure sustainable development for all.
- More information can be found on: www.unesco.org
What is the gender bias challenge?
The gender bias challenge aims to reduce the gender digital divide by exposing bias in AI. AI feeds on biased data-sets, amplifying the existing gender bias in our societies. Evidence shows that by 2022, 85% of AI projects will deliver erroneous outcomes due to bias if AI as a technology and as a sector is not more inclusive and diverse. How can we make sure data sets are more diverse? UNESCO is looking for innovative solutions that aim to reduce the gender digital divide by exposing bias in AI.
Our winning solution
Solve imbalances in the input data by balancing it with intelligent synthetic data
The challenge in our opinion: bias in = bias out
UNESCO’s seminal report from 2019 showed that AI-powered voice assistant tools like Alexa and Siri were perpetuating harmful stereotypes and sexist abuse directed at ‘feminized’ technology was even anticipated by tech companies.
In this example from UNESCO, If there is a certain bias in the data, it will unmeritedly cause biases in the output. Hence, our statement: ‘bias in = bias out’. And in the shared example, the developers apparently already knew about certain imbalances and biases in the data. So, how to overcome this?
Our solution: intelligent synthetic data generation to mitigate data biases
We have to re-balance the dataset to solve data bias challenges that could lead to discrimination in algorithms. How does our solution work. In this example, there is a bias and imbalance in the data. Where we expect 50% males and 50% females, we see only 33% females and 66% males. We can solve this by generating extra synthetic female or male data records to balance the dataset back to 50% males and 50% females to mitigate biases and imbalances in the data that could result in discrimination. This is how we solve data biases. We solve the problem by its roots. We solve the ‘bias in = data bias out’ challenge.
[NEW] Value adding synthetic data features
We support various value adding synthetic data features to take your data to the next level!
Example features that we support:
- Generate unlimited data
- Sub setting
- Data bias minimization
- Generate new data
- Customized data generation
- Automatic PII detection
- Customize PII generation
- Data quality issue solving
- Real time data generation
- Open APIs