[PRESS RELEASE – Please Read Disclaimer]
A first-of-its-kind initiative enabling inclusiveness and fairness
Oasis Labs announced the partnership with Meta, alongside the launch of a platform to assess fairness in Meta’s products, while protecting people’s privacy. As Meta’s technology partner, Oasis Labs built the platform that uses Secure Multi-Party Computation (SMPC) to safeguard information as Meta asks users on Instagram to take a survey in which they can voluntarily share their race or ethnicity.
The project aims to advance fairness measurement in AI models, which is intended to positively impact the lives of individuals across the globe and benefit society as a whole. This looks to be a first-of-its-kind platform which will play a part in an initiative that is a step towards identifying whether an AI model is fair, as well as allowing for appropriate mitigation.
How the platform will assess fairness in AI models
Meta’s Responsible AI, Instagram Equity, and Civil Rights teams are introducing an off-platform survey to people who use Instagram. Users will be asked to share their race and/or ethnicity on a voluntary basis.
The data, collected by a third-party survey provider, will be secret-shared with third-party facilitators in a way such that the user’s survey responses cannot be learned by either the facilitators or Meta. The measurement is then computed by the facilitators using encrypted prediction data from AI models, that are cryptographically shared by Meta, with the combined, de-identified results from each facilitator reconstituted into aggregate fairness measurement results by Meta. The cryptographic techniques used by the platform enable Meta to measure for bias and fairness while providing individuals that contribute sensitive demographic measurement data high levels of privacy protection.
Learn more about the platform, its objectives, and the launch here.
Oasis Labs Work & Mission
Oasis Labs states that responsible data usage and ownership have always been at the forefront of its core vision. Their progress toward a Web3 world identifies that no entity should take user data for granted, and to that end, it is creating technologies to ensure data ownership and control are in the hands of the individuals.
Using blockchain, confidential computing, and privacy-preserving technologies, Oasis Labs has the vision to build platforms and products that further individual privacy protection, data governance, and responsible data use. Oasis’ technologies focus on making it easier for developers to incorporate privacy-preserving data storage, governance, and computation.
Sponsored Social Media Posts
Breaking: Oasis Labs partners with Meta to Assess Fairness for its AI Models, while Protecting People’s Privacy.
A first-of-its kind initiative enabling inclusiveness and fairness in AI Models.
As Meta’s technology partner, Oasis Labs built the platform that uses Secure Multi-Party Computation (SMPC) to safeguard information as Meta asks users on Instagram to take a survey in which they can voluntarily share their race or ethnicity.
The project will advance fairness measurement in AI models, which will positively impact the lives of individuals across the globe and benefit society as a whole.
“This is an unprecedented use of these techniques for a large-scale measurement of AI model fairness in the real world. We look forward to working with Meta to build towards responsible AI and responsible data use for a fairer and more inclusive society.” — Professor Dawn Song, Founder of Oasis Labs.
Responsible data usage and ownership have always been at the forefront of Oasis Labs’ core vision.
Decentralization and Web3 have the ability to reach individuals across the world. When combined with data privacy, this unlocks the ability to reach a global audience and build better products that treat everyone equally.
Read more at the link below: