The working group will be exploring how we can help AI builders shift industry norms and build more trustworthy technology leading up to MozFest 2021. If you have ideas on how we can do this, or would like to help, this group is for you!
The Building Trustworthy AI Working Group aims to help our technical community build more trustworthy AI. Our three main goals are: (1) establish best practices in key areas of Trustworthy AI, (2) more diverse stakeholders involved in building AI, and (3) develop new technologies as building blocks for developers.
These three goals are taken from Mozilla’s Trustworthy AI Theory of Change (1.1, 1.3, & 2.1).
- Monthly calls (Zoom, 60mins) from August 2020 to March 2021 during this pilot
- Async discussions on the MozFest Slack
- GitHub repo for monthly activity summaries and topic discussions
These six inspiring projects were generated and selected by working group members. The working group will collaborate on these projects leading up to MozFest 2021.
Let us know you’re interested in joining the working group through this form
Working groups are collections of MozFest community members coming together to focus on a specific topic around trustworthy AI. These working groups are an extension of the Mozilla Festival; By convening regularly online, these groups will support ongoing work around trustworthy AI. All our work and organizing is done openly.
Community Participation Guidelines
As MozFest working group members, we will follow the Mozilla Community Participation Guidelines during meetings and throughout all interactions related to our work.
If you would like to report an issue that happens during our time together, please report it to the Mozilla staff co-facilitator and/or the community co-facilitator.
Violating these community participation guidelines may lead to consequences that include being dismissed from the working group.
All contributions to the working group must be shared with an open license (ie CC BY 4.0, MPL-2.0).