In 2016, Boston University and Microsoft ran a text analysis of Google News and came up with some interesting responses. Researches asked AI software to replace the “X” in the following sentence, “man is to computer programmer as woman is to X”, the AI software responded with “homemaker.” These results are illuminating and not uncommon.
Machine-learning software trained on datasets shows bias, prejudice and sometimes outright misogyny. Are we training sexism into AI? How do we solve the problem once we have sexist AI? What causes AI to become sexist?
Join us as we tackle these questions and many more about sexism in AI and Silicon Valley in general.
Falon Fatemi – Founder and CEO of Node
Paul Smith – COO of Botanic Technologies
Sean Captain – Tech Writer at Fast Company