
MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.For a chance to give your own TED Talk, fill out the Idea Search Application: ted.com/ideasearch.Interested in learning more about upcoming TED events? Follow these links:TEDNext: ted.com/futureyouTEDSports: ted.com/sportsTEDAI Vienna: ted.com/ai-viennaTEDAI San Francisco: ted.com/ai-sf Hosted on Acast. See acast.com/privacy for more information.
No persons identified in this episode.
No transcription available yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster