[Home] [Top] [Archives] [About] [Options]

OLDaily

4 human-caused biases we need to fix for machine learning
Glen Ford, TheNextWeb, 2018/10/28


Icon

The response to three of the four biases reported in this post is diversity. And I would add that just as machines need diversity in order to avoid bias, so do humans. That's why diversity is valuable. Anyhow here's the breakdown of the three (all quoted from the text):

The fourth type of bias is prejudice bias, and "is a result of training data that is influenced by cultural or other stereotypes." To fix this, "the humans who label and annotate training data may have to be trained to avoid introducing their own societal prejudices or stereotypes into the training data."

Web: [Direct Link] [This Post]


This newsletter is sent only at the request of subscribers. If you would like to unsubscribe, Click here.

Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own, you can join our mailing list. Click here to subscribe.

Copyright 2018 Stephen Downes Contact: stephen@downes.ca

This work is licensed under a Creative Commons License.