Recommending Dream Jobs in a Biased Real World

05/10/2019
by   Nadia Fawaz, et al.
0

Machine learning models learn what we teach them to learn. Machine learning is at the heart of recommender systems. If a machine learning model is trained on biased data, the resulting recommender system may reflect the biases in its recommendations. Biases arise at different stages in a recommender system, from existing societal biases in the data such as the professional gender gap, to biases introduced by the data collection or modeling processes. These biases impact the performance of various components of recommender systems, from offline training, to evaluation and online serving of recommendations in production systems. Specific techniques can help reduce bias at each stage of a recommender system. Reducing bias in our recommender systems is crucial to successfully recommending dream jobs to hundreds of millions members worldwide, while being true to LinkedIn's vision: "To create economic opportunity for every member of the global workforce".

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset