Algorithms have taken a lot of heat recently for producing biased decisions. People are outraged over a recruiting algorithm Amazon developed that overlooked female job applicants. Likewise, they are outraged over predictive policing and predictive sentencing that disproportionately penalize people of color. Importantly, race and gender were not included as inputs into any of these algorithms. Should we be outraged by bias reflected in algorithmic output? Read on to find out.