In the early days of computing there was an acronym, GIGO which stood for “Garbage In, Garbage Out.” It meant if the programmers didn’t know what they were doing, the answers they got would be a mess as well. It appears that’s true for biases as well. A study by Harvard reported in the NYTimes last week showed that Google’s search algorithms had biases that seemed to reflect their programmers’ attitudes. Higher-paying job ads were shown more to men than to women and ads offering arrest records were shown on searches of names that were perceived as African-American. What systems do you have in place to keep your prejudices out of your marketing?