Will Knight, Senior Editor, AI for MIT Technology Review, wrote an article last week citing the growing concern over how algorithms, while set out to help the average consumer, are increasingly becoming more and more bias…and everyone is turning a blind eye; including the companies driving them.
Will reviews the many issues facing consumers and experiencing bias at every turn, from applying for a job to getting a loan and even gaining insurance for your automobile. And a new group emerged recently to help combat this growing concern. The AI Now Initiative, founded by Kate Crawford, a researcher at Microsoft and her colleague, Meredith Whittaker, a researcher at Google, are launching research around the social impacts of AI.
It’s still early days for understanding algorithmic bias, Crawford and Whittaker said in e-mail. Just this year we’ve seen more systems that have issues, and these are just the ones (services & products) that have been investigated.
You can read more about Kate & Meredith’s new initiative here.
Why This Matters –
One of the 2017 trends we’re tracking is called Tackling Invisible Bias.
Trend Summary: Combating a disease is hard enough without fighting misperceptions. Today brands are tackling the unconscious biases in healthcare that shape how we communicate, prevent and even diagnose key conditions.
This article struck a cord. We are very aware of the invisible bias that occurs between the physician and the patient as well as the many social influences each of us experience. But what happens when healthcare ramps up AI to help undoubtedly streamline processes to make them more efficient which in turn should provide lower-cost medicine to all? AI has the ability to help revolutionize healthcare - but only if it doesn't perpetuate existing healthcare inequities. What will be interesting to watch as this technology plays a bigger role in healthcare, is what data is used to help build the models that may eventually help decide what treatment is best for you - with the key being does the data being used REALLY represent YOU?