It has happened again. Researchers have discovered that Google’s ad-targeting system is discriminatory. Male web users were more likely to be shown high paying executive ads compared to female visitors. The researchers have published a paper which was presented at the Privacy Enhancing Technologies Symposium in Philadelphia.
I had blogged about the dark side of Big Data almost two years back. Latanya Sweeney, a Harvard professor Googled her own name to find out an ad next to her name for a background check hinting that she was arrested. She dug deeper and concluded that so-called black-identifying names were significantly more likely to be the targets for such ads. She documented this in her paper, Discrimination in Online Ad Delivery. Google then denied AdWords being discriminatory in anyway and Google is denying to be discriminatory now.
I want to believe Google. I don’t think Google believes they are discriminating. And, that’s the discriminatory dark side of Big Data. I have no intention to paint a gloomy picture and blame technology, but I find it scary to observe that technology is changing much faster than the ability of the brightest minds to comprehend the impact of it.
A combination of massively parallel computing and sophisticated algorithms to leverage this parallelism as well as ability of algorithms to learn and adapt without any manual intervention to be more relevant, almost in real-time, are going to cause a lot more of such issues to surface. As a customer you simply don't know whether the products or services that you are offered or not at a certain price is based on any discriminatory practices. To complicate this further, in many cases, even companies don't know whether insights they derive from a vast amount of internal as well as external data are discriminatory or not. This is the dark side of Big Data.
The challenge with Big Data is not Big Data itself but what companies could do with your data combined with any other data without your explicit understanding of how algorithms work. To prevent discriminatory practices, we see employment practices being audited to ensure equal opportunity and admissions to colleges audited to ensure fair admission process, but I don't see how anyone is going to audit these algorithms and data practices.
Disruptive technology always surfaces socioeconomic issues that either didn't exist before or were not obvious and imminent. Some people get worked up because they don't quite understand how technology works. I still remember politicians trying to blame GMail for "reading" emails to show ads. I believe that Big Data is yet another such disruption that is going to cause similar issues and it is disappointing that nothing much has changed in the last two years.
It has taken a while for the Internet companies to figure out how to safeguard our personal data and they are not even there, but their ability to control the way this data could get used is very questionable. Let’s not forget data does not discriminate, people do. We should not shy away from these issues but should collaboratively work hard to highlight and amplify what these issues might be and address them as opposed to blame technology to be evil.
Photo courtesy: Kutrt Bauschardt
No comments:
Post a Comment