DailyPost 2054

Algorithms are opinions embedded in code, which is clearly brought out in the movie Coded Bias. The startling revelation was made by MIT media researcher Joy Buolamwini’s studies that uncovered that her face was unrecognizable in many facial recognition programs. This was the beginning of a search for an answer, into what is popularly known as the algorithmic bias. She was looking for an answer. In this exercise she discovered that the facial recognition programs didn’t work on her own face, but it worked if she wore a white mask. This is the genesis of the thought and the expected reality of technology negatively impacting the minorities. The objectivity supposedly inherent in technology has its own limitations and it gets accentuated, if is handled by pure technologists working on technical rules handed over to them.

From racial bias to credit ratings, the life algorithms are laying out for us, is certainly not worth getting wedded to, you don’t know who the groom is. The book Weapons of Math Destruction published in 2016 talks about the societal impact of algorithms. It was written by Cathy O’Neil, who is an American mathematician and a data scientist. This book has been on New York Times best-seller list. The mathematician author analyses big data and algorithms in a variety of areas including insurance, advertising, education, and policing can finally end up in decisions that harm the poor, keeps the racist disadvantage alive and helps in amplifying the already existing inequality. Algos mean objectivity is a myth, the earlier it is smashed the better for both the scientist fraternity and humanity in general.

That a big data set can yield perfect results has been proven wrong any number of times, it might be ideal in an ideal world, with ideal data which is full and comprehensive, of integrity, possibly gathered in real time, but not in the world of the complexities we live in. The nature of data and how it is to be analyzed and finally made into a business rule and then into a decision or directive, does not seem to be happening in any part of the world. Who takes the call, a group of data scientists and the private company or a group of experts who know the field in and out, and have the capability to bring in the required expertise / vision and have the capability to use it with caution, till the time they find it to be full proof. If ever it can happen, is a different story, we leave it for some other day.

The issue of COMPAS software used by US courts has been facing the same issues for quite some time. Lets understand a simple logic; machine learning based algorithms are data dependent and if the data is biased, they will also end up the same way. The biggest challenge with proprietary based software like COMPAS is that since the algorithms it uses are trade secrets, which cannot be examined by the public and affected parties. Are trade secrets above human life and liberty and the right to equal opportunity? Does the due process of law means making the algorithm immune to any judicial scrutiny. The biggest challenge is the method of their creation, for which there are no laws, rules, processes or validations laid. Algorithmic Law of the Jungle. Then it gains more sanctity than the law of the land. These are problematic mathematical tools. Being opaque, unregulated and difficult to contest is their second skin.

Sanjay Sahay

Leave a Comment

Your email address will not be published. Required fields are marked *

The reCAPTCHA verification period has expired. Please reload the page.

Scroll to Top