Tag: 上海后花园论坛千花网CD

Can algorithms be biased or harmful? Yes, here’s how

Algorithms have changed modern society for the better in a number of ways, through advances in technology, optimized experiences online and so much more.

But what happens when the algorithm that gets built ends up doing more harm than good? At what point can an algorithm fail?

READ MORE: 

We asked experts to weigh in on algorithm bias and just how much control a data scientist actually has over their own creation. 

How can algorithms be biased or harmful?

When comparing two effective algorithms that are producing results for big tech companies, there’s really no such thing as a good or bad algorithm.

It’s more complicated than that.

The algorithm itself is an objective tool to get from a problem to a solution. It’s typically built by computer or data scientists to learn from certain data sets and then work to solve a specific problem.

The quality of the data set affects the outcome. This means that if the data is biased, so is the algorithm, even if the algorithm itself may be producing the intended result, said Stephen Chen, associate professor of information technology at York University.

A few years back, he noted, that was filtering out women by learning from a data set of resumes that reflected a male-dominated industry. Based on the input, the algorithm had “learned” that male candidates were best suited for the job. 

“If you have skewed data on the input, you will basically reinforce your bias,” he said. 

Chen said the same can be said for discrimination in racialized communities, using U.S. health care predictions as an example. 

“If racialized people have less access to health care, then all the AI algorithms will predict less access to health care,” he continued. “It codifies past discrimination.”

In addition to perpetuating biases and discrimination, algorithms also have the power to predict certain behaviours or circumstances that can also hurt specific groups of people, said Salma Karray, a marketing professor at Ontario Tech University.

For example, Karray explained, there was a case in the U.S. in 2012 involving Target that sparked privacy concerns after the retailer used an algorithm to  and sent her a maternity pamphlet in the mail, before her father found out.

As well, she said, an algorithm can also potentially encourage addictive behaviours, such as gambling, by targeting users who appear interested in a certain activity online. 

Can algorithms be controlled?

A self-accelerating vehicle speeds up at the wrong time and kills its driver. This is a real-life example of an algorithm gone wrong, Chen said. 

But from a results perspective, this machine’s algorithm technically learned what it was expected to learn: how to automatically drive a vehicle.

“There’s nothing more dangerous than assuming your code works because it gives you the result you were expecting,” Chen said.

He said there’s something in machine intelligence that is different from human intelligence, and that is something that we, as humans, may never understand. 

“If the machines have a different form of intelligence and do things differently, and receive things and understand things differently, then if we give control to the algorithms, to something that we don’t fully understand, do we have control over them still?” he asked.

“Once you train an algorithm to learn outside of the lab, you no longer have any idea what it’s doing because it has learned something that you did not know it was going to learn. Then it can do things that nobody was expecting.” 

What are the ethics behind algorithms?

While there is on algorithms to make them smarter and better at making decisions, Chen said there is a lack of ethics in the process as a whole, which is more of a historical problem in science in general.

“It’s like, let’s build the bomb first and then talk about the ethical implications about nuclear power after, right?” he said.

Chen said that when it comes to regulations, there is some more robust privacy legislation that has emerged recently in certain parts of the world around big tech companies (such as the General Data Protection Regulation in the EU); however, it is difficult to regulate algorithms.

“It’s really hard to say that you are not allowed to be exposed to an algorithm because the algorithms – they track everything,” he said. “Data is the cost of connectivity.”