Skip to main content
  1. Home
  2. Computing
  3. Emerging Tech
  4. News

Researchers design new test to detect discrimination in AI programs

Add as a preferred source on Google

Artificial intelligence isn’t yet conscious but algorithms can still discriminate, sometimes subtly expressing the hidden biases of the programmers who created them. It’s a big, complicated problem, as AI systems become more enmeshed into out everyday lives.

But there may be fix — or at least a way to monitor algorithms and tell whether they’ve inappropriately discriminated against a demographic.

Recommended Videos

“Learned prediction rules are often too complex to understand.”


Proposed by a team of computer scientists from Google, the University of Chicago, and the University of Texas, Austin, the Equality of Opportunity in Supervised Learning approach analyzes the decisions that machine learning programs make — rather than the decision-making processes themselves — to detect discrimination. The very nature of these algorithms is to make decisions on their own, with their own logic, in a black box hidden from human review. As such, the researchers see gaining access to the black boxes as practically futile.

“Learned prediction rules are often too complex to understand,” University of Chicago computer scientist and co-author, Nathan Srebro, told Digital Trends. “Indeed, the whole point of machine learning is to automatically learn a [statistically] good rule…not one whose description necessarily makes sense to humans.  With this view of learning in mind, we also wanted to be able to ensure a sense of non-discrimination while still treating learned rules as black boxes.”

Srebro and co-authors Moritz Hardt of Google and Eric Price of UT Austin developed an approach to analyze an algorithm’s decisions and make sure it didn’t discriminate in the decision-making process. To do this, they led with the anti-prejudicial principle that a decision about a particular person should not be solely based on that person’s demographic. In the case of an AI program, the algorithm’s decision about a person should not reveal anything about that person’s gender or race in a way that would be inappropriately discriminatory.

It’s a test that doesn’t solve the problem directly but helps flag and prevent discriminatory processes. For this reason, some researchers are wary.

“Machine learning is great if you’re using it to work out the best way to route an oil pipeline,” Noel Sharkey, emeritus professor of robotics and AI at the University of Sheffield, told The Guardian. “Until we know more about how biases work in them, I’d be very concerned about them making predictions that affect people’s lives.”

Srebro recognizes this concern but does not consider it sweeping critique of his teams approach. “I agree that in many applications with high-stakes impact on individuals, especially by government and judicial authorities, use of black box statistical predictors is not appropriate and transparency is vital,” he said. “In other situations, when used by commercial entities and when individual stakes are lower, black box statistical predictors might be appropriate and efficient. It might be hard to completely disallow them but still desirable to control for specific protected discrimination.”

The paper on Equality of Opportunity in Supervised Learning was one of a handful presented this month at the Neural Information Processing Systems (NIPS) in Barcelona, Spain, which offered approaches to detecting discrimination in algorithms, according to The Guardian.

Dyllan Furness
Former Contributor
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
Don’t try this $3 app that makes your MacBook moan, but I know you want to
This absurd $3 Mac app went viral for all the wrong reasons
Computer, Electronics, Laptop, MacBook

There are useful apps, there are pointless app,s and then there is SlapMac, which sits in a category all by itself.

This app has gone viral online for one very stupid (and fun) reason: it makes your MacBook play sound effects when you slap it. Just spank your Mac and hear it moan, fart, or throw punches. The app creator has apparently made $5,000 in just three days, which is what makes the story even more absurd.

Read more
Apple’s ridiculous $700 wheels for its desktop PC are gone for good
The $700 Apple wheels are dead, long live ridiculous tech accessories
Machine, Wheel, Tire, Apple Mac Pro Wheels

Apple has officially discontinued the Mac Pro, and by extension, the $700 Mac Pro Wheels Kit is also dead.

Yes, that sentence is still funny in 2026. It marks the end of one of the company's most infamous desktop add-ons. For anyone who somehow missed this saga, the Wheels Kit launched back in 2020 as an upgrade for the Mac Pro. It allowed you to add wheels for $400, but buying the standalone kit later costs a whopping $700 because the base machine already included the standard feet. Apple also sold a separate $300 Feet Kit for people who wanted to swap back.

Read more
Macbook Neo stress test shows Apple could’ve made it run cooler with a simple fix
This simple mod makes the MacBook Neo faster.
Apple MacBook Neo with users hands on it

Apple's MacBook Neo arrived as a shock to the industry. It is the new cheap MacBook that is designed to be silent, efficient, and affordable. But a new stress test suggests that it could have been noticeably better with a very simple change.

As per a recent test, the addition of a basic copper plate to the cooling setup can improve both thermals and performance by a meaningful margin. And the frustrating part? It isn't some complex engineering overhaul and is relatively straightforward.

Read more