Wednesday, September 19, 2018

IBM’s New Software Explains Bias And Automated Decisions Taken By AI

IBM has released a software service that detects bias decisions taken by AI systems, and also explains the factors behind the automated decisions. The software addresses an important issue of the AI-based systems’ credibility. It has been noticed that the decisions taken by such systems are not fair. For instance, the infamous facial recognition software from […]

The post IBM’s New Software Explains Bias And Automated Decisions Taken By AI appeared first on Fossbytes.


IBM’s New Software Explains Bias And Automated Decisions Taken By AI
read more

No comments:

Post a Comment

Playing Grand Theft Auto Inside A Neural Network’s Hallucination? It’s Possible!

Ever imagined what a Neural Network's hallucination would look like? The post Playing Grand Theft Auto Inside A Neural Network’s Halluc...