Bias vs Accuracy¶
The Bias vs Accuracy chart shows the tradeoff between predictive accuracy and fairness, removing the need to manually note each model's accuracy score and fairness score for the protected features. Consider your use case when deciding if the model needs to be more accurate or more fair. The Bias vs Accuracy display is based on the validation score, using the currently selected metric.
- The Y-axis displays the validation score of each model. To change this metric, switch to the Leaderboard, change the metric via the Metric dropdown, and then return to Bias vs Accuracy.
- The X-axis displays the fairness score of each model, that is the lowest relative fairness score for a class in the protected feature.
Bias vs Accuracy chart¶
Consider the following when evaluating the Bias vs Accuracy chart:
- You must calculate Per-Class Bias for a model before it can be displayed on the chart.
- Protected Features, Fairness metric, and Fairness threshold were defined in advanced options prior to model building.
- Use the Feature List dropdown to compare the models trained on different feature lists.
- The left side highlights models with fairness scores below the fairness threshold, and the right side highlights models with scores above the threshold.
- Hover on any point to view scores for a specific model.
- Some models may not report scores because there is not enough data (as indicated with a tooltip). Requirements are:
- More than 100 rows.
- If between 100 and 1,000 rows, more than 10% of the rows must belong to the majority class (the class with the most rows of data).
Updated November 30, 2021
Was this page helpful?
Great! Let us know what you found helpful.
What can we do to improve the content?
Thanks for your feedback!