Overview
The Bias Detection metric evaluates the presence of bias in text generated by a model. This metric identifies potential biases, such as cultural, gender, racial, or ideological biases, that could unfairly favor or disfavor a particular group or perspective. It is crucial for ensuring fairness and neutrality in automated text generation, especially in sensitive contexts.BiasDetectionMetric
uses the evaluateBias
function to assess the text for any indications of bias.
Methods
evaluateBias
Function
This function checks the generated text for biases by analyzing individual statements.
output
: The text generated by the model.
BiasDetectionMetric
Class
BiasDetectionMetric
detects bias within the text provided.
output
: The text to be evaluated for bias.
evaluateSteps
method calls evaluateBias
and provides a detailed result, including a bias score. The score quantifies the extent of bias, with a detailed explanation provided for scores indicating significant bias.