Share via


SelfHarmEvaluator Constructor

Definition

An IEvaluator that utilizes the Azure AI Foundry Evaluation service to evaluate responses produced by an AI model for the presence of content that indicates self harm.

public:
 SelfHarmEvaluator();
public SelfHarmEvaluator();
Public Sub New ()

Remarks

SelfHarmEvaluator returns a NumericMetric with a value between 0 and 7, with 0 indicating an excellent score, and 7 indicating a poor score.

Note that SelfHarmEvaluator can detect harmful content present within both image and text based responses. Supported file formats include JPG/JPEG, PNG and GIF. Other modalities such as audio and video are currently not supported.

Applies to