Skip to content
  • Auto
  • Light
  • Dark

List

List Evaluation Metrics
agents.evaluation_metrics.list() -> metricslistEvaluationMetricListResponse
get/v2/gen-ai/evaluation_metrics

To list all evaluation metrics, send a GET request to /v2/gen-ai/evaluation_metrics.

Returns
EvaluationMetricListResponseclass
Hide ParametersShow Parameters
metricslist
optional
Optional[List[descriptionstrinvertedboolmetric_namestrmetric_typeliteralmetric_uuidstrmetric_value_typeliteralrange_maxfloatrange_minfloatAPIEvaluationMetric]]
Hide ParametersShow Parameters
descriptionstr
optional
invertedbool
optional

If true, the metric is inverted, meaning that a lower value is better.

metric_namestr
optional
metric_typeliteral
optional
Optional[Literal["METRIC_TYPE_UNSPECIFIED", "METRIC_TYPE_GENERAL_QUALITY", "METRIC_TYPE_RAG_AND_TOOL"]]
Hide ParametersShow Parameters
"METRIC_TYPE_UNSPECIFIED"
"METRIC_TYPE_GENERAL_QUALITY"
"METRIC_TYPE_RAG_AND_TOOL"
metric_uuidstr
optional
metric_value_typeliteral
optional
Optional[Literal["METRIC_VALUE_TYPE_UNSPECIFIED", "METRIC_VALUE_TYPE_NUMBER", "METRIC_VALUE_TYPE_STRING", "METRIC_VALUE_TYPE_PERCENTAGE"]]
Hide ParametersShow Parameters
"METRIC_VALUE_TYPE_UNSPECIFIED"
"METRIC_VALUE_TYPE_NUMBER"
"METRIC_VALUE_TYPE_STRING"
"METRIC_VALUE_TYPE_PERCENTAGE"
range_maxfloat
optional

The maximum value for the metric.

formatfloat
range_minfloat
optional

The minimum value for the metric.

formatfloat
from do_gradientai import GradientAI

client = GradientAI()
evaluation_metrics = client.agents.evaluation_metrics.list()
print(evaluation_metrics.metrics)
200 Example
{
  "metrics": [
    {
      "description": "\"example string\"",
      "inverted": true,
      "metric_name": "\"example name\"",
      "metric_type": "METRIC_TYPE_UNSPECIFIED",
      "metric_uuid": "\"123e4567-e89b-12d3-a456-426614174000\"",
      "metric_value_type": "METRIC_VALUE_TYPE_UNSPECIFIED",
      "range_max": 123,
      "range_min": 123
    }
  ]
}