Class EvaluationClassificationMetric (1.51.0)

EvaluationClassificationMetric(
    label_name: typing.Optional[str] = None,
    auPrc: typing.Optional[float] = None,
    auRoc: typing.Optional[float] = None,
    logLoss: typing.Optional[float] = None,
    confidenceMetrics: typing.Optional[
        typing.List[typing.Dict[str, typing.Any]]
    ] = None,
    confusionMatrix: typing.Optional[typing.Dict[str, typing.Any]] = None,
)

The evaluation metric response for classification metrics.

Parameters

NameDescription
label_namestr

Optional. The name of the label associated with the metrics. This is only returned when only_summary_metrics=False is passed to evaluate().

auPrcfloat

Optional. The area under the precision recall curve.

auRocfloat

Optional. The area under the receiver operating characteristic curve.

logLossfloat

Optional. Logarithmic loss.

confidenceMetricsList[Dict[str, Any]]

Optional. This is only returned when only_summary_metrics=False is passed to evaluate().

confusionMatrixDict[str, Any]

Optional. This is only returned when only_summary_metrics=False is passed to evaluate().

Properties

input_dataset_paths

The Google Cloud Storage paths to the dataset used for this evaluation.

task_name

The type of evaluation task for the evaluation..