AWS SDK for C++  1.8.96
AWS SDK for C++
Public Member Functions | List of all members
Aws::Glue::Model::FindMatchesMetrics Class Reference

#include <FindMatchesMetrics.h>

Public Member Functions

 FindMatchesMetrics ()
 
 FindMatchesMetrics (Aws::Utils::Json::JsonView jsonValue)
 
FindMatchesMetricsoperator= (Aws::Utils::Json::JsonView jsonValue)
 
Aws::Utils::Json::JsonValue Jsonize () const
 
double GetAreaUnderPRCurve () const
 
bool AreaUnderPRCurveHasBeenSet () const
 
void SetAreaUnderPRCurve (double value)
 
FindMatchesMetricsWithAreaUnderPRCurve (double value)
 
double GetPrecision () const
 
bool PrecisionHasBeenSet () const
 
void SetPrecision (double value)
 
FindMatchesMetricsWithPrecision (double value)
 
double GetRecall () const
 
bool RecallHasBeenSet () const
 
void SetRecall (double value)
 
FindMatchesMetricsWithRecall (double value)
 
double GetF1 () const
 
bool F1HasBeenSet () const
 
void SetF1 (double value)
 
FindMatchesMetricsWithF1 (double value)
 
const ConfusionMatrixGetConfusionMatrix () const
 
bool ConfusionMatrixHasBeenSet () const
 
void SetConfusionMatrix (const ConfusionMatrix &value)
 
void SetConfusionMatrix (ConfusionMatrix &&value)
 
FindMatchesMetricsWithConfusionMatrix (const ConfusionMatrix &value)
 
FindMatchesMetricsWithConfusionMatrix (ConfusionMatrix &&value)
 

Detailed Description

The evaluation metrics for the find matches algorithm. The quality of your machine learning transform is measured by getting your transform to predict some matches and comparing the results to known matches from the same dataset. The quality metrics are based on a subset of your data, so they are not precise.

See Also:

AWS API Reference

Definition at line 35 of file FindMatchesMetrics.h.

Constructor & Destructor Documentation

◆ FindMatchesMetrics() [1/2]

Aws::Glue::Model::FindMatchesMetrics::FindMatchesMetrics ( )

◆ FindMatchesMetrics() [2/2]

Aws::Glue::Model::FindMatchesMetrics::FindMatchesMetrics ( Aws::Utils::Json::JsonView  jsonValue)

Member Function Documentation

◆ AreaUnderPRCurveHasBeenSet()

bool Aws::Glue::Model::FindMatchesMetrics::AreaUnderPRCurveHasBeenSet ( ) const
inline

The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.

For more information, see Precision and recall in Wikipedia.

Definition at line 62 of file FindMatchesMetrics.h.

◆ ConfusionMatrixHasBeenSet()

bool Aws::Glue::Model::FindMatchesMetrics::ConfusionMatrixHasBeenSet ( ) const
inline

The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.

For more information, see Confusion matrix in Wikipedia.

Definition at line 206 of file FindMatchesMetrics.h.

◆ F1HasBeenSet()

bool Aws::Glue::Model::FindMatchesMetrics::F1HasBeenSet ( ) const
inline

The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.

For more information, see F1 score in Wikipedia.

Definition at line 175 of file FindMatchesMetrics.h.

◆ GetAreaUnderPRCurve()

double Aws::Glue::Model::FindMatchesMetrics::GetAreaUnderPRCurve ( ) const
inline

The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.

For more information, see Precision and recall in Wikipedia.

Definition at line 52 of file FindMatchesMetrics.h.

◆ GetConfusionMatrix()

const ConfusionMatrix& Aws::Glue::Model::FindMatchesMetrics::GetConfusionMatrix ( ) const
inline

The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.

For more information, see Confusion matrix in Wikipedia.

Definition at line 198 of file FindMatchesMetrics.h.

◆ GetF1()

double Aws::Glue::Model::FindMatchesMetrics::GetF1 ( ) const
inline

The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.

For more information, see F1 score in Wikipedia.

Definition at line 168 of file FindMatchesMetrics.h.

◆ GetPrecision()

double Aws::Glue::Model::FindMatchesMetrics::GetPrecision ( ) const
inline

The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.

For more information, see Precision and recall in Wikipedia.

Definition at line 92 of file FindMatchesMetrics.h.

◆ GetRecall()

double Aws::Glue::Model::FindMatchesMetrics::GetRecall ( ) const
inline

The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.

For more information, see Precision and recall in Wikipedia.

Definition at line 130 of file FindMatchesMetrics.h.

◆ Jsonize()

Aws::Utils::Json::JsonValue Aws::Glue::Model::FindMatchesMetrics::Jsonize ( ) const

◆ operator=()

FindMatchesMetrics& Aws::Glue::Model::FindMatchesMetrics::operator= ( Aws::Utils::Json::JsonView  jsonValue)

◆ PrecisionHasBeenSet()

bool Aws::Glue::Model::FindMatchesMetrics::PrecisionHasBeenSet ( ) const
inline

The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.

For more information, see Precision and recall in Wikipedia.

Definition at line 101 of file FindMatchesMetrics.h.

◆ RecallHasBeenSet()

bool Aws::Glue::Model::FindMatchesMetrics::RecallHasBeenSet ( ) const
inline

The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.

For more information, see Precision and recall in Wikipedia.

Definition at line 140 of file FindMatchesMetrics.h.

◆ SetAreaUnderPRCurve()

void Aws::Glue::Model::FindMatchesMetrics::SetAreaUnderPRCurve ( double  value)
inline

The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.

For more information, see Precision and recall in Wikipedia.

Definition at line 72 of file FindMatchesMetrics.h.

◆ SetConfusionMatrix() [1/2]

void Aws::Glue::Model::FindMatchesMetrics::SetConfusionMatrix ( const ConfusionMatrix value)
inline

The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.

For more information, see Confusion matrix in Wikipedia.

Definition at line 214 of file FindMatchesMetrics.h.

◆ SetConfusionMatrix() [2/2]

void Aws::Glue::Model::FindMatchesMetrics::SetConfusionMatrix ( ConfusionMatrix &&  value)
inline

The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.

For more information, see Confusion matrix in Wikipedia.

Definition at line 222 of file FindMatchesMetrics.h.

◆ SetF1()

void Aws::Glue::Model::FindMatchesMetrics::SetF1 ( double  value)
inline

The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.

For more information, see F1 score in Wikipedia.

Definition at line 182 of file FindMatchesMetrics.h.

◆ SetPrecision()

void Aws::Glue::Model::FindMatchesMetrics::SetPrecision ( double  value)
inline

The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.

For more information, see Precision and recall in Wikipedia.

Definition at line 110 of file FindMatchesMetrics.h.

◆ SetRecall()

void Aws::Glue::Model::FindMatchesMetrics::SetRecall ( double  value)
inline

The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.

For more information, see Precision and recall in Wikipedia.

Definition at line 150 of file FindMatchesMetrics.h.

◆ WithAreaUnderPRCurve()

FindMatchesMetrics& Aws::Glue::Model::FindMatchesMetrics::WithAreaUnderPRCurve ( double  value)
inline

The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.

For more information, see Precision and recall in Wikipedia.

Definition at line 82 of file FindMatchesMetrics.h.

◆ WithConfusionMatrix() [1/2]

FindMatchesMetrics& Aws::Glue::Model::FindMatchesMetrics::WithConfusionMatrix ( const ConfusionMatrix value)
inline

The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.

For more information, see Confusion matrix in Wikipedia.

Definition at line 230 of file FindMatchesMetrics.h.

◆ WithConfusionMatrix() [2/2]

FindMatchesMetrics& Aws::Glue::Model::FindMatchesMetrics::WithConfusionMatrix ( ConfusionMatrix &&  value)
inline

The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.

For more information, see Confusion matrix in Wikipedia.

Definition at line 238 of file FindMatchesMetrics.h.

◆ WithF1()

FindMatchesMetrics& Aws::Glue::Model::FindMatchesMetrics::WithF1 ( double  value)
inline

The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.

For more information, see F1 score in Wikipedia.

Definition at line 189 of file FindMatchesMetrics.h.

◆ WithPrecision()

FindMatchesMetrics& Aws::Glue::Model::FindMatchesMetrics::WithPrecision ( double  value)
inline

The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.

For more information, see Precision and recall in Wikipedia.

Definition at line 119 of file FindMatchesMetrics.h.

◆ WithRecall()

FindMatchesMetrics& Aws::Glue::Model::FindMatchesMetrics::WithRecall ( double  value)
inline

The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.

For more information, see Precision and recall in Wikipedia.

Definition at line 160 of file FindMatchesMetrics.h.


The documentation for this class was generated from the following file: