Statistics: Evaluate models

Classification models:

Accuracy: (TruePositive+TrueNegative)/(TruePositive+FalsePositive+TrueNegative+FalseNegative)

Precision: TruePositive/(TruePositive+FalsePositive)

Recall/Sensitivity: TruePositive/(TruePositive+FalseNegative)

F1-score: 2*Precision*Recall/(Precision+Recall)

Confusion Matrix: table of TruePositive, FalsePositive, TrueNegative, FalseNegative

ROC curves: Receiver Operating Characteristic curves between Ture Positive Rate vs. False Positive Rate

AUC: Area-Under-(ROC)Curve, the larger the better

Regression models:

RMSE: Root Mean Sauared Error, MSE=mean((observed - predicted)^2), sqrt(MSE), the lower the better

RSE: Sigma, Residual Standard Error, the lower the better

MAE: Mean Absolute Error, mean(abs(observed-predicted)), less sensitive to outliers

R-squared: coefficient of determination, also equal to the square of the correlation coefficient, the greater the better

R-squared adj: always less than or equal to R-squared, take into consideration of the residual sum of squares (RSS), Total Sum of Squares (TSS), independent predictors (k), and number of samples (n)

Cp: Mallows Cp, sigma^2 is unbiased value of actual sigma^2, the lower the better

AIC: Akaike Information Criterion, the lower the better

BIC: Bayesian Informaiton Criterion, the lower the better