Classification loss for neural network classifier – MATLAB loss – MathWorks América Latina

Binomial deviance'binodeviance'

L=∑j=1nwjlog{1+exp[−2mj]}.

Observed misclassification cost'classifcost'

L=∑j=1nwjcyjy^j,

where y^j is the class label corresponding to the class with the
maximal score, and cyjy^j is the user-specified cost of classifying an
observation into class y^j when its true class is
yj.

Misclassified rate in decimal'classiferror'

L=∑j=1nwjI{y^j≠yj},

where
I{·} is the indicator
function.

Cross-entropy loss'crossentropy'

'crossentropy' is appropriate only for neural network models.

The weighted cross-entropy loss is

L=−∑j=1nw˜jlog(mj)Kn,

where the weights w˜j are normalized to sum to n instead of 1.

Exponential loss'exponential'

L=∑j=1nwjexp(−mj).

Hinge loss'hinge'

L=∑j=1nwjmax{0,1−mj}.

Logit loss'logit'

L=∑j=1nwjlog(1+exp(−mj)).

Minimal expected misclassification cost'mincost'

'mincost' is appropriate only if classification scores are posterior
probabilities.

The software computes the weighted minimal
expected classification cost using this procedure for observations
j = 1,…,n.

  1. Estimate the expected misclassification cost of
    classifying the observation
    Xj into
    the class k:

    γjk=(f(Xj)′C)k.

    f(Xj)
    is the column vector of class posterior probabilities for
    the observation
    Xj.
    C is the cost matrix stored in the
    Cost property of the model.

  2. For observation j, predict the class
    label corresponding to the minimal expected
    misclassification cost:

    y^j=argmink=1,…,Kγjk.

  3. Using C, identify the cost incurred
    (cj) for
    making the prediction.

The weighted average of the minimal expected
misclassification cost loss is

L=∑j=1nwjcj.

Quadratic loss'quadratic'

L=∑j=1nwj(1−mj)2.