Losses
LabelSmoothingCrossEntropy
Bases: Module
Label smoothing is designed to make the model a little bit less
certain of it's decision by changing a little bit its target:
instead of wanting to predict 1 for the correct class and 0 for all
the others, we ask it to predict 1 - ε
for the correct class and
ε
for all the others, with ε
a (small) positive number.
__init__(eps=0.1, reduction='mean')
Parameters:
Name | Type | Description | Default |
---|---|---|---|
eps |
float
|
Weight for the interpolation formula. |
0.1
|
reduction |
str
|
Reduction applied to the loss tensor. |
mean
|
NoneReduce
Force non-reduction on the loss tensor so it can used later in
methods such as Mixup
or LabelSmoothing
.
__init__(loss_func)
Parameters:
Name | Type | Description | Default |
---|---|---|---|
loss_func |
Callable
|
Loss function. |
required |
reduce_loss(loss, reduction=None)
Reduce the loss
tensor using reduction
method. If reduction
is
None, returns the passed loss tensor.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
loss |
Tensor
|
Loss tensor. |
required |
reduction |
str | None
|
Reduction applied to the loss tensor. |
None
|
Returns:
Type | Description |
---|---|
Tensor
|
Reduced loss tensor. |