Struct BCEWithLogitsLossImpl¶
Defined in File loss.h
Page Contents
Inheritance Relationships¶
Base Type¶
public torch::nn::Cloneable< BCEWithLogitsLossImpl >
(Template Class Cloneable)
Struct Documentation¶
-
struct BCEWithLogitsLossImpl : public torch::nn::Cloneable<BCEWithLogitsLossImpl>¶
This loss combines a
Sigmoid
layer and theBCELoss
in one single class.This version is more numerically stable than using a plain
Sigmoid
followed by aBCELoss
as, by combining the operations into one layer, we take advantage of the log-sum-exp trick for numerical stability. See https://pytorch.org/docs/main/nn.html#torch.nn.BCEWithLogitsLoss to learn about the exact behavior of this module.See the documentation for
torch::nn::BCEWithLogitsLossOptions
class to learn what constructor arguments are supported for this module.Example:
BCEWithLogitsLoss model(BCEWithLogitsLossOptions().reduction(torch::kNone).weight(weight));
Public Functions
-
explicit BCEWithLogitsLossImpl(BCEWithLogitsLossOptions options_ = {})¶
-
virtual void reset() override¶
reset()
must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override¶
Pretty prints the
BCEWithLogitsLoss
module into the givenstream
.
-
Tensor forward(const Tensor &input, const Tensor &target)¶
Public Members
-
BCEWithLogitsLossOptions options¶
The options with which this
Module
was constructed.
-
Tensor weight¶
A manual rescaling weight given to the loss of each batch element.
-
Tensor pos_weight¶
A weight of positive examples.
-
explicit BCEWithLogitsLossImpl(BCEWithLogitsLossOptions options_ = {})¶