Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
598 views
in Technique[技术] by (71.8m points)

python - Should I use softmax as output when using cross entropy loss in pytorch?

I have a problem with classifying fully connected deep neural net with 2 hidden layers for MNIST dataset in pytorch.

I want to use tanh as activations in both hidden layers, but in the end, I should use softmax.

For the loss, I am choosing nn.CrossEntropyLoss() in PyTOrch, which (as I have found out) does not want to take one-hot encoded labels as true labels, but takes LongTensor of classes instead.

My model is nn.Sequential() and when I am using softmax in the end, it gives me worse results in terms of accuracy on testing data. Why?

import torch
from torch import nn

inputs, n_hidden0, n_hidden1, out = 784, 128, 64, 10
n_epochs = 500
model = nn.Sequential(
    nn.Linear(inputs, n_hidden0, bias=True), 
    nn.Tanh(),
    nn.Linear(n_hidden0, n_hidden1, bias=True),
    nn.Tanh(),
    nn.Linear(n_hidden1, out, bias=True),
    nn.Softmax()  # SHOULD THIS BE THERE?
)
                 
criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.5)

for epoch in range(n_epochs):
    y_pred = model(X_train)
    loss = criterion(y_pred, Y_train)
    print('epoch: ', epoch+1,' loss: ', loss.item())
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

As stated in the torch.nn.CrossEntropyLoss() doc:

This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class.

Therefore, you should not use softmax before.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...