Yang Zhang
1 min readNov 12, 2019

--

From the documentation it seems cross_entropy is just a nice shortcut and equivalent to log_softmax+nll_loss:

This criterion combines log_softmax and nll_loss in a single function.

I usually just use cross_entropy because it’s less code. But love to see someone point to their actual differences if any.

--

--

Yang Zhang
Yang Zhang

Written by Yang Zhang

Data science and machine learning

No responses yet