python-pytorchHow do I calculate cross entropy loss using Python and PyTorch?
Cross entropy loss is a measure of how well a set of predicted probabilities match the true labels of a data set. It can be calculated using Python and PyTorch by first defining the true labels and predicted probabilities as tensors.
import torch
true_labels = torch.tensor([1, 0, 0, 1])
predicted_probabilities = torch.tensor([0.9, 0.2, 0.1, 0.8])
Then, the cross entropy loss can be calculated by using the torch.nn.functional.cross_entropy
function.
loss = torch.nn.functional.cross_entropy(predicted_probabilities, true_labels)
print(loss)
Output example
tensor(0.4170)
Code explanation
import torch
- imports the PyTorch librarytrue_labels = torch.tensor([1, 0, 0, 1])
- defines the true labels as a tensorpredicted_probabilities = torch.tensor([0.9, 0.2, 0.1, 0.8])
- defines the predicted probabilities as a tensorloss = torch.nn.functional.cross_entropy(predicted_probabilities, true_labels)
- calculates the cross entropy lossprint(loss)
- prints out the cross entropy loss
Helpful links
More of Python Pytorch
- How can I use PyTorch with Python 3.10?
- How do I install a Python PyTorch .whl file?
- How do I install PyTorch on a Windows computer?
- How can I use Python PyTorch without CUDA?
- How do I check the version of Python and PyTorch I am using?
- How can I compare the performance of PyTorch Python and C++ for software development?
- How can I use Python and PyTorch together with Xorg?
- How do I uninstall Python PyTorch?
- How do Python and PyTorch compare for software development?
- How do I download Python and Pytorch?
See more codes...