python-pytorchHow do I calculate cross entropy loss using Python and PyTorch?
Cross entropy loss is a measure of how well a set of predicted probabilities match the true labels of a data set. It can be calculated using Python and PyTorch by first defining the true labels and predicted probabilities as tensors.
import torch
true_labels = torch.tensor([1, 0, 0, 1])
predicted_probabilities = torch.tensor([0.9, 0.2, 0.1, 0.8])
Then, the cross entropy loss can be calculated by using the torch.nn.functional.cross_entropy
function.
loss = torch.nn.functional.cross_entropy(predicted_probabilities, true_labels)
print(loss)
Output example
tensor(0.4170)
Code explanation
import torch
- imports the PyTorch librarytrue_labels = torch.tensor([1, 0, 0, 1])
- defines the true labels as a tensorpredicted_probabilities = torch.tensor([0.9, 0.2, 0.1, 0.8])
- defines the predicted probabilities as a tensorloss = torch.nn.functional.cross_entropy(predicted_probabilities, true_labels)
- calculates the cross entropy lossprint(loss)
- prints out the cross entropy loss
Helpful links
More of Python Pytorch
- How can I use Yolov5 with PyTorch?
- How can I use Python, PyTorch, and YOLOv5 to build an object detection model?
- How can I use Python and PyTorch to parse XML files?
- How do I use Pytorch with Python 3.11 on Windows?
- How can I use Python PyTorch with CUDA?
- How can I use the Softmax function in Python with PyTorch?
- How do I use PyTorch with Python version 3.11?
- What is the most compatible version of Python to use with PyTorch?
- How do I check the version of Python and PyTorch I am using?
- How do I determine the version of Python and PyTorch I'm using?
See more codes...