반응형
[ Pytorch ] nn.BCELoss, nn.BCEWithLogitsLoss,
nn.CrossEntropyLoss, nn.NLLLoss 총정리
이 글은 아래 링크된 글에 이어지는 글입니다. 내용이 궁금하시다면 먼저 보고 오시길 바랍니다!
[ Loss ] Cross-Entropy, Negative Log-Likelihood 내용 정리! ( + Pytorch Code )
Binary Classification (nn.BCELoss, nn.BCEWithLogitsLoss)
nn.BCELoss
- Binary Cross Entropy Loss 의 줄임말
- BCELoss 내에서 따로 sigmoid 함수가 존재하지 않음
- Input : ( sigmoid(f(x)), target )
- target(float type), 모델의 예측과 동일한 shape 을 가져야함
nn.BCEWithLogitsLoss
- nn.BCELoss 에 Sigmoid 함수가 포함된 형태
- Input : ( f(x), target )
- 나머진 nn.BCELoss 와 유사함
import torch
import torch.nn as nn
torch.manual_seed(77)
### Binary Setting ###
print(f"{'Setting up binary case':-^80}")
z = torch.randn(5) # z = f(x)
y_hat = torch.sigmoid(z) # y_hat = sigmoid(f(x))
y = torch.tensor([0.,1.,1.,0.,1.]) # ground truth (float type)
print(f"z = {z}\ny_hat = {y_hat}\ny = {y}\n{'':-^80}")
# Negative Log-likelihoods
loss_NLL_scratch = -(y * y_hat.log() + (1 - y) * (1 - y_hat).log())
print(f"Negative Log-likelihoods\n {loss_NLL_scratch}")
# BCELoss from PyTorch
loss_BCE = nn.BCELoss(reduction='none')(y_hat,y) # Input : y_hat, y
print(f"PyTorch BCELoss\n {loss_BCE}")
# BCEWithLogitLoss from PyTorch
loss_BCEWithLogits = nn.BCEWithLogitsLoss(reduction='none')(z,y) # Input : z, y
print(f"PyTorch BCEWithLogitsLoss\n {loss_BCEWithLogits}")
Multiclass Classification (nn.CrossEntropyLoss, nn.NLLLoss)
nn.CrossEntropyLoss
- LogSoftmax가 포함되어 있음
- Input : ( f(x), target )
- 아래의 예시 방법 target(long type) : 각 element는 해당되는 class 를 정수로 가짐(0~C-1)
- target 은 input 의 shape에 따라 다양한 형태가 가능
nn.NLLLoss
- LogSoftmax가 포함되어 있지 않음
- Input : ( LogSoftmax(f(x)), target )
- 나머진 nn.CrossEntropyLoss 와 유사함
import torch
import torch.nn as nn
torch.manual_seed(77)
### Multi class Setting ###
print(f"{'Setting up multiclass case':-^80}")
z = torch.randn(5,3) # z = f(x)
y_hat = torch.softmax(z, dim=1) # y_hat = softmax(f(x))
# ground truth : each element in target has to have 0 <= value < C
y = torch.tensor([0,1,0,2,1])
print(f"z = {z}\ny_hat = {y_hat}\ny = {y}\n{'':-^80}")
# Negative Log-likelihoods
# - masking the correct entries
loss_NLL_scratch = -y_hat.log()[torch.arange(5),y.long()]
print(f"Negative Log-likelihoods\n {loss_NLL_scratch}")
# CrossEntropyLoss from PyTorch
# - It includes LogSoftmax
# - Input : z, y
loss_CE = nn.CrossEntropyLoss(reduction='none')(z,y)
print(f"PyTorch CrossEntropyLoss\n {loss_CE}")
# NLLLoss from PyTorch
# - It doesn't include LogSoftmax
# - Input : y_hat.log(), y
loss_NLL = nn.NLLLoss(reduction='none')(y_hat.log(),y)
print(f"PyTorch NLLLoss\n {loss_NLL}")
[ 다음 글 ]
[ Pytorch ] 파이토치 텐서 합치는 방법 : cat(), stack() ( + dim의 의미와, 병합 방식의 차이)
[ PyTorch / torchvision ] make_grid() 사용하기
[ PyTorch / torchvision ] draw_bounding_boxes() 사용하기
[ 참고 ]
https://towardsdatascience.com/cross-entropy-negative-log-likelihood-and-all-that-jazz-47a95bd2e81
반응형
'컴퓨터 언어 > Python_Pytorch' 카테고리의 다른 글
[ PyTorch / torchvision ] draw_segmentation_masks() 사용하기 (0) | 2022.08.31 |
---|---|
[ PyTorch / torchvision ] draw_bounding_boxes() 사용하기 (0) | 2022.08.30 |
[ PyTorch / torchvision ] make_grid() 사용하기 (0) | 2022.08.30 |
[ Pytorch ] 파이토치 텐서 합치는 방법 : cat(), stack() ( + dim의 의미와, 병합 방식의 차이) (0) | 2022.08.18 |
[ Pytorch ] 파이토치 설치하기 (0) | 2021.08.10 |