Custom Loss function #396
-
Hi✌🏽 ProblemI'm struggling to train my I've tested two different ways of passing the loss to Loss function: python functiondef MeanPinballLoss(y: Tensor, yhat: Tensor, alpha: int = 0.9) -> Tensor:
y_diff = y - yhat
yhat_diff = yhat - y
loss = (
alpha * torch.max(y_diff, torch.zeros_like(y_diff))
+ (1-alpha) * torch.max(yhat_diff, torch.zeros_like(yhat_diff))
).sum().mean()
return loss Using this function during model init
I had a look at the source code, and from my understanding of neural_prophet/neuralprophet/configure.py Lines 40 to 55 in 31051a2 this should not happen, since callable(MeanPinballLoss) is True (line 50).
Loss function: PyTorch module lossMy next attempt was to extend class MeanPinballLoss(_Loss):
__constants__ = ['reduction']
def __init__(self, size_average=None, reduce=None, reduction: str = 'mean', alpha=0.5) -> None:
super(MeanPinballLoss, self).__init__(size_average, reduce, reduction)
self.alpha = alpha
def forward(self, input: Tensor, target: Tensor) -> Tensor:
y_diff = target - input
yhat_diff = input - target
loss = (
self.alpha * torch.max(y_diff, torch.zeros_like(y_diff))
+ (1-self.alpha) * torch.max(yhat_diff, torch.zeros_like(yhat_diff))
).sum().mean()
return loss Initialising this with NeuralProphet by
Line 52 in EndI could not find any similar issues with custom loss functions, and would appreciate any help to know if I'm doing something wrong, or if there is a bug. Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hi @nishai, Thank you for documenting your issue so nicely! We have one test for a callable function as loss: neural_prophet/tests/test_integration.py Line 574 in 31051a2 However, we currently do not have a test covering your case. I would appreciate if you could open a pull request with:
You can find how to do a "dev install" (runs black and tests as hooks): Hope this helps and I would love to welcome you as a Contributor to the community! |
Beta Was this translation helpful? Give feedback.
Hi @nishai,
Thank you for documenting your issue so nicely!
I think you are assessing the situation right.
You seem to be doing everything correct, but our docstring is different from the actual check.
-> we should relax the check to allow for any loss_func being a subclass of _Loss, not just ones predefined in Torch.
We have one test for a callable function as loss:
neural_prophet/tests/test_integration.py
Line 574 in 31051a2
However, we currently do not have a test covering your case.
I would appreciate if you could open a pull request with: