I have I dataset on which I am trying to train a regression model. The pattern of the values in the dataset are more or less always the same and can be described from the following image:
Sample1
Sample 2
on the left is the raw values in the space and in the right after I've sorted them out (following a polynomial pattern). Now someone watching the raw valued graphs would consider that the sparse points on the top are outliers, however there are not and are quite important for the study that I want to apply. So they should be taken account. For regressing the values I've tried mainly 4 different loss functions i.e. MAE, MSE, RMSE and a custom one which is a combination of MAE and MSE on the low and high error values:
class CustomLoss(torch.nn.Module):
def __init__(self):
super(CustomLoss, self).__init__()
def forward(self, predicted, true):
errors = torch.abs(predicted - true)
mask = errors > 250
return torch.mean((0.5 * mask * (errors ** 2)) + ~mask * errors)
The output for each can be seen below:
MAE
MSE
RMSE
MAE+MSE
However, as you can see all of the fail to address the high frequency values. A similar output I am also getting for sample 2. Thus, I would like to ask whether there is a way to address such kind of value with a custom loss or something.
Thanks in advance.
question from:
https://stackoverflow.com/questions/65852301/pytorch-and-optimizing-custom-loss-for-polynomial-regression 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…