Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.5k views
in Technique[技术] by (71.8m points)

why LSTM fail me on this time series regression problem?

I am trying to regress one dependent y from 114 independents x at each time step like below digram. The problem here is this model can't learn any this about y. There is clearly period T in y. Is it because of the input sequence I use is T so that model only see one period each forward and backward calculation?

Maybe lstm not suitable for this regression? I tried pls which demonstrate there is period info in x.

enter image description here

My code is :


class LSTM(nn.Module):
    def __init__(self, input_size=114, hidden_size=200, output_size=1):
        super().__init__()
        self.hidden_size = hidden_size
        self.lstm = nn.LSTM(input_size, hidden_size,num_layer)
        self.linear = nn.Linear(hidden_size, output_size)
        self.hidden_cell = (torch.zeros(num_layer,1,self.hidden_size),torch.zeros(num_layer,1,self.hidden_size))
        self.relu = nn.ReLU()
    def forward(self, input_seq):
        outputs, self.hidden_cell = self.lstm(input_seq, self.hidden_cell)
        predictions = self.relu(self.linear(outputs.squeeze()))
        return predictions

model = LSTM()
loss_function = nn.MSELoss()
optimizer = torch.optim.Adam(model.parameters(), lr=0.01)

epochs = 100
for epoch in range(epochs):
    model.train()
    loss = 0
    #model.hidden_cell = (torch.zeros(1, 1, model.hidden_size), torch.zeros(1, 1, model.hidden_size))
    for trial in range(trials):
        input=np.transpose(train[trial,:,:],(1,0)).unsqueeze(1) # torch.Size([299, 1, 114])
        target=targets[trial,:]

        model.hidden_cell = (torch.zeros(num_layer, 1, model.hidden_size),torch.zeros(num_layer, 1, model.hidden_size))
        y_hat=model(input)
        loss = loss_function(y_hat.squeeze(), target)
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

    model.eval()
    with torch.no_grad():
        y_test_predict = model(test.unsqueeze(1))
        losst = loss_function(y_test_predict.squeeze(), test_target)

    #print(f'epoch: {epoch:3} loss: {loss.item():10.8f}, test loss: {losst.item():10.8f}')
    print(f'epoch: {epoch:3} loss: {loss.item():10.8f}, test loss: {losst.item():10.8f}')```


question from:https://stackoverflow.com/questions/65885799/why-lstm-fail-me-on-this-time-series-regression-problem

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...