Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
318 views
in Technique[技术] by (71.8m points)

python - pytorch crashing in backward() and screen freezing, could you help me?

import torch

def forward(x):
    return x * w

def loss(x, y):
    y_pred = forward(x)
    return (y_pred - y) ** 2

x_data = [1.0, 2.0, 3.0]
y_data = [2.0, 4.0, 6.0]

w = torch.Tensor([1.0])
w.requires_grad = True

print("predict (before training)", 4, forward(4).item())

for epoch in range(100):
    for x, y in zip(x_data, y_data):
        l = loss(x, y)
        l.backward()
        print('grad:', x, y, w.grad.item())
        w.data = w.data - 0.01 * w.grad.data
        w.grad.data.zero_()
        print("progress:", epoch, l.item())
print("predict (after training)", 4, forward(4).item())

Linux distribution is Ubuntu 18.04

GPU: GeForce RTX 2080 SUPER

GPU driver: NVIDIA UNIX x86_64 Kernel Module 450.80.02

CUDA: Cuda compilation tools, release 11.0, V11.0.194

cudNN: 8.0.3

pytorch: 1.7.0, py3.6_cuda11.0.221_cudnn8.0.3_0

python: 3.6

code is very simple, but freezes the whole computer when I run it

screen, keyboard, mouse, not responding at all


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

reinstall the Nvidia driver solves the problem


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...