Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
942 views
in Technique[技术] by (71.8m points)

python - PyTorch: passing numpy array for weight initialization

I'd like to initialize the parameters of RNN with np arrays.

In the following example, I want to pass w to the parameters of rnn. I know pytorch provides many initialization methods like Xavier, uniform, etc., but is there way to initialize the parameters by passing numpy arrays?

import numpy as np
import torch as nn
rng = np.random.RandomState(313)
w = rng.randn(input_size, hidden_size).astype(np.float32)

rnn = nn.RNN(input_size, hidden_size, num_layers)
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

First, let's note that nn.RNN has more than one weight variable, c.f. the documentation:

Variables:

  • weight_ih_l[k] – the learnable input-hidden weights of the k-th layer, of shape (hidden_size * input_size) for k = 0. Otherwise, the shape is (hidden_size * hidden_size)
  • weight_hh_l[k] – the learnable hidden-hidden weights of the k-th layer, of shape (hidden_size * hidden_size)
  • bias_ih_l[k] – the learnable input-hidden bias of the k-th layer, of shape (hidden_size)
  • bias_hh_l[k] – the learnable hidden-hidden bias of the k-th layer, of shape (hidden_size)

Now, each of these variables (Parameter instances) are attributes of your nn.RNN instance. You can access them, and edit them, two ways, as show below:

  • Solution 1: Accessing all the RNN Parameter attributes by name (rnn.weight_hh_lK, rnn.weight_ih_lK, etc.):
import torch
from torch import nn
import numpy as np

input_size, hidden_size, num_layers = 3, 4, 2
use_bias = True
rng = np.random.RandomState(313)

rnn = nn.RNN(input_size, hidden_size, num_layers, bias=use_bias)

def set_nn_parameter_data(layer, parameter_name, new_data):
    param = getattr(layer, parameter_name)
    param.data = new_data

for i in range(num_layers):
    weights_hh_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)
    weights_ih_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)
    set_nn_parameter_data(rnn, "weight_hh_l{}".format(i), 
                          torch.from_numpy(weights_hh_layer_i))
    set_nn_parameter_data(rnn, "weight_ih_l{}".format(i), 
                          torch.from_numpy(weights_ih_layer_i))

    if use_bias:
        bias_hh_layer_i = rng.randn(hidden_size).astype(np.float32)
        bias_ih_layer_i = rng.randn(hidden_size).astype(np.float32)
        set_nn_parameter_data(rnn, "bias_hh_l{}".format(i), 
                              torch.from_numpy(bias_hh_layer_i))
        set_nn_parameter_data(rnn, "bias_ih_l{}".format(i), 
                              torch.from_numpy(bias_ih_layer_i))
  • Solution 2: Accessing all the RNN Parameter attributes through rnn.all_weights list attribute:
import torch
from torch import nn
import numpy as np

input_size, hidden_size, num_layers = 3, 4, 2
use_bias = True
rng = np.random.RandomState(313)

rnn = nn.RNN(input_size, hidden_size, num_layers, bias=use_bias)

for i in range(num_layers):
    weights_hh_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)
    weights_ih_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)
    rnn.all_weights[i][0].data = torch.from_numpy(weights_ih_layer_i)
    rnn.all_weights[i][1].data = torch.from_numpy(weights_hh_layer_i)

    if use_bias:
        bias_hh_layer_i = rng.randn(hidden_size).astype(np.float32)
        bias_ih_layer_i = rng.randn(hidden_size).astype(np.float32)
        rnn.all_weights[i][2].data = torch.from_numpy(bias_ih_layer_i)
        rnn.all_weights[i][3].data = torch.from_numpy(bias_hh_layer_i)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...