PyTorch,nn.Sequential(),访问nn.Sequential()中特定模块的权重


问题内容

这应该是一个快速的过程。当我在PyTorch中使用预定义的模块时,通常可以很容易地访问其权重。但是,如果首先将模块包装在nn.Sequential()中,该如何访问它们?请参阅下面的玩具示例

class My_Model_1(nn.Module):
    def __init__(self,D_in,D_out):
        super(My_Model_1, self).__init__()
        self.layer = nn.Linear(D_in,D_out)
    def forward(self,x):
        out = self.layer(x)
        return out

class My_Model_2(nn.Module):
    def __init__(self,D_in,D_out):
        super(My_Model_2, self).__init__()
        self.layer = nn.Sequential(nn.Linear(D_in,D_out))
    def forward(self,x):
        out = self.layer(x)
        return out

model_1 = My_Model_1(10,10)
print(model_1.layer.weight)
model_2 = My_Model_2(10,10)
# How do I print the weights now?
# model_2.layer.0.weight doesn't work.

问题答案:

PyTorch论坛上,推荐的方法是:

model_2.layer[0].weight