Pytorch中backward用法,以及gradient参数解析

标量函数backward import torch from torch.autograd import Variable import torch.nn as nn import torch.nn.functional as F #反向传播 x = torch.ones(2, 2, require
posted on 2021-06-27 12:05  A2he  阅读(402)  评论(0编辑  收藏  举报