简介
该用户还未填写简介
擅长的技术栈
可提供的服务
暂无可提供的服务
torch.randn:用来生成随机数字的tensor,这些随机数字满足标准正态分布(0~1)。torch.randn(size),size可以是一个整数,也可以是一个元组。代码示例:import torcha=torch.randn(3)b=torch.randn(3,4)print("a:",a)print("b:",b)输出:a: tensor([ 0.9405, -0.1068,0.171
torch.reshape用来改变tensor的shape。torch.reshape(tensor,shape)import torcha=torch.tensor([[[1,2,3],[4,5,6]],[[7,8,9],[10,11,12]]])print("a的shape:",a.shape)b=torch.reshape(a,((4,3,1)))print("b:",b)print("b的
torch.split,用来划分tensor,可以从数量上划分,还有维度上划分。torch.split(tensor,split_szie,dim),split_size有整数,也有列表,dim默认为0,自己也可以修改。代码示例:import torcha=torch.tensor([[[1,2,3],[4,5,6]],[[7,8,9],[10,11,12]]])print("a的shape:",
torch.nn.GroupNorm:将channel切分成许多组进行归一化torch.nn.GroupNorm(num_groups,num_channels)num_groups:组数num_channels:通道数量代码示例:a=torch.randn(15,256,9,15)#将channel256分为8组,每组32channelm=nn.GroupNorm(8,256)...
torch.exp(input):代码示例:import torchimport torch.nn as nnimport matha=torch.tensor([0,math.log(2)])print(torch.exp(a))输出:tensor([1., 2.])
torch.nn.ReLU:调用relu函数代码示例来自pytorch官方文档,正好复习下unsqueeze和cat函数的使用:import torchimport torch.nn as nnm=nn.ReLU()input=torch.randn(2).unsqueeze(0)print("input:",input)print("input的shape:",input.shape)outpu
torch.transpose:转置经常在矩阵中使用,交换两个维度。torch.transpose(tensor,dim_0,dim_1)代码示例:import torcha=torch.tensor([[[1,2,3],[4,5,6]],[[7,8,9],[10,11,12]]])b=torch.transpose(a,1,2)print("tensor_a",a)print("tensor_b
torch.from_numpy()用来将数组array转换为张量Tensora=np.array([1,2,3,4])print(a)#[1 2 3 4]print(torch.from_numpy(a))#tensor([1, 2, 3, 4], dtype=torch.int32)
torch.index_select:通过选择索引然后去得到想要的tensor,针对比较长的tensortorch.index_select(tensor,维度,选择的index)代码示例:import torch#shape为(2,2,3)a=torch.tensor([[[1,2,3],[4,5,6]],[[7,8,9],[10,11,12]]])#选择索引0和索引2的tensorindice
torch.chunk:用来将tensor分成很多个块,简而言之我理解的就是切分吧,可以在不同维度上切分。torch.chunk(tensor,chunk数,维度)代码示例:import torcha=torch.tensor([[[1,2],[3,4]],[[5,6],[7,8]]])b=torch.chunk(a,2,1)print(a)print(b)输出:tensor([[[1, 2],[