摘要:
Entropy Uncetainly measure of surprise higher entropy = less info $$Entropy = -\sum_i P(i)\log P(i)$$ Lottery import torch a = torch.full([4], 1/4.) a 阅读全文
摘要:
import torch import torch.nn.functional as F import torch import torch.nn as nn import torch.nn.functional as F import torch.optim as optim from torch 阅读全文