Pytorch Cross Entropy

Entropy Uncetainly measure of surprise higher entropy = less info $$Entropy = -\sum_i P(i)\log P(i)$$ Lottery import torch a = torch.full([4], 1/4.) a
posted on 2022-01-27 07:45  blueskylabor  阅读(26)  评论(0编辑  收藏  举报