Enet、refinenet多类别的语义分割的loss

1、语义分割的loss再pytorch里有两个方式计算,其效果是一样的。

A、使用CrossEntropyLoss():

    # Intialize ENet
    model = ENet(num_classes).to(device)
    # Check if the network architecture is correct
    print(model)
    # We are going to use the CrossEntropyLoss loss function as it's most
    # frequentely used in classification problems with multiple classes which
    # fits the problem. This criterion  combines LogSoftMax and NLLLoss.
    #这个损失函数适用于于多类别的语义分割也,其损失也可以换成logSoftMax跟NLLLoss的组合。
    criterion = nn.CrossEntropyLoss(weight=class_weights)

    # Forward propagation
    outputs = self.model(inputs)
    # Loss computation
    loss = self.criterion(outputs, labels)
    # Backpropagation
    self.optim.zero_grad()
    loss.backward()
    self.optim.step()

B、使用LogSoftmax跟 NLLLoss2d计算损失:

        ## Criterion ##这个NLLLoss2d限制已经改用NLLLoss()
        segm_crit = nn.NLLLoss2d(ignore_index=args.ignore_label).cuda()       
         # Compute output
        output = segmenter(input_var)
        output = nn.functional.interpolate(output, size=target_var.size()[1:], mode='bilinear', align_corners=False)
        soft_output = nn.LogSoftmax()(output)
        # Compute loss and backpropagate
        loss = segm_crit(soft_output, target_var)
        optim_enc.zero_grad()
        optim_dec.zero_grad()
        loss.backward()
        optim_enc.step()

 

上一篇:深度学习模型需要调哪些参数


下一篇:Pytorch:优化器