$ 0 0 You can use the top-level torch.softmax() function from PyTorch for your softmax activation needs.