Tag: machine-learning
APIs with nondeterministic algorithm are used
Non-deterministic ops might return different outputs when run with the same inputs.
Detects if a torch variable is modified in place inside an assignment.
Checks if gradient calculation is disabled during evaluation.
Checks if eval() is called before validating or testing a model.
Best practices to improve the maintainability of notebooks.
Detects if Softmax is used with CrossEntropyLoss.
Detects if Softmax is explicitly computed.
Not setting seeds for the random number generators in Pytorch can lead to reproducibility issues.
Notebook has uninitialized variable usage given the execution order
The computation of the bceloss using sigmoid values as inputs can be replaced by a single BCEWithLogitsLoss which is numerically more stable.
Using DataLoader with num_workers
greater than 0
can cause increased memory consumption over time when iterating over native Python objects such as list
or dict
.
Checks if Softmax
is used with NLLLoss
function.
Zero out the gradients before doing a backward pass
A variable is re-defined in multiple cells with different types.
Detects if nondeterministic tensorflow APIs are used.
Detects if a random seed is set before random number generation.
Creating PyTorch tensors on the CPU and then moving them to the device is inefficient.