Home » PyTorch Packages

PyTorch Packages

PyTorch is an optimized tensor library for deep learning using CPUs and GPUs. PyTorch has a rich set of packages which are used to perform deep learning concepts. These packages help us in optimization, conversion, and loss calculation, etc. Let’s get a brief knowledge of these packages.

S.No Name Description
1. Torch The torch package includes data structure for multi-dimensional tensors and mathematical operation over these are defined.
2. torch.Tensor This package is a multi-dimensional matrix which contains an element of a single data type.
3. Tensor Attributes
a) torch.dtype It is an object which represents the datatype of thetorch.Tensor.
b) torch.device It is an object that represents the device on which torch.Tensor will be allocated.
c) torch.layout It is an object which represents a memory layout of a toch.Tensor.
4. Type Info The numerical properties of a torch.dtype will be accessed through either the torch.iinfo or the torch.finfo.
1) torch.finfo It is an object which represents the numerical properties of a floating-point torch.dtype.
2) torch.iinfo It is an object which represents the numerical properties of an integer torch.dtype.
5. torch.sparse Torch supports sparse tensors in COO (rdinate) format, which will efficiently store and process tensors for which the majority of elements are zero.
6. torch.cuda Torch supports for CUDA tensor types which implement the same function as CPU tensors, but for computation they utilize GPUs.
7. torch.Storage A torch.Storage is a contiguous, one-dimensional array of a single data type.
8. torch.nn This package provides us many more classes and modules to implement and train the neural network.
9. torch.nn.functional This package has functional classes which are similar to torch.nn.
10. torch.optim This package is used to implement various optimization algorithm.
11. torch.autogard This package provides classes and functions to implement automatic differentiation of arbitrary scalar value functions.
12. torch.distributed This package supports three backends and each one is with different capabilities.
13. torch.distribution This package allows us to construct the stochastic computation graphs, and stochastic gradient estimators for optimization
14. torch.hub It is a pre-trained model repository which is designed to facilitate research reproducibility.
15. torch.multiprocessing It is a wrapper around the native multiprocessing module.
16. torch.utils.bottleneck It is a tool which can be used as an initial step for debugging bottlenecks in our program.
17. torch.utils.checkpoint It is used to create checkpoint in our source program.
18. torch.tils.cpp_extension It is used to create the extension of C++, CUDA, and other languages.
19. torch.utils.data This package is mainly used for creating the dataset.
20. torch.utils.dlpack It will use to decode the Dlpack into tensor.
21. torch.onnx The ONNX exporter is a trace-based exporter, which means that it operates by executing your model once and exporting the operators which were actually run during this run

Reference:

https://pytorch.org/docs/stable/index.html


You may also like