Skip to Main Content
Purdue University Purdue Logo Purdue Libraries

D-VELOP

Description

Introduction to PyTorch

The session covered the basics of deep learning and PyTorch, beginning with linear regression and an explanation of how to train it using forward and backward propagation:

  • Forward Propagation: The model processes input data through its functions to generate predictions.
  • Backward Propagation: The neural network adjusts its parameters based on the gradient of the loss function.

Next, Multi-Layer Perceptrons (MLP) were introduced, with a discussion on how activation functions add non-linearity to neural networks.

The session then moved to PyTorch, covering basic tensor concepts, model creation, data loading, and model optimization. It concluded with a workflow example for training a model:

  1. Retrieve a batch from the DataLoader.
  2. Obtain predictions from the model for the batch.
  3. Calculate the loss based on the difference between predictions and labels.
  4. Perform backpropagation to calculate gradients for each parameter with respect to the loss.
  5. Update the model parameters based on the gradients.

Video