0:00 pie torch an open source deep learning
0:02 framework used to build some of the
0:04 world's most famous artificial
0:05 intelligence products it was created at
0:07 The Meta AI research lab in 2016 but is
0:10 actually derived from the Lua based
0:12 torch library that dates back to 2002.
0:14 fundamentally it's a library for
0:16 programming with tensors which are
0:18 basically just multi-dimensional arrays
0:19 that represent data and parameters in
0:22 deep neural networks sounds complicated
0:23 but its focused on usability will have
0:25 you training machine learning models
0:27 with just a few lines of python in
0:28 addition it facilitates high performance
0:30 parallel Computing on a GPU thanks to
0:33 nvidia's Cuda platform developers love
0:35 prototyping with it because it supports
0:37 a dynamic computational graph allowing
0:39 models to be optimized at runtime it
0:41 does this by constructing a directed
0:43 acyclic graph consisting of functions
0:45 that keeps track of all the executed
0:47 operations on the tensors allowing you
0:49 to change the shape size and operations
0:50 after every iteration if needed pytorch
0:53 has been used to train models for
0:54 computer vision AI like Tesla autopilot
0:57 image generators like stable diffusion
0:59 and speech recognition models like open
1:01 AI whisper just to name a few to get
1:03 started install Pi torque and optionally
1:05 Cuda if you want to accelerate Computing
1:07 on your GPU now import it into a python
1:09 file or notebook like I mentioned a
1:11 tensor is similar to a multi-dimensional
1:13 array create a 2d array or Matrix with
1:15 python then use torch to convert it into
1:17 a tensor now we can run all kinds of
1:19 computations on it like we might convert
1:21 all these integers into random floating
1:23 points we can also perform linear
1:25 algebra by taking multiple tensors and
1:27 multiplying them together what you came
1:29 here to do though is build a deep neural
1:31 network like an image classifier to
1:33 handle that we can define a new class
1:35 that inherits from the neural network
1:37 module class inside the Constructor we
1:39 can build it out layer by layer the
1:41 flattened layer will take a
1:42 multi-dimensional input like an image
1:44 and convert it to one dimension from
1:46 there sequential is used to create a
1:48 container of layers that the data will
1:49 flow through each layer has multiple
1:51 nodes where each node is like its own
1:53 mini statistical model as each data
1:55 point flows through it it'll try to
1:56 guess the output and gradually update a
1:58 mapping of weights to determine in the
2:00 importance of a given variable linear is
2:02 a fully connected layer that takes the
2:04 flat and 28 by 28 image and transforms
2:06 it to an output of 512. this layer is
2:09 followed by a non-linear activation
2:11 function when activated it means that
2:13 feature might be important and outputs
2:15 the node otherwise it just outputs zero
2:16 and finally we finish with a fully
2:18 connected layer that outputs the 10
2:19 labels the model is trying to predict
2:21 with these pieces in place that next
2:23 step is to define a forward method that
2:25 describes the flow of data and now
2:27 instantiate the model to a GPU and pass
2:29 it some input data this will
2:31 automatically call its forward method
2:32 for training and prediction
2:33 congratulations you just built a neural
2:36 network this has been pytorch in 100
2:38 seconds thanks for watching and I will
2:40 see you in the next one