0:01 if you don't have a technical background
0:03 but you still want to learn the basics
0:05 of artificial intelligence stick around
0:06 because we were distilling Google's
0:09 4-Hour AI course for beginners into just
0:11 10 minutes I was initially very
0:13 skeptical because I thought the course
0:15 would be too conceptual we're all about
0:16 practical tips on this channel and
0:18 knowing Google the course might just
0:20 disappear after 1 hour but I found the
0:22 underlying Concepts actually made me
0:25 better at using tools like Chachi BT and
0:27 Google bard and cleared up a bunch of
0:30 misconceptions I didn't know I had about
0:33 AI machine learning and large language
0:35 models so starting with the broadest
0:37 possible question what is artificial
0:39 intelligence it turns out and I'm so
0:41 embarrassed to admit I didn't know this
0:43 AI is an entire field of study like
0:45 physics and machine learning is a
0:48 subfield of AI much like how
0:51 thermodynamics is a subfield of physics
0:53 going down another level deep learning
0:55 is a subset of machine learning and deep
0:57 learning models can be further broken
0:58 down into something called
1:01 discriminative models and generative
1:04 models large language models llms also
1:06 fall under deep learning and right at
1:08 the intersection between generative and
1:11 llms is the technology that powers the
1:13 applications we're all familiar with
1:15 chat gbt and Google bard let me know in
1:17 the comments if this was news to you as
1:19 well now that we have an understanding
1:21 of the overall landscape and you see how
1:22 the different disciplines sit in
1:25 relation to each other let's go over the
1:27 key takeaways you should know for each
1:29 level in a nutshell machine learning is
1:32 a program that uses input data to train
1:35 a model that trained model can then make
1:38 predictions Based on data it has never
1:40 seen before for example if you train a
1:42 model based on Nike sales data you can
1:44 then use that model to predict how well
1:47 a new shoe from Adidas would sell based
1:49 on Adidas sales data two of the most
1:51 common types of machine learning models
1:54 are supervised and unsupervised learning
1:56 models the key difference between the
1:58 two is supervised models use labeled
2:01 data and unsupervised models use
2:04 unlabeled data in this supervised
2:05 example we have historical data points
2:07 that plot the total bill amount at a
2:10 restaurant against the tip amount and
2:13 here the data is labeled Blue Dot equals
2:15 the order was picked up and yellow dot
2:17 equals the order was delivered using a
2:19 supervised learning model we can now
2:21 predict how much tip we can expect for
2:24 the next order given the bill amount and
2:26 whether it's picked up or delivered for
2:28 unsupervised learning models we look at
2:30 the raw data and see if a naturally
2:33 falls into groups in this example we
2:35 plotted the employee tenure at a company
2:37 against their income we see this group
2:39 of employees have a relatively High
2:42 income to years work ratio versus this
2:44 group we can also see all these are
2:47 unlabeled data if they were labeled we
2:50 would see male female years worked
2:53 company function Etc we can now ask this
2:54 unsupervised learning model to solve a
2:57 problem like if a new employee joins are
2:59 they on the FasTrack or not if they
3:01 appear on on the left then yes if they
3:04 appear on the right then no Pro tip
3:05 another big difference between the two
3:07 models is that after a supervised
3:09 learning model makes a prediction it
3:11 will compare that prediction to the
3:13 training data used to train that model
3:16 and if there's a difference it tries to
3:18 close that Gap unsupervised learning
3:20 models do not do this by the way this
3:22 video is not sponsored but it is
3:24 supported by those of you who subscribe
3:25 to my paid productivity newsletter on
3:27 Google tips Link in the description if
3:29 you want to learn more now we have a
3:31 basic Gra as of machine learning it's a
3:33 good time to talk about deep learning
3:35 which is just a type of machine learning
3:37 that uses something called artificial
3:40 neural networks don't worry all you have
3:41 to know for now is that artificial
3:43 neural networks are inspired by the
3:46 human brain and looks something like
3:49 this layers of nodes and neurons and the
3:51 more layers there are the more powerful
3:53 the model and because we have these
3:54 neural networks we can now do something
3:57 called semisupervised learning whereby a
3:59 deep learning model is trained on a
4:02 small amount of labeled data and a large
4:04 amount of unlabeled data for example a
4:06 bank might use deep learning models to
4:08 detect fraud the bank spends a bit of
4:11 time to tag or label 5% of transactions
4:14 as either fraudulent or not fraudulent
4:16 and they leave the remaining 95% of
4:17 transactions unlabeled because they
4:19 don't have the time or resources to
4:22 label every transaction the magic
4:23 happens when the Deep learning model
4:26 uses the 5% of label data to learn the
4:28 basic concepts of the task okay these
4:30 transactions are good and these are bad
4:32 okay apply those learnings to the
4:35 remaining 95% of unlabeled data and
4:37 using this new aggregate data set the
4:40 model makes predictions for future
4:42 transactions that's pretty cool and
4:44 we're not done because deep learning can
4:46 be divided into two types discriminative
4:49 and generative models discriminative
4:51 models learn from the relationship
4:54 between labels of data points and only
4:56 has the ability to classify those data
4:59 points fraud not fraud for example you
5:01 have a bunch of pictures or data points
5:03 you purposefully label some of them as
5:06 cats and some of them as dogs a
5:08 discriminative model will learn from the
5:10 label cat or dog and if you submit a
5:13 picture of a dog it will predict the
5:16 label for that new data point a dog we
5:18 finally get to generative AI unlike
5:20 discriminative models generative models
5:22 learn about the patterns in the training
5:24 data then after they receive some input
5:26 for example a text prompt from us they
5:28 generate something new based on the
5:31 patterns they just learned going back to
5:33 the animal example the pictures or data
5:36 points are not labeled as cater doog so
5:37 a generative model will look for
5:40 patterns oh these data points all have
5:43 two ears four legs a tail likes dog food
5:45 and Barks when as to generate something
5:48 called a dog the generative model
5:51 generates a completely new image based
5:53 on the patterns it just learned there's
5:55 a super simple way to determine if
5:57 something is generative AI or not if the
6:00 output is a number a class ification
6:04 spam not spam or a probability it is not
6:06 generative AI it is Gen AI when the
6:09 output is natural language text or a
6:12 speech an image or audio basically
6:15 generative AI generates new samples that
6:18 are similar to the data it was trained
6:20 on moving on to different generative AI
6:22 model types most of us are familiar with
6:25 textto text models like Chach BT and
6:27 Google bard other common model types
6:30 include text to image models like midj
6:33 Dolly and stable diffusion these can not
6:36 only generate images but edit images as
6:38 well text to video models surprise
6:40 surprise can generate and edit video
6:42 footage examples include Google's
6:44 imageen video Cog video and the Very
6:47 creatively named make a video text to 3D
6:49 models are used to create game assets
6:51 and a little known example would be open
6:54 ai's shape e model and finally text to
6:56 task models are trained to perform a
6:59 specific task for example if you type
7:02 Gmail summarize my unread emails Google
7:04 bard will look through your inbox and
7:06 summarize your unread emails moving over
7:08 to large language models don't forget
7:10 that llms are also a subset of deep
7:13 learning and although there is some
7:17 overlap llms and geni are not the same
7:19 thing an important distinction is that
7:21 large language models are generally
7:23 pre-trained with a very large set of
7:26 data and then fine-tune for specific
7:28 purposes what does that mean imagine you
7:30 have a pet dog it can be pre-trained
7:33 with basic commands like sit come down
7:35 and stay it's a good boy and a
7:37 generalist but if that same good boy
7:39 goes on to become a police dog a guide
7:42 dog or hunting dog they need to receive
7:45 specific training so they're fine tuned
7:48 for that specialist role a similar idea
7:51 applies to large language models they're
7:53 first pre-trained to solve common
7:55 language problems like text
7:57 classification question answering
7:59 document summarization and text
8:02 generation then using smaller industry
8:05 specific data sets these llms are
8:07 fine-tuned to solve specific problems in
8:11 Retail Finance Healthcare entertainment
8:13 and other fields in the real world this
8:16 might mean a hospital uses a pre-trained
8:18 large language model from one of the big
8:20 tech companies and fine-tunes that model
8:23 with its own first-party medical data to
8:25 improve diagnostic accuracy from X-rays
8:28 and other medical tests this is a
8:30 win-win scenario because large companies
8:32 can spend billions developing general
8:34 purpose large language models then sell
8:37 those llms to smaller institutions like
8:39 retail companies Banks hospitals who
8:41 don't have the resources to develop
8:44 their own large language models but they
8:46 have the domain specific data sets to
8:49 fine-tune those models Pro tip if you do
8:50 end up taking the full course I'll link
8:52 it down below it's completely free when
8:54 you're taking notes you can right click
8:56 on the video player and copy video URL
8:58 at the current time so can quickly
9:00 navigate back to that specific part of
9:02 the video there are five modules total
9:04 and you get a badge after completing
9:07 each module the content overall is a bit
9:08 more on the theoretical side so you
9:10 definitely want to check out this video
9:12 on how to master prompting next see you
9:14 on the next video in the