0:01 hey this is Andrew Brown and I'm
0:03 bringing you another certification
0:05 course and this time it's the Azure AI
0:08 fundamentals also known as the AI 900
0:10 and if you're looking to pass a
0:12 certification we have everything that
0:14 you need here such as Labs lectures and
0:16 a free practice exam so you can go ASAT
0:19 exam get that uh certification put on
0:21 your resume and Linkedin to go get that
0:24 job you've been looking for um if you
0:26 want to support more free courses like
0:27 this one the best way is to purchase the
0:29 additional uh paid materials where you
0:32 can get access uh to more practice exams
0:35 and other resources if you don't know me
0:37 um I've taught a bit of everything uh
0:39 here on the cloud that's been with adabs
0:42 Azure gcp devops terraform kubernetes
0:44 you name it I've taught it but uh you
0:47 know you know the drill here let's get
0:49 into it and learn more about the Azure AI
0:55 fundamentals hey this is Andrew Brown
0:57 from exam Pro and we are at the start of
0:59 our journey here learning about the AI
1:01 900 asking the most important question
1:04 which is what is the AI 900 so the Azure
1:06 AI fundamental certification is for
1:08 those seeking an ml role such as AI
1:10 engineer or data scientist the
1:12 certification will demonstrate if a
1:14 person can Define and understand Azure
1:16 AI services such as competive services
1:19 and Azure applied AI Services AI
1:22 Concepts knowledge mining responsible AI
1:24 basics of NL pipelines classical ml
1:27 models autom ml generative AI workloads
1:29 which is newly added content and Azure
1:31 AI studio so you don't need to know
1:33 super complicated ml knowledge here but
1:34 it definitely helps to get you through
1:36 there so this certification is generally
1:39 referred to by its course code the AI
1:41 900 and it's the natural path for the
1:43 Azure AI engineer or Azure data
1:45 scientist certification this generally
1:47 is an easy course to pass and it's great
1:49 for those new to cloud or ml related
1:51 technology looking at our road map you
1:54 might be asking okay well what are the
1:55 paths and what should I learn first so
1:58 here are a few suggested routes if you
2:00 already have your a900 that's that's a
2:01 great starting point before you take
2:03 your AI 900 if you don't have your a
2:06 z900 you can jump right into the AI 900
2:08 but I strongly recommend you go get that
2:10 a z900 because it gives you General
2:12 foundational knowledge it's just another
2:13 thing that you should not have to worry
2:15 about which is just how to use Azure at
2:17 a fundamental level do you need the
2:21 dp900 to take the AI 900 no but a lot of
2:22 people seem to like to go this route
2:23 where they want to have that data
2:25 Foundation before they move on to the AI
2:28 900 because they know that the broad
2:30 knowledge is going to be useful they so
2:31 it's app pairing that you see a lot of
2:34 people getting the AI 900 and the dp900
2:37 together for the AI 900 the path is a
2:38 little bit more clear it's either going
2:41 to be data scientists or AI engineer so
2:43 for the AI engineer you have to know how
2:45 to use the AI services in and out for
2:47 data scientists it's more focused on
2:49 setting up actual pipelines and things
2:50 like that within Azure machine learning
2:52 so you just have to decide which path is
2:55 for you the data scientist is definitely
2:56 harder than the AI engineer so if you
2:58 aren't ready for the data science some
3:00 people like taking the AI engineer first
3:02 and then doing the data scientists so
3:05 this is kind of like a warmup again it's
3:07 not 100% necessary but it's just based
3:09 on your personal learning style and a
3:10 lot of times people like to take the
3:12 data engineer after the data scientists
3:13 just to round out their complete
3:16 knowledge now if you already have the
3:18 a900 and the administrator associate you
3:20 can safely go to the data scientist if
3:22 you want to risk it because this one is
3:24 really hard so if you've passed the
3:26 a1004 you know you're going to probably
3:27 have a lot more confidence learning up
3:29 about all the concepts at this level
3:31 here but of course it's always
3:32 recommended to go grab these
3:34 foundational CTS because sometimes
3:36 course materials just do not cover the
3:38 information and so the obvious stuff is
3:40 going to get left out okay so moving
3:42 forward here how long should you study
3:44 to pass for the AI 900 if you're
3:47 entirely new to ml Ai and Cloud
3:48 providers such as Azure you should
3:50 anticipate dedicating around 15 hours to
3:53 grasp the basics this estimate can vary
3:54 base on your familiarity with these
3:57 concepts for complete beginners the time
3:58 commitment might extend to 20 to 30
4:01 hours for the intermediate level so
4:03 people that have passed the a900 or
4:06 dp900 you're looking at around 8 to 10
4:08 hours if you have one or more years of
4:10 experience with Azure or another cloud
4:13 service provider like a WS or gcp you're
4:16 looking at about 5 hours or less the
4:18 average stunny time is about 8 hours
4:19 this is where you should be committing
4:21 50% of the time to the lecture in labs
4:24 and 50% for the practice exams the
4:26 recommended study time is 30 minutes to
4:28 an hour a day for 14 days this should
4:29 get you through it but just don't
4:31 overstudy and just don't spend too
4:33 little time what does it take to pass
4:35 the exam well you got to watch the
4:38 lectures and memorize key information do
4:40 Hands-On labs and follow along with your
4:42 own Azure account I'd say that you could
4:43 probably get away with just watching all
4:45 the videos in this one without having to
4:47 do the labs but again it really does
4:49 reinforce that information if you do
4:51 take the time there is some stuff that
4:53 is in Azure AI Studio or machine
4:55 learning you might be wary of launching
4:57 instances because we do have to run
4:59 instances and they will cost money
5:00 unless you delete the instances after
5:03 use resulting in very small costs so if
5:04 you feel that you're not comfortable
5:06 with that by just watching the videos
5:08 you should be okay but when you get into
5:10 the associate tier you absolutely have
5:11 to expect to pay something to learn and
5:14 take that risk you want to do paid
5:15 online practice exams that simulate the
5:18 real exam as I've mentioned before I do
5:20 provide a free practice exam and have
5:22 paid practice exams that accompany this
5:24 course that are on my platform exam Pro
5:25 and that's how you can help support more
5:28 of these free courses so can you pass
5:30 this certification without taking a
5:32 practice exam well azzure is a little
5:34 bit harder if this isn't a WS exam I
5:36 would say yes but for Azure exams like
5:41 AI 900 dp900 and sc900 probably not it's
5:43 kind of risky I think you should do at
5:45 least one practice exam or go through
5:46 the sample one there's a sample one
5:48 probably laying around on the Azure
5:50 website let's take a look at the exam
5:52 guide breakdown here and then in the
5:53 following video we'll look at in more
5:55 detail so it's broken down into the
5:58 following domain so the exam has five
6:00 domains of questions and each domain has
6:01 its own waiting which determines how
6:03 many questions in a domain that will
6:06 show up so 15 to 20% will be described
6:08 Ai workloads and considerations 20 to
6:11 25% will consist of describe fundamental
6:13 principles of machine learning on Azure
6:15 15 to 20% will consist of described
6:17 features of computer vision workloads on
6:20 Azure 15 to 20% will be described
6:22 features of natural language processing
6:25 workloads on Azure and 15 to 20% will be
6:26 described features of generative AI
6:29 workloads on Azure I want you to notice
6:31 it's says describe these domains this is
6:32 good because that tells you it's not
6:34 going to be super hard if you start
6:36 seeing things that say Beyond describe
6:37 and identify then you know it's going to
6:40 be a bit harder so where do you take
6:42 this exam well you can take it in person
6:44 at a test center or online from the
6:46 convenience of your own home so there's
6:48 two popular test centers there's CER
6:50 aort and there's Pearson view you can
6:52 also take it at a local test center if
6:54 there are nearby locations the term
6:56 Proctor means a supervisor or person
6:57 that is monitoring you while you're
6:59 taking the exam if I had the the option
7:02 between in person or online I would
7:03 always choose the in person because it's
7:05 a controlled environment and it's way
7:07 less stressful online there are many
7:08 things that can go wrong but it's up to
7:10 your personal preference and your
7:13 situation the passing grade here is 700
7:16 out of a th000 so that's around 70% I
7:17 would say around because you could
7:19 possibly fail with 70% because these
7:21 things work on scaled scoring for
7:24 response types there's about 37 to 47
7:25 questions and you can afford to get
7:28 about 10 to 13 questions wrong so some
7:30 questions are worth more than one point
7:33 some questions cannot be skipped and the
7:35 format of questions can be multiple
7:37 choice multiple answer drag and drop and
7:39 hot area there shouldn't be any case
7:41 studies for foundational level exams and
7:44 there's no penalty for wrong questions
7:46 so for the duration you get 1 hour that
7:48 means about 1 minute per question the
7:51 time for this exam is 60 Minutes your C
7:53 time is 90 minutes C time refers to the
7:55 amount of time that you should take to
7:57 allocate for that exam so this includes
7:59 time to review the instructions read and
8:01 accept the NDA complete the exam and
8:03 provide feedback at the end this
8:04 certification is going to be valid
8:07 forever and it does not expire Microsoft
8:08 fundamental certifications such as the
8:12 a900 or ms9900 do not expire as long as
8:13 the technology is still available or
8:16 relevant so we'll proceed to the full
8:17 exam guide [Music]
8:20 [Music]
8:23 now hey this is Andrew Brown from exam
8:25 Pro and what we've pulled up here is the
8:27 official exam outline on the Microsoft
8:29 website if you want to find this year s
8:31 you just have to type in AI 900 Azure or
8:34 Microsoft you should be able to easily
8:36 find it the page looks like this what I
8:38 want you to do is scroll on down because
8:40 we're looking for the AI 900 set of
8:41 guide and from there we're going to
8:43 scroll on down to the skills measured
8:45 section and you might want to bump up
8:47 the text Azure loves updating their
8:49 courses with minor updates that don't
8:50 generally affect the outcome of the
8:52 study here but it does get a lot of
8:54 people worried because they always say
8:56 well is your course out of date so no
8:58 they're just making minor changes
8:59 because they'll do this like five times
9:01 a year and so if there was a major
9:03 revision what would happen is they would
9:05 change it so instead of being the AI 900
9:08 it would be like the AI 9001 or 9002
9:11 similar to how the AI 102 was previously
9:14 AI 100 but now it's the AI 102 so just
9:16 watch out for those and if it's a major
9:17 revision then yes it would probably need
9:20 a completely new course so there aren't
9:22 any major changes with the new update
9:23 other than the update for the generative
9:25 AI workloads on Azure section A couple
9:27 of name changes and a few things being
9:29 removed everything else remains is
9:31 relatively the same with very minor
9:33 changes so the concepts and such are
9:35 still up to date overall I think the
9:37 exam is easier than the four so let's go
9:39 through some of the topics and work our
9:41 way through here so describe Ai
9:43 workloads and considerations so here
9:44 we're just kind of describing the
9:47 generalities of AI so content moderation
9:49 workloads involve filtering out
9:51 inappropriate or harmful content from
9:53 user generated inputs ensuring a safe
9:55 and positive user experience
9:57 personalization workloads analyze user
9:58 behavior and preferences to tailor
10:01 content recommendations or experiences
10:03 to individual users computer vision
10:05 workloads involve the analysis of images
10:08 and videos to recognize patterns objects
10:11 faces and actions identify natural
10:13 language processing knowledge mining
10:15 document intelligence and features of
10:17 generative AI workloads note that these
10:19 are all just Concepts you don't need to
10:21 know how to use the services at a high
10:23 level then you have the responsible AI
10:25 section so Microsoft has these six
10:27 principles that they really want you to
10:28 know and they push it throughout all
10:30 their AI services so those are the six
10:31 you'll need to know and they're not that
10:32 hard to
10:35 learn moving on we have described
10:36 fundamental principles of machine
10:37 learning on
10:40 Azure so here it's just describing
10:42 regression classification clustering and
10:44 features of deep learning we have a lot
10:46 of practical experience with these in
10:48 the course so you will understand at the
10:50 end what these are used for next we have
10:52 core machine learning Concepts we can
10:54 identify features and labels in a data
10:56 set so that's the data labeling service
10:58 describe how training validation data
11:00 sets are used in machine learning so
11:02 we'll touch on that describe
11:04 capabilities of Automated machine
11:06 learning automl simplifies building and
11:08 picking the best models while data and
11:10 compute Services provide the power you
11:12 need for training with Azure machine
11:13 learning it helps with managing and
11:15 deploying your models letting you put
11:17 your machine learning projects into
11:19 action smoothly under computer vision
11:21 workloads we have image classification
11:23 object detection optical character
11:25 recognition facial detection and facial analysis
11:27 analysis
11:29 Solutions next we have Azure AI Vision
11:32 Azure AI face detection and Azure AI
11:34 video indexer the Azure AI Services
11:36 Encompass a wide range of tools designed
11:37 to facilitate the development of
11:40 intelligent applications these Services
11:42 used to be called computer vision custom
11:44 vision face service and form recognizer
11:46 but have Evol or been grouped under
11:48 broader service categories to streamline
11:51 their application and integration into
11:53 projects for NLP we have key phrase
11:55 extraction entity recognition sentiment
11:58 analysis language modeling speech
12:00 recognition synthesis this one doesn't
12:01 really appear much it's kind of a
12:03 concept not so much something we have to
12:05 do and then there's
12:08 translation so now we have Azure tools
12:10 and services for NLP workloads these
12:12 include the Azure AI language service
12:14 Azure AI speech service and Azure AI
12:16 translator service these used to be
12:18 separate Services I believe like the
12:20 text analytics service Lewis speech
12:22 service and translator text service but
12:24 they have been added to the Azure AI
12:26 umbrella of AI services and now we'll be
12:28 moving on to the generative AI workloads
12:31 on Azure we'll be covering features of
12:33 generative AI models common scenarios
12:35 for generative Ai and responsible AI
12:38 considerations for generative Ai and
12:39 also some of the cool features that
12:41 Azure open ey service has to offer such
12:44 as natural language generation code
12:45 generation and image
12:48 generation so that's about a general
12:51 breakdown of the AI 900 exam [Music]
12:54 [Music]
12:57 guide hey this is angrew Brown from exam
12:58 Pro and we are looking at the layers of
13:00 machine learning so here I have this
13:02 thing that looks like kind of an onion
13:04 and what it is it's just describing the
13:07 relationship between these uh ml terms
13:09 uh uh related to Ai and we'll just work
13:10 our way through here starting at the top
13:13 so artificial intelligence also known as
13:15 AI is when machines that perform jobs
13:17 that mimic human behavior so it doesn't
13:19 describe uh how it does that but it's
13:22 just the fact that that's what AI is uh
13:24 one layer underneath we have machine
13:25 learning so machines that get better at
13:28 a task without explicit programming uh
13:29 then we have deep learning so these are
13:31 machines that have an artificial neural
13:33 network inspired by the human brain to
13:35 solve complex problems and if you're
13:36 talking about someone that actually
13:39 assembles either ml or or deep learning
13:41 uh models or algorithms that's a data
13:42 scientist so a person with
13:44 multi-disciplinary skills and math
13:46 statistics predictive modeling machine
13:48 learning to make future predictions so
13:50 what you need to understand is that AI
13:52 is just the outcome right and so AI
13:55 could be using ml underneath or deep
13:57 learning or a combination of both or
13:59 just IFL statements okay [Music]
14:04 [Music]
14:05 all right so let's take a look here at
14:07 the key elements of AI so AI is the
14:08 software that imitates human behaviors
14:10 and capabilities and there are key
14:13 elements according to Azure or Microsoft
14:16 as to what makes up AI so let's go
14:17 through this list quickly here so we
14:18 have machine learning which is the
14:20 foundation of an AI system that can
14:22 learn and predict like a human you have
14:24 anomaly detection so detect outliers or
14:26 things out of place like a human
14:29 computer vision be able to see like a
14:31 human natural language processing also
14:34 known as NLP be able to process human
14:36 languages in referr contexts you know
14:38 like a human at conversational AI be
14:41 able to hold a conversation with a human
14:45 so you know I wrote here according to
14:47 Microsoft and Azure because you know the
14:50 the global definition is a bit different
14:51 but I just wanted to put this here
14:53 because I've definitely seen this as an
14:55 exam question and so we're going to have
15:04 let's define what is a data set so a
15:05 data set is a logical grouping of units
15:08 of data that are closely related to or
15:10 share the same data structure and there
15:12 are publicly available data sets that
15:15 are used in uh learning of Statistics
15:17 data analytics and machine learning I
15:19 just want to cover a couple here so the
15:21 first is the mnist database so images of
15:24 handwritten digits used to test classify
15:26 cluster image processing algorithms
15:28 commonly used when learning uh how to
15:30 build computer vision ml models to
15:32 translate handwritten into or
15:34 handwriting into digital text so it's
15:37 just a bunch of handwritten uh numbers
15:39 and letters and then another very
15:41 popular data set is the common objects
15:44 in context Coco data set so this is a
15:45 data set which contains many common
15:48 images using a Json file Coco format
15:50 that identify objects or segments within
15:52 an image uh and so this data set has a
15:53 lot of stuff in it so object
15:55 segmentations recognition and it
15:57 contexts super pixel stuff segmentation
16:00 they have a lot of images and a lot of
16:02 objects uh so there's a lot of stuff in
16:04 there so why am I talking about this and
16:07 in particular Coco data sets well when
16:09 you use um Azure machine Learning Studio
16:12 it has a DAT data labeling service and
16:15 um the thing is is that uh it can
16:17 actually export out into Coco formats
16:18 that's why I wanted you to get exposure
16:20 to what Coco was and the other thing is
16:22 is that when you're building out Azure
16:25 machine learning uh pipelines you uh
16:26 they actually have open data sets which
16:28 we'll see later in the course um that
16:30 shows you that you can just use very
16:33 common ones and so uh you might see mest
16:36 and uh the other one there uh so I just
16:37 wanted to get you some exposure [Music]
16:41 [Music]
16:43 okay let's talk about data labeling so
16:45 this is the process of identifying raw
16:48 data so images text files videos and
16:50 adding one or more meaningful and
16:52 informative labels to provide context so
16:54 a machine learning model can learn so
16:55 with supervised machine learning
16:57 labeling is a prerequisite to produce
16:59 training data and each piece of data
17:01 will generally be labeled by a human the
17:03 reason why I say generally here is
17:06 because with azure's uh data labeling
17:08 Service uh they can actually do ml
17:10 assisted labeling uh so with
17:11 unsupervised machine learning labels
17:14 will be produced by the machine and may
17:16 not be human readable uh and then one
17:17 other thing I want to touch on is the
17:20 term called Ground truth so this is a
17:22 proper u a properly labeled data set
17:24 that you can use as the objective
17:26 standard to train and assess a given
17:28 model is often called Ground truth the
17:30 accuracy of your train model will depend
17:32 on the accuracy of your ground Truth Now
17:35 using um azures tools I never seen use
17:36 the word ground truth I see that a lot
17:38 in AWS and even this graphing here is
17:41 from AWS but uh I just want to make sure
17:43 you are familiar with all that stuff [Music]
17:46 [Music]
17:49 okay let's compare supervised
17:50 unsupervised and reinforcement learning
17:52 starting at the top we got supervised
17:54 learning this is where the data has been
17:56 labeled for training and it's considered
17:58 Tas driven because you are trying to
18:00 make a prediction get a value back so
18:02 when the labels are known and you want a
18:04 precise outcome when you need a specific
18:06 value returned and so you're going to be
18:08 using classification and regression in
18:10 these cases for unsupervised learning
18:11 this is where data that has not been
18:13 labeled uh the ml model needs to do its
18:15 own labeling this is considered data
18:17 driven it's trying to recognize a
18:19 structure or a pattern and so this is
18:20 when the labels are not known and the
18:23 outcome does not need to be prise when
18:25 you're trying to make sense of data so
18:27 you have clustering dimensionality
18:29 reduction and associ
18:30 if You' never heard this term before the
18:32 idea is it's trying to reduce the amount
18:33 of Dimensions to make it easier to work
18:36 with the data so make sense of the data
18:38 right uh we have reinforcement learning
18:39 so this is where there is no data
18:41 there's an environment and an ml model
18:44 generates data uh and and makes many
18:45 attempts to reach a goal so this is
18:48 considered uh decisions driven and so
18:50 this is for game AI learning tasks robot
18:53 navigation when you've seen someone code
18:55 a video game that can play itself that's
18:57 what this is if you're wondering this is
18:59 not all the types of machine learning uh
19:01 and these are in specific unsupervised
19:03 and supervised is considered classical
19:05 machine learning because they he heavily
19:07 rely on statistics and math to produce
19:09 the outcome uh but there you [Music]
19:13 [Music]
19:16 go so what is a neural network well it's
19:17 often described as mimicking the brain
19:19 it's a neuron or node that represents an
19:21 algorithm so data is inputed into a
19:22 neuron and based on the output the data
19:25 will be passed to one of many connected
19:27 neurals the connections between neurons
19:28 is weighted I really should have
19:29 highlighted that one that's very
19:31 important uh the network is organized
19:34 into layers there will be an input layer
19:36 uh one to many hidden layers and an
19:38 output layer so here's an example of a
19:41 very simple neural network notice the NN
19:43 a lot of times you'll see this in ml as
19:45 an abbreviation for neural networks and
19:46 sometimes neural networks are just
19:48 called neural Nets so just understand
19:50 that's the same term here what is deep
19:52 learning this is a neural network that
19:53 has three or more hidden layers it's
19:54 considered deep learning because at this
19:57 point it's uh it's not human readable to
19:59 understand what's going on with within those
20:00 those
20:03 layers what is forward feed so neural
20:04 networks where they have connections
20:06 between nodes that do not form a cycle
20:07 they always move forward so that just
20:10 describes uh a a forward pass through
20:12 the network you'll see fnn which stands
20:14 for forward feed neural network just to
20:17 describe that type of network uh then
20:19 there's back back propagation which are
20:21 in forward feed uh networks this is
20:22 where we move backwards through the
20:23 neural net adjusting the weights to
20:25 improve the outcome on next iteration
20:27 this is how a neural net learns the way
20:29 the back propagation knows to do this is
20:30 that there's a loss function so a
20:32 function that compares the ground truth
20:34 to the prediction to determine the error
20:36 rate how bad the network performs so
20:37 when it gets to the end it's going to
20:40 perform that calculation and then it's
20:41 going to do its back propagation and
20:44 adjust the weights um then you have
20:46 activation functions I'm just going to
20:50 uh clear this up here so activation
20:51 functions uh they're an algorithm
20:54 applied to a hidden layer uh node that
20:56 affects connected output so for this
20:57 entire hidden layer they'll all have the
20:59 same uh one here and it just kind of
21:02 affects uh how it learns and like how
21:04 the waiting works so it's part of back
21:05 propagation and just the learning
21:08 process there's a concept of D so when
21:09 the next layer increases the amount of
21:11 nodes and you have spars so when the
21:13 next layer decreases the amount of nodes
21:14 anytime you see something going from a
21:16 dense layer to a sparse layer that's
21:17 usually called dimensional
21:19 dimensionality reduction because you're
21:20 reducing the amount of Dimensions
21:22 because the amount of nodes in your
21:24 network determines the dimensions you have
21:27 have okay
21:28 okay [Music]
21:29 [Music]
21:31 what is a GPU well it's a general
21:33 processing unit that is specially
21:35 designed to quickly uh render high
21:36 resolution images and videos
21:38 concurrently gpus can perform parallel
21:40 operations on multiple sets of data so
21:42 they are commonly used for non-graphical
21:44 tasks such as machine learning and
21:46 scientific computation so a CPU has an
21:49 average of four to 16 processor cores a
21:51 GPU can have thousands of processor
21:53 cores so something that has 408 gpus
21:55 could have as many as 40,000 cores
21:57 here's an image I grabbed right off the
21:59 Nvidia website and so it really
22:02 illustrates very well uh like how this
22:03 would be really good for machine
22:06 learning or U neural networks because no
22:08 networks have a bunch of nodes they're
22:10 very repetitive tasks if you can spread
22:11 them across a lot of cores that's going
22:13 to work out really great so gpus are
22:15 suited uh for repetitive and highly
22:16 paralleled Computing tasks such as
22:19 rendering Graphics cryptocurrency mining
22:21 deep learning and machine [Music]
22:24 [Music]
22:27 learning we're talking about Cuda before
22:29 we can let's talk about what Nvidia is
22:31 so Nvidia is a company that manufactures
22:33 graphical processor units for gaming and
22:34 professional markets if you play video
22:36 games you've heard of Nvidia so what is
22:39 Cuda it is the compute unified device
22:41 architecture it is a parallel Computing
22:43 platform in API by Nvidia that allows
22:46 developers to use Cuda enable gpus for
22:49 general purpose Computing on gpus so
22:51 gpg all major deep learning Frameworks
22:53 are integrated with Nvidia deep uh
22:56 learning SDK the Nvidia uh deep learning
22:58 SDK is a collection of Nvidia libraries
23:00 for deep learning one of those libraries
23:02 is the Cuda deep neural network library
23:07 so CNN so Cuda or CNN provides highly
23:09 tuned implementations for standard
23:11 routines such as forward and back uh
23:12 convolution convolution is really great
23:16 for um uh uh computer vision pooling
23:20 normalization activation layers uh so
23:22 you know in the Azure certification uh
23:25 for the AI 900 uh they're not going to
23:26 be talking about Cuda but if you
23:27 understand these two things you'll
23:30 understand why gpus uh really matter [Music]
23:34 [Music]
23:37 okay all right let's get a uh easy
23:38 introduction into machine learning
23:40 pipeline so this one is definitely not
23:41 an exhaustive one and we're definitely
23:43 going to see more complex ones uh
23:46 throughout this course but let's get to
23:47 it here so starting on the left hand
23:49 side we might start with data labeling
23:51 this is very uh important when you're
23:53 doing supervis learning because you need
23:55 to label your data so the ml model can
23:57 learn by example during training uh this
24:00 stage and the feature engineering stage
24:02 are is considered pre-processing because
24:04 we are preparing our data to be trained
24:07 for the model uh when we move on to
24:08 feature engineering the idea here is
24:10 that ml models can only work with
24:12 numerical data so you'll need to
24:13 translate it into a format that it can
24:15 understand so extract out the important
24:20 data that the ml model needs to focus on
24:22 okay uh then there's the training steps
24:23 so your model needs to learn how to
24:25 become smarter it will perform multiple
24:27 iterations getting smarter with each iteration
24:28 iteration
24:30 uh you might also have a hyperparameter
24:33 tuning uh step here it says tunning but
24:36 it should say tuning um but the ml model
24:37 can have different parameters so you can
24:40 use ml to try out many different
24:41 parameters to optimize the outcome when
24:44 you get to deep learning it's impossible
24:46 to tweak the parameters by hand so you
24:49 have to use hyperparameter tuning then
24:50 you have serving sometimes known as
24:53 deploying uh but you know when we say
24:54 deploy we talk about the entire pipeline
24:56 not necessarily just the the ml model
24:58 step so we need to make an ml model
25:01 accessible so we serve it by hosting in
25:03 a virtual machine or container uh when
25:06 we're talking about Azure um machine
25:07 learning it's either going to be an
25:09 Azure kubernetes service or Azure
25:11 container instance and you have uh
25:13 inference so inference is the active
25:16 request uh of requesting to make a
25:18 prediction so you send your payload with
25:21 either CSV or whatever and you get back
25:23 the results you have a real time
25:25 endpoint and batch processing so real
25:27 time is just they can batch can be real
25:29 as well but generally it's slower but
25:31 the idea is that do I am I making a
25:33 single item prediction or am I giving
25:35 you a bunch of data at once and again
25:37 this is a very simplified ml pipeline
25:39 I'm sure we'll revisit ml pipeline later
25:40 in this [Music]
25:43 [Music]
25:47 course so let's compare the uh the terms
25:48 forecasting and prediction so
25:50 forecasting you make a prediction with
25:52 relevant data it's great for analysis of
25:55 Trends uh and it's not guessing and when
25:57 you're talking about prediction this is
25:58 where you make a prediction without
25:59 relevant data you use statistics to
26:01 predict future outcomes it's more of
26:04 guessing and it uses decision Theory so
26:06 imagine you have a bunch of data and the
26:08 idea is you're going to infer from that
26:10 data okay maybe it's a maybe it's B
26:12 maybe it's C and for prediction you
26:14 don't have really much data so you're
26:17 going to have to uh kind of invent it
26:18 and the idea is that you'll figure out
26:19 what the outcome is there these are
26:22 extremely broad terms but you know just
26:24 so you have a highle view of these two things
26:26 things okay
26:29 okay
26:31 so what are performance or evaluation
26:32 metrics well they are used to evaluate
26:34 different machine learning algorithms
26:36 the idea is uh you know when your
26:37 machine learning makes a prediction
26:39 these are the metrics you're using to
26:41 evaluate to determine you know is your
26:44 ml model working as you intended so for
26:45 different types of problems different
26:47 metrics matter this is absolutely not an
26:50 exhaustive list I just want you to get
26:52 you exposure to these uh words and
26:53 things uh so that when you see them you
26:55 go okay I'll come back here and refer to
26:57 this uh but lots of these you're just
26:58 it's not it's not necessarily to
27:00 remember but classification metrics you
27:02 should know so classification we have
27:05 accuracy precision recall F1 score rock
27:07 and a for regression metrics we have MSE
27:11 R msce Mae ranking metrics we have MMR
27:14 dcg and dcg statistical metrics we have
27:16 correlation computer vision metrics we
27:20 have psnr ssim IOU NLP metrics we have
27:22 perplexity blue medor Rogue deep
27:24 learning related metrics we have
27:26 Inception score I cannot say this
27:28 person's name but I'm assuming it's a
27:31 person but uh this Inception distance
27:33 and there are two categories of
27:34 evaluation metrics we have internal
27:36 evaluations so metrics used to evaluate
27:39 the internals of an ml model so accuracy
27:41 F1 score Precision recall I call them
27:44 the famous four using all kinds of uh
27:46 models and uh external evaluation
27:48 metrics used to evaluate the final
27:52 prediction of an ml model so yeah uh
27:54 don't get too worked up here I know
27:56 that's a lot of stuff uh the ones that
27:58 matter we will see again again [Music]
28:02 [Music]
28:04 okay let's take a look at Jupiter
28:05 notebook so these are web-based
28:07 applications for authoring documents
28:09 that combin live code narrative text
28:12 equations visualizations uh so if you're
28:13 doing data science or you're building ml
28:14 models you absolutely are going to be
28:16 working with jupyter notebooks they're
28:19 always integrated into uh cloud service
28:22 providers ml tools um uh so jupyter
28:23 notebook actually came about from
28:25 IPython so IPython is the precursor of
28:27 it and they extracted that feature out
28:29 it became jupyter notebook I IPython is
28:32 now a kernel uh to run uh python so when
28:35 you execute out python code here it's
28:37 using IPython which is just a version of
28:39 python uh jupyter notebooks were
28:41 overhauled and better integrated into an
28:43 IDE called Jupiter Labs which we'll talk
28:44 about here in a moment and you generally
28:46 want to open notebooks in Labs the
28:48 Legacy webbased interface is known as
28:50 Jupiter classic notebooks so this is
28:51 what the old one looks like you can
28:53 still open them up but everyone uses
28:54 Jupiter Labs now okay so let's talk
28:56 about Jupiter labs jupyter labs is the
28:58 next generation web-based user interface
29:00 all familiar features of the classic
29:02 Jupiter notebook uh is in a flexible
29:05 powerful user interface it has notebooks
29:07 a terminal a text editor a file browser
29:09 Rich outputs Jupiter Labs will
29:11 eventually replace the classic uh
29:13 Jupiter notebooks so there you [Music]
29:16 [Music]
29:19 go we keep mentioning regression but
29:21 let's talk about it in uh more detail
29:22 here so we kind of understand the
29:24 concept so regression is the process of
29:26 finding a function to equate a labeled
29:28 data set notice it says labeled that
29:29 means it's going to be for supervised
29:32 learning into a continuous variable
29:34 number so another way to say it is
29:35 predict this variable in the future so
29:37 the future is just means like that
29:39 continuous variable doesn't have to be
29:41 time but that's just a good example of
29:44 regression so what will the temperature
29:47 be next week so will it be 20 Celsius
29:48 how would we determine that well we
29:50 would have vectors so dots that are
29:53 plotted on a graph that has multiple
29:54 Dimensions the dimensions could be
29:56 greater than just X and Y you could have
29:58 uh many
29:59 uh and then you have a regression Line
30:00 This is the line that's going through
30:03 our data set and uh and that's going to
30:06 help us uh figure out um how to predict
30:08 the value so how would we do that well
30:10 we would need to calculate the distance
30:11 of a vector from the regression line
30:13 which is called an error and so
30:15 different regression algorithms use uh
30:16 the error to predict different variable
30:18 future variables so just to look at this
30:20 graphic here so here is our regression
30:24 line and here is a a DOT like a a vector
30:26 a piece of information and this distance
30:28 from the line the the actual distance is
30:30 what we're going to use in our ml model
30:33 to figure out if we were to plot another
30:35 line up here right you know we would
30:37 compare this line to all the other lines
30:40 okay and that's how we'd find similarity
30:42 and what we'll commonly see for this is
30:44 mean squared error root mean squared
30:48 error mean absolute error so MSE mrse
30:50 and Mae [Music]
30:53 [Music]
30:55 okay let's take a closer look at the
30:57 concepts of classification so
30:59 classification is the process of finding
31:01 a function to divide a labeled data set
31:04 so again this is supervised learning
31:07 into classes or categories so predict a
31:10 category to apply to the inputed data so
31:11 will it rain next Saturday will it be
31:14 sunny or rainy so we have our data set
31:15 and the idea is we're drawing through
31:17 this a classification line to divide the
31:19 data set so we're regression we're
31:21 measuring the line two or the the the
31:23 vectors to the line and this one it's
31:25 just what side of the line is it on if
31:26 it's on this side then it's sunny if
31:29 it's on this side it's rainy okay for
31:31 classification algorithms we got log
31:33 logistic regression decision trees
31:36 random forests neural networks uh naive
31:39 Bays K nearest neighbor also known as
31:44 knnn and support Vector machines svms [Music]
31:47 [Music]
31:49 okay let's take a closer look at
31:51 clustering so clustering is the process
31:53 of grouping unlabeled data so unlabeled
31:56 data means it's unsupervised learning
31:58 based on similarity and differences so
32:00 the outcome could be group data based on
32:02 similarities or differences I guess it's
32:04 the same description up here uh but
32:06 imagine we have a graph and we have data
32:08 and the idea is we draw boundaries
32:10 around that to see uh similar groups so
32:12 maybe we're recommending purchases to
32:14 Windows computers or recommending
32:16 purchase to Mac computers now remember
32:18 this is unlabeled data so the label is
32:20 being inferred or um or they're just
32:23 saying these things are similar right so
32:25 clustering algorithms we got K means k
32:29 medoids a density Bas hierarchial [Music]
32:32 [Music]
32:34 okay hey this is Andrew Brown from exam
32:36 Pro and we're looking at the confusing
32:38 Matrix and this is a table to visualize
32:39 the model predictions the predicted
32:41 versus the ground truth labels the
32:43 actual also known as an error Matrix and
32:45 they are useful for classification
32:48 problems to determine if our um if our
32:50 classification is working as we think it
32:52 is so imagine we have a question how
32:55 many bananas did this person eat or
32:57 these people eat and so we have this
32:59 kind of a box here where we have
33:01 predicted versus actual and it's really
33:04 comparing the ground truth and what the
33:07 model predicted right and so on the exam
33:10 they'll ask you questions like okay well
33:12 imagine that uh and they might not even
33:15 say yes or no maybe like zero and one
33:17 and so what they're saying is you know
33:20 imagine you have you want to tell us the
33:22 true positives right and so the idea is
33:23 they won't show you the labels here but
33:25 you know one and one would be a true
33:28 positive and zero and Z would be a false
33:31 negative okay another thing they'll ask
33:34 you about these uh confusion matrixes is
33:36 uh the size of them so the idea is that
33:40 we're looking right now at a um oops
33:41 just going to erase that there but we
33:43 are looking at a binary classifier
33:47 because we have one label and uh uh just
33:49 two labels right one and two okay but
33:51 you could have three say one two and
33:52 three so how would you calculate that
33:54 well there would just be a third cell
33:56 over here uh you know and it's just
33:57 going to be actual predicted because
33:59 we're only going to have ground Truth
34:01 Versus prediction and so that's how
34:03 you'll know it will be six the size will
34:04 be six might not say cells but it'll
34:06 just say six [Music]
34:09 [Music]
34:12 okay so to understand anomaly detection
34:14 let's define quickly what is an anomaly
34:17 so an abnormal thing that is marked by
34:20 deviation from the norm or standard so
34:22 anomaly detection is the process of
34:23 finding outliers within a data set
34:25 called an anomaly so detecting when a
34:27 piece of data or access pattern
34:31 appear suspicious or malicious so use
34:32 cases for Nom detection can be data
34:34 cleaning intrusion detection fraud
34:37 detection system Health monitoring event
34:39 detection and sensory or sensor networks
34:42 ecosystem disturbances detection of
34:45 critical and cascading flaws Anomaly
34:47 detections by hand is a very tedious
34:49 process so using ml for anomaly
34:51 detection is more efficient and accurate
34:53 and Azure has a service called anomaly
34:55 detector detects anomalies and data to
34:57 quickly find uh quick identify and troubleshoot
34:59 troubleshoot [Music]
35:02 [Music]
35:05 issues so computer vision is when we use
35:06 machine learning neural networks to gain
35:08 high level understanding of digital
35:11 images or videos so for computer vision
35:13 deep learning algorithms we have
35:15 convolutional neural networks these are
35:17 for image and video recognition they're
35:19 inspired after how the human eye
35:21 actually processes information and sends
35:22 it back to the brain to be processed you
35:25 have recurrent neural networks rnns
35:27 which are generally used for handri
35:28 recognition or speech recognition of
35:29 course these algorithms have other
35:31 applications but these are the most
35:34 common use cases for them for types of
35:36 computer vision we have image
35:37 classification so look at an image or
35:40 video and classify its place in a
35:42 category object detection so identify
35:43 objects within an image or video and
35:46 apply labels and location boundaries
35:48 semantic segmentation so identify
35:49 segments or objects by drawing pixel
35:51 masks around them so great for objects
35:54 and movement image analysis so analyze
35:57 uh an image or video to apply
35:59 descriptive context uh labels so maybe
36:01 an employee is sitting at a desk in
36:03 Tokyo would be uh something that image
36:05 analysis would do optical character
36:08 recognition or OCR find texts in images
36:11 or videos and extract them into digital
36:13 text for editing facial detection so
36:16 detect faces in a photo or video and
36:18 draw a location boundary uh and label
36:20 their expression so for computer vision
36:22 to some things around Azure or Microsoft
36:24 Services there's one called seeing AI
36:26 it's an AI app developed by Microsoft
36:29 for iOS and so you use your device
36:31 camera to identify OB uh people and
36:33 objects and the app is audibly describes
36:34 those objects for people with visual
36:36 impairments it's totally free if you
36:38 have an IOS app I have an Android phone
36:40 so I cannot use it but I hear it's great
36:42 some of the Azure computer vision
36:44 service offering is computer vision so
36:46 analyze images and videos extract
36:48 descriptions tags objects and texts
36:50 custom Vision so custom uh image
36:52 classification object detection models
36:55 using your own images face so detect and
36:58 identify people uh emotions and images
37:01 form recognizer so translate scan
37:03 documents into key value or tabular editable
37:04 editable [Music]
37:07 [Music]
37:10 data so natural language processing also
37:12 known as NLP is machine learning that
37:15 can understand the context of a corpus
37:17 Corpus being a body of related text so
37:19 nlps enable you to analyze and interpret
37:21 text within documents and email messages
37:24 interpret or contextualize spoken tokens
37:26 so for example maybe customer sentiment
37:27 analysis whether customer is happy or
37:30 sad synthesize speech so a voice
37:32 assistant uh assistant talking to you
37:34 automatically translates spoken or
37:35 written phrases and sentences between
37:37 languages interpret spoken or written
37:39 commands and determine appropriate
37:42 actions a very famous example for a
37:43 voice assistant specifically or virtual
37:46 assistant for Microsoft is Cortana uh it
37:48 uses the Bing search engine to perform
37:49 tasks such as setting reminders and
37:51 answering questions for the user uh and
37:53 if you're on a Windows 10 machine uh
37:55 it's very easy to activate Cortana by
37:57 accident uh when we're talking about
37:59 azures MLP offering we have text and
38:01 analytics so sentiment analysis to find
38:04 out what customers think find topic uh
38:06 topic relevant phrases using key phrase
38:07 extraction identify the language of the
38:09 text with the language detection detect
38:11 and categorize entities in your text
38:13 with named entity recognition for
38:14 translator we have real-time text
38:17 translation multilanguage support uh for
38:19 speech service we have transcribe
38:21 audible speech into readable searchable
38:23 texts and then we have
38:25 language understanding also known as Lewis
38:27 Lewis
38:29 uh natural language processing service
38:30 that enables you to understand human
38:33 language in your own application website
38:35 chatbots iot device and more when we
38:38 talk about conversational AI it usually
38:40 generally uses NLP so that's where
38:42 you'll see that overlap next [Music]
38:45 [Music]
38:47 okay let's take a look here at
38:49 conversational AI which is technology
38:50 that can participate in conversations
38:52 with humans so we have chat Bots voice
38:54 assistants and interactive voice
38:56 recognition systems which is like the
38:58 the second version to interactive voice
39:00 response system so you know when you
39:02 call in and they say press these numbers
39:03 that is a response system and a
39:05 recognition system is when they can
39:07 actually take human uh Speech and
39:09 translate that into action so the use
39:11 cases here would be online customer
39:13 support replaces human agents for uh for
39:16 replying about customer FAQs maybe
39:17 shipping questions anything about
39:19 customer support accessibility so voice
39:21 operate UI for those who are uh visually
39:24 impaired HR processes so employee
39:25 training onboarding updating employee
39:27 information I've never seen it used like
39:29 that but that's what they say as a use
39:30 case Healthcare accessible affordable
39:32 healthare so maybe you're doing a claim
39:34 process I've never seen this but maybe
39:36 in the US where you do more your claims
39:38 and everything is privatized it makes
39:40 more sense Internet of Things So iot
39:42 devices so Amazon Alexa Apple Siri
39:44 Google home and I suppose Cortana but it
39:46 doesn't really have a particular device
39:48 so that's why I didn't list it there
39:50 computer software so autocomplete search
39:52 on phone or desktop so that would be
39:54 Cortana something it could do uh for the
39:56 two services that are around
39:58 conversation AI for Azure we have Q&A
40:00 maker so create a conversational
40:01 question and answer bot from your
40:04 existing content also known as a
40:06 knowledge base and Azure bot service
40:08 intelligent serverless bot service that
40:09 scales on demand used for creating
40:12 publishing managing bots so uh the idea
40:14 is you make your Bot here and then you
40:15 deploy it with this [Music]
40:19 [Music]
40:21 okay let's take a look here at
40:23 responsible AI which focuses on ethical
40:25 transparent and accountable uses of AI
40:27 technology Microsoft put into practice
40:29 responsible AI via its six Microsoft AI
40:31 principles this whole thing is invented
40:34 by Microsoft uh and so you know it's not
40:35 necessarily a standard but it's
40:37 something that Microsoft is pushing hard
40:40 to uh have people adopt okay so we the
40:42 first thing we have is fairness so this
40:44 is an AI system which should treat all
40:46 people fairly we have reliability and
40:47 safety an AI system should perform
40:50 reliably and safely privacy and security
40:52 AI system should be secure and respect
40:55 privacy inclusiveness AI system should
40:56 Empower everyone and engage people
40:59 transparency AI systems should be
41:00 understandable accountability people
41:03 should be accountable for AI systems and
41:05 we need to know these in uh uh greater
41:07 detail so we're going to have a a short
41:09 little video on each of these [Music]
41:12 [Music]
41:14 okay the first on our list is fairness
41:16 so AI systems should treat all people
41:18 fairly so an AI system can reinforce
41:21 existing social social stere uh
41:24 stereotypical bias can be introduced uh
41:27 during the development of a pipeline so
41:29 an A system that are used to allocate or
41:31 withhold opportunities resources or
41:33 information uh in domains such as
41:35 criminal justice employee employment and
41:37 hiring finance and credit so an example
41:39 here would be an ml model designed to
41:40 select a final applicant for hiring
41:42 pipeline without incorporating any bias
41:44 based on gender ethnicity or may result
41:47 in unfair Advantage so Azure ml can tell
41:49 you how each feature can influence a
41:52 model's prediction for bias uh one thing
41:54 that could be of use is fair learn so
41:55 it's an open source python project to
41:57 help data science is to improve fairness
41:59 in the AI systems at the time of I made
42:01 this course a lot of their stuff is
42:03 still in preview so you know it's the
42:05 fairness component is is not 100% there
42:07 but it's great to see that they're
42:14 okay so we are on to our second AI
42:16 principle for Microsoft and this one is
42:18 AI systems should perform reliably and
42:20 safely so AI software must be rigorously
42:22 tested to ensure they work as expected
42:24 before release to the end user if there
42:26 are scenarios where AI is making
42:27 mistakes it is important to release a
42:29 report Quantified risks and harms to end
42:31 users so they are informed of the
42:33 shortcomings of an AI solution something
42:34 you should really remember for the exam
42:36 they'll definitely ask that AI wear
42:38 concern uh for reliability safety for
42:40 humans is critically important
42:43 autonomous vehicles a health diagnosis a
42:45 suggestion prescriptions and autonomous
42:47 weapon systems they didn't mention this
42:48 in their content and I was just like
42:50 doing some additional research research
42:52 I'm like yeah you really don't want
42:54 mistakes when you have automated weapons
42:55 or ethnically you shouldn't have them at
42:57 all but hey that's uh that's just how
42:59 the world works but yeah this is this category
43:00 category [Music]
43:03 [Music]
43:06 here we're on to our third Microsoft AI
43:09 principle AI system should be secure and
43:11 respect privacy so AI can require vast
43:13 amounts of data to train deep machine ml
43:15 models the nature of an ml model may
43:17 require personally identifiable
43:19 information so
43:21 piis uh it is important that we ensure
43:23 protection of user data that it is not
43:25 leak or disclosed in some cases ml
43:27 models can be run locally on a user's
43:30 device so their uh piis remain on their
43:32 device avoiding the the vulnerability
43:33 this is called this is like Edge
43:36 Computing so that's the concept there AI
43:37 security principles to malicious actors
43:40 so data origin and lineage data use
43:42 internal versus external data corruption
43:44 considerations anomaly detection so
43:45 there you [Music]
43:48 [Music]
43:51 go we're on to the fourth Microsoft AI
43:53 principle so AI systems should Empower
43:55 everyone and engage people if we can
43:56 design AI Solutions for the minority of
43:58 users they can design a solution for the
43:59 majority of users so when we're talking
44:00 about minority groups we're talking
44:02 about physical ability gender sexual
44:04 orientation ethnicity other factors this
44:06 one's really simple uh in terms of
44:09 practicality it doesn't 100% make sense
44:11 because if you've worked with um uh
44:13 groups that are deaf and blind
44:14 developing technology for them a lot of
44:17 times they need specialized Solutions uh
44:18 but the approach here is that you know
44:20 if we can design for the minority we can
44:22 design for all that is uh the principle
44:25 there so that's what we need to know okay
44:30 let's take a look here at transparency
44:32 so AI systems should be understandable
44:34 so interpretability and intelligibility
44:36 is when the end user can understand the
44:38 behavior of UI so transparency of AI
44:40 systems can result in mitigating
44:42 unfairness help developers debug their
44:44 AI systems ging more trust from our
44:47 users those build a those who build AI
44:49 systems should be open about why they're
44:51 using AI open about the limitations of
44:53 the AI systems adopting an open source
44:56 AI framework can provide transparency at
44:57 least from a technical perspective on
45:06 system we are on to the last Microsoft
45:08 AI principle here people uh should be
45:09 accountable for AI systems so the
45:11 structure put in place to consistently
45:13 enacting AI principles and taking them
45:15 into account AI systems should work
45:16 within Frameworks of governments
45:18 organizational principles ethical and
45:21 legal standards that are clearly defined
45:22 principles guide Microsoft and how they
45:24 develop sell and Advocate when working
45:27 with third parties and this push towards
45:29 regulation towards a principles so this
45:31 is Microsoft saying hey everybody adopt
45:33 our model um there aren't many other
45:34 model so I guess it's great that
45:36 Microsoft is taking the charge there I
45:38 just feel that it needs to be a bit more
45:40 welldeveloped but what we'll do is look
45:42 at some more practical examples so we
45:44 can better understand how to apply their principles
45:50 okay so if we really want to understand
45:52 how to apply the Microsoft AI principles
45:53 they've great created this nice little
45:55 tool via a free web app for practical
45:57 scenarios so they have these cards you
45:58 can read through these cards they're
46:01 color coded for different scenarios and
46:02 there's a website so let's go take a
46:04 look at that and see what we can learn [Music]
46:07 [Music]
46:09 okay all right so we're here on the
46:11 guidelines for human AI interaction so
46:14 we can better understand the uh how to
46:16 put into practice the Microsoft AI
46:19 principles they have 18 cards and let's
46:20 work our way through here and see the
46:22 examples the first one our list make
46:24 clear what the system can do help the
46:25 users understand what the AI system is
46:27 capable of doing so here PowerPoint
46:29 quick start Builders an on uh Builds an
46:31 online outline to help you get started
46:34 researching a subject it display uh
46:35 suggested topics that help you
46:37 understand the features capability then
46:40 we have the Bing app shows examples of
46:42 types of things you can search for um
46:44 Apple watch displays all metrics it
46:46 tracks and explains how going on the
46:49 second card we have make clear how well
46:51 the system can do what it can do so here
46:53 we have office new uh companion
46:55 experience ideas dock alongside your
46:57 work work and offers one-click
46:59 assistance with grammar design Data
47:01 Insights richer images and more the
47:03 unassuming term ideas coupled with label
47:05 previews help set expectations and
47:08 presented suggestions the recommender in
47:10 apple music uses language such as we
47:14 will think you'll like to communicate
47:16 uncertainty the help page for Outlook
47:18 web mail explains the filtering into
47:20 focused and other and we'll start
47:21 working right away but we'll get better
47:24 with use making clear the mistakes uh
47:26 will happen and you teach the product
47:29 and set overrides onto our red cards
47:32 here we have time Services based on
47:34 context time when to act or interrupt
47:36 based on the user's current task and
47:38 environment when it's time to leave for
47:40 appointments Outlook sends a time to
47:42 leave notification with directions for
47:43 both driving and public transit taking
47:45 into account current location event
47:48 location real-time traffic
47:50 information um and then we have after
47:52 using Apple Maps routing it remembers
47:54 when you're parked your car when you
47:56 open the app after for a little while it
47:58 suggests routing to the location of the
48:00 park car all these Apple examples make
48:02 me think that Microsoft has some kind of
48:04 partnership with apple I guess I guess
48:08 Microsoft or or Bill Gates did own Apple
48:09 shares so maybe they're closer than we
48:11 think uh show contextually relevant
48:13 information time when to act or
48:15 interrupt based on user's current task
48:16 and environment powered by Machine
48:18 learning acronyms in word helps you
48:21 understand shorthand employed uh in your
48:23 own work environment relative to current Open
48:24 Open
48:27 document uh on Walmart.com when the user
48:29 is looking at a product such as gaming
48:31 console recommends accessories and games
48:33 that would go with it when a user
48:36 searches for movies Google shows results
48:38 including showtimes near the users's
48:40 location for the current data onto our
48:43 fifth card here match based uh we didn't
48:46 we didn't miss this one right yeah we
48:48 did okay so we're on the fifth one here
48:50 match relevant social norms ensure
48:51 experience is delivered in a way the
48:53 users would expect given the social
48:56 cultural context when editor identifies
48:58 ways to improve writing style presents
49:01 options politely consider using that's
49:04 the Canadian way being polite uh Google
49:06 photos is able to recognize pets and use
49:08 the wording important cats and dogs
49:11 recognizing that for many pets are an
49:13 important part of one's family and you
49:15 know what uh when I uh started renting
49:17 my new house uh I I said you know is
49:18 there a problem with dogs and my
49:21 landlord said well of course pets are
49:22 part of the family and that was
49:24 something I like to hear uh Cortana uses
49:27 semiformal tone apologizing when unable
49:30 to find a uh contact which is polite and
49:32 socially appropriate I like
49:35 that okay mitigate social biases ensure
49:37 AI system languages and behaviors do not
49:40 reinforce undesirable unfair stereotypes
49:42 and biases my anal summarizes how you
49:44 spend your time at work then suggest
49:46 ways to work smarter one ways to
49:48 mitigate bias is by using gender neutral
49:50 icons to represent important people
49:52 sounds good to me a Bing search for CEO
49:55 or doctor shows images of diverse people
49:56 in terms
49:58 of gender and an ethnicity sounds good
50:01 to me the predictive uh keyboard for
50:02 Android suggests both genders when
50:04 typing a pronoun starting with the
50:08 letter H we're on to our yellow cards uh
50:10 so support efficient invocation so make
50:12 it easy to invoke or request system
50:14 Services when needed so flashfill is a
50:16 helpful timesaver in Excel that can be
50:19 easily invoked with on canvas
50:20 interactions and uh that keep you in
50:24 flow on amazon.com oh hey there got
50:26 Amazon in addition to the system giving
50:28 recommendations as you browse you can
50:29 manually invoke additional
50:31 recommendations from the recommender for
50:34 your menu uh design ideas in Microsoft
50:38 PowerPoint can be invoked uh with the
50:39 with the Press of a button if needed I
50:41 cannot stand it when that pops up I
50:44 always have to tell it to leave me alone
50:47 okay support efficient dismal uh efficient
50:49 efficient
50:52 disle dismissal oh support efficient
50:54 dismissal okay make it easy to dismiss
50:57 or ignore or undesired AI system
50:59 Services okay this sounds good to me
51:00 Microsoft forms allows you to create
51:02 custom surveys quizzes polls
51:04 questionnaires and forms some choices
51:06 questions trigger suggested options
51:09 position beneath the relevant question
51:11 the suggestion can be easily ignored and
51:13 dismissed Instagram allows the user to
51:15 easily hide or report ads that have been
51:18 suggested by AI by tapping the ellipses
51:21 at the top of the right of the ad Siri
51:24 can be easily dismissed uh uh by saying
51:27 never mind
51:30 I'm always telling my Alexa never mind
51:32 support efficient uh correction make it
51:34 easy to edit refine or recover the AI
51:36 system uh when the when the AI system is
51:39 wrong so alt Auto alt text automatically
51:41 generates alt text for photographs by
51:43 using intelligent services in the cloud
51:44 descriptions can be easily Modified by
51:46 clicking the alt text button in the
51:49 ribbon once you set a reminder with Siri
51:52 the UI displays a tap to edit link when
51:54 Bing automatically corrects spelling
51:56 errors in search queries it provides the
51:58 option to revert to the query as
52:01 originally typed with one click on to
52:02 card number
52:04 10 Scope Services when in doubt so
52:07 engage in disambiguate
52:09 disambiguate
52:12 disambiguation or gracefully degrade the
52:14 AI system service when uncertain about a
52:17 user's goal so when Auto replacing word
52:19 is uncertain of a correction it engages
52:22 in disambiguation by displaying multiple
52:24 options you can select from Siri will
52:26 let you know it has trouble hearing if
52:28 you don't respond or talk or or speak
52:30 too softly big Maps will provide
52:33 multiple routing options when uh when
52:35 unable to recommend best one we're on to
52:38 card number 11 make clear why the system
52:40 did what it did enable users to access
52:42 an explanation of why the AI system
52:45 behaved as it did office online
52:47 recommends docu documents based on
52:49 history and activity descriptive text
52:51 above each document makes it clear why
52:54 the recommendation is shown product
52:56 recommendations on Amazon .c include why
52:59 Rec recommended link that shows that
53:00 what products in the user shopping
53:03 history informs the recommendations
53:05 Facebook enables you to access an
53:07 explanation about why you are seeing
53:10 each ad in the news
53:13 feed onto our green cards so remember
53:15 recent interactions so maintain
53:16 short-term memory and allow the user to
53:19 make efficient references to that memory
53:21 when attaching a file Outlook offers a
53:23 list of recent files including recently
53:25 copied file links Outlook also remembers
53:28 people you have interacted with recently
53:30 and displays uh them when addressing a new
53:31 new
53:34 email uh Bing search remembers some
53:35 recent queries and search can be
53:37 continued uh conversationally how old is
53:41 he after a search for kyanu Reeves Siri
53:43 carries over the context from one
53:44 interaction to the next a text message
53:46 is Created from the person you told Siri
53:50 to message to onto card number 13 lucky
53:52 number 13 learn from user Behavior
53:54 personalize the user experience by
53:56 learning from their actions over time
53:57 tap on a search bar in office
54:00 applications and search lists uh the top
54:02 three commands on your screen that
54:03 you're most likely to need to
54:06 personalize the technology called zero
54:08 query doesn't even need to type in the
54:10 search bar to provide a personalized
54:12 predictive answer amazon.com gives
54:14 personalized product recomm
54:16 recommendations based on previous
54:19 purchases onto card 14 update and adapt
54:22 Cur uh C uh cautiously limit disruptive
54:24 changes when updating adaptive adapting
54:27 the systems behaviors so PowerPoint
54:29 designer improves slides for office 65
54:31 subscribers by automatically generating
54:33 design ideas from to choose from
54:36 designer has integrated new capabilities
54:37 such as smart Graphics icon suggestions
54:40 and existing user experience ensuring
54:41 the updates are not
54:44 disruptive office tell office tell me
54:46 feature shows dynamically recommended
54:49 items and a designated try area to
54:52 minimize disruptive changes onto card
54:55 number 15 encourage granular feedback
54:56 back enable the users to provide
54:58 feedback indicating their preferences
55:00 during regular interactions with the AI
55:02 system so ideas and Excel empowers you
55:04 to understand your data through high
55:05 level visual summaries Trends and
55:07 patterns encourages feedback on each
55:10 suggestion by asking is this helpful not
55:12 only does Instagram provide the option
55:13 to hide specific ads but it also
55:15 solicits feedback to understand why the
55:18 ad is not relevant and Apple's music app
55:20 love dislike buttons are prominent easily
55:22 easily
55:24 accessible number 16 convey the
55:26 consequences of us actions immediately
55:27 update or convey how user actions will
55:30 impact future behaviors of the AI system
55:32 you can get stock in G Geographic data
55:35 types and Excel it is easy as typing
55:37 text into a cell and converting it to
55:40 stock data type or geograph geography
55:41 data type when you perform the
55:43 conversion action an icon immediately
55:46 appears in the converted cells uh upon
55:47 tapping the like dislike button for each
55:50 recommendation it at in apple music a
55:52 pop-up informs the user that they'll
55:54 receive more or fewer similar
55:56 recommendations onto card number 17
55:58 we're almost near the end provide Global
56:00 controls allow the user to globally
56:04 customize the system system monitors and
56:06 how it behaves so editor expands on
56:08 spelling and grammar checking
56:10 capabilities of word to include more
56:11 advanced proofing and editing designed
56:14 to ensure document is readable editor
56:16 can flag a range of critique types and
56:18 allow to customize the thing is is that
56:21 in word it's so awful spellchecking I
56:23 don't understand like it's been years
56:25 and the the spell checking never gets
56:26 better so they got to emplore better
56:29 spellchecking AI I think bang search
56:31 provides settings that impact the the
56:33 types of results the engine will return
56:36 for example safe
56:38 search uh then we have Google photos
56:40 allows user to turn location history on
56:42 enough for future photos it's kind of
56:44 funny seeing like Bing in there about
56:46 like using AI because at one point it's
56:49 almost pretty certain that Bing was
56:50 copying just Google search indexes to
56:53 learn how to index I don't know that's
56:55 Microsoft for you uh we're on on to card
56:57 18 notify users about changes inform the
57:00 user when AI system adds or updates his
57:02 capabilities uh the what's new dialogue
57:04 in office informs you about changes by
57:05 giving an overview of the latest
57:07 features and updates including updates
57:10 to AI features in Outlook web the help
57:13 tab includes a what's new section that
57:15 covers updates so there we go we made it
57:18 to the end of the list I hope that was a
57:20 fun listen for you and and there I hope
57:23 that we could kind of match up the uh
57:25 the responsible AI I kind of wish what
57:27 they would have done is actually mapped
57:28 it out here and said where it match but
57:30 I guess it's kind of an isolate service
57:32 that kind of ties in so I guess there we go
57:33 go [Music]
57:37 [Music]
57:39 okay hey this is Andrew Brown from exam
57:41 Pro and we are looking at Azure
57:42 cognitive services and this is a
57:44 comprehensive family of AI services and
57:46 cognitive apis to help you build
57:48 intelligent apps so create customizable
57:50 pre-trained models built with
57:52 breakthrough AI researches I put that in
57:53 quotations I'm kind of throwing some
57:56 shade at uh Microsoft at Azure just
57:58 because it's their marketing material
58:00 right uh deploy cognitive Services
58:02 anywhere from Cloud to the edge uh with
58:04 containers get started quickly no
58:06 machine learning expertise required but
58:08 I think it it helps to have a bit of
58:10 background knowledge uh develop with
58:12 strict ethical standards uh Microsoft
58:15 loves talking about the responsible um
58:17 their responsible AI stuff empowering
58:19 responsible use with industry leading
58:21 tools and guidelines so let's do a quick
58:24 breakdown of the types of services in
58:25 this family so for decision we have
58:27 anomaly detector identify potential
58:30 problems early on content moderator
58:32 detect potentially offensive or unwanted
58:34 content personalizer create Rich
58:36 personalized experiences for every user
58:37 for languages we have language
58:40 understanding also known as uh Luis
58:41 Lewis I don't know why I didn't put the
58:42 initialism there but don't worry we'll
58:45 see it again build natural language
58:46 understanding into app spots and iot
58:48 devices Q&A maker create a
58:50 conversational question and answer layer
58:53 over your data text analytics detect
58:55 sentiment so sentiment is is like
58:58 whether customers are happy sad glad uh
59:00 keep phrases and named entities
59:02 translator detect and translate more
59:04 than 90 supported languages for speech
59:06 we have speech to text so transcribe
59:08 audible um speech into readable search
59:10 text text to speech convert text to
59:13 lifelike speech for natural interfaces
59:16 speech translation so integrate realtime
59:18 speech translation into your apps uh
59:20 speaker recognition uh identify and
59:23 verify uh the People speaking based on
59:25 audio for vision uh we have computer
59:27 vision so analyze content in images and
59:31 videos custom Vision so analyze or sorry
59:34 customize image Rec image recognition to
59:37 fit your business needs um pH detect uh
59:39 and identify people and emotions in
59:41 images so there you [Music]
59:45 [Music]
59:48 go so Azure cognitive Services is an
59:50 umbrella AI service that enables
59:52 customers to access multiple AI services
59:55 with an AI key and API endpoint so what
59:57 you do is you go create a new cognitive
59:59 service and once you're there it's going
60:01 to generate out two keys and an endpoint and that is what you're using generally
60:02 and that is what you're using generally for authentication uh with the various
60:04 for authentication uh with the various AI services programmatically and that is
60:06 AI services programmatically and that is something that is key to the service
60:08 something that is key to the service that you need to
60:09 that you need to [Music]
60:12 [Music] know so knowledge mining is a discipline
60:15 know so knowledge mining is a discipline in AI that uses a combination of
60:16 in AI that uses a combination of intelligent services to quickly learn
60:18 intelligent services to quickly learn from vast amounts of information so it
60:20 from vast amounts of information so it allows organizations to deeply
60:22 allows organizations to deeply understand and easily explore
60:23 understand and easily explore information uncover hidden insights and
60:25 information uncover hidden insights and find relationships and patterns at scale
60:27 find relationships and patterns at scale so we have ingest enrich and explore as
60:31 so we have ingest enrich and explore as our three steps so for ingest content
60:32 our three steps so for ingest content from a range of sources using connectors
60:34 from a range of sources using connectors to First and thirdparty data stores so
60:37 to First and thirdparty data stores so we might have structured data such as
60:38 we might have structured data such as databases csvs uh the csvs would more be
60:42 databases csvs uh the csvs would more be semi-structured but we're not going to
60:43 semi-structured but we're not going to get into that level of detail
60:45 get into that level of detail unstructured data so PDFs videos images
60:47 unstructured data so PDFs videos images and audio for enrich the content with AI
60:50 and audio for enrich the content with AI capabilities that let you extract
60:52 capabilities that let you extract information find patterns and deepen
60:54 information find patterns and deepen understanding in so cognitive services
60:56 understanding in so cognitive services like Vision language speech decision and
60:59 like Vision language speech decision and search and explore the newly indexed
61:01 search and explore the newly indexed data via search spots existing
61:03 data via search spots existing businesses applications and data
61:05 businesses applications and data visualizations enrich uh structured Data
61:07 visualizations enrich uh structured Data customer relationship management wrap
61:10 customer relationship management wrap systems powerbi this whole knowledge
61:12 systems powerbi this whole knowledge mining thing is a thing but like I
61:14 mining thing is a thing but like I believe that the whole model around this
61:16 believe that the whole model around this is so that Azure uh shows you how you
61:18 is so that Azure uh shows you how you can use the cognitive services to solve
61:21 can use the cognitive services to solve uh things without having to invent new
61:23 uh things without having to invent new Solutions so let's look at a bunch of
61:24 Solutions so let's look at a bunch of use cases that Azure has uh and see what
61:27 use cases that Azure has uh and see what where we can find some in uh useful use
61:29 where we can find some in uh useful use so the first one here is for Content
61:31 so the first one here is for Content research so when organizations uh task
61:33 research so when organizations uh task employees review and research of
61:35 employees review and research of technical data it can be tedious to read
61:37 technical data it can be tedious to read page after page of dense texts knowledge
61:39 page after page of dense texts knowledge mining helps employees quickly review
61:42 mining helps employees quickly review these dense materials so you have a
61:43 these dense materials so you have a document and in the enrichment step you
61:46 document and in the enrichment step you could be doing printed text recognition
61:48 could be doing printed text recognition key phrase extraction sharpener uh
61:50 key phrase extraction sharpener uh sharpen skills technical keyword
61:52 sharpen skills technical keyword sanitation format definition minor large
61:55 sanitation format definition minor large scale vocabulary matcher you put it
61:57 scale vocabulary matcher you put it through a search service and now you
61:59 through a search service and now you have search reference library so it
62:00 have search reference library so it makes things a lot easier to work with
62:03 makes things a lot easier to work with uh now we have uh audit risk compliance
62:05 uh now we have uh audit risk compliance management so developers could use
62:07 management so developers could use knowledge mining to help attorneys
62:08 knowledge mining to help attorneys quickly identify entities of importance
62:10 quickly identify entities of importance from Discovery documents and flag
62:12 from Discovery documents and flag important ideas across documents so we
62:14 important ideas across documents so we have documents so Clause extraction
62:17 have documents so Clause extraction Clause uh classification gdpr risk named
62:20 Clause uh classification gdpr risk named uh identity extraction keyphrase
62:22 uh identity extraction keyphrase extraction language detection automate
62:24 extraction language detection automate translation uh then you put it back into
62:26 translation uh then you put it back into a search index and now you can use it a
62:28 a search index and now you can use it a management platform or a word
62:31 management platform or a word plugin and so we have business Process
62:33 plugin and so we have business Process Management in Industries where bidding
62:35 Management in Industries where bidding competition is fierce or when the
62:37 competition is fierce or when the diagnosis of a problem must be quick or
62:39 diagnosis of a problem must be quick or in near real time companies use
62:41 in near real time companies use knowledge mining to avoid costly
62:44 knowledge mining to avoid costly mistakes so the client Drilling and uh
62:46 mistakes so the client Drilling and uh uh and completion reports document
62:48 uh and completion reports document processor AI services and custom models
62:50 processor AI services and custom models Q for human validation intelligent
62:53 Q for human validation intelligent automation yes send it to a backend
62:55 automation yes send it to a backend system or a data Lake Andor a data Lake
62:57 system or a data Lake Andor a data Lake and then you do your analytics dashboard
63:00 and then you do your analytics dashboard then we have customer support and
63:01 then we have customer support and feedback uh analysis so for many
63:04 feedback uh analysis so for many companies customer support is costly and
63:06 companies customer support is costly and in uh efficient knowledge mining can
63:07 in uh efficient knowledge mining can help customer support teams quickly find
63:09 help customer support teams quickly find the right answers for a customer inquiry
63:12 the right answers for a customer inquiry or assess customer sentiment at scale so
63:15 or assess customer sentiment at scale so you have your Source data you do your
63:16 you have your Source data you do your document cracking use cognitive skills
63:19 document cracking use cognitive skills so pre-trained services or custom uh you
63:21 so pre-trained services or custom uh you have enriched documents from here you're
63:23 have enriched documents from here you're going to do your projections and have a
63:24 going to do your projections and have a knowledge store you're going to have a
63:26 knowledge store you're going to have a search index and then do your analytics
63:28 search index and then do your analytics something like powerbi uh we have
63:30 something like powerbi uh we have digital assessment management I know
63:31 digital assessment management I know there's a lot of these but it really
63:32 there's a lot of these but it really helps you understand how cognitive
63:34 helps you understand how cognitive services are going to be useful uh given
63:36 services are going to be useful uh given the amount of unstructured data created
63:38 the amount of unstructured data created daily many companies are struggling to
63:39 daily many companies are struggling to make use of or find information within
63:42 make use of or find information within their files knowledge mining through a
63:43 their files knowledge mining through a search index makes it easy for end
63:45 search index makes it easy for end customers and employees to locate what
63:47 customers and employees to locate what they're looking for faster so you in
63:49 they're looking for faster so you in just like art metadata and the actual
63:50 just like art metadata and the actual images themselves for the top ler we're
63:52 images themselves for the top ler we're doing geopoint extractor biographical
63:54 doing geopoint extractor biographical enricher then down below we're tagging
63:56 enricher then down below we're tagging we're custom object detector similar
63:58 we're custom object detector similar image tagger we put it in a search index
64:00 image tagger we put it in a search index they love those search indexes and now
64:02 they love those search indexes and now you have an art
64:04 you have an art Explorer uh we have contract management
64:07 Explorer uh we have contract management this is the last one here many companies
64:09 this is the last one here many companies create products for multiple sectors
64:11 create products for multiple sectors hence the business opportunities with
64:12 hence the business opportunities with different vendors and buyers increase
64:14 different vendors and buyers increase exponentially knowledge mining can help
64:16 exponentially knowledge mining can help organizations to scour thousands of
64:18 organizations to scour thousands of pages of sources to create accurate bids
64:21 pages of sources to create accurate bids so here we have RFP documents um this
64:23 so here we have RFP documents um this will actually probably come back later
64:25 will actually probably come back later in the original set but we will Ex we'll
64:28 in the original set but we will Ex we'll do risk extraction print text
64:30 do risk extraction print text recognition keyphrase extraction
64:32 recognition keyphrase extraction organizational extraction engineering
64:34 organizational extraction engineering standards we'll create a search index
64:36 standards we'll create a search index and put it here this will bring back
64:38 and put it here this will bring back data also Metadate extraction will come
64:40 data also Metadate extraction will come back here and then this is just like a
64:41 back here and then this is just like a continuous pipeline
64:43 continuous pipeline [Music]
64:46 [Music] okay hey this is Andrew Brown from exam
64:49 okay hey this is Andrew Brown from exam Pro and we are looking at face service
64:51 Pro and we are looking at face service and Azure face service provides an AI
64:53 and Azure face service provides an AI algorithm that can detect recognize and
64:55 algorithm that can detect recognize and analyze human faces and images uh such
64:58 analyze human faces and images uh such as a face in an image face with specific
65:00 as a face in an image face with specific attributes face landmarks similar faces
65:03 attributes face landmarks similar faces and the same face as a specific identity
65:06 and the same face as a specific identity across a gallery of images so here is an
65:08 across a gallery of images so here is an example of an image uh that I ran that
65:10 example of an image uh that I ran that will do in the follow along and uh what
65:13 will do in the follow along and uh what it's done is it's drawn a bounding box
65:14 it's done is it's drawn a bounding box around the image and there's this ID and
65:17 around the image and there's this ID and this is a unique identifier uh string
65:19 this is a unique identifier uh string for each detected face in an image and
65:21 for each detected face in an image and these can be unique across the gallery
65:22 these can be unique across the gallery which is really useful as well another
65:24 which is really useful as well another cool thing you can do is a face
65:26 cool thing you can do is a face landmarks so the idea is that you have a
65:29 landmarks so the idea is that you have a face and it can identify very particular
65:31 face and it can identify very particular components of it and up to 27 predefined
65:34 components of it and up to 27 predefined landmarks is what is provided with this
65:36 landmarks is what is provided with this face Service uh another interesting
65:39 face Service uh another interesting thing is face attribut so you can uh
65:41 thing is face attribut so you can uh check whether they're wearing access
65:43 check whether they're wearing access accessories so think like earrings or
65:45 accessories so think like earrings or lip rings uh determine its age uh the
65:48 lip rings uh determine its age uh the blurriness of the image uh what kind of
65:50 blurriness of the image uh what kind of emotion is being uh experienced the
65:52 emotion is being uh experienced the exposure of the image you know the
65:54 exposure of the image you know the contrast uh facial hair gender glasses
65:58 contrast uh facial hair gender glasses uh your hair in general the head pose
66:01 uh your hair in general the head pose there's a lot of information around that
66:02 there's a lot of information around that makeup which seems to be limited like
66:04 makeup which seems to be limited like when we ran it here in the lab all we
66:07 when we ran it here in the lab all we got back was eye makeup and lip makeup
66:09 got back was eye makeup and lip makeup but hey we get some information whether
66:11 but hey we get some information whether they're wearing a mask uh noise so
66:13 they're wearing a mask uh noise so whether there's artifacts like visual
66:15 whether there's artifacts like visual artifacts or occlusion so whether an
66:18 artifacts or occlusion so whether an object is blocking parts of the face and
66:20 object is blocking parts of the face and then they simply have a Boolean value
66:22 then they simply have a Boolean value for whether the person smileing or not
66:24 for whether the person smileing or not which I assume is a very common
66:26 which I assume is a very common component so that's pretty much all we
66:28 component so that's pretty much all we really need to know about the faith
66:29 really need to know about the faith service and there you
66:30 service and there you [Music]
66:34 [Music] go hey this is Andrew Brown from exam
66:36 go hey this is Andrew Brown from exam Pro and we are looking at the speech and
66:38 Pro and we are looking at the speech and translate service so a your's translate
66:40 translate service so a your's translate service is a translation service as the
66:42 service is a translation service as the name implies and it can translate 90
66:44 name implies and it can translate 90 languages and dialects and I was even
66:46 languages and dialects and I was even surprised to find out that it can
66:47 surprised to find out that it can translate into kingon um and it uses a
66:51 translate into kingon um and it uses a neural machine translation nmt replace
66:53 neural machine translation nmt replace reping its Legacy statistical machine
66:56 reping its Legacy statistical machine translation smt so what my guess here is
66:59 translation smt so what my guess here is that statistical meaning that it used
67:01 that statistical meaning that it used classical machine learning back in 2010
67:03 classical machine learning back in 2010 and and then they decided to switch it
67:05 and and then they decided to switch it over to neural networks um which of
67:08 over to neural networks um which of course would be a lot more accurate uh
67:09 course would be a lot more accurate uh as your translate service can support a
67:12 as your translate service can support a custom translator so it allows you to
67:13 custom translator so it allows you to extend the service for translation based
67:15 extend the service for translation based on your business domain use cases so if
67:17 on your business domain use cases so if you use a lot of technical words and
67:19 you use a lot of technical words and things like that then you can fine-tune
67:21 things like that then you can fine-tune that or particular phrases then there's
67:23 that or particular phrases then there's the other service Azure speech service
67:26 the other service Azure speech service and this is a uh a speech uh synthesis
67:29 and this is a uh a speech uh synthesis Serv service so what can do is speech to
67:31 Serv service so what can do is speech to text text to speech and speech
67:33 text text to speech and speech translation so it's synthesizing
67:35 translation so it's synthesizing creating new voices okay so we have
67:37 creating new voices okay so we have speech to text so real time speech to
67:39 speech to text so real time speech to text batch uh batching Multi-Device
67:42 text batch uh batching Multi-Device conversation conversation transcription
67:44 conversation conversation transcription and you can create custom speech models
67:46 and you can create custom speech models then you have text to speech so this
67:48 then you have text to speech so this utilizes a speech synthesis markup
67:51 utilizes a speech synthesis markup language so it's just a way of
67:52 language so it's just a way of formatting it and it can create cust
67:54 formatting it and it can create cust some
67:54 some voices uh then you have the voice
67:56 voices uh then you have the voice assistant so integrates with the bot
67:58 assistant so integrates with the bot framework SDK and speech recognition so
68:01 framework SDK and speech recognition so speaker verification and identification
68:03 speaker verification and identification so there you
68:04 so there you [Music]
68:07 [Music] go hey this is Andrew Brown from exam
68:10 go hey this is Andrew Brown from exam Pro and we are looking at text analytics
68:12 Pro and we are looking at text analytics and this is a service for NLP so natural
68:14 and this is a service for NLP so natural language processing for text Mining and
68:17 language processing for text Mining and text analysis so text analytics can
68:19 text analysis so text analytics can perform sentiment analysis so find out
68:22 perform sentiment analysis so find out what people think about your brand or
68:23 what people think about your brand or topics so features provide sentiment
68:25 topics so features provide sentiment labels such as negative neutral positive
68:28 labels such as negative neutral positive then you have opinion minin which is an
68:30 then you have opinion minin which is an aspect based sentiment analysis it's for
68:33 aspect based sentiment analysis it's for granular information about the opinions
68:34 granular information about the opinions related to aspects then you have key
68:37 related to aspects then you have key phrase extraction so quickly identify
68:39 phrase extraction so quickly identify the main Concepts in text you have
68:41 the main Concepts in text you have language detections that detect the
68:42 language detections that detect the language uh of an inputed text that it's
68:45 language uh of an inputed text that it's written in and you have name entity
68:47 written in and you have name entity recognition so Neer so identify and
68:50 recognition so Neer so identify and categorize entities in your text as
68:51 categorize entities in your text as people places objects and quantities and
68:54 people places objects and quantities and subset of ner is personally identifiable
68:58 subset of ner is personally identifiable information so piis let's just look at a
69:01 information so piis let's just look at a few of these more in detail some of them
69:03 few of these more in detail some of them are very obvious but some of these would
69:04 are very obvious but some of these would help to have an example so the first
69:05 help to have an example so the first we're looking at is key phrase
69:06 we're looking at is key phrase extraction so quickly identify the main
69:09 extraction so quickly identify the main Concepts and text so keyphrase
69:10 Concepts and text so keyphrase extraction works best when you have uh
69:12 extraction works best when you have uh when you give it bigger amounts of text
69:13 when you give it bigger amounts of text to work on this is the opposite of
69:15 to work on this is the opposite of sentiment analysis which performs better
69:17 sentiment analysis which performs better on smaller amounts of text so document
69:20 on smaller amounts of text so document sizes can be 5,000 or fewer characters
69:22 sizes can be 5,000 or fewer characters per document and you can have up to a
69:25 per document and you can have up to a thousand items per collection so imagine
69:28 thousand items per collection so imagine you have a movie review with a lot of
69:29 you have a movie review with a lot of text in here and you want to uh extract
69:31 text in here and you want to uh extract out the key phrases so here it
69:32 out the key phrases so here it identified Borg ship Enterprise um
69:35 identified Borg ship Enterprise um surface travels things like that uh then
69:37 surface travels things like that uh then you have named entity recognition so
69:40 you have named entity recognition so this detects words and phrases mentioned
69:42 this detects words and phrases mentioned in unstructured data that can be
69:43 in unstructured data that can be associated with one or more semantic
69:45 associated with one or more semantic types and so here's an example I think
69:48 types and so here's an example I think this is medicine based and so the idea
69:50 this is medicine based and so the idea is that it's identifying it's
69:52 is that it's identifying it's identifying um
69:54 identifying um these words or phrases and then it's
69:56 these words or phrases and then it's applying a semantic type so it's saying
69:58 applying a semantic type so it's saying like this is a diagnosis this is a
70:00 like this is a diagnosis this is a medication class and stuff like that uh
70:02 medication class and stuff like that uh semantic type could be more broad so
70:04 semantic type could be more broad so there's location event but location
70:06 there's location event but location twice your person diagnosis age and
70:08 twice your person diagnosis age and there is a predefined set I believe that
70:10 there is a predefined set I believe that is in um Azure that you should expect
70:13 is in um Azure that you should expect but they have a generic one and then
70:14 but they have a generic one and then there's one that's for health we're
70:16 there's one that's for health we're looking at sentiment analysis this
70:18 looking at sentiment analysis this graphic makes it uh make a lot more
70:20 graphic makes it uh make a lot more sense when we're splitting between
70:21 sense when we're splitting between sentiment and opinion mining but the
70:24 sentiment and opinion mining but the idea here is that sentiment analysis
70:25 idea here is that sentiment analysis will apply labels and confidence scores
70:27 will apply labels and confidence scores to text at the sentence and document
70:30 to text at the sentence and document level and so labels can include negative
70:33 level and so labels can include negative positive mixed or neutral and we'll have
70:35 positive mixed or neutral and we'll have a confidence score ranging from 0 to one
70:38 a confidence score ranging from 0 to one and so over here we have a sentiment
70:40 and so over here we have a sentiment analysis of this line here and it's
70:42 analysis of this line here and it's saying that this was a negative
70:43 saying that this was a negative sentiment but look there's something
70:44 sentiment but look there's something that's positive and there's something
70:46 that's positive and there's something that's negative so was it really
70:47 that's negative so was it really negative and that's where opinion mining
70:49 negative and that's where opinion mining gets really useful because it has more
70:51 gets really useful because it has more granular data where we have a subject
70:54 granular data where we have a subject and we have an opinion right and so here
70:56 and we have an opinion right and so here we can see the room was great but the
70:58 we can see the room was great but the staff was unfriendly negative so we have
71:00 staff was unfriendly negative so we have a bit of a split there
71:02 a bit of a split there [Music]
71:06 [Music] okay hey this is Andrew Brown from exam
71:08 okay hey this is Andrew Brown from exam Pro and we are looking at optical
71:10 Pro and we are looking at optical character recognition also known as OCR
71:13 character recognition also known as OCR and this is the process of extracting
71:14 and this is the process of extracting printed or handwritten text into a
71:16 printed or handwritten text into a digital and editable format so OCR can
71:19 digital and editable format so OCR can be applied to photos of street signs
71:21 be applied to photos of street signs products documents invoices bills s
71:24 products documents invoices bills s Financial reports articles and more and
71:26 Financial reports articles and more and so here's an example of us extracting
71:28 so here's an example of us extracting out nutritional data or nutritional
71:30 out nutritional data or nutritional facts off the back of a food product so
71:34 facts off the back of a food product so Azure has two different kinds of apis
71:36 Azure has two different kinds of apis that can perform OCR they have the OCR
71:39 that can perform OCR they have the OCR API and the read API so the OCR API uses
71:42 API and the read API so the OCR API uses an older recognition model it supports
71:45 an older recognition model it supports only images it executes synchros
71:48 only images it executes synchros synchronously returning immediately when
71:51 synchronously returning immediately when uh it detects text it's suited for for
71:53 uh it detects text it's suited for for Less texts it supports more languages
71:56 Less texts it supports more languages it's easier to implement and on the
71:58 it's easier to implement and on the other side we have the read API so this
72:00 other side we have the read API so this is an updated recognition model supports
72:02 is an updated recognition model supports images and PDFs executes
72:06 images and PDFs executes asynchronously paralyzes tasks per line
72:08 asynchronously paralyzes tasks per line for faster results suited for lots of
72:11 for faster results suited for lots of texts supports fewer languages and it's
72:14 texts supports fewer languages and it's a bit more difficult to implement and so
72:16 a bit more difficult to implement and so when we want to use this service we're
72:18 when we want to use this service we're going to be using uh computer vision SDK
72:22 going to be using uh computer vision SDK okay
72:28 hey this is Andrew Brown from exam Pro and we're taking a look here at form
72:30 and we're taking a look here at form recognizer service this is a specialized
72:32 recognizer service this is a specialized OCR service that translates printed text
72:35 OCR service that translates printed text into digital and editable content it PR
72:37 into digital and editable content it PR preserves the structure and
72:39 preserves the structure and relationships of the form like data
72:41 relationships of the form like data that's what makes it so special so form
72:43 that's what makes it so special so form recognizer is used to automate data
72:45 recognizer is used to automate data entry in your applications and enrich
72:46 entry in your applications and enrich your document search capabilities it can
72:48 your document search capabilities it can identify key value pairs selection marks
72:50 identify key value pairs selection marks table structures uh it can produce
72:53 table structures uh it can produce output structures such as original file
72:55 output structures such as original file relationships bounding boxes confidence
72:58 relationships bounding boxes confidence score and form recognizer is composed of
73:00 score and form recognizer is composed of a custom document processing models
73:02 a custom document processing models pre-built models for invoices receipts
73:04 pre-built models for invoices receipts IDs business cards the model layouts
73:07 IDs business cards the model layouts let's talk about the layout here so
73:08 let's talk about the layout here so extract text selection marks table
73:10 extract text selection marks table structures along with bounding box
73:11 structures along with bounding box coordinates from documents for
73:13 coordinates from documents for recognizer can extract text selection
73:16 recognizer can extract text selection marks and table structures the row and
73:18 marks and table structures the row and column numbers associated with the text
73:20 column numbers associated with the text using highdefinition optical character
73:22 using highdefinition optical character enhancement models
73:24 enhancement models that is totally useless that
74:04 hey this is Andrew Brown from exam Pro and we are looking at form recognizer
74:06 and we are looking at form recognizer service and this is a Specialized
74:08 service and this is a Specialized Service uh for OCR that translates
74:10 Service uh for OCR that translates printed text into digital edible content
74:12 printed text into digital edible content but the magic here is that preserves the
74:14 but the magic here is that preserves the structure and relationship of form likee
74:16 structure and relationship of form likee data so there's an invoice you see those
74:18 data so there's an invoice you see those magenta lines it's saying identify that
74:21 magenta lines it's saying identify that form like data so form recogn is used to
74:24 form like data so form recogn is used to automate data entry in your applications
74:25 automate data entry in your applications and enrich your document search
74:27 and enrich your document search capabilities and it can identify key
74:29 capabilities and it can identify key value pairs selection marks table
74:31 value pairs selection marks table structures and it can output structures
74:33 structures and it can output structures such as original file relationships
74:34 such as original file relationships bounding box boxes confidence scores
74:37 bounding box boxes confidence scores it's composed of a customer Uh custom
74:40 it's composed of a customer Uh custom document processing model pre-built
74:42 document processing model pre-built models for invoices receipts IDs and
74:44 models for invoices receipts IDs and business cards it's based on this layout
74:46 business cards it's based on this layout model uh and there you
74:48 model uh and there you [Music]
74:52 [Music] go so let's touch upon custom models so
74:55 go so let's touch upon custom models so custom models allow you to extract text
74:57 custom models allow you to extract text key value pair selection marks and
74:58 key value pair selection marks and tabular data from your forms these
75:00 tabular data from your forms these models are trained with your data so
75:02 models are trained with your data so they're tailored to your forms you only
75:04 they're tailored to your forms you only need five samples uh sample input forms
75:06 need five samples uh sample input forms to start a trained document processing
75:08 to start a trained document processing model can output structured data that
75:10 model can output structured data that includes the relationship in the
75:11 includes the relationship in the original form document after you train
75:13 original form document after you train the model you can test and retrain it
75:14 the model you can test and retrain it and eventually use it reliably extract
75:16 and eventually use it reliably extract data from uh more forms according to
75:18 data from uh more forms according to your needs you have two learning options
75:20 your needs you have two learning options you have unsupervised learnings to
75:22 you have unsupervised learnings to understand the layout and Rel ship
75:23 understand the layout and Rel ship between Fields entries and your forms
75:25 between Fields entries and your forms and you have supervised learning to
75:26 and you have supervised learning to extract values of Interest using the
75:28 extract values of Interest using the labeled form so we've covered
75:30 labeled form so we've covered unsupervised and supervisor learning so
75:31 unsupervised and supervisor learning so you're going to be very familiar with
75:32 you're going to be very familiar with these two
75:34 these two [Music]
75:37 [Music] okay so Forum recognizer service has
75:41 okay so Forum recognizer service has many pre-built models that are uh easy
75:43 many pre-built models that are uh easy to get uh started with and so let's go
75:45 to get uh started with and so let's go look at them and see what kind of fields
75:47 look at them and see what kind of fields it extracts out by default so the first
75:49 it extracts out by default so the first is receipts so sales receipts from
75:52 is receipts so sales receipts from Australia Canada great Britain India
75:53 Australia Canada great Britain India United States will work great here and
75:55 United States will work great here and the fields it will extract out is
75:56 the fields it will extract out is receipt type Merchant name Merchant
75:57 receipt type Merchant name Merchant phone number Merchant address
75:59 phone number Merchant address transaction date transaction time total
76:01 transaction date transaction time total subtotal tax tip items name quantity
76:04 subtotal tax tip items name quantity price total price if there's information
76:06 price total price if there's information that is on a a receipt that you're not
76:08 that is on a a receipt that you're not getting out of these fields and that's
76:10 getting out of these fields and that's where you make your own custom model
76:11 where you make your own custom model right for business cards it's only
76:14 right for business cards it's only available for English business cards but
76:16 available for English business cards but uh we can extract out contact names
76:17 uh we can extract out contact names first name last name company names
76:19 first name last name company names departments job titles emails websites
76:21 departments job titles emails websites addresses mobile phones faxes work
76:23 addresses mobile phones faxes work phones uh and other phone numbers not
76:25 phones uh and other phone numbers not sure how many people are using uh
76:27 sure how many people are using uh business cards these days but hey they
76:29 business cards these days but hey they have it as an option for invoices
76:31 have it as an option for invoices extract data from invoices in various
76:32 extract data from invoices in various formats and return structured data so we
76:35 formats and return structured data so we have customer name customer ID purchase
76:37 have customer name customer ID purchase order invoice ID invoice date due date
76:39 order invoice ID invoice date due date vendor name vendor address vendor
76:41 vendor name vendor address vendor address receipt customer address
76:43 address receipt customer address customer address receipt billing address
76:45 customer address receipt billing address billing address receipt shipping address
76:47 billing address receipt shipping address subtotal total tax invoice total amount
76:50 subtotal total tax invoice total amount due service address uh remittance
76:52 due service address uh remittance address uh start service start date and
76:54 address uh start service start date and end date uh previous unpaid balance and
76:57 end date uh previous unpaid balance and then they even have one for line items
76:59 then they even have one for line items so items amount description quantity
77:03 so items amount description quantity unit price product code unit date tax
77:06 unit price product code unit date tax and then for IDs which could be
77:08 and then for IDs which could be worldwide passports us driver license
77:10 worldwide passports us driver license things like that um you would have
77:12 things like that um you would have Fields such as country region date of
77:14 Fields such as country region date of birth date of expire expiration document
77:16 birth date of expire expiration document name first name last name nationality
77:18 name first name last name nationality sex machine readable zone I'm not sure
77:22 sex machine readable zone I'm not sure what that is document type and address
77:25 what that is document type and address and region uh and there are some
77:27 and region uh and there are some additional features with some of these
77:29 additional features with some of these models we didn't really cover them it's
77:31 models we didn't really cover them it's not that important but um yeah there we
77:33 not that important but um yeah there we [Music]
77:37 [Music] go hey this is Andrew Brown from exam
77:39 go hey this is Andrew Brown from exam Pro and we were looking at natural
77:41 Pro and we were looking at natural understanding or LS or Louise depends on
77:45 understanding or LS or Louise depends on how You' like to say it and this is a no
77:47 how You' like to say it and this is a no code ml service to build language uh
77:49 code ml service to build language uh natural language into Apps Bots and iot
77:52 natural language into Apps Bots and iot devices to quickly create Enterprise
77:54 devices to quickly create Enterprise ready custom models that continuously
77:56 ready custom models that continuously improve so Lewis I'm going to just call
77:58 improve so Lewis I'm going to just call it lwis cuz that's what I prefer is
78:00 it lwis cuz that's what I prefer is access via its own isolate domain at
78:03 access via its own isolate domain at lewis. and it utilizes NLP and nlu so
78:07 lewis. and it utilizes NLP and nlu so nlu is the ability to perform or ability
78:10 nlu is the ability to perform or ability to transform a linguistic statement to a
78:12 to transform a linguistic statement to a representation that enables you to
78:14 representation that enables you to understand your users naturally and it
78:16 understand your users naturally and it is intended to focus on intention and
78:18 is intended to focus on intention and extraction okay so where the users want
78:21 extraction okay so where the users want or sorry what the users want and what
78:23 or sorry what the users want and what the users are talking about so uh the
78:26 the users are talking about so uh the loose application is composed of a
78:28 loose application is composed of a schema and the schema is autogenerated
78:30 schema and the schema is autogenerated for you when you use the Lewis AI web
78:32 for you when you use the Lewis AI web interface so you definitely AR going to
78:34 interface so you definitely AR going to be writing this by hand but it just
78:35 be writing this by hand but it just helps to see what's kind of in there if
78:37 helps to see what's kind of in there if you do have some programmatic skills you
78:38 you do have some programmatic skills you obviously you can make better use of the
78:40 obviously you can make better use of the service than just the web interface but
78:42 service than just the web interface but the schema defines intentions so what
78:44 the schema defines intentions so what the users are asking for a l app always
78:47 the users are asking for a l app always contains a nonone intent we'll talk
78:49 contains a nonone intent we'll talk about why that is in a moment and
78:50 about why that is in a moment and entities what parts of the intent is
78:53 entities what parts of the intent is used to determine the answer then you
78:55 used to determine the answer then you also have utterances so examples of the
78:57 also have utterances so examples of the user input that includes intent and
78:59 user input that includes intent and entities to train the ml model to match
79:01 entities to train the ml model to match predictions against the real user input
79:04 predictions against the real user input so an intent requires one or more
79:06 so an intent requires one or more example utterance for training and it is
79:08 example utterance for training and it is recommended to have 15 to 30 example
79:10 recommended to have 15 to 30 example utterances to explicitly train uh to
79:13 utterances to explicitly train uh to ignore an utterance you use the nonone
79:15 ignore an utterance you use the nonone intent so intent classifies user
79:20 intent so intent classifies user utterances and entities extract data
79:22 utterances and entities extract data from utterances so hopefully that
79:24 from utterances so hopefully that understands I always get this stuff
79:25 understands I always get this stuff mixed up it always takes me a bit of
79:26 mixed up it always takes me a bit of time to understand there is more than
79:28 time to understand there is more than just these things there's like features
79:30 just these things there's like features and other things but you know for the AI
79:33 and other things but you know for the AI 900 we don't need to go that deep okay
79:36 900 we don't need to go that deep okay uh just to get to visualizing this to
79:38 uh just to get to visualizing this to make a bit easier so imagine we have
79:39 make a bit easier so imagine we have this uh this utterance here these would
79:42 this uh this utterance here these would be the identities so we have two and
79:44 be the identities so we have two and Toronto this is the example utterance
79:46 Toronto this is the example utterance and then the idea is that you'd have the
79:47 and then the idea is that you'd have the intent and the intent and if you look at
79:50 intent and the intent and if you look at this keyword here this really helps
79:51 this keyword here this really helps where it says classify it's that's what
79:52 where it says classify it's that's what what it is it's a classification of this
79:55 what it is it's a classification of this example utterance and that's how the ml
79:57 example utterance and that's how the ml model is going to learn
79:59 model is going to learn [Music]
80:02 [Music] okay hey this is Andre Brown from exam
80:05 okay hey this is Andre Brown from exam Pro and we are looking at Q&A maker
80:06 Pro and we are looking at Q&A maker service and this is a cloud-based NLP
80:09 service and this is a cloud-based NLP service that allows you to create a
80:10 service that allows you to create a natural conversational layer over your
80:12 natural conversational layer over your data so Q&A maker is hosted on its own
80:14 data so Q&A maker is hosted on its own isolate domain at Q&A maker. it will
80:17 isolate domain at Q&A maker. it will help the most uh it will help you find
80:19 help the most uh it will help you find the most appropriate answer from any
80:20 the most appropriate answer from any input from your custom knowledge base of
80:22 input from your custom knowledge base of information so you can commonly uh it's
80:25 information so you can commonly uh it's commonly used to build conversation
80:26 commonly used to build conversation clients which include social apps chat
80:28 clients which include social apps chat Bots speech enabled uh desktop
80:31 Bots speech enabled uh desktop applications uh Q&A maker doesn't store
80:33 applications uh Q&A maker doesn't store customer data all the customer data
80:35 customer data all the customer data stored in the region the customer
80:37 stored in the region the customer deploys the the dependent Services
80:39 deploys the the dependent Services instances within okay so let's look at
80:41 instances within okay so let's look at some of the use cases for this so when
80:43 some of the use cases for this so when you have static information you can use
80:45 you have static information you can use Q&A maker uh in your knowledge base of
80:48 Q&A maker uh in your knowledge base of answers this knowledge base is customed
80:49 answers this knowledge base is customed to your needs which you've built with
80:51 to your needs which you've built with documents such as PDF and URL
80:53 documents such as PDF and URL when you want to provide the same answer
80:54 when you want to provide the same answer to a repeat question command when
80:56 to a repeat question command when different users submit the same question
80:57 different users submit the same question the answers is returned when you want to
80:59 the answers is returned when you want to filter stack information based on meta
81:02 filter stack information based on meta information so metatag data is provide
81:05 information so metatag data is provide uh provides additional filtering options
81:06 uh provides additional filtering options relevant to your client application
81:08 relevant to your client application users and information common meded
81:09 users and information common meded information includes chitchat content
81:11 information includes chitchat content type format content purpose content
81:14 type format content purpose content freshness and there's a use case when
81:16 freshness and there's a use case when you want to manage a bot conversation
81:18 you want to manage a bot conversation that includes static information so your
81:20 that includes static information so your knowledge base takes uh takes a user
81:22 knowledge base takes uh takes a user conversation text or command and answers
81:24 conversation text or command and answers it if the answer is part of a
81:26 it if the answer is part of a predetermined conversation flow
81:28 predetermined conversation flow represented in the knowledge base with
81:29 represented in the knowledge base with multiple TurnKey context the bot can
81:32 multiple TurnKey context the bot can easily provide this flow so Q&A maker
81:35 easily provide this flow so Q&A maker Imports your content into a knowledge
81:37 Imports your content into a knowledge base of questions and answer Pairs and
81:39 base of questions and answer Pairs and Q&A maker can build your knowledge base
81:41 Q&A maker can build your knowledge base from an existing document manual or
81:43 from an existing document manual or website you're all docx PDF I thought
81:45 website you're all docx PDF I thought this was the coolest thing so you can
81:47 this was the coolest thing so you can just basically have anyone write a doc X
81:49 just basically have anyone write a doc X as long as it has a heading and a and
81:51 as long as it has a heading and a and text and I think you can even extract at
81:52 text and I think you can even extract at images and it'll just turn it into uh
81:55 images and it'll just turn it into uh the bot it just saves you so much time
81:56 the bot it just saves you so much time it's crazy it will use ml to extract the
81:59 it's crazy it will use ml to extract the question and answer pairs the content of
82:01 question and answer pairs the content of the question of answer pairs include all
82:03 the question of answer pairs include all the alternate forms of the question
82:04 the alternate forms of the question metad dag tags used to filter choices
82:07 metad dag tags used to filter choices during the search followup prompts to
82:09 during the search followup prompts to continue to search refinement uh
82:10 continue to search refinement uh refinement K maker stores answers text
82:13 refinement K maker stores answers text in markdown once your knowledge base is
82:16 in markdown once your knowledge base is imported you can finetune the imported
82:17 imported you can finetune the imported results by editing the question and
82:19 results by editing the question and answer pairs as seen here uh there is
82:22 answer pairs as seen here uh there is the chat box so you can converse with
82:24 the chat box so you can converse with your Bot through a chat box I wouldn't
82:25 your Bot through a chat box I wouldn't say it's particularly a feature of Q&A
82:28 say it's particularly a feature of Q&A maker but I just want you to know that's
82:29 maker but I just want you to know that's how You' interact with it so when you're
82:31 how You' interact with it so when you're using the Q&A maker AI the Azure bot
82:33 using the Q&A maker AI the Azure bot service the bot composer um or via
82:36 service the bot composer um or via channels you'll get an embeddable one
82:37 channels you'll get an embeddable one you'll see this box where you can start
82:39 you'll see this box where you can start typing in your questions and and get
82:41 typing in your questions and and get back the answers to test it here an
82:43 back the answers to test it here an example is a multi-term conversation so
82:46 example is a multi-term conversation so somebody asked a question a generic
82:47 somebody asked a question a generic question and that said hey are you
82:49 question and that said hey are you talking about adus or Azure which is
82:51 talking about adus or Azure which is kind of like a follow-up prompt and
82:52 kind of like a follow-up prompt and we'll talk about multi-turn here in a
82:54 we'll talk about multi-turn here in a second but uh that's something I want
82:55 second but uh that's something I want you to know about okay so chitchat is a
82:58 you to know about okay so chitchat is a feature in Q&A maker that allows you to
82:59 feature in Q&A maker that allows you to easily add prepopulated sets of top Chit
83:02 easily add prepopulated sets of top Chit Chats into your knowledge base the data
83:04 Chats into your knowledge base the data set has about 100 scenarios of chitchat
83:06 set has about 100 scenarios of chitchat in voices of multiple personas so the
83:08 in voices of multiple personas so the idea is like if someone says something
83:10 idea is like if someone says something random like how are you doing what's the
83:12 random like how are you doing what's the weather today things that your Bot
83:13 weather today things that your Bot wouldn't necessarily know it has like
83:15 wouldn't necessarily know it has like canned answers and it's going to be
83:17 canned answers and it's going to be different based on how you want the
83:19 different based on how you want the response to be okay uh there's a concept
83:22 response to be okay uh there's a concept of layered ranking so uh Q&A maker
83:24 of layered ranking so uh Q&A maker system is a layered ranking approach the
83:26 system is a layered ranking approach the data is stored in Azure search which
83:28 data is stored in Azure search which also serves as the first ranking layer
83:31 also serves as the first ranking layer the top result for uh from Azure search
83:33 the top result for uh from Azure search are then passed through Q&A makers NLP
83:36 are then passed through Q&A makers NLP reranking model to produce the final
83:37 reranking model to produce the final results and confidence score T touching
83:40 results and confidence score T touching on multi- turn conversation is a
83:42 on multi- turn conversation is a follow-up prompt and context to manage
83:44 follow-up prompt and context to manage the multiple turns known as multi-t for
83:47 the multiple turns known as multi-t for your Bot from one question to another
83:49 your Bot from one question to another when a question can't be answered in a
83:50 when a question can't be answered in a single turn that is when you're using
83:52 single turn that is when you're using multi-term conversation so Q&A maker
83:55 multi-term conversation so Q&A maker provides multi-term prompts and active
83:56 provides multi-term prompts and active learning to help you improve your
83:57 learning to help you improve your questions based on key and answer Pairs
83:59 questions based on key and answer Pairs and gives you the opportunity to connect
84:01 and gives you the opportunity to connect questions and answer pairs the
84:02 questions and answer pairs the connection allows the client application
84:05 connection allows the client application to provide a top answer and provide more
84:07 to provide a top answer and provide more questions to refine the search for a
84:08 questions to refine the search for a final answer after the knowledge base
84:10 final answer after the knowledge base receives questions from users at the
84:12 receives questions from users at the publish endpoint cre maker applies
84:13 publish endpoint cre maker applies active learning to these rule work
84:15 active learning to these rule work questions to suggest changes to your
84:16 questions to suggest changes to your knowledge base to improve the quality
84:18 knowledge base to improve the quality all
84:22 right [Music]
84:24 [Music] hey this is Andrew Brown from exam Pro
84:26 hey this is Andrew Brown from exam Pro and we are looking Azure bot service so
84:28 and we are looking Azure bot service so the Azure bot service is an intelligent
84:29 the Azure bot service is an intelligent servess bot service that scales on
84:31 servess bot service that scales on demand used for creating publishing and
84:33 demand used for creating publishing and managing bots so you can register and
84:35 managing bots so you can register and publish a variety of bots from the Azure
84:36 publish a variety of bots from the Azure portal so here there's a bunch of ones
84:38 portal so here there's a bunch of ones I've never heard of um probably with
84:40 I've never heard of um probably with thirdparty providers partnered with
84:41 thirdparty providers partnered with Azure and then there's the ones that we
84:43 Azure and then there's the ones that we would know like the Azure healthbot the
84:45 would know like the Azure healthbot the Azure bot or the uh web app bot which is
84:48 Azure bot or the uh web app bot which is more of a generic one so Azure Bop
84:51 more of a generic one so Azure Bop service Bop bot service can integrate
84:53 service Bop bot service can integrate your Bot with other Azure Microsoft or
84:57 your Bot with other Azure Microsoft or thirdparty service uh Services via
84:59 thirdparty service uh Services via channel so you can have a direct line
85:01 channel so you can have a direct line Alexa Office 365 Facebook keik line
85:06 Alexa Office 365 Facebook keik line Microsoft teams Skype uh twio and more
85:10 Microsoft teams Skype uh twio and more all right and two things that are
85:12 all right and two things that are commonly associated with the Azure bot
85:14 commonly associated with the Azure bot service is the bot framework and Bot
85:16 service is the bot framework and Bot composer in fact it was really hard just
85:18 composer in fact it was really hard just to make make this slide here because
85:20 to make make this slide here because they just weren't very descriptive on it
85:21 they just weren't very descriptive on it cuz they wanted to push these other two
85:22 cuz they wanted to push these other two things here but let's talk about the bot
85:24 things here but let's talk about the bot framework SDK so the bot framework SDK
85:27 framework SDK so the bot framework SDK which is now version four is an open
85:29 which is now version four is an open source SDK that enables developers to
85:31 source SDK that enables developers to model and build sophisticated
85:32 model and build sophisticated conversations the bot framework uh along
85:35 conversations the bot framework uh along with the Azure bot service provides an
85:37 with the Azure bot service provides an end-to-end workflow so we can design
85:39 end-to-end workflow so we can design build test publish connect and evaluate
85:43 build test publish connect and evaluate our uh Bots okay with this with this
85:45 our uh Bots okay with this with this framework developers can create Bots
85:47 framework developers can create Bots that use speech understand uh natural
85:49 that use speech understand uh natural language handle questions answers and
85:51 language handle questions answers and more the bot framework includes a module
85:53 more the bot framework includes a module accessible SDK for building Bots as well
85:55 accessible SDK for building Bots as well as tools templates and related AI
85:57 as tools templates and related AI Services then you have bot framework
86:00 Services then you have bot framework composer and uh this is built on top of
86:02 composer and uh this is built on top of the bot framework SDK it's an open
86:04 the bot framework SDK it's an open source IDE for developers to author test
86:06 source IDE for developers to author test provision and manage conversational
86:08 provision and manage conversational experiences you can download it's an app
86:10 experiences you can download it's an app on Windows OS X and Linux is probably
86:12 on Windows OS X and Linux is probably built uh using uh like uh web technology
86:16 built uh using uh like uh web technology and so here is the actual uh app there
86:18 and so here is the actual uh app there and so you can see there's kind of a bit
86:19 and so you can see there's kind of a bit of a flow and things you can do in there
86:21 of a flow and things you can do in there so you can either use C or not to build
86:23 so you can either use C or not to build your Bot you can deploy the bot to the
86:26 your Bot you can deploy the bot to the Azure web apps or Azure functions you
86:28 Azure web apps or Azure functions you have templates to build Q&A maker bot
86:30 have templates to build Q&A maker bot Enterprise or personal assistant bot
86:32 Enterprise or personal assistant bot language bot calendar or people bot uh
86:35 language bot calendar or people bot uh you can test and debug via the bot
86:36 you can test and debug via the bot framework emulator uh and it has a
86:38 framework emulator uh and it has a built-in package manager there's a lot
86:40 built-in package manager there's a lot more to these things but again at the AI
86:42 more to these things but again at the AI 900 this is all we need to know um but
86:44 900 this is all we need to know um but yeah there you
86:45 yeah there you [Music]
86:48 [Music] go hey this is Andrew Brown from exam
86:51 go hey this is Andrew Brown from exam Pro and we are looking at Azure machine
86:53 Pro and we are looking at Azure machine learning service I want you to know
86:54 learning service I want you to know there's a classic version of the service
86:56 there's a classic version of the service it's still accessible in the portal this
86:58 it's still accessible in the portal this is not an exam we are going to 100%
87:00 is not an exam we are going to 100% avoid it uh it has severe limitations we
87:03 avoid it uh it has severe limitations we cannot transfer anything over from the
87:05 cannot transfer anything over from the classic to the new one uh so the one
87:07 classic to the new one uh so the one we're going to focus on is the Azure
87:08 we're going to focus on is the Azure machine learning service you do create
87:10 machine learning service you do create Studios within it so you'll hear me say
87:12 Studios within it so you'll hear me say Azure machine Learning Studio and I'm
87:13 Azure machine Learning Studio and I'm referring to the new one a service that
87:15 referring to the new one a service that simplifies running AI ml work related
87:17 simplifies running AI ml work related workloads allowing you to build flexible
87:19 workloads allowing you to build flexible automated ml pipelines use python or Run
87:22 automated ml pipelines use python or Run Deep learning workloads such as
87:24 Deep learning workloads such as tensorflow we can make Jer notebooks in
87:26 tensorflow we can make Jer notebooks in here so build and document your machine
87:28 here so build and document your machine learning models as you build them share
87:29 learning models as you build them share and collaborate Azure machine learning
87:31 and collaborate Azure machine learning SDK for python so an SDK designed
87:33 SDK for python so an SDK designed specifically to interact with the Azure
87:35 specifically to interact with the Azure machine learning Services it does ML Ops
87:38 machine learning Services it does ML Ops machine learning operations so end to
87:40 machine learning operations so end to end automation of ml model pipelines
87:41 end automation of ml model pipelines cicd training inference aure machine
87:44 cicd training inference aure machine learning designer uh uh so this is a
87:47 learning designer uh uh so this is a drag and drop interface to visually
87:48 drag and drop interface to visually build test deploy machine learning
87:50 build test deploy machine learning models uh technically pipelines I guess
87:52 models uh technically pipelines I guess as a data labeling service so assemble a
87:55 as a data labeling service so assemble a team of humans to label your training
87:57 team of humans to label your training data responsible machine learning so
87:58 data responsible machine learning so model fairness through disparity metrics
88:01 model fairness through disparity metrics and mitigate unfairness at the time of
88:03 and mitigate unfairness at the time of this Ser this is not very good but it's
88:04 this Ser this is not very good but it's supposed to tie in with the responsible
88:06 supposed to tie in with the responsible AI that Microsoft is always promoting
88:16 okay so once we launch our own uh Studio within Azure machine learning service
88:17 within Azure machine learning service you're going to get this nice big bar uh
88:19 you're going to get this nice big bar uh or navigation left hand side it shows
88:21 or navigation left hand side it shows you there's a lot of stuff that's in
88:22 you there's a lot of stuff that's in here so let's just break it down and
88:23 here so let's just break it down and what all these things are so for
88:24 what all these things are so for authoring we got notebooks these are
88:26 authoring we got notebooks these are jupyter notebooks and IDE to write
88:27 jupyter notebooks and IDE to write python code to build ml models they kind
88:29 python code to build ml models they kind of have their own preview which I don't
88:30 of have their own preview which I don't really like but there is a way to bridge
88:32 really like but there is a way to bridge it over to jupyter notebooks or into
88:34 it over to jupyter notebooks or into Visual Studio code we have automl
88:36 Visual Studio code we have automl completely automated process to build
88:38 completely automated process to build and train ml models it's you're limited
88:40 and train ml models it's you're limited to only three types of models but still
88:42 to only three types of models but still that's great we have the designer so
88:43 that's great we have the designer so visual drag and drop designer to
88:45 visual drag and drop designer to construct end to-end ml pipelines for
88:47 construct end to-end ml pipelines for assets we have data sets so data you can
88:49 assets we have data sets so data you can upload which we will be used which will
88:51 upload which we will be used which will be used for for training experiments
88:53 be used for for training experiments when you run a training job they are
88:55 when you run a training job they are detailed here uh pipelines ml workflows
88:58 detailed here uh pipelines ml workflows that you have built or have used in the
89:00 that you have built or have used in the designer model so a model registry
89:02 designer model so a model registry containing train models that can be
89:04 containing train models that can be deployed endpoints so when you deploy a
89:06 deployed endpoints so when you deploy a model it's hosted on accessible endpoint
89:08 model it's hosted on accessible endpoint so you're going to be able to uh access
89:10 so you're going to be able to uh access it via a rest API or maybe the SDK uh
89:13 it via a rest API or maybe the SDK uh for manage we got compute the underlying
89:15 for manage we got compute the underlying Computing instances used uh for
89:17 Computing instances used uh for notebooks training and inference
89:19 notebooks training and inference environments are
89:20 environments are reproducible uh python environment for
89:22 reproducible uh python environment for machine learning experiments data stores
89:25 machine learning experiments data stores a data repository where your data
89:27 a data repository where your data resides data labeling uh so you have a
89:29 resides data labeling uh so you have a human with ML assisted labeling to label
89:32 human with ML assisted labeling to label your data for supervised Learning Link
89:34 your data for supervised Learning Link services external service you can
89:35 services external service you can connect to the workspace such as Azure
89:37 connect to the workspace such as Azure synapse
89:39 synapse [Music]
89:42 [Music] analytic let's take a look at uh the
89:44 analytic let's take a look at uh the types of compute that is available in
89:46 types of compute that is available in our Azure machine Learning Studio we got
89:48 our Azure machine Learning Studio we got four categories uh we have compute
89:50 four categories uh we have compute instances to uh development workstations
89:53 instances to uh development workstations that data scientists can use to work
89:54 that data scientists can use to work with data and models compute clusters to
89:57 with data and models compute clusters to scalable clusters of VMS for on demand
89:59 scalable clusters of VMS for on demand processing experimentation code
90:01 processing experimentation code deployment targets for Predictive
90:02 deployment targets for Predictive Services that use your trained models
90:05 Services that use your trained models and attach compute links to existing
90:07 and attach compute links to existing Azure compute resources such as uh Azure
90:09 Azure compute resources such as uh Azure VMS uh and Azure data brick clusters now
90:13 VMS uh and Azure data brick clusters now uh what's interesting here is like with
90:15 uh what's interesting here is like with this compute you can see that you can
90:16 this compute you can see that you can open it in Jupiter Labs jupyter vs code
90:18 open it in Jupiter Labs jupyter vs code R studio and terminal but you can you
90:22 R studio and terminal but you can you can work with uh your computer instances
90:24 can work with uh your computer instances your development workstations uh
90:26 your development workstations uh directly in the studio which that's the
90:27 directly in the studio which that's the way I do it um what's interesting is for
90:30 way I do it um what's interesting is for inference that's when you want to make a
90:32 inference that's when you want to make a prediction you use Azure kubernetes
90:34 prediction you use Azure kubernetes service or Azure container instance and
90:36 service or Azure container instance and I didn't see it show up under here so
90:37 I didn't see it show up under here so I'm kind of confused whether that's
90:39 I'm kind of confused whether that's where it appears U maybe we'll discover
90:41 where it appears U maybe we'll discover as we do the follow alongs that they do
90:43 as we do the follow alongs that they do appear here but uh I'm not sure about
90:45 appear here but uh I'm not sure about that one but yeah those are the four
90:47 that one but yeah those are the four there
90:53 okay So within Azure machine Learning Studio we can do some data labeling so
90:55 Studio we can do some data labeling so we create data labeling jobs to prepare
90:57 we create data labeling jobs to prepare your ground Truth for supervised
90:58 your ground Truth for supervised learning we have two options human in
91:00 learning we have two options human in the loop labeling you have a team of
91:01 the loop labeling you have a team of humans that will apply labeling these
91:03 humans that will apply labeling these are humans you grant access to labeling
91:05 are humans you grant access to labeling uh machine learning assisted daily
91:07 uh machine learning assisted daily labeling you will use ml to perform uh
91:09 labeling you will use ml to perform uh labeling so you can export the label
91:11 labeling so you can export the label data from for machine learning
91:13 data from for machine learning experimentation at any time uh your
91:15 experimentation at any time uh your users often export multiple times and
91:17 users often export multiple times and train different models rather than wait
91:19 train different models rather than wait for all the images to be labeled images
91:21 for all the images to be labeled images labeled can be exported in Coco format
91:23 labeled can be exported in Coco format that's why we talked about Coco uh a lot
91:26 that's why we talked about Coco uh a lot earlier in our data set section as your
91:28 earlier in our data set section as your machine learning data set and this is
91:29 machine learning data set and this is the data set format that makes it easy
91:31 the data set format that makes it easy to use for training and Azure machine
91:33 to use for training and Azure machine learning so generally you want to use
91:34 learning so generally you want to use that format the idea is you would choose
91:36 that format the idea is you would choose a labeling task type uh and that way you
91:39 a labeling task type uh and that way you would have this UI and then people would
91:41 would have this UI and then people would go in and and just click buttons and do
91:43 go in and and just click buttons and do the labeling
91:50 okay so aure ml data store securely connects you to storage service services
91:52 connects you to storage service services on Azure without putting your
91:53 on Azure without putting your authentication credentials and the
91:55 authentication credentials and the Integrity of your original data source
91:56 Integrity of your original data source at risk so here is the example of data
91:59 at risk so here is the example of data source that are available to us in the
92:00 source that are available to us in the studio and let's just go quickly through
92:02 studio and let's just go quickly through them so we have Azure blob storage this
92:04 them so we have Azure blob storage this is data that is stored as objects
92:05 is data that is stored as objects distributed across many machines Azure
92:07 distributed across many machines Azure file share am mountable file share via
92:09 file share am mountable file share via SMB and NFS protocols Azure data Lake
92:12 SMB and NFS protocols Azure data Lake storage Gen 2 um this is blob storage
92:14 storage Gen 2 um this is blob storage design for vast amounts of big data
92:16 design for vast amounts of big data analytics Azure SQL this is a fully
92:18 analytics Azure SQL this is a fully managed mssql relational database azure
92:21 managed mssql relational database azure postest datas this is an open source
92:22 postest datas this is an open source relational database often considered an
92:25 relational database often considered an object related database preferred by
92:26 object related database preferred by developers a your MySQL another open
92:29 developers a your MySQL another open source relational database the most
92:30 source relational database the most popular one and considered a pure
92:32 popular one and considered a pure relational database
92:40 okay so aure ml data sets makes it easy to register your data sets for use with
92:42 to register your data sets for use with your ml workload so what you do is you'd
92:45 your ml workload so what you do is you'd add a data set and you get a bunch of
92:46 add a data set and you get a bunch of metadata associated with it uh and you
92:48 metadata associated with it uh and you can also upload uh addition like the
92:50 can also upload uh addition like the data set again to have multiple versions
92:53 data set again to have multiple versions so you'll have a current version and a
92:54 so you'll have a current version and a latest version uh it's very easy to get
92:57 latest version uh it's very easy to get started working with them because
92:58 started working with them because they'll have some sample code that's for
92:59 they'll have some sample code that's for the Azure ml SDK uh to import that into
93:04 the Azure ml SDK uh to import that into uh into your Jupiter notebooks uh for
93:06 uh into your Jupiter notebooks uh for data sets you can generate profiles that
93:08 data sets you can generate profiles that will give you summary statistics
93:09 will give you summary statistics distribution of data and more you will
93:11 distribution of data and more you will have to use a compute instance to
93:12 have to use a compute instance to generate that data so you press the
93:14 generate that data so you press the generate profile and you'd have that
93:15 generate profile and you'd have that stored I think it's in Blob storage
93:17 stored I think it's in Blob storage there are open data sets these are
93:19 there are open data sets these are publicly hosted data sets that are
93:20 publicly hosted data sets that are commonly used for learning how to build
93:22 commonly used for learning how to build ml models so if you go to open data sets
93:25 ml models so if you go to open data sets you just choose one and so this is a
93:27 you just choose one and so this is a curated list of open data sets that you
93:28 curated list of open data sets that you can quickly add to your data store great
93:30 can quickly add to your data store great for learning how to use autom ml or
93:32 for learning how to use autom ml or aszure machine learning designer or any
93:34 aszure machine learning designer or any kind of uh ml uh workload if you're new
93:37 kind of uh ml uh workload if you're new to it that's why we covered mnist and
93:40 to it that's why we covered mnist and Coco earlier just because those are some
93:41 Coco earlier just because those are some common data sets there but there you
93:44 common data sets there but there you [Music]
93:47 [Music] go taking a look here at Azure ml
93:49 go taking a look here at Azure ml experiments this is a logical grouping
93:51 experiments this is a logical grouping of azure runs and runs act uh is the act
93:53 of azure runs and runs act uh is the act of running an ml task on a virtual
93:55 of running an ml task on a virtual machine or container so here's a list of
93:57 machine or container so here's a list of them and uh it can run various uh types
94:00 them and uh it can run various uh types of ml tasks so scripts could be
94:01 of ml tasks so scripts could be pre-processing autom ml uh a training
94:04 pre-processing autom ml uh a training pipeline but what it's not going to
94:06 pipeline but what it's not going to include is inference and what I mean is
94:07 include is inference and what I mean is once you've deployed your model or
94:09 once you've deployed your model or Pipeline and you uh uh make predictions
94:12 Pipeline and you uh uh make predictions via a request it's just not going to
94:14 via a request it's just not going to show up under here
94:16 show up under here [Music]
94:19 [Music] okay okay so we have azure ml pipelines
94:22 okay okay so we have azure ml pipelines which is an exec workflow of a complete
94:24 which is an exec workflow of a complete machine learning task not to be confused
94:26 machine learning task not to be confused with Azure pipelines which is part of
94:28 with Azure pipelines which is part of azure devops or data Factory which has
94:30 azure devops or data Factory which has its own pipelines it's a total a totally
94:32 its own pipelines it's a total a totally separate thing here so subas are
94:34 separate thing here so subas are encapsulated is a series of steps within
94:36 encapsulated is a series of steps within the pipeline independent steps allow
94:38 the pipeline independent steps allow multiple data scientists to work on the
94:40 multiple data scientists to work on the same pipeline at the same time without
94:41 same pipeline at the same time without over taxing compute resources separate
94:44 over taxing compute resources separate steps also make it easy to use different
94:46 steps also make it easy to use different compute type sizes for each step when
94:48 compute type sizes for each step when you rerun a pipeline the Run jumps to
94:50 you rerun a pipeline the Run jumps to the steps that need to be rerun such as
94:52 the steps that need to be rerun such as the updated training script steps do not
94:54 the updated training script steps do not need to be rerun and they will be
94:56 need to be rerun and they will be skipped after a uh pipeline has been
94:58 skipped after a uh pipeline has been published you can configure a rest
95:00 published you can configure a rest endpoint which allows you to rerun the
95:01 endpoint which allows you to rerun the pipeline from any platform or stack
95:03 pipeline from any platform or stack there's two ways uh to build pipelines
95:06 there's two ways uh to build pipelines you can use the Azure ml designer or
95:08 you can use the Azure ml designer or programmatically using Azure machine
95:10 programmatically using Azure machine learning python SDK so here is example
95:13 learning python SDK so here is example of some code just make a note here I
95:15 of some code just make a note here I mean it's not that important but no it's
95:17 mean it's not that important but no it's just you create steps okay and then you
95:19 just you create steps okay and then you assemble all the steps into a pipeline
95:21 assemble all the steps into a pipeline line here all
95:23 line here all [Music]
95:26 [Music] right so as your machine learning
95:29 right so as your machine learning designer lets you quickly build as your
95:31 designer lets you quickly build as your ml pipelines without having to write any
95:33 ml pipelines without having to write any code so here is what it looks like and
95:35 code so here is what it looks like and over there you can see our pipeline is
95:37 over there you can see our pipeline is quite Visual and on the left hand side
95:39 quite Visual and on the left hand side you have a bunch of assets you can drag
95:41 you have a bunch of assets you can drag out that are pre-built there so uh it's
95:43 out that are pre-built there so uh it's a really fast way for building a
95:45 a really fast way for building a pipeline so you do have to have a good
95:47 pipeline so you do have to have a good understanding of um ml pipelines nend to
95:49 understanding of um ml pipelines nend to end to make good use of it uh once you
95:51 end to make good use of it uh once you you trained your pipeline you can create
95:52 you trained your pipeline you can create an inference pipeline so you drop down
95:55 an inference pipeline so you drop down and you say whether you want it to be
95:56 and you say whether you want it to be real or batch and you can TW toggle
95:59 real or batch and you can TW toggle between them later so I mean there's a
96:01 between them later so I mean there's a lot to this service but for the AI 900
96:04 lot to this service but for the AI 900 we don't have to go uh diving too deep
96:13 okay so Azure ml models or the model registry allows you to create manage and
96:15 registry allows you to create manage and track your registered models as
96:16 track your registered models as incremental versions under the same name
96:18 incremental versions under the same name so each time you register a model with
96:20 so each time you register a model with the same name as an existing one the
96:22 the same name as an existing one the registry assures that it's a new version
96:24 registry assures that it's a new version Additionally you can provide metadata
96:26 Additionally you can provide metadata tags and use tags when you search for
96:27 tags and use tags when you search for models so yeah it's just really easy way
96:30 models so yeah it's just really easy way to share and deploy or download your
96:32 to share and deploy or download your models
96:33 models [Music]
96:36 [Music] okay Azure MLM points allow you to
96:39 okay Azure MLM points allow you to deploy machine learning models as a web
96:41 deploy machine learning models as a web service so the workflow for deploying
96:42 service so the workflow for deploying model is register the model prepare an
96:44 model is register the model prepare an entry script prepare an inference
96:45 entry script prepare an inference configuration deploy the model locally
96:47 configuration deploy the model locally to ensure everything works compute uh
96:50 to ensure everything works compute uh choose a compute Target redeploy the
96:52 choose a compute Target redeploy the model uh to the cloud test the resulting
96:54 model uh to the cloud test the resulting web service so we have uh two options
96:57 web service so we have uh two options here real time endpoints so an endpoint
96:59 here real time endpoints so an endpoint that provides remote access to invoke
97:00 that provides remote access to invoke the ml model uh service running on
97:03 the ml model uh service running on either Azure kubernetes service AKs or
97:06 either Azure kubernetes service AKs or Azure container instance ACI then we
97:08 Azure container instance ACI then we have pipeline endpoints so endpoint that
97:10 have pipeline endpoints so endpoint that provides remote access to invoke an ml
97:12 provides remote access to invoke an ml pipeline you can parameterize the
97:14 pipeline you can parameterize the pipeline endpoint for manage
97:15 pipeline endpoint for manage repeatability in batch scoring and
97:17 repeatability in batch scoring and retraining
97:19 retraining scenarios um and so you can deploy a
97:21 scenarios um and so you can deploy a model to an endpoint it will either be
97:23 model to an endpoint it will either be deployed to a AKs or ACI as we said
97:26 deployed to a AKs or ACI as we said earlier uh and the thing is is that when
97:28 earlier uh and the thing is is that when you do do that just understand that
97:30 you do do that just understand that that's going to be shown under the AKs
97:32 that's going to be shown under the AKs or ACI um within the Azure portal it's
97:35 or ACI um within the Azure portal it's not Consolidated under the Azure machine
97:37 not Consolidated under the Azure machine Learning Studio when you've deployed a
97:38 Learning Studio when you've deployed a realtime endpoint you can test the
97:40 realtime endpoint you can test the endpoint by sending either a single
97:41 endpoint by sending either a single request or batch request so they have a
97:43 request or batch request so they have a nice form here where it's single or it's
97:46 nice form here where it's single or it's um like here it's a CSV that you can
97:47 um like here it's a CSV that you can send so there you
97:50 send so there you go
97:55 so Azure has a built-in jupyter like notebook editor so you can build and
97:57 notebook editor so you can build and train your ml models and so here is an
97:59 train your ml models and so here is an example of it I personally don't like it
98:01 example of it I personally don't like it too much but that's okay because we have
98:03 too much but that's okay because we have some other options to make it easier but
98:05 some other options to make it easier but what you do is you choose your compute
98:07 what you do is you choose your compute uh instance to run the notebook you'll
98:08 uh instance to run the notebook you'll choose your kernel uh which is a
98:10 choose your kernel uh which is a pre-loaded programming language and
98:12 pre-loaded programming language and programming libraries for different use
98:14 programming libraries for different use cases but that's a Jupiter kernel uh
98:16 cases but that's a Jupiter kernel uh concept there uh so you can open the
98:18 concept there uh so you can open the notebook in a more familiar ID such as
98:20 notebook in a more familiar ID such as vs code jupyter notebook classic or
98:23 vs code jupyter notebook classic or Jupiter lab so you go there drop it down
98:25 Jupiter lab so you go there drop it down choose it and open it up and now you're
98:26 choose it and open it up and now you're in a more familiar territory the vs code
98:29 in a more familiar territory the vs code one is exactly the same experience as
98:31 one is exactly the same experience as the um the one in Azure or Azure ml
98:34 the um the one in Azure or Azure ml Studio I personally don't like it I
98:36 Studio I personally don't like it I think most people are going to be using
98:37 think most people are going to be using the notebooks but it's great that they
98:39 the notebooks but it's great that they have all those
98:40 have all those [Music]
98:44 [Music] options so Azure automated machine
98:47 options so Azure automated machine learning also known as autom ml
98:48 learning also known as autom ml automates the process of creating an ml
98:50 automates the process of creating an ml model so with Azure automl you supply a
98:52 model so with Azure automl you supply a data set choose a task type uh and then
98:55 data set choose a task type uh and then automl will train and tune your model so
98:57 automl will train and tune your model so here are task types let's quickly go
98:59 here are task types let's quickly go through them so we have classification
99:00 through them so we have classification when you need to make a prediction based
99:02 when you need to make a prediction based on several classes so binary
99:04 on several classes so binary classification multiclass classification
99:06 classification multiclass classification regression when you need to predict a
99:08 regression when you need to predict a continuous number value and then a Time
99:10 continuous number value and then a Time series forecasting when you need to
99:12 series forecasting when you need to predict the value based on time so just
99:14 predict the value based on time so just look at them a little bit more in detail
99:16 look at them a little bit more in detail so classification is a type of
99:17 so classification is a type of supervised learning in which the model
99:18 supervised learning in which the model learns using training data and appli
99:21 learns using training data and appli those learnings to new data so here is
99:23 those learnings to new data so here is an example uh or this is just the option
99:25 an example uh or this is just the option here and so the goal of classification
99:27 here and so the goal of classification is to predict which categories new data
99:29 is to predict which categories new data will fall into based on learning from
99:31 will fall into based on learning from its training data so binary
99:32 its training data so binary classification is a record uh is labeled
99:35 classification is a record uh is labeled out of two possible labels so maybe it's
99:37 out of two possible labels so maybe it's true or false zero or one it's just two
99:40 true or false zero or one it's just two values multiclass classification is a
99:42 values multiclass classification is a record is labeled out of range of out of
99:44 record is labeled out of range of out of a range of
99:45 a range of labels uh and so it could be like happy
99:47 labels uh and so it could be like happy sad mad or rad and just you know I can
99:50 sad mad or rad and just you know I can see there's a smelling mistake there but
99:51 see there's a smelling mistake there but yeah there should be an F so let's just
99:53 yeah there should be an F so let's just correct that there we go uh you can also
99:56 correct that there we go uh you can also apply deep learning and so if you turn
99:58 apply deep learning and so if you turn deep learning on you probably want to uh
100:00 deep learning on you probably want to uh use a gpus uh compute instance just
100:02 use a gpus uh compute instance just because or um compute cluster Because
100:05 because or um compute cluster Because deep learning really prefers uh gpus
100:08 deep learning really prefers uh gpus okay looking at regression it's also a
100:11 okay looking at regression it's also a type of supervised learning where the
100:12 type of supervised learning where the model learns uh using training data and
100:15 model learns uh using training data and applies those learnings to new data but
100:16 applies those learnings to new data but it's a bit different where the goal of
100:18 it's a bit different where the goal of regression is to predict a variable in
100:19 regression is to predict a variable in the future uh then you have time series
100:22 the future uh then you have time series forecasting and this sounds a lot like
100:24 forecasting and this sounds a lot like um uh regression because it is so
100:27 um uh regression because it is so forecast Revenue inventory sales or
100:30 forecast Revenue inventory sales or customer demand an automated time series
100:32 customer demand an automated time series experiment that is treated as a
100:34 experiment that is treated as a multivariant regression problem past
100:36 multivariant regression problem past time series values are pivoted to become
100:38 time series values are pivoted to become additional dimensions for the regressor
100:40 additional dimensions for the regressor together with other predictors and
100:43 together with other predictors and unlike classical time series methods has
100:45 unlike classical time series methods has an advantage of naturally incorporating
100:48 an advantage of naturally incorporating multiple contextual variables and their
100:50 multiple contextual variables and their relationship to one another during
100:52 relationship to one another during training so use cases here or Advanced
100:54 training so use cases here or Advanced configurations I should say holiday
100:56 configurations I should say holiday detection and featurization time series
100:58 detection and featurization time series uh deep learning uh neural networks so
101:01 uh deep learning uh neural networks so you got Auto arima profit forecast TCN
101:05 you got Auto arima profit forecast TCN uh many model supports through grouping
101:07 uh many model supports through grouping rolling origin cross validation
101:09 rolling origin cross validation configurable Labs rolling window
101:11 configurable Labs rolling window aggregate features so there you
101:14 aggregate features so there you [Music]
101:18 [Music] go so within automl we have data guard
101:21 go so within automl we have data guard rails and these are run by automl when
101:24 rails and these are run by automl when automatic featurization is enabled it's
101:26 automatic featurization is enabled it's a sequence of checks to ensure
101:27 a sequence of checks to ensure highquality input data is being used to
101:30 highquality input data is being used to train the model so just to show you some
101:32 train the model so just to show you some information here so the idea is it could
101:35 information here so the idea is it could apply validation split handling so the
101:36 apply validation split handling so the input data has been split for validation
101:38 input data has been split for validation to improve the performance then you have
101:41 to improve the performance then you have missing feature value uh imputation so
101:44 missing feature value uh imputation so no features missing values were detected
101:46 no features missing values were detected in training data High cardinality
101:48 in training data High cardinality feature detection your inputs were
101:50 feature detection your inputs were analyzed and no no high cardinality
101:51 analyzed and no no high cardinality features were detected High cardinality
101:53 features were detected High cardinality means like if you have too many
101:55 means like if you have too many dimensions it becomes very dense or hard
101:56 dimensions it becomes very dense or hard to process the data um so that's
101:59 to process the data um so that's something good to check
102:01 something good to check [Music]
102:04 [Music] against let's talk about autom ml's
102:07 against let's talk about autom ml's automatic featurization so during model
102:09 automatic featurization so during model training with autom ML one of the
102:11 training with autom ML one of the following scaling or normalization
102:13 following scaling or normalization techniques will be applied to each model
102:14 techniques will be applied to each model the first is standard scale wrapper so
102:17 the first is standard scale wrapper so standardized features by removing the
102:18 standardized features by removing the mean and scaling to unit variance minmax
102:21 mean and scaling to unit variance minmax scaler transform features by scaling
102:23 scaler transform features by scaling each feature by the column is minimum
102:24 each feature by the column is minimum maximum Max ABS scaler scale each
102:27 maximum Max ABS scaler scale each feature by its maximum absolute value
102:29 feature by its maximum absolute value robust scar scales features by the
102:31 robust scar scales features by the quanti quantile range PCA linear
102:34 quanti quantile range PCA linear dimensionality reduction using single uh
102:37 dimensionality reduction using single uh value decomposition of the data to
102:39 value decomposition of the data to project it to lower dimensional space uh
102:43 project it to lower dimensional space uh uh Dimension uh reduction is very useful
102:45 uh Dimension uh reduction is very useful if your data is too complex if let's say
102:46 if your data is too complex if let's say you have data and you have too many
102:49 you have data and you have too many labels like 20 20 30 40 labels for per
102:52 labels like 20 20 30 40 labels for per like for categories to pick out of you
102:54 like for categories to pick out of you want to reduce the dimensions so that
102:56 want to reduce the dimensions so that your machine learning model is not
102:58 your machine learning model is not overwhelmed so then you have truncated
103:00 overwhelmed so then you have truncated SVD wrappers so the Transformer performs
103:02 SVD wrappers so the Transformer performs linear dimensionality reduction by means
103:04 linear dimensionality reduction by means of truncated single singular value
103:07 of truncated single singular value decomposition contrary to PCA the
103:09 decomposition contrary to PCA the estimator does not Center the data uh
103:11 estimator does not Center the data uh before Computing the singular value
103:12 before Computing the singular value decomposition which means it can work
103:14 decomposition which means it can work with spicy sparse matrices efficiently
103:18 with spicy sparse matrices efficiently sparse normalization each sample that is
103:20 sparse normalization each sample that is each row of the data Matrix which uh
103:22 each row of the data Matrix which uh with at least one zero component is
103:25 with at least one zero component is rescaled independently of other samples
103:26 rescaled independently of other samples so that is Norm so one L or two L2 I
103:30 so that is Norm so one L or two L2 I can't remember it's I2 or
103:32 can't remember it's I2 or L anyway i1 and and I2 okay so the thing
103:39 L anyway i1 and and I2 okay so the thing is is that on the exam they're probably
103:40 is is that on the exam they're probably not going to be asking these questions
103:42 not going to be asking these questions but I just like to get you exposure but
103:43 but I just like to get you exposure but I just want to show you that automl is
103:45 I just want to show you that automl is doing all this this is like
103:46 doing all this this is like pre-processing stuff you know like this
103:48 pre-processing stuff you know like this is stuff that you'd have to do and so
103:51 is stuff that you'd have to do and so it's just taking care of the stuff for
103:52 it's just taking care of the stuff for you
103:53 you [Music]
103:56 [Music] okay so within Azure automl they have a
103:59 okay so within Azure automl they have a feature called Model selection and this
104:01 feature called Model selection and this is the task of selecting a statistical
104:03 is the task of selecting a statistical model from a set of candidate models and
104:05 model from a set of candidate models and Azure autom ml will use different uh or
104:08 Azure autom ml will use different uh or many different ml algorithms that will
104:10 many different ml algorithms that will recommend the best performing candidate
104:11 recommend the best performing candidate so here is a list and I want to just
104:13 so here is a list and I want to just point out down below there's three pages
104:16 point out down below there's three pages there's 53 models it's a lot of models
104:19 there's 53 models it's a lot of models and so you can see that the it chose as
104:21 and so you can see that the it chose as its top candidate was called voting
104:23 its top candidate was called voting Ensemble that's an ensemble uh um
104:26 Ensemble that's an ensemble uh um algorithm that's where you take two weak
104:28 algorithm that's where you take two weak ml models combine them together to make
104:30 ml models combine them together to make a more uh uh Stronger one and notice
104:33 a more uh uh Stronger one and notice here it will show us the results and
104:34 here it will show us the results and this is what we're looking for which is
104:36 this is what we're looking for which is the primary metric the highest value
104:38 the primary metric the highest value should indicate that that's the model we
104:40 should indicate that that's the model we should want to use you can get an
104:42 should want to use you can get an explanation of the model called uh
104:43 explanation of the model called uh that's known as
104:46 that's known as explainability and now if you're a data
104:47 explainability and now if you're a data scientist you might be a bit smarter and
104:49 scientist you might be a bit smarter and say well I know this one should be
104:51 say well I know this one should be better so I'll use this and tweak it but
104:53 better so I'll use this and tweak it but you know if you don't know what you're
104:54 you know if you don't know what you're doing you just go with the top one
104:56 doing you just go with the top one [Music]
105:00 [Music] okay so we just saw that we had a top
105:03 okay so we just saw that we had a top candidate model and there could be an
105:04 candidate model and there could be an explanation to understand as to the
105:06 explanation to understand as to the effectiveness of this this is called MXL
105:08 effectiveness of this this is called MXL so machine learning explainability this
105:11 so machine learning explainability this is the process of explaining
105:12 is the process of explaining interpreting ml or deep learning models
105:14 interpreting ml or deep learning models MX mlx can help machine learning
105:18 MX mlx can help machine learning developers to better understand
105:19 developers to better understand interpret models Behavior so after your
105:21 interpret models Behavior so after your top candidate model is selected by Azure
105:23 top candidate model is selected by Azure automl you can get an explanation of
105:26 automl you can get an explanation of internals of various factors so model
105:27 internals of various factors so model performance uh data set Explorer
105:29 performance uh data set Explorer aggregate feature importance individual
105:31 aggregate feature importance individual feature importance so I mean yeah this
105:34 feature importance so I mean yeah this is aggregate so what it's looking at and
105:36 is aggregate so what it's looking at and it's actually cut off here but it's
105:37 it's actually cut off here but it's saying that these are the most important
105:40 saying that these are the most important ones that affect how the models outcome
105:43 ones that affect how the models outcome so I think this is the diabetes dat a
105:44 so I think this is the diabetes dat a data set so BMI uh would be one that
105:47 data set so BMI uh would be one that would be a huge influencer there okay
105:50 would be a huge influencer there okay [Music]
105:54 [Music] so the primary metric is a parameter
105:56 so the primary metric is a parameter that determines the metric to be used
105:58 that determines the metric to be used during the model training for
105:59 during the model training for optimization so for classification we
106:01 optimization so for classification we have a few and regression and time
106:03 have a few and regression and time series we have a few but you'll have
106:04 series we have a few but you'll have these task types and underneath you'll
106:06 these task types and underneath you'll choose the additional configuration and
106:08 choose the additional configuration and that's where you can override the
106:09 that's where you can override the primary metric uh it might just Auto
106:11 primary metric uh it might just Auto detected for you so you don't have to
106:12 detected for you so you don't have to because it might sample some of your
106:14 because it might sample some of your data set to just kind of guess U but you
106:16 data set to just kind of guess U but you might have to override it yourself uh
106:18 might have to override it yourself uh just going through some scenarios um and
106:20 just going through some scenarios um and we'll break it down into two categories
106:21 we'll break it down into two categories so here we have suited for larger data
106:23 so here we have suited for larger data sets that are well balanced well
106:25 sets that are well balanced well balanced means that your data set like
106:26 balanced means that your data set like is evenly distributed so if you have uh
106:29 is evenly distributed so if you have uh uh CL classifications for A and B let's
106:31 uh CL classifications for A and B let's say you have 100 and 100 they're well
106:33 say you have 100 and 100 they're well balanced right you don't have one data
106:35 balanced right you don't have one data set much a subset of your dat set much
106:37 set much a subset of your dat set much larger than the other that's labeled so
106:39 larger than the other that's labeled so for accuracy this is great for image
106:41 for accuracy this is great for image classification sentiment analysis CH
106:43 classification sentiment analysis CH prediction for average Precision score
106:45 prediction for average Precision score weighted it's for sentiment analysis
106:47 weighted it's for sentiment analysis Norm macro recall term prediction for
106:49 Norm macro recall term prediction for precision score weighted uh uncertain as
106:51 precision score weighted uh uncertain as to what that would be good for maybe
106:52 to what that would be good for maybe sentiment analysis suited for smaller
106:55 sentiment analysis suited for smaller data sets that are inbalance so that's
106:56 data sets that are inbalance so that's where your data set like you might have
106:58 where your data set like you might have like 10 records for one and 500 for the
107:00 like 10 records for one and 500 for the other on the label so you have Au
107:03 other on the label so you have Au weighted fraud detection image
107:04 weighted fraud detection image classification anomaly detection spam
107:07 classification anomaly detection spam detection onto regression scenarios uh
107:10 detection onto regression scenarios uh will break it down into ranges so when
107:12 will break it down into ranges so when you have a very wide range uh Spearman
107:14 you have a very wide range uh Spearman correlation works really well R2 score
107:16 correlation works really well R2 score this is great for airline uh delay
107:18 this is great for airline uh delay salary estimation bug res resolution
107:20 salary estimation bug res resolution time we're looking at smaller ranges
107:23 time we're looking at smaller ranges where you're talking about normalize
107:24 where you're talking about normalize root square mean to error so price
107:26 root square mean to error so price predictions um review tips score
107:28 predictions um review tips score predictions for normalized mean absolute
107:31 predictions for normalized mean absolute error um it's going to be just another
107:33 error um it's going to be just another one here they don't give a description
107:34 one here they don't give a description for time series it's the same thing it's
107:36 for time series it's the same thing it's just in the context of Time series so
107:39 just in the context of Time series so forecasting all
107:42 forecasting all [Music]
107:45 [Music] right another option we can change is
107:47 right another option we can change is the validation type when we're setting
107:49 the validation type when we're setting up our ml model so Val validation model
107:51 up our ml model so Val validation model validation is when we compare the
107:52 validation is when we compare the results of our training data set to our
107:54 results of our training data set to our test data set model validation occurs
107:56 test data set model validation occurs after we train the model and so you can
107:58 after we train the model and so you can just drop it down there we have some
107:59 just drop it down there we have some options so Auto kfold cross validation
108:02 options so Auto kfold cross validation Monte Carlo cross validation train
108:04 Monte Carlo cross validation train validation split I'm not going to really
108:06 validation split I'm not going to really get into the details of that I don't
108:08 get into the details of that I don't think it'll show up on the AI 900 exam
108:10 think it'll show up on the AI 900 exam but I just want you to be aware that you
108:11 but I just want you to be aware that you do have those options
108:13 do have those options [Music]
108:16 [Music] okay hey this is Andrew Brown from exam
108:19 okay hey this is Andrew Brown from exam Pro and we are taking a look here at
108:20 Pro and we are taking a look here at custom vision and this is a fully
108:22 custom vision and this is a fully managed no code service to quickly build
108:25 managed no code service to quickly build your own classification and object
108:27 your own classification and object detection ml models the service is
108:29 detection ml models the service is hosted on its own isolate domain at
108:31 hosted on its own isolate domain at www.com vision. so the first idea is you
108:35 www.com vision. so the first idea is you upload your images so bring your own
108:36 upload your images so bring your own labeled images or custom Vision to
108:38 labeled images or custom Vision to quickly add tags to any unlabeled data
108:41 quickly add tags to any unlabeled data images you use the labeled images to
108:44 images you use the labeled images to teach custom Vision the concepts you
108:46 teach custom Vision the concepts you care about which is training and you use
108:48 care about which is training and you use a simple rest API that calls
108:50 a simple rest API that calls uh to quickly tag images with your new
108:53 uh to quickly tag images with your new custom computer vision model so you can
108:55 custom computer vision model so you can evaluate
108:57 evaluate [Music]
109:00 [Music] okay so when we launch custom Vision we
109:03 okay so when we launch custom Vision we have to create a project and with that
109:04 have to create a project and with that we need to choose a project type and we
109:06 we need to choose a project type and we have classification and object detection
109:09 have classification and object detection reviewing classification here you have
109:12 reviewing classification here you have the option between multi-label so when
109:13 the option between multi-label so when you want to apply many tags to an image
109:16 you want to apply many tags to an image so think of an image that contains both
109:18 so think of an image that contains both a cat and a dog you have multi class so
109:20 a cat and a dog you have multi class so when you only have one possible tag to
109:22 when you only have one possible tag to apply to an image so it's either an
109:24 apply to an image so it's either an apple banana and orange it's not
109:26 apple banana and orange it's not multiples of these things you have
109:28 multiples of these things you have object detection this is when we want to
109:30 object detection this is when we want to detect various objects in an image uh
109:32 detect various objects in an image uh and you also need to choose a domain a
109:34 and you also need to choose a domain a domain is a Microsoft managed data set
109:36 domain is a Microsoft managed data set that is used for training the ml model
109:38 that is used for training the ml model there are different domains that are
109:39 there are different domains that are suited for different use cases so let's
109:41 suited for different use cases so let's go take a look first at image
109:42 go take a look first at image classification domains so here is the
109:45 classification domains so here is the big list the domains being over
109:47 big list the domains being over here okay and we'll go through these
109:50 here okay and we'll go through these here so General is optimized for a broad
109:52 here so General is optimized for a broad range of image classification tasks if
109:54 range of image classification tasks if none of the uh uh if none of the other
109:56 none of the uh uh if none of the other specified domains are appropriate or
109:57 specified domains are appropriate or you're unsure of which domain to choose
109:59 you're unsure of which domain to choose select one of the general domains so G
110:02 select one of the general domains so G uh or A1 is optimized for better
110:04 uh or A1 is optimized for better accuracy with comparable inference time
110:06 accuracy with comparable inference time as general domain recommended for larger
110:08 as general domain recommended for larger data sets or more difficult user
110:10 data sets or more difficult user scenarios this domain requires a more
110:12 scenarios this domain requires a more training time then you have A2 optimized
110:15 training time then you have A2 optimized for better accuracy with faster advert
110:17 for better accuracy with faster advert times than A1 and general domains
110:20 times than A1 and general domains recommended for more most data sets this
110:23 recommended for more most data sets this domain requires less training time than
110:25 domain requires less training time than General and A1 you have food optimized
110:27 General and A1 you have food optimized for photographs or dishes of as you
110:29 for photographs or dishes of as you would see them on a restaurant menu if
110:31 would see them on a restaurant menu if you want to classify photographs of
110:33 you want to classify photographs of individual fruits or vegetables use food
110:36 individual fruits or vegetables use food domains uh so then we have optimize for
110:38 domains uh so then we have optimize for recognizable landmarks both natural and
110:40 recognizable landmarks both natural and artificial this domain works best when
110:43 artificial this domain works best when Landmark is clearly visible in the
110:44 Landmark is clearly visible in the photograph this domain works even if the
110:46 photograph this domain works even if the land mark is slightly um obstructed by
110:49 land mark is slightly um obstructed by people in front of
110:51 people in front of it then you have retail so optimize for
110:54 it then you have retail so optimize for images that are found in a shopping cart
110:55 images that are found in a shopping cart or shopping uh website if you want a
110:58 or shopping uh website if you want a high Precision classifying uh
111:00 high Precision classifying uh classifying between dresses pants shirts
111:02 classifying between dresses pants shirts use this domain compact domains
111:04 use this domain compact domains optimized for the constraints of
111:05 optimized for the constraints of real-time classification on the edge
111:09 real-time classification on the edge okay then uh we have object detection
111:12 okay then uh we have object detection domain so this one's a lot shorter so
111:14 domain so this one's a lot shorter so we'll get through a lot quicker so
111:15 we'll get through a lot quicker so optimize for a broad range of object
111:17 optimize for a broad range of object detection tasks if none of the uh other
111:19 detection tasks if none of the uh other domains are appropriate or you're unsure
111:21 domains are appropriate or you're unsure of which domain choose the general one
111:23 of which domain choose the general one A1 optimized for better accuracy and
111:25 A1 optimized for better accuracy and comparable inference time than the
111:26 comparable inference time than the general domain recommended for most
111:28 general domain recommended for most accurate region locations larger data
111:30 accurate region locations larger data sets or more difficult use case
111:32 sets or more difficult use case scenarios the domain requires more
111:33 scenarios the domain requires more training and results are not
111:34 training and results are not deterministic expect uh plus minus 1%
111:38 deterministic expect uh plus minus 1% mean average Precision difference uh
111:40 mean average Precision difference uh with the same training data provided you
111:42 with the same training data provided you have logo optimized for finding Brands
111:45 have logo optimized for finding Brands uh logos and images uh products on
111:47 uh logos and images uh products on shelves so optimized for detecting and
111:49 shelves so optimized for detecting and classifying
111:50 classifying products on the shelves so there you
111:52 products on the shelves so there you [Music]
111:56 [Music] go okay so let's get some uh more
111:58 go okay so let's get some uh more practical knowledge of the service so
112:00 practical knowledge of the service so for image classification you're going to
112:01 for image classification you're going to upload multiple images and apply a
112:03 upload multiple images and apply a single or multiple labels to the entire
112:05 single or multiple labels to the entire image so here I have a bunch of images
112:07 image so here I have a bunch of images uploaded and then I have my tags over
112:09 uploaded and then I have my tags over here and they could either be multi or
112:11 here and they could either be multi or singular for object detection you apply
112:13 singular for object detection you apply tags to objects in an image for data
112:15 tags to objects in an image for data labeling and you hover uh your cursor
112:17 labeling and you hover uh your cursor over the image custom Vision uses ml to
112:19 over the image custom Vision uses ml to show boundaries uh bounding boxes of
112:21 show boundaries uh bounding boxes of possible objects have not yet been
112:22 possible objects have not yet been labeled if it does not detect it you can
112:25 labeled if it does not detect it you can also just click and drag to draw out
112:27 also just click and drag to draw out whatever Square you want so here's one
112:29 whatever Square you want so here's one where I tagged it up quite a bit you
112:31 where I tagged it up quite a bit you have to have at least 50 images on every
112:32 have to have at least 50 images on every tag to train uh so just be aware of that
112:35 tag to train uh so just be aware of that when you are tagging your images uh when
112:38 when you are tagging your images uh when you're training your model is ready when
112:40 you're training your model is ready when you and you have two options so you have
112:41 you and you have two options so you have quick training this trains quickly but
112:43 quick training this trains quickly but it will be less accurate you have
112:45 it will be less accurate you have Advanced Training this increases compute
112:46 Advanced Training this increases compute time to improve your results so for
112:49 time to improve your results so for Advanced Training BAS basically you just
112:50 Advanced Training BAS basically you just have this thing that you move to the
112:51 have this thing that you move to the right uh with each iteration of training
112:54 right uh with each iteration of training our ml model will improve the evaluation
112:56 our ml model will improve the evaluation metric so precision and recall it's
112:58 metric so precision and recall it's going to vary we're going to talk about
112:59 going to vary we're going to talk about the metrics here in a moment but the
113:00 the metrics here in a moment but the probability threshold value determines
113:02 probability threshold value determines when to stop training when our
113:04 when to stop training when our evaluation metric meets our desired
113:06 evaluation metric meets our desired threshold so these are just additional
113:07 threshold so these are just additional options where when you're training you
113:09 options where when you're training you can move this left to right uh and these
113:12 can move this left to right uh and these left to right
113:14 left to right okay and then when we get our results
113:16 okay and then when we get our results back uh we're going to get um some
113:18 back uh we're going to get um some metrics here so uh we have evaluation
113:20 metrics here so uh we have evaluation metric so we have Precision being exact
113:22 metric so we have Precision being exact and accurate selects items that are
113:24 and accurate selects items that are relevant recall such sensitivity or
113:26 relevant recall such sensitivity or known as true positive rate how many
113:28 known as true positive rate how many relevant items returned average
113:30 relevant items returned average Precision it's important that you
113:31 Precision it's important that you remember these because they might ask
113:34 remember these because they might ask you that on the exam so for uh cut when
113:37 you that on the exam so for uh cut when we're looking at object detection and
113:38 we're looking at object detection and we're looking at the evaluation metric
113:40 we're looking at the evaluation metric outcomes for this one we have Precision
113:42 outcomes for this one we have Precision recall and mean average Precision uh
113:46 recall and mean average Precision uh once we've deployed our pipeline it
113:47 once we've deployed our pipeline it makes sense that we go ahead and give it
113:49 makes sense that we go ahead and give it a quick test to make sure it's working
113:50 a quick test to make sure it's working correctly so you press the click quick
113:52 correctly so you press the click quick test button and you can upload your
113:54 test button and you can upload your image and it will tell you so this one
113:56 image and it will tell you so this one says it's Warf uh when you're ready to
113:58 says it's Warf uh when you're ready to publish you just hit the publish button
114:01 publish you just hit the publish button and then you'll get uh some prediction
114:03 and then you'll get uh some prediction URL and information so you can invoke it
114:06 URL and information so you can invoke it uh one other feature that's kind of
114:08 uh one other feature that's kind of useful is the Smart labeler so once
114:09 useful is the Smart labeler so once you've loaded some training data within
114:12 you've loaded some training data within it can now make suggestions right so you
114:14 it can now make suggestions right so you can't do this right away but once it has
114:16 can't do this right away but once it has some data it's like it's like kind of a
114:19 some data it's like it's like kind of a prediction that is not 100% guaranteed
114:21 prediction that is not 100% guaranteed right and it just helps you build up
114:22 right and it just helps you build up your training uh data set a lot faster
114:25 your training uh data set a lot faster uh very useful if you have a very large
114:27 uh very useful if you have a very large data set this is known as ml assisted
114:29 data set this is known as ml assisted labeling
114:31 labeling [Music]
114:34 [Music] okay hey this is Andrew Brown from exam
114:36 okay hey this is Andrew Brown from exam Pro and in this section we'll be
114:38 Pro and in this section we'll be covering the newly added section to the
114:40 covering the newly added section to the AI 900 that focuses on generative AI
114:42 AI 900 that focuses on generative AI generative AI including Technologies
114:44 generative AI including Technologies like chat GPT is becoming more
114:46 like chat GPT is becoming more recognized outside of tech circles while
114:49 recognized outside of tech circles while it may seem magical in its ability to
114:51 it may seem magical in its ability to produce humanlike content it's actually
114:53 produce humanlike content it's actually based on Advanced mathematical
114:54 based on Advanced mathematical techniques from statistics data science
114:56 techniques from statistics data science and machine learning understanding these
114:58 and machine learning understanding these Core Concepts can help Society Envision
115:00 Core Concepts can help Society Envision new AI possibilities for the future
115:03 new AI possibilities for the future first let's compare the differences
115:04 first let's compare the differences between regular AI versus generative ai
115:07 between regular AI versus generative ai ai refers to the development of computer
115:09 ai refers to the development of computer systems that can perform tasks typically
115:11 systems that can perform tasks typically requiring human intelligence these
115:13 requiring human intelligence these include problem solving decision-making
115:16 include problem solving decision-making understanding natural language
115:17 understanding natural language recognizing speech and images and more
115:20 recognizing speech and images and more the primary goal of traditional AI is to
115:22 the primary goal of traditional AI is to create systems that can interpret
115:23 create systems that can interpret analyze and respond to human actions or
115:25 analyze and respond to human actions or environmental changes efficiently and
115:27 environmental changes efficiently and accurately it aims to replicate or
115:29 accurately it aims to replicate or simulate human intelligence in machines
115:32 simulate human intelligence in machines AI applications are vast and include
115:34 AI applications are vast and include areas like expert systems natural
115:36 areas like expert systems natural language processing speech recognition
115:38 language processing speech recognition and Robotics AI is used in various
115:41 and Robotics AI is used in various Industries for tasks such as customer
115:42 Industries for tasks such as customer service chatbots recommendation systems
115:45 service chatbots recommendation systems in e-commerce autonomous vehicles and
115:47 in e-commerce autonomous vehicles and medical diagnosis on the other hand
115:50 medical diagnosis on the other hand generative AI is a subset of AI that
115:52 generative AI is a subset of AI that focuses on creating new content or data
115:54 focuses on creating new content or data that is novel and realistic it does not
115:56 that is novel and realistic it does not just interpret or analyze data but
115:58 just interpret or analyze data but generates new data itself it includes
116:00 generates new data itself it includes generating text images music speech and
116:03 generating text images music speech and other forms of media it often involves
116:05 other forms of media it often involves Advanced machine learning techniques
116:07 Advanced machine learning techniques particularly deep learning models like
116:09 particularly deep learning models like generative adversarial networks
116:10 generative adversarial networks variational autoencoders and Transformer
116:13 variational autoencoders and Transformer models like GPT generative AI is used in
116:16 models like GPT generative AI is used in a range of applications including
116:18 a range of applications including creating realistic images and videos
116:20 creating realistic images and videos generating human-like text composing
116:22 generating human-like text composing music creating virtual environments and
116:24 music creating virtual environments and even drug Discovery some examples
116:27 even drug Discovery some examples include tools like GPT for text
116:29 include tools like GPT for text generation doly for image creation and
116:31 generation doly for image creation and various deep learning models that
116:33 various deep learning models that compose music so let's quickly summarize
116:36 compose music so let's quickly summarize the differences of regular AI with
116:38 the differences of regular AI with generative AI across three features
116:40 generative AI across three features functionality data handling and
116:42 functionality data handling and applications regular AI focuses on
116:45 applications regular AI focuses on understanding and decision making
116:46 understanding and decision making whereas generative AI is about creating
116:48 whereas generative AI is about creating new original outputs in terms of data
116:51 new original outputs in terms of data handling regular AI analyzes and bases
116:53 handling regular AI analyzes and bases decisions on existing data while
116:55 decisions on existing data while generative AI uses the same data to
116:57 generative AI uses the same data to generate new previously unseen outputs
117:00 generate new previously unseen outputs and for applications regular I scope
117:02 and for applications regular I scope includes data analysis automation
117:04 includes data analysis automation natural language processing and
117:06 natural language processing and Healthcare in contrast generative AI
117:08 Healthcare in contrast generative AI leans towards more creative and
117:09 leans towards more creative and Innovative applications such as content
117:11 Innovative applications such as content creation synthetic data generation deep
117:14 creation synthetic data generation deep fakes and
117:20 Design the next topic we'll be covering is what
117:21 the next topic we'll be covering is what is a large language Model A large
117:23 is a large language Model A large language model such as GPT Works in a
117:26 language model such as GPT Works in a way that's similar to a complex
117:27 way that's similar to a complex automatic system that recognizes
117:29 automatic system that recognizes patterns and makes predictions training
117:32 patterns and makes predictions training on large data sets initially the model
117:34 on large data sets initially the model is trained on massive amounts of text
117:36 is trained on massive amounts of text Data this data can include books
117:38 Data this data can include books articles websites and other written
117:39 articles websites and other written material during this training phase the
117:42 material during this training phase the model learns patterns and language such
117:44 model learns patterns and language such as grammar word usage sentence structure
117:46 as grammar word usage sentence structure and even style and tone understanding
117:48 and even style and tone understanding context the model's design allows it to
117:51 context the model's design allows it to consider a wide context this means it
117:53 consider a wide context this means it doesn't just focus on single words but
117:55 doesn't just focus on single words but understands them in relation to the
117:56 understands them in relation to the words and sentences that come before and
117:58 words and sentences that come before and after this context understanding is
118:00 after this context understanding is important for generating coherent and
118:02 important for generating coherent and relevant text predicting the next word
118:05 relevant text predicting the next word when you give the model a prompt which
118:07 when you give the model a prompt which is a starting piece of text it uses what
118:09 is a starting piece of text it uses what it has learned to predict the next most
118:10 it has learned to predict the next most likely word it then adds this word to
118:13 likely word it then adds this word to the prompt and repeats the process
118:14 the prompt and repeats the process continually predicting the next word
118:16 continually predicting the next word based on the extended sequence
118:18 based on the extended sequence generating text this process of
118:20 generating text this process of predicting the next word continues
118:22 predicting the next word continues creating a chain of words that forms a
118:24 creating a chain of words that forms a coherent piece of text the length of
118:26 coherent piece of text the length of this generated text can vary based on
118:27 this generated text can vary based on Specific Instructions or limitations set
118:29 Specific Instructions or limitations set for the model refinement with feedback
118:32 for the model refinement with feedback the model can be further refined and
118:34 the model can be further refined and improved over time with feedback this
118:36 improved over time with feedback this means it gets better at understanding
118:37 means it gets better at understanding and generating text as it is exposed to
118:39 and generating text as it is exposed to more data and usage in summary a large
118:42 more data and usage in summary a large language model works by learning from a
118:44 language model works by learning from a vast quantity of text Data understanding
118:46 vast quantity of text Data understanding the context of language and using this
118:48 the context of language and using this understanding to predict and generate
118:50 understanding to predict and generate new text that is coherent and
118:51 new text that is coherent and contextually appropriate which can be
118:53 contextually appropriate which can be further refined with feedback as shown
118:55 further refined with feedback as shown in the workflow
118:56 in the workflow [Music]
118:59 [Music] image next let's talk about Transformer
119:02 image next let's talk about Transformer models so a Transformer model is a type
119:05 models so a Transformer model is a type of machine learning model that's
119:06 of machine learning model that's especially good at understanding and
119:08 especially good at understanding and generating language it's built using a
119:10 generating language it's built using a structure called the Transformer
119:12 structure called the Transformer architecture which is really effective
119:13 architecture which is really effective for tasks involving natural language
119:15 for tasks involving natural language processing like translating languages or
119:17 processing like translating languages or writing text trans Transformer model
119:20 writing text trans Transformer model architecture consists of two components
119:22 architecture consists of two components or blocks first we have the encoder this
119:24 or blocks first we have the encoder this part reads and understands the input
119:26 part reads and understands the input text it's like a smart system that goes
119:28 text it's like a smart system that goes through everything it's been taught
119:29 through everything it's been taught which is a lot of text and picks up on
119:31 which is a lot of text and picks up on the meanings of words and how they're
119:33 the meanings of words and how they're used in different contexts then we have
119:35 used in different contexts then we have the decoder So based on what the encoder
119:37 the decoder So based on what the encoder has learned this part generates New
119:39 has learned this part generates New pieces of text it's like a skilled
119:41 pieces of text it's like a skilled writer that can make up sentences that
119:43 writer that can make up sentences that flow well and make sense there are
119:45 flow well and make sense there are different types of Transformer models
119:47 different types of Transformer models with specific jobs for example Bert is
119:49 with specific jobs for example Bert is good at understanding the language it's
119:51 good at understanding the language it's like a librarian who knows where every
119:53 like a librarian who knows where every book is and what's inside them Google
119:55 book is and what's inside them Google uses it to help its search engine
119:56 uses it to help its search engine understand what you're looking for GPT
119:59 understand what you're looking for GPT is good at creating text it's like a
120:00 is good at creating text it's like a skilled author who can write stories
120:02 skilled author who can write stories articles or conversations based on what
120:04 articles or conversations based on what it has learned so that's an overview of
120:06 it has learned so that's an overview of a transformer model next we'll be
120:09 a transformer model next we'll be talking about the main components of a
120:10 talking about the main components of a transformer
120:17 model the next component of a transformer model we'll be covering is
120:19 transformer model we'll be covering is the the tokenization process
120:20 the the tokenization process tokenization in a Transformer model is
120:23 tokenization in a Transformer model is like turning a sentence into a puzzle
120:24 like turning a sentence into a puzzle for example you have the sentence I
120:26 for example you have the sentence I heard a dog bark loudly at a cat to help
120:29 heard a dog bark loudly at a cat to help a computer understand it we chop up the
120:31 a computer understand it we chop up the sentence into pieces called tokens each
120:33 sentence into pieces called tokens each piece can be a word or even a part of a
120:35 piece can be a word or even a part of a word so for our sentence we give each
120:37 word so for our sentence we give each word a number like this I might be one
120:40 word a number like this I might be one her might be two a might be three do
120:42 her might be two a might be three do might be four bark might be five loudly
120:45 might be four bark might be five loudly might be six at might be seven is
120:48 might be six at might be seven is already token is three tap might be
120:50 already token is three tap might be eight now our sentence becomes a series
120:53 eight now our sentence becomes a series of numbers this is like giving each word
120:55 of numbers this is like giving each word a special code the computer uses these
120:57 a special code the computer uses these codes to learn about the words and how
120:58 codes to learn about the words and how they fit together if a word repeats like
121:01 they fit together if a word repeats like a we use its code again instead of
121:02 a we use its code again instead of making a new one as the computer reads
121:05 making a new one as the computer reads more text it keeps turning new words
121:07 more text it keeps turning new words into new tokens with new numbers if it
121:09 into new tokens with new numbers if it learns the word meow it might call it
121:11 learns the word meow it might call it nine and skateboard could be 10 by doing
121:13 nine and skateboard could be 10 by doing this with lots and lots of text the
121:15 this with lots and lots of text the computer builds a big list of these
121:17 computer builds a big list of these tokens which it then uses to stand and
121:19 tokens which it then uses to stand and generate language it's a bit like
121:21 generate language it's a bit like creating a dictionary where every word
121:22 creating a dictionary where every word has a unique
121:29 number the next component of a transformer model will be covering our
121:31 transformer model will be covering our embeddings so to help a computer
121:34 embeddings so to help a computer understand language we turn words into
121:35 understand language we turn words into tokens and then give each token a
121:37 tokens and then give each token a special numeric code called an embedding
121:39 special numeric code called an embedding these embeddings are like a secret code
121:41 these embeddings are like a secret code that captures the meaning of the word as
121:43 that captures the meaning of the word as a simple example suppose the embeddings
121:45 a simple example suppose the embeddings for our tokens consist of vectors with
121:47 for our tokens consist of vectors with three elements for example four for dog
121:50 three elements for example four for dog has the embedding vectors 10 32 five for
121:53 has the embedding vectors 10 32 five for bark has the vectors 10 22 eight for cat
121:56 bark has the vectors 10 22 eight for cat the vectors are 10 3 1 nine for meow the
122:00 the vectors are 10 3 1 nine for meow the vectors are 10 2 1 and 10 for skateboard
122:03 vectors are 10 2 1 and 10 for skateboard as the vectors 3 3 one which is quite
122:06 as the vectors 3 3 one which is quite different from the rest words that have
122:08 different from the rest words that have similar meanings or are used in similar
122:10 similar meanings or are used in similar ways get codes that look alike so dog
122:12 ways get codes that look alike so dog and bark might have similar codes
122:13 and bark might have similar codes because they are related but skateboard
122:15 because they are related but skateboard might be off in a different area because
122:17 might be off in a different area because it's not much related to these other
122:18 it's not much related to these other words words this way the computer can
122:21 words words this way the computer can figure out which words are similar to
122:22 figure out which words are similar to each other just by looking at their
122:24 each other just by looking at their codes it's like giving each word a home
122:26 codes it's like giving each word a home on a map and words that are neighbors on
122:27 on a map and words that are neighbors on this map have related meanings the image
122:30 this map have related meanings the image shows a simple example model in which
122:32 shows a simple example model in which each embedding has only three dimensions
122:34 each embedding has only three dimensions real language models have many more
122:36 real language models have many more Dimensions tools such as word TWC or the
122:39 Dimensions tools such as word TWC or the encoding part of a transformer model
122:40 encoding part of a transformer model help AI to figure out where each word
122:42 help AI to figure out where each word dot should go on this big
122:50 map let's go over positional encoding from a Transformer model positional
122:52 from a Transformer model positional encoding is a technique used to ensure
122:54 encoding is a technique used to ensure that a language model such as GPT
122:56 that a language model such as GPT doesn't lose the order of words when
122:57 doesn't lose the order of words when processing natural language this is
123:00 processing natural language this is important because the order in which
123:01 important because the order in which words appear can change the meaning of a
123:03 words appear can change the meaning of a sentence let's take the sentence I heard
123:05 sentence let's take the sentence I heard a dog bark loudly at a c from our
123:07 a dog bark loudly at a c from our previous example without positional
123:09 previous example without positional encoding if we simply tokenize this
123:11 encoding if we simply tokenize this sentence and convert the tokens into
123:13 sentence and convert the tokens into embedding vectors we might end up with a
123:15 embedding vectors we might end up with a set of vectors that lose the sequence
123:16 set of vectors that lose the sequence information positional encoding adds a
123:19 information positional encoding adds a positional Vector to each word in order
123:21 positional Vector to each word in order to keep track of the positions of the
123:23 to keep track of the positions of the words however by adding positional en
123:25 words however by adding positional en coding vectors to each words embedding
123:27 coding vectors to each words embedding we ensure that each position in the
123:29 we ensure that each position in the sentence is uniquely identified the
123:31 sentence is uniquely identified the embedding for I would be modified by
123:33 embedding for I would be modified by adding a positional Vector corresponding
123:35 adding a positional Vector corresponding to position one labeled I 1 the
123:37 to position one labeled I 1 the embedding for herd would be altered by a
123:39 embedding for herd would be altered by a vector for position two labeled herd 2
123:42 vector for position two labeled herd 2 the embedding for a would be updated
123:44 the embedding for a would be updated with a vector for position three labeled
123:46 with a vector for position three labeled a 3 and reused with the same positional
123:48 a 3 and reused with the same positional vector for its second occurrence this
123:51 vector for its second occurrence this process continues for each word token in
123:53 process continues for each word token in the sentence with dog four bark five
123:55 the sentence with dog four bark five loudly six at seven and Cat 8 all
123:58 loudly six at seven and Cat 8 all receiving their unique positional
124:00 receiving their unique positional encodings as a result the sentence I
124:02 encodings as a result the sentence I heard a dog bark loudly at a cat is
124:04 heard a dog bark loudly at a cat is represented not just by a sequence of
124:05 represented not just by a sequence of vectors for its words but by a sequence
124:08 vectors for its words but by a sequence of vectors that are influenced by the
124:09 of vectors that are influenced by the position of each word in the sentence
124:12 position of each word in the sentence this means that even if another sentence
124:13 this means that even if another sentence had the same words in a different order
124:15 had the same words in a different order its overall representation would be
124:17 its overall representation would be different because the positional
124:18 different because the positional encodings differ reflecting the
124:19 encodings differ reflecting the different sequence of words so that's an
124:22 different sequence of words so that's an overview of positional
124:30 encoding the next component of a transformer we'll be covering is
124:31 transformer we'll be covering is attention attention in AI especially in
124:34 attention attention in AI especially in Transformer models is a way the model
124:36 Transformer models is a way the model figures out how important each word or
124:38 figures out how important each word or token is to the meaning of a sentence
124:40 token is to the meaning of a sentence particularly in relation to the other
124:41 particularly in relation to the other words around it let's reuse the sentence
124:43 words around it let's reuse the sentence I heard a DOT bark loudly at a cat to
124:45 I heard a DOT bark loudly at a cat to explain this better self attention
124:48 explain this better self attention imagine each word word in the sentence
124:49 imagine each word word in the sentence shining a flashlight on the other words
124:51 shining a flashlight on the other words the brightness of the light shows how
124:53 the brightness of the light shows how much one word should pay attention to
124:54 much one word should pay attention to the others when understanding the
124:55 the others when understanding the sentence for bark the light might shine
124:58 sentence for bark the light might shine brightest on dot because they're closely
124:59 brightest on dot because they're closely related encoder's role in the encoder
125:02 related encoder's role in the encoder part of a transformer model attention
125:04 part of a transformer model attention helps decide how to represent each word
125:06 helps decide how to represent each word as a number or vector it's not just the
125:08 as a number or vector it's not just the word itself but also its context that
125:10 word itself but also its context that matters for example bark in the bark of
125:12 matters for example bark in the bark of a tree would have a different
125:13 a tree would have a different representation than bark and I heard a
125:15 representation than bark and I heard a DOT bark because the surrounding words
125:17 DOT bark because the surrounding words are different decoder's role when
125:19 are different decoder's role when generating new text like completing a
125:21 generating new text like completing a sentence the decoder uses attention to
125:23 sentence the decoder uses attention to figure out which words it already has
125:25 figure out which words it already has are most important for deciding what
125:26 are most important for deciding what comes next if our sentence is I heard a
125:29 comes next if our sentence is I heard a dog the model uses attention to know
125:31 dog the model uses attention to know that her and dog are key to adding the
125:32 that her and dog are key to adding the next word which might be bark multi-head
125:35 next word which might be bark multi-head attention this is like having multiple
125:37 attention this is like having multiple flashlights each highlighting different
125:39 flashlights each highlighting different aspects of the words maybe one
125:41 aspects of the words maybe one flashlight looks at the meaning of the
125:42 flashlight looks at the meaning of the word another looks at its role in the
125:44 word another looks at its role in the sentence like subject or object and so
125:46 sentence like subject or object and so on this helps the model get a richer
125:48 on this helps the model get a richer understanding of the text building the
125:51 understanding of the text building the output the decoder builds the sentence
125:52 output the decoder builds the sentence one word at a time using attention at
125:54 one word at a time using attention at each step it looks at the sentence so
125:56 each step it looks at the sentence so far decides what's important and then
125:58 far decides what's important and then predicts the next word it's an ongoing
126:00 predicts the next word it's an ongoing process with each new word influencing
126:02 process with each new word influencing the next so attention in Transformer
126:05 the next so attention in Transformer models is like a guide that helps the AI
126:07 models is like a guide that helps the AI understand and create Language by
126:08 understand and create Language by focusing on the most relevant parts of
126:10 focusing on the most relevant parts of the text considering both individual
126:12 the text considering both individual word meanings and their relationships
126:14 word meanings and their relationships within the
126:15 within the sentence let's take a look at the
126:17 sentence let's take a look at the attention process token embeddings each
126:20 attention process token embeddings each word in the sentence is represented as a
126:22 word in the sentence is represented as a vector of numbers or its embedding
126:24 vector of numbers or its embedding predicting the next token the goal is to
126:26 predicting the next token the goal is to figure out what the next word should be
126:28 figure out what the next word should be also represented as a vector as signing
126:31 also represented as a vector as signing weights the attention layer looks at the
126:32 weights the attention layer looks at the sentence so far and decides how much
126:34 sentence so far and decides how much influence each word should have on the
126:35 influence each word should have on the next one calculating attention scores
126:38 next one calculating attention scores using these weights a new Vector for the
126:40 using these weights a new Vector for the next token is calculated which includes
126:42 next token is calculated which includes an attention score multi-head attention
126:44 an attention score multi-head attention does this several times focusing on
126:46 does this several times focusing on different aspects of the words choosing
126:49 different aspects of the words choosing the most likely word a neural network
126:51 the most likely word a neural network takes these vectors with attention
126:52 takes these vectors with attention scores and picks the word from the
126:53 scores and picks the word from the vocabulary that most likely comes next
126:56 vocabulary that most likely comes next adding to the sequence The Chosen word
126:58 adding to the sequence The Chosen word is added to the existing sequence and
127:00 is added to the existing sequence and the process repeats for each new word so
127:03 the process repeats for each new word so let's use gp4 is an example for how this
127:06 let's use gp4 is an example for how this entire process works explained in a
127:08 entire process works explained in a simplified manner a Transformer model
127:10 simplified manner a Transformer model like gp4 works by taking a text input
127:13 like gp4 works by taking a text input and producing a well structured output
127:15 and producing a well structured output during training it learns from a vast
127:17 during training it learns from a vast array of text Data understand
127:18 array of text Data understand understanding how words are typically
127:19 understanding how words are typically arranged in sentences the model knows
127:22 arranged in sentences the model knows the correct sequence of words but hides
127:24 the correct sequence of words but hides future words to learn how to predict
127:25 future words to learn how to predict them when it tries to predict a word it
127:28 them when it tries to predict a word it Compares its guess to the actual word
127:30 Compares its guess to the actual word gradually adjusting to reduce errors in
127:32 gradually adjusting to reduce errors in practice the model uses its training to
127:34 practice the model uses its training to aside importance to each word in a
127:36 aside importance to each word in a sequence helping it guess the next word
127:38 sequence helping it guess the next word accurately the result is that gp4 can
127:41 accurately the result is that gp4 can create sentences that sound like they
127:42 create sentences that sound like they were written by a human however this
127:45 were written by a human however this doesn't mean the model knows things or
127:46 doesn't mean the model knows things or is intelligent in the human sense it's
127:48 is intelligent in the human sense it's it's simply very good at using its large
127:50 it's simply very good at using its large vocabulary and training to generate
127:52 vocabulary and training to generate realistic text base on word
127:54 realistic text base on word relationships so that's an overview of
127:56 relationships so that's an overview of attention in a Transformer
127:58 attention in a Transformer [Music]
128:02 [Music] model hey this is Andrew Brown from exam
128:04 model hey this is Andrew Brown from exam Pro and in this section we'll be going
128:06 Pro and in this section we'll be going over an introduction to Azure openai
128:08 over an introduction to Azure openai service Azure open AI service is a
128:10 service Azure open AI service is a cloud-based platform designed to deploy
128:12 cloud-based platform designed to deploy and manage Advanced language models from
128:14 and manage Advanced language models from open AI this service combines open I's
128:16 open AI this service combines open I's latest language model development with
128:18 latest language model development with the robust security and scalability of
128:20 the robust security and scalability of azure's cloud infrastructure Azure open
128:22 azure's cloud infrastructure Azure open AI offers several types of models for
128:24 AI offers several types of models for different purposes gp4 models these are
128:28 different purposes gp4 models these are the newest in the line of GPT models and
128:30 the newest in the line of GPT models and can create text and programming code
128:31 can create text and programming code when given a prompt written in natural
128:33 when given a prompt written in natural language GPT 3.5 models similar to GPT 4
128:38 language GPT 3.5 models similar to GPT 4 these models also create text and code
128:39 these models also create text and code from natural language props the GPT 3.5
128:43 from natural language props the GPT 3.5 turbo version is specially designed for
128:44 turbo version is specially designed for conversations making it a great choice
128:46 conversations making it a great choice for chat applications and other
128:48 for chat applications and other interactive AI tasks embedding models
128:51 interactive AI tasks embedding models these models turn written text into
128:53 these models turn written text into number sequences which is helpful for
128:55 number sequences which is helpful for analyzing and comparing different pieces
128:56 analyzing and comparing different pieces of text to find out how similar they are
128:59 of text to find out how similar they are doll e models these models can make
129:01 doll e models these models can make images from descriptions given in words
129:03 images from descriptions given in words the doll e models are still being tested
129:05 the doll e models are still being tested and are shown in the Azure Open aai
129:06 and are shown in the Azure Open aai studio so you don't have to set them up
129:08 studio so you don't have to set them up for use manually key Concepts in using
129:11 for use manually key Concepts in using Azure open AI include prompts and
129:13 Azure open AI include prompts and completions tokens resources deployments
129:16 completions tokens resources deployments prompt engineering and various models
129:18 prompt engineering and various models prompts and completions users interact
129:21 prompts and completions users interact with the API by providing a text command
129:23 with the API by providing a text command in English known as a prompt and the
129:24 in English known as a prompt and the model generates a text response or
129:26 model generates a text response or completion for example a prompt to C to
129:29 completion for example a prompt to C to five and a loop results in the model
129:30 five and a loop results in the model returning appropriate code tokens asure
129:33 returning appropriate code tokens asure open AI breaks down text into tokens
129:35 open AI breaks down text into tokens which are words or character chunks to
129:37 which are words or character chunks to process requests the number of tokens
129:39 process requests the number of tokens affects response latency and throughput
129:41 affects response latency and throughput for images token cost varies with image
129:43 for images token cost varies with image size and detail setting with low detail
129:46 size and detail setting with low detail images costing fewer tokens and high
129:47 images costing fewer tokens and high detail images costing more resources
129:50 detail images costing more resources Azure open aai operates like other Azure
129:52 Azure open aai operates like other Azure products where users create a resource
129:54 products where users create a resource within their Azure subscription
129:56 within their Azure subscription deployments to use the service users
129:58 deployments to use the service users must deploy a model via deployment apis
130:00 must deploy a model via deployment apis choosing the specific model for their
130:02 choosing the specific model for their needs prompt engineering crafting
130:04 needs prompt engineering crafting prompts is crucial as they guide the
130:06 prompts is crucial as they guide the model's output this requires skill as
130:08 model's output this requires skill as prompt construction is nuanced and
130:10 prompt construction is nuanced and impacts the model's response models
130:13 impacts the model's response models various models offer different
130:14 various models offer different capabilities and pricing Dolly creates
130:17 capabilities and pricing Dolly creates images from text while whisper
130:18 images from text while whisper transcribes and translates speech to
130:20 transcribes and translates speech to text each has unique features suitable
130:22 text each has unique features suitable for different tasks so that's an
130:24 for different tasks so that's an overview of azure open ey
130:32 service the next topic we'll be covering is azure open AI Studio developers can
130:35 is azure open AI Studio developers can work with these models in Azure open AI
130:37 work with these models in Azure open AI Studio A web-based environment where AI
130:39 Studio A web-based environment where AI professionals can deploy test and manage
130:41 professionals can deploy test and manage llms that support generative AI app
130:43 llms that support generative AI app development on Azure access is currently
130:45 development on Azure access is currently limited due to the high demand upcoming
130:47 limited due to the high demand upcoming product improvements and Microsoft's
130:49 product improvements and Microsoft's commitment to responsible AI presently
130:52 commitment to responsible AI presently collaborations are being prioritized for
130:54 collaborations are being prioritized for those who already have a partnership
130:55 those who already have a partnership with Microsoft are engaged in lower risk
130:57 with Microsoft are engaged in lower risk use cases and are dedicated to including
131:00 use cases and are dedicated to including necessary
131:01 necessary safeguards in Azure open AI Studio you
131:03 safeguards in Azure open AI Studio you can deploy large language models provide
131:06 can deploy large language models provide F shot examples and test them in Azure
131:08 F shot examples and test them in Azure open AI Studios chat playground the
131:10 open AI Studios chat playground the image shows Azure open eyes chat
131:12 image shows Azure open eyes chat playground interface where users can
131:14 playground interface where users can test and configure an AI chat bot in the
131:17 test and configure an AI chat bot in the middle there's a chat area to type user
131:18 middle there's a chat area to type user messages and see the assistant's replies
131:21 messages and see the assistant's replies on the left there's a menu for
131:22 on the left there's a menu for navigation and a section to set up the
131:24 navigation and a section to set up the assistant including a reminder to save
131:26 assistant including a reminder to save changes on the right adjustable
131:29 changes on the right adjustable parameters control the eyes response
131:30 parameters control the eyes response Behavior like length Randomness and
131:32 Behavior like length Randomness and repetition users into queries adjust
131:35 repetition users into queries adjust settings and observe how the AI responds
131:37 settings and observe how the AI responds to fine-tune its performance so that's
131:39 to fine-tune its performance so that's an overview of azure open AI
131:47 Studio let's take a look at the pricing for the model models in Azure open AI
131:49 for the model models in Azure open AI service starting off with the language
131:51 service starting off with the language models we have GPT 3.5 Turbo with a
131:54 models we have GPT 3.5 Turbo with a context of 4K tokens cost
131:57 context of 4K tokens cost 0.15 for prompts and
132:00 0.15 for prompts and 0.002 for completions per 1,000 tokens
132:03 0.002 for completions per 1,000 tokens another version of GPT 3.5 turbo can
132:06 another version of GPT 3.5 turbo can handle a larger context of 16k tokens
132:09 handle a larger context of 16k tokens with PRT and completion costs increased
132:11 with PRT and completion costs increased up to
132:12 up to 0.003 and
132:14 0.003 and 0.004 respectively gpt3 5 Turbo 11106
132:19 0.004 respectively gpt3 5 Turbo 11106 with a 16k context has no available
132:21 with a 16k context has no available pricing gp4 turbo and gp4 turbo Vision
132:25 pricing gp4 turbo and gp4 turbo Vision both have an even larger Contex size of
132:27 both have an even larger Contex size of 128k tokens but also have no listed
132:30 128k tokens but also have no listed prices the standard gp4 model with an 8K
132:33 prices the standard gp4 model with an 8K token Contex costs 3 cents for prompts
132:36 token Contex costs 3 cents for prompts and 6 cents for completions and a larger
132:39 and 6 cents for completions and a larger context version of gp4 with 32k tokens
132:42 context version of gp4 with 32k tokens cost 6 cents for prompts and 12 cents
132:44 cost 6 cents for prompts and 12 cents for
132:44 for completions there are other models such
132:47 completions there are other models such as the base models
132:48 as the base models fine-tuning models image models
132:51 fine-tuning models image models embedding models and speech models they
132:54 embedding models and speech models they all have their respective pricing but we
132:56 all have their respective pricing but we won't be going through each of them in a
132:57 won't be going through each of them in a lot of detail but essentially they are
132:59 lot of detail but essentially they are all on a paper use pricing model it
133:02 all on a paper use pricing model it could be payer hour or paper token and
133:04 could be payer hour or paper token and so on the higher quality the model the
133:06 so on the higher quality the model the more expensive it will likely be so
133:08 more expensive it will likely be so that's an overview of azure open AI
133:10 that's an overview of azure open AI Service
133:12 Service [Music]
133:15 [Music] pricing hey this is Andrew Brown from
133:17 pricing hey this is Andrew Brown from exam Pro and the next topic will be
133:19 exam Pro and the next topic will be going over co-pilots co-pilots are a new
133:22 going over co-pilots co-pilots are a new type of computing tool that integrates
133:23 type of computing tool that integrates with applications to help users with
133:25 with applications to help users with common tasks using generative AI models
133:28 common tasks using generative AI models they are designed using a standard
133:29 they are designed using a standard architecture allowing developers to
133:31 architecture allowing developers to create custom co-pilots tailored to
133:33 create custom co-pilots tailored to specific business needs and applications
133:35 specific business needs and applications co-pilots might appear as a chat feature
133:37 co-pilots might appear as a chat feature beside your document or file and they
133:39 beside your document or file and they utilize the content within the product
133:41 utilize the content within the product to generate specific results creating a
133:43 to generate specific results creating a co-pilot involves several steps training
133:46 co-pilot involves several steps training a large language model with a vast
133:48 a large language model with a vast amount of data utilizing services like
133:50 amount of data utilizing services like Azure open AI service which provide
133:52 Azure open AI service which provide pre-trained models that developers can
133:54 pre-trained models that developers can either use as his refin tune with their
133:56 either use as his refin tune with their own data for more specific tasks
133:58 own data for more specific tasks deploying the model to make it available
134:00 deploying the model to make it available for use within applications building
134:02 for use within applications building co-pilots that prompt the models to
134:04 co-pilots that prompt the models to generate usable content enabling
134:06 generate usable content enabling business users to enhance their
134:08 business users to enhance their productivity and creativity through AI
134:10 productivity and creativity through AI generated assistance co-pilots have the
134:12 generated assistance co-pilots have the potential to revolutionize the way we
134:14 potential to revolutionize the way we work these co-pilots use generative AI
134:16 work these co-pilots use generative AI to help with first draft information
134:18 to help with first draft information synthesis strategic planning and much
134:21 synthesis strategic planning and much more let's take a look at a few examples
134:23 more let's take a look at a few examples of co-pilot starting with Microsoft
134:26 of co-pilot starting with Microsoft co-pilot so Microsoft co-pilot is
134:28 co-pilot so Microsoft co-pilot is integrated into various applications to
134:30 integrated into various applications to assist users in creating documents
134:32 assist users in creating documents spreadsheets presentations and more by
134:35 spreadsheets presentations and more by generating content summarizing
134:36 generating content summarizing information and aiding in strategic
134:38 information and aiding in strategic planning it is used across Microsoft
134:41 planning it is used across Microsoft Suite of products and services to
134:42 Suite of products and services to enhance user experience and efficiency
134:45 enhance user experience and efficiency next we have the Microsoft being search
134:47 next we have the Microsoft being search engine which which has an integrated
134:48 engine which which has an integrated co-pilot to help users when browsing or
134:50 co-pilot to help users when browsing or searching the Internet by generating
134:52 searching the Internet by generating natural language answers to questions by
134:54 natural language answers to questions by understanding the context of the
134:56 understanding the context of the questions providing a richer and more
134:57 questions providing a richer and more intuitive search experience Microsoft
135:00 intuitive search experience Microsoft 365 co-pilot is designed to be a partner
135:03 365 co-pilot is designed to be a partner in your workflow integrated with
135:05 in your workflow integrated with productivity and communication tools
135:07 productivity and communication tools like PowerPoint and Outlook it's there
135:09 like PowerPoint and Outlook it's there to help you craft effective documents
135:11 to help you craft effective documents design spreadsheets put together
135:12 design spreadsheets put together presentations manage emails and
135:14 presentations manage emails and streamline other tasks GitHub co-pilot
135:17 streamline other tasks GitHub co-pilot is tool that helps software developers
135:19 is tool that helps software developers offering real-time assistance as they
135:21 offering real-time assistance as they write code it offers more than
135:23 write code it offers more than suggesting code Snippets it can help in
135:25 suggesting code Snippets it can help in Thoroughly documenting the code for
135:26 Thoroughly documenting the code for better understanding and maintenance
135:29 better understanding and maintenance additionally co-pilot contributes to the
135:31 additionally co-pilot contributes to the development process by providing support
135:32 development process by providing support for testing code ensuring that
135:34 for testing code ensuring that developers can work more efficiently and
135:36 developers can work more efficiently and with fewer errors so that's an overview
135:39 with fewer errors so that's an overview of
135:40 of [Music]
135:43 [Music] co-pilot hey this is Andrew Brown from
135:46 co-pilot hey this is Andrew Brown from exam Pro and the next topic will be
135:48 exam Pro and the next topic will be covering is prompt engineering prompt
135:50 covering is prompt engineering prompt engineering is a process that improves
135:52 engineering is a process that improves the interaction between humans and
135:53 the interaction between humans and generative AI it involves refining the
135:55 generative AI it involves refining the props or instructions given to an AI
135:57 props or instructions given to an AI application to generate higher quality
135:59 application to generate higher quality responses this process is valuable for
136:02 responses this process is valuable for both the developers who create AI driven
136:04 both the developers who create AI driven applications and the end users who
136:05 applications and the end users who interact with them for example
136:08 interact with them for example developers May build a generative AI
136:09 developers May build a generative AI application for teachers to create
136:11 application for teachers to create multiple choice questions related to
136:13 multiple choice questions related to text students read during the
136:14 text students read during the development of the application
136:16 development of the application developers can add other rules for what
136:17 developers can add other rules for what the program should do with the prompts
136:19 the program should do with the prompts it receives system messages prompt
136:22 it receives system messages prompt engineering techniques include defining
136:23 engineering techniques include defining a system message the message sets the
136:25 a system message the message sets the context for the model by describing
136:27 context for the model by describing expectations and constraints for example
136:30 expectations and constraints for example you're a helpful assistant that responds
136:32 you're a helpful assistant that responds in a cheerful friendly manner these
136:34 in a cheerful friendly manner these system messages determine constraints
136:35 system messages determine constraints and styles for the model's responses
136:38 and styles for the model's responses writing good prompts to maximize the
136:40 writing good prompts to maximize the utility of AI responses it is essential
136:42 utility of AI responses it is essential to be precise and explicit in your props
136:44 to be precise and explicit in your props a well structured propt such as create a
136:47 a well structured propt such as create a list of 10 things things to do in
136:48 list of 10 things things to do in Edinburgh during August directs the AI
136:50 Edinburgh during August directs the AI to produce a targeted and relevant
136:51 to produce a targeted and relevant output achieving better results zero
136:54 output achieving better results zero shot learning refers to an AI model's
136:56 shot learning refers to an AI model's ability to correctly perform a task
136:58 ability to correctly perform a task without any prior examples or training
137:00 without any prior examples or training on that specific task one shot learning
137:02 on that specific task one shot learning involves the AI model learning from a
137:04 involves the AI model learning from a single example or instance to perform a
137:06 single example or instance to perform a task here is an example of prompt
137:08 task here is an example of prompt engineering with a user query and system
137:10 engineering with a user query and system response so the user inputs can my
137:13 response so the user inputs can my camera handle the rainy season if I go
137:14 camera handle the rainy season if I go to the Amazon rainforest next week some
137:17 to the Amazon rainforest next week some The Prompt engineering components could
137:19 The Prompt engineering components could be the weather resistance feature check
137:21 be the weather resistance feature check users equipment database rainforest
137:23 users equipment database rainforest climate data product specifications
137:25 climate data product specifications travel tips for photographers etc for
137:28 travel tips for photographers etc for the llm processing the AI system
137:30 the llm processing the AI system integrates the user question with data
137:32 integrates the user question with data about the Amazon's climate specifically
137:34 about the Amazon's climate specifically during the rainy season and the product
137:36 during the rainy season and the product information about the camera's weather
137:37 information about the camera's weather resistance features it also references a
137:40 resistance features it also references a database of the user's equipment to
137:41 database of the user's equipment to ensure it's talking about the correct
137:43 ensure it's talking about the correct item and may include travel tips that
137:45 item and may include travel tips that are useful for photographers heading to
137:46 are useful for photographers heading to similar climate
137:48 similar climate and the output results in your current
137:50 and the output results in your current camera model the proot markv is designed
137:52 camera model the proot markv is designed with a weather sealed body suitable for
137:54 with a weather sealed body suitable for high humidity and Rain conditions which
137:56 high humidity and Rain conditions which matches the expected weather in the
137:57 matches the expected weather in the Amazon rainforest for next week however
138:00 Amazon rainforest for next week however for added protection during heavy rains
138:02 for added protection during heavy rains consider using a rain cover next let's
138:05 consider using a rain cover next let's take a look at the prompt engineering
138:06 take a look at the prompt engineering workflow this image describes a
138:09 workflow this image describes a simplified step process for working with
138:10 simplified step process for working with AI models and prompt engineering why
138:13 AI models and prompt engineering why task understanding know what you want
138:15 task understanding know what you want the AI to do two craft prompts write
138:18 the AI to do two craft prompts write instructions for the AI three prompt
138:21 instructions for the AI three prompt alignment make sure instructions match
138:23 alignment make sure instructions match what the AI can do for optimizing prompt
138:26 what the AI can do for optimizing prompt improve the instructions for better AI
138:28 improve the instructions for better AI responses five AI model processing the
138:31 responses five AI model processing the AI thinks about the instructions six
138:34 AI thinks about the instructions six generating output the AI gives an answer
138:36 generating output the AI gives an answer or result seven output refinement fix or
138:39 or result seven output refinement fix or tweak the I's answer hey to iterative
138:42 tweak the I's answer hey to iterative improvement keep improving the
138:43 improvement keep improving the instructions and answers so that's an
138:46 instructions and answers so that's an overview of prompt engineering
138:48 overview of prompt engineering [Music]
138:52 [Music] the next topic we'll be covering is
138:54 the next topic we'll be covering is grounding grounding impr prompt
138:55 grounding grounding impr prompt engineering is a technique used in large
138:57 engineering is a technique used in large language models where you provide
138:59 language models where you provide specific relevant context within a
139:01 specific relevant context within a prompt this helps the AI to produce a
139:03 prompt this helps the AI to produce a more accurate and related response for
139:05 more accurate and related response for example if you want an llm to summarize
139:07 example if you want an llm to summarize an email you would include the actual
139:09 an email you would include the actual email text in the prompt along with a
139:10 email text in the prompt along with a command to summarize it this approach
139:12 command to summarize it this approach allows you to Leverage The llm for tasks
139:14 allows you to Leverage The llm for tasks it wasn't explicitly trained on without
139:16 it wasn't explicitly trained on without the need for training the model so
139:19 the need for training the model so what's the difference between prompt
139:20 what's the difference between prompt engineering and grounding prompt
139:22 engineering and grounding prompt engineering broadly refers to the art of
139:24 engineering broadly refers to the art of crafting effective prps to produce the
139:26 crafting effective prps to produce the desired output from an AI model
139:28 desired output from an AI model grounding specifically involves
139:29 grounding specifically involves enriching prps with relevant context to
139:31 enriching prps with relevant context to improve the model's understanding and
139:33 improve the model's understanding and responses grounding ensures the AI has
139:36 responses grounding ensures the AI has enough information to process the prop
139:38 enough information to process the prop correctly whereas prop engineering can
139:40 correctly whereas prop engineering can also include techniques like format
139:42 also include techniques like format style and the Strategic use of examples
139:44 style and the Strategic use of examples or questions to guide the AI the image
139:46 or questions to guide the AI the image outlines of framework for grounding
139:48 outlines of framework for grounding options in prompt engineering within the
139:50 options in prompt engineering within the context of large language models
139:52 context of large language models grounding options these are techniques
139:54 grounding options these are techniques to ensure llm outputs are accurate and
139:56 to ensure llm outputs are accurate and adhere to responsible AI principles
139:58 adhere to responsible AI principles prompt engineering placed at the top
140:00 prompt engineering placed at the top indicating its broad applicability this
140:02 indicating its broad applicability this involves designing prompts to direct the
140:04 involves designing prompts to direct the AI toward generating the desired output
140:07 AI toward generating the desired output fine-tuning a step below in complexity
140:09 fine-tuning a step below in complexity where llms are trained on specific data
140:11 where llms are trained on specific data to improve their task performance
140:13 to improve their task performance training the most resource intensive
140:15 training the most resource intensive process at the triangle base suggesting
140:17 process at the triangle base suggesting it's used in more extensive
140:19 it's used in more extensive customization needs llm Ops and
140:21 customization needs llm Ops and responsible AI these foundational
140:23 responsible AI these foundational aspects emphasize the importance of
140:25 aspects emphasize the importance of operational efficiency and ethical
140:27 operational efficiency and ethical standards across all stages of LL and
140:29 standards across all stages of LL and application development so that's an
140:31 application development so that's an overview of
140:33 overview of [Music]
140:36 [Music] grounding hey this is Andrew Brown from
140:39 grounding hey this is Andrew Brown from exampro and in this demo we'll be going
140:40 exampro and in this demo we'll be going over a short demo on what you can do
140:42 over a short demo on what you can do with co-pilot with gp4 on Microsoft bang
140:45 with co-pilot with gp4 on Microsoft bang so to get here you'll need to search for
140:47 so to get here you'll need to search for something like co-pilot Bing and click
140:49 something like co-pilot Bing and click on the TR copilot and you should be able
140:51 on the TR copilot and you should be able to access this page so on here you have
140:54 to access this page so on here you have some suggested or popular prompts that
140:56 some suggested or popular prompts that people commonly use such as create an
140:58 people commonly use such as create an image of a concept kitchen generate
141:00 image of a concept kitchen generate ideas for whacking new products how
141:03 ideas for whacking new products how would you explain AI to a sixth grader
141:05 would you explain AI to a sixth grader WR python code to calculate all the
141:07 WR python code to calculate all the different flavor combination as for my
141:09 different flavor combination as for my ice cream parlor and so on you can
141:12 ice cream parlor and so on you can choose the conversation style ranging
141:13 choose the conversation style ranging from more creative for more original and
141:16 from more creative for more original and imaginative ideas more balanced or more
141:18 imaginative ideas more balanced or more precise for more factual information
141:21 precise for more factual information we'll be going with somewhere in the
141:22 we'll be going with somewhere in the middle so more balanced just for this
141:24 middle so more balanced just for this example on the bottom here you can type
141:27 example on the bottom here you can type in any prompt you want so for example we
141:30 in any prompt you want so for example we can type something simple like summarize
141:32 can type something simple like summarize the main differences between supervised
141:34 the main differences between supervised and unsupervised learning for the AI 900
141:36 and unsupervised learning for the AI 900 exam you'll see that it will start
141:38 exam you'll see that it will start generating an answer for you so for
141:41 generating an answer for you so for supervised learning data labeling in
141:43 supervised learning data labeling in supervised learning the training data is
141:45 supervised learning the training data is pre-labeled with the correct output
141:46 pre-labeled with the correct output values
141:47 values and it provides other objectives and
141:49 and it provides other objectives and examples as well and for unsupervised
141:52 examples as well and for unsupervised learning No Labels unsupervised learning
141:54 learning No Labels unsupervised learning operates without labeled data it seeks
141:56 operates without labeled data it seeks to discover patterns structures or
141:58 to discover patterns structures or relationships within the raw data notice
142:01 relationships within the raw data notice how it uses sources from the internet
142:03 how it uses sources from the internet and if you want to learn more you can
142:04 and if you want to learn more you can click on these links that it provides to
142:06 click on these links that it provides to directly go to the source of the
142:07 directly go to the source of the information which is very convenient so
142:10 information which is very convenient so let's quickly check one out and it seems
142:12 let's quickly check one out and it seems like the information we got was pretty
142:14 like the information we got was pretty good and
142:16 good and credible and on the bottom it also
142:18 credible and on the bottom it also provides us some suggestions for
142:20 provides us some suggestions for follow-up questions you may want to ask
142:21 follow-up questions you may want to ask in the future that is related to the
142:23 in the future that is related to the previous
142:24 previous PRT another cool feature of co-pilot is
142:27 PRT another cool feature of co-pilot is that it's integrated with dolly3 which
142:29 that it's integrated with dolly3 which is a image generation service so for
142:32 is a image generation service so for example you can say something like
142:34 example you can say something like create an image of a cute dog running
142:36 create an image of a cute dog running through a green field on a sunny
142:38 through a green field on a sunny day so now you'll have to wait a little
142:40 day so now you'll have to wait a little bit for it to generate the image that
142:42 bit for it to generate the image that you described in your
142:47 prompt and there we go we have an adorable little puppy running through
142:49 adorable little puppy running through the
142:56 fields you also have the power to modify images if you're not satisfied with the
142:57 images if you're not satisfied with the result so they've provided a few options
143:00 result so they've provided a few options for you here so for example we can add a
143:02 for you here so for example we can add a rainbow in the background change it into
143:04 rainbow in the background change it into a cat or make the sky pink and purple
143:07 a cat or make the sky pink and purple let's try changing it to a cat so it's
143:10 let's try changing it to a cat so it's going to generate and change it from a
143:11 going to generate and change it from a dog to a cat and there we go it's now a
143:14 dog to a cat and there we go it's now a cute little cat running through the
143:16 cute little cat running through the field
143:17 field you could also write code using
143:20 you could also write code using co-pilot so for example I can type in
143:23 co-pilot so for example I can type in write a python function to check if a
143:25 write a python function to check if a given number is
143:26 given number is prime it'll start generating a piece of
143:29 prime it'll start generating a piece of code for
143:35 me it can write code in multiple languages not just python so let's try
143:37 languages not just python so let's try out something with
143:39 out something with JavaScript let's try create a JavaScript
143:42 JavaScript let's try create a JavaScript function to reverse a
143:47 string and of of course we'll need to wait for the code to
143:53 generate so there we go here's our code for the function to reverse a string
143:55 for the function to reverse a string just as we asked for it so that's a
143:57 just as we asked for it so that's a really quick and general demo for
143:59 really quick and general demo for co-pilot with
144:01 co-pilot with [Music]
144:05 [Music] gp4 hey this is Andrew Brown from exam
144:08 gp4 hey this is Andrew Brown from exam Pro and in this follow along we're going
144:09 Pro and in this follow along we're going to set up a studio with an Azure machine
144:12 to set up a studio with an Azure machine learning uh Service uh so that it will
144:14 learning uh Service uh so that it will be the basis for all the follow alongs
144:16 be the basis for all the follow alongs here so I want you to do is go all the
144:18 here so I want you to do is go all the way to the top here and type in Azure
144:20 way to the top here and type in Azure machine learning and you're looking for
144:22 machine learning and you're looking for this one that looks like a science uh
144:24 this one that looks like a science uh bottle here and we'll go ahead and
144:26 bottle here and we'll go ahead and create ourselves our machine learning uh
144:29 create ourselves our machine learning uh studio and so I'll create a new one here
144:31 studio and so I'll create a new one here and I'll just say um my
144:36 and I'll just say um my studio and we'll hit okay and we'll name
144:39 studio and we'll hit okay and we'll name the workpace so we'll say my uh work
144:52 we'll maybe say ml workplace here uh for containers there are none so
144:54 here uh for containers there are none so it'll create all that stuff for us I'll
144:55 it'll create all that stuff for us I'll hit
144:57 hit create and
145:03 create and so what we're going to do here is just wait for that creation
145:06 here is just wait for that creation okay all right so after a short little
145:08 okay all right so after a short little wait there it looks like our studio set
145:09 wait there it looks like our studio set up so we'll go to that resource launch
145:11 up so we'll go to that resource launch the studio and we are now in so uh
145:15 the studio and we are now in so uh there's a lot of stuff in here but
145:16 there's a lot of stuff in here but generally the first thing you'll ever
145:17 generally the first thing you'll ever want to do is get yourself a notebook
145:19 want to do is get yourself a notebook going so in the top left corner I'm
145:20 going so in the top left corner I'm going to go to notebooks and what we'll
145:22 going to go to notebooks and what we'll need to do is load some files in here
145:25 need to do is load some files in here now they do have some sample files like
145:27 now they do have some sample files like how to use uh Azure ml so if we just
145:30 how to use uh Azure ml so if we just quickly go through here um you know
145:33 quickly go through here um you know maybe we'll want to look at something
145:35 maybe we'll want to look at something like uh Ms nist here and we'll go ahead
145:37 like uh Ms nist here and we'll go ahead and open this
145:39 and open this one and maybe we'll just go ahead and
145:41 one and maybe we'll just go ahead and clone uh this and we'll just clone it
145:44 clone uh this and we'll just clone it over
145:45 over here
145:48 here okay and the idea is that we want to get
145:50 okay and the idea is that we want to get this notebook running and so notebooks
145:52 this notebook running and so notebooks have to be backed by some kind of
145:54 have to be backed by some kind of compute so up here it says no compute
145:56 compute so up here it says no compute found and Etc so what we can do here and
145:59 found and Etc so what we can do here and I'm just going to go back to my files oh
146:00 I'm just going to go back to my files oh it went back there for me but what I'm
146:02 it went back there for me but what I'm going to do is go all the way down
146:04 going to do is go all the way down actually I'll just expand this up here
146:05 actually I'll just expand this up here makes it a bit easier close this tab out
146:08 makes it a bit easier close this tab out but uh what we'll do is go down to
146:09 but uh what we'll do is go down to compute and here we have our four types
146:11 compute and here we have our four types of comput so compute instances is when
146:13 of comput so compute instances is when we're running notebooks compute clusters
146:15 we're running notebooks compute clusters is when we're doing training and
146:16 is when we're doing training and inference clusters is when we have uh a
146:19 inference clusters is when we have uh a inference pipeline uh and then attach
146:22 inference pipeline uh and then attach computer is bringing uh things like hdn
146:24 computer is bringing uh things like hdn sites or data bricks into here but for
146:26 sites or data bricks into here but for compute instances is what we need we'll
146:28 compute instances is what we need we'll get ahead and go new you'll notice we
146:30 get ahead and go new you'll notice we have the option between CPU and GPU GPU
146:32 have the option between CPU and GPU GPU is much more expensive see it's like 90
146:34 is much more expensive see it's like 90 cents per hour for a notebook we do not
146:37 cents per hour for a notebook we do not need anything uh super powerful notice
146:39 need anything uh super powerful notice it'll say here development on notebooks
146:41 it'll say here development on notebooks IDs lightweight testing here it says
146:43 IDs lightweight testing here it says classical ml model training autom ml
146:45 classical ml model training autom ml pipelines Etc
146:47 pipelines Etc so I want to make this a bit cheaper for
146:49 so I want to make this a bit cheaper for us here uh because we're going to be
146:51 us here uh because we're going to be using the notebook to run uh cognitive
146:54 using the notebook to run uh cognitive services and those cost next to nothing
146:56 services and those cost next to nothing like they don't take much compute power
146:58 like they don't take much compute power uh and for some other ones we might do
146:59 uh and for some other ones we might do something a bit larger for this this is
147:01 something a bit larger for this this is good enough so I'll go ahead and hit
147:02 good enough so I'll go ahead and hit next I'm just going to say my uh
147:05 next I'm just going to say my uh notebook uh instance
147:07 notebook uh instance here we'll go ahead and hit
147:09 here we'll go ahead and hit create and so we're just going to have
147:11 create and so we're just going to have to wait for that to finish creating and
147:13 to wait for that to finish creating and running and when it is I'll see you back
147:15 running and when it is I'll see you back here in a moment all right so after a
147:17 here in a moment all right so after a short little wait there it looks like
147:19 short little wait there it looks like our server is running and you can even
147:20 our server is running and you can even see here it shows you you can launch in
147:22 see here it shows you you can launch in jupyter Labs Jupiter vs code R Studio or
147:25 jupyter Labs Jupiter vs code R Studio or The Terminal but what I'm going to do is
147:27 The Terminal but what I'm going to do is go back all the way to our notebooks
147:29 go back all the way to our notebooks just so we have some consistency here I
147:30 just so we have some consistency here I want you to notice that it's now running
147:32 want you to notice that it's now running on this compute if it's not you can go
147:34 on this compute if it's not you can go ahead and select it uh and it also
147:36 ahead and select it uh and it also loaded in Python 3.6 there is 3.8 right
147:39 loaded in Python 3.6 there is 3.8 right now it's not a big deal which one you
147:41 now it's not a big deal which one you use um but that is the kernel like how
147:43 use um but that is the kernel like how it will run this stuff now this is all
147:45 it will run this stuff now this is all interesting but I don't want to uh run
147:47 interesting but I don't want to uh run this right now what I want to do is get
147:49 this right now what I want to do is get those cognitive Services uh into here so
147:53 those cognitive Services uh into here so what we can do is just go up here and
147:56 what we can do is just go up here and we'll choose editors and edit in Jupiter
147:59 we'll choose editors and edit in Jupiter lab and what that should do is open up a
148:02 lab and what that should do is open up a new tab
148:03 new tab here uh is it
148:06 here uh is it opening if it's not opening what we can
148:08 opening if it's not opening what we can do is go to compute sometimes it's a bit
148:10 do is go to compute sometimes it's a bit more responsive if we just click there
148:11 more responsive if we just click there it's the same way of getting to it um I
148:14 it's the same way of getting to it um I don't know why but just sometimes that
148:15 don't know why but just sometimes that link doesn't work uh when you're in the
148:16 link doesn't work uh when you're in the notebook and what we can do is well
148:19 notebook and what we can do is well we're in here now we can see that this
148:20 we're in here now we can see that this is where uh this example project is okay
148:25 is where uh this example project is okay um but what we want to do is get those
148:27 um but what we want to do is get those cognitive services in here so I don't
148:30 cognitive services in here so I don't know if I showed it to you yet but I
148:31 know if I showed it to you yet but I have a repository I just got to go find
148:34 have a repository I just got to go find it it's somewhere on my
148:36 it it's somewhere on my screen um here it is okay so I have a
148:38 screen um here it is okay so I have a repo called the free a a uh the free a
148:42 repo called the free a a uh the free a it should be AI 900 I think I'll go
148:45 it should be AI 900 I think I'll go ahead and change that or that is going
148:47 ahead and change that or that is going to get
148:49 to get confusing okay so what I want you to do
148:52 confusing okay so what I want you to do here is um we'll get this loaded in so
148:55 here is um we'll get this loaded in so this is a public directory I'm just
148:56 this is a public directory I'm just thinking there's a couple ways we can do
148:58 thinking there's a couple ways we can do it we can go and uh use the terminal to
149:00 it we can go and uh use the terminal to grab it what I'm going to do is I'm just
149:02 grab it what I'm going to do is I'm just going to go download the
149:09 zip and this is just one of the easiest ways to install it and we need um to
149:11 ways to install it and we need um to place it somewhere so here are my
149:13 place it somewhere so here are my downloads and I'm just going to drag it
149:16 downloads and I'm just going to drag it out here
149:18 out here okay and uh what we'll do is upload that
149:21 okay and uh what we'll do is upload that there so I can't remember if it lets you
149:23 there so I can't remember if it lets you upload entire folders we'll give it a go
149:25 upload entire folders we'll give it a go see if it lets us maybe rename this to
149:27 see if it lets us maybe rename this to the free a or AI 900 there we'll say
149:31 the free a or AI 900 there we'll say open uh yeah so it's individual file so
149:34 open uh yeah so it's individual file so it's not that big of a deal but we can
149:35 it's not that big of a deal but we can go and ahead and select it like
149:39 go and ahead and select it like that and maybe we'll just make a new
149:41 that and maybe we'll just make a new folder in here we'll say this cognitive
149:43 folder in here we'll say this cognitive [Music]
149:45 [Music] services
149:54 okay and uh what we'll do here is just keep on uploading some stuff so we have
150:06 assets so I have a couple loose Falls there and I know we have a
150:10 there and I know we have a crew oops we'll have
150:13 crew oops we'll have crew
150:15 crew oops is not as responsive
150:18 oops is not as responsive um we want
150:20 um we want OCR uh I believe we have one called
150:23 OCR uh I believe we have one called movie uh
150:25 movie uh reviews so we'll go into OCR here and
150:29 reviews so we'll go into OCR here and upload the files that we
150:31 upload the files that we have so we have a few files
150:34 have so we have a few files there and we'll go back a directory here
150:38 there and we'll go back a directory here and I know movie uh reviews are just
150:40 and I know movie uh reviews are just static
151:03 objects and then we'll go back and to crew and we need a folder called
151:06 crew and we need a folder called Warf a folder called
151:09 Warf a folder called Crusher a folder called data and so for
151:13 Crusher a folder called data and so for each of these we have some
151:15 each of these we have some images
151:18 images I think we're on Warf right yep we are
151:20 I think we're on Warf right yep we are okay great so we will quickly upload all
151:28 these well technically we don't really need to upload any of these well these
151:30 need to upload any of these well these images we don't but I'm going to put
151:31 images we don't but I'm going to put them here anyway I just remember that uh
151:34 them here anyway I just remember that uh these we just upload directly to the
151:36 these we just upload directly to the service but because I'm already doing
151:38 service but because I'm already doing anyway I'm just going to put them here
151:39 anyway I'm just going to put them here even though we're not going to do
151:40 even though we're not going to do anything with
151:51 all right and so now we are all set up to do some cognitive services so I'll
151:54 to do some cognitive services so I'll see you in the next video all right so
151:56 see you in the next video all right so now that we have our work environment
151:57 now that we have our work environment set up what we can do is go ahead and
152:00 set up what we can do is go ahead and get cognitive Services hooked up because
152:02 get cognitive Services hooked up because um we need that service in order to
152:04 um we need that service in order to interact with it because if we open up
152:06 interact with it because if we open up any of these you're going to notice we
152:07 any of these you're going to notice we have a cognitive key and endpoint that
152:09 have a cognitive key and endpoint that we're going to need so what I want you
152:11 we're going to need so what I want you to do is go back to your Azure portal
152:15 to do is go back to your Azure portal and at the top here we'll type in
152:17 and at the top here we'll type in cognitive
152:18 cognitive Services now the thing is is that all
152:20 Services now the thing is is that all these services are individualized but at
152:22 these services are individualized but at some point they did group them together
152:24 some point they did group them together and you're able to use them through
152:25 and you're able to use them through unified um key and API Point that's what
152:28 unified um key and API Point that's what this is and that's the way we're going
152:29 this is and that's the way we're going to do it so we'll say
152:32 to do it so we'll say add and uh it brought us to the
152:34 add and uh it brought us to the marketplace so I'm just going to type in
152:46 services and then just click this one here here and we'll hit
152:48 here here and we'll hit create and uh we'll make a new one here
152:50 create and uh we'll make a new one here I'm just going to call my uh Cog
152:53 I'm just going to call my uh Cog Services say Okay um I prefer to be in
152:57 Services say Okay um I prefer to be in Us East I will leave in US West it's
152:59 Us East I will leave in US West it's fine and so in here we'll just say my
153:02 fine and so in here we'll just say my Cog
153:04 Cog services and if it doesn't like that
153:06 services and if it doesn't like that I'll just put some numbers in there we
153:08 I'll just put some numbers in there we go we'll do standard so we will be
153:11 go we'll do standard so we will be charged something for that let's go take
153:12 charged something for that let's go take a look at the
153:15 a look at the pricing
153:17 pricing so you can see that the pricing is uh
153:19 so you can see that the pricing is uh quite variable here but uh it's like
153:21 quite variable here but uh it's like you'd have to do a thousand transactions
153:22 you'd have to do a thousand transactions before you are build uh so I think we're
153:25 before you are build uh so I think we're going to be okay for billing uh we'll
153:28 going to be okay for billing uh we'll checkbox this here we'll go down below
153:30 checkbox this here we'll go down below it's telling us about responsible AI
153:31 it's telling us about responsible AI notice uh sometimes services will
153:33 notice uh sometimes services will actually have you checkbox it but in
153:34 actually have you checkbox it but in this case it just tells us
153:50 and I don't believe this took very long so we'll give it a second here yep it's
153:52 so we'll give it a second here yep it's all deployed so we'll go to this
153:53 all deployed so we'll go to this resource here and what we're looking for
153:55 resource here and what we're looking for are our keys and end
153:58 are our keys and end points uh and so we have two keys and
154:00 points uh and so we have two keys and two end points we only need a single key
154:02 two end points we only need a single key so I'm going to copy this endpoint over
154:04 so I'm going to copy this endpoint over we're going to go over to Jupiter lab
154:06 we're going to go over to Jupiter lab and I'm just going to paste this in here
154:08 and I'm just going to paste this in here I'm just going to put it in all the ones
154:10 I'm just going to put it in all the ones that need it so this one needs
154:12 that need it so this one needs one this one needs
154:15 one this one needs one this one needs
154:18 one this one needs one and this one needs
154:20 one and this one needs one and we will show the key here I
154:24 one and we will show the key here I guess it doesn't show but it copies of
154:26 guess it doesn't show but it copies of course I will end up deleting my key
154:27 course I will end up deleting my key before you ever see it but this is
154:29 before you ever see it but this is something you don't want to share
154:31 something you don't want to share publicly and usually you don't want to
154:33 publicly and usually you don't want to embed Keys directly into a notebook but
154:35 embed Keys directly into a notebook but uh this is the only way to do it so it's
154:37 uh this is the only way to do it so it's just how it is with Azure um so yeah all
154:41 just how it is with Azure um so yeah all our keys are installed going back to the
154:43 our keys are installed going back to the cognitive Services uh nothing super
154:45 cognitive Services uh nothing super exciting here but but it does tell us
154:47 exciting here but but it does tell us what services work with it you'll see
154:49 what services work with it you'll see there's an aster beside custom Vision
154:50 there's an aster beside custom Vision because we're going to access that
154:51 because we're going to access that through another app um but uh yeah
154:54 through another app um but uh yeah cognitive servic is all set up and so
154:56 cognitive servic is all set up and so that means we are ready to uh start
154:58 that means we are ready to uh start doing some of these Labs
155:00 doing some of these Labs [Music]
155:03 [Music] okay all right so let's take a look here
155:06 okay all right so let's take a look here at computer vision first and computer
155:07 at computer vision first and computer vision is actually used for a variety of
155:09 vision is actually used for a variety of different Services as you will see it's
155:11 different Services as you will see it's kind of an umbrella for a lot of
155:13 kind of an umbrella for a lot of different things but the one in
155:14 different things but the one in particular that we're looking at here is
155:16 particular that we're looking at here is to describe image in stream if we go
155:18 to describe image in stream if we go over here to the documentation this
155:20 over here to the documentation this operation generates description of image
155:22 operation generates description of image in a human reable language with complete
155:24 in a human reable language with complete sentences the description is based on a
155:26 sentences the description is based on a collection of content tags which also
155:28 collection of content tags which also returned by the operation okay so let's
155:30 returned by the operation okay so let's go see what that looks like in action so
155:32 go see what that looks like in action so the first thing is is that um we need to
155:34 the first thing is is that um we need to install this Azure cognitive Services
155:36 install this Azure cognitive Services Vision computer vision now we do have a
155:38 Vision computer vision now we do have a kernel and these aren't installed by
155:40 kernel and these aren't installed by default they're not part of the um uh
155:44 default they're not part of the um uh machine learning uh the Azure machine
155:46 machine learning uh the Azure machine learning uh SDK for python I believe
155:49 learning uh SDK for python I believe that's pre-installed but uh these AI
155:52 that's pre-installed but uh these AI services are not so what we'll do is go
155:53 services are not so what we'll do is go ahead and run it this way and you'll
155:55 ahead and run it this way and you'll notice where it says pip install that's
155:56 notice where it says pip install that's how it knows to install and once that is
155:59 how it knows to install and once that is done we'll go run our requirements here
156:01 done we'll go run our requirements here so we have the OS which is for usually
156:04 so we have the OS which is for usually handling op like OS layer stuff we have
156:07 handling op like OS layer stuff we have matte matte plot lib which is to
156:10 matte matte plot lib which is to visually plot things and we're going to
156:12 visually plot things and we're going to use that to show images and draw borders
156:14 use that to show images and draw borders we need to handle images is I'm not sure
156:17 we need to handle images is I'm not sure if we're using numpy here but I have
156:18 if we're using numpy here but I have numpy loaded and then here we have the
156:21 numpy loaded and then here we have the Azure cognitive Services Vision computer
156:23 Azure cognitive Services Vision computer vision we're going to load the client
156:25 vision we're going to load the client and then we have the credentials and
156:27 and then we have the credentials and these are generic credentials for the
156:28 these are generic credentials for the cognitive Services credentials it's
156:30 cognitive Services credentials it's commonly used for most of these services
156:32 commonly used for most of these services and some exceptions they the apis do not
156:34 and some exceptions they the apis do not support them yet but I imagine they will
156:36 support them yet but I imagine they will in the future so just notice that when
156:38 in the future so just notice that when we run something it will show a number
156:39 we run something it will show a number if there's an aster it means it hasn't
156:41 if there's an aster it means it hasn't ran yet so I'll go ahead and hit play up
156:42 ran yet so I'll go ahead and hit play up here so it goes an aster and we get a
156:44 here so it goes an aster and we get a two and we'll go ahe head and hit play
156:47 two and we'll go ahe head and hit play again and now those are loaded in and so
156:49 again and now those are loaded in and so we'll go ahead and hit
156:51 we'll go ahead and hit play okay so here we've just packaged
156:54 play okay so here we've just packaged our credentials together so we passed
156:55 our credentials together so we passed our key into here and then we'll now
156:58 our key into here and then we'll now load in the client uh and so we'll pass
157:01 load in the client uh and so we'll pass our endpoint and our key okay so we hit
157:04 our endpoint and our key okay so we hit play and so now we just want to load our
157:06 play and so now we just want to load our image so here we're loading assets
157:08 image so here we're loading assets data.jpg let's just make sure that that
157:11 data.jpg let's just make sure that that is there so we go assets and there it is
157:13 is there so we go assets and there it is and we're going to load it as a stream
157:15 and we're going to load it as a stream because you have to pass streams along
157:16 because you have to pass streams along so we'll hit play and you'll see that it
157:19 so we'll hit play and you'll see that it now ran and so now we'll go ahead and
157:21 now ran and so now we'll go ahead and make that
157:23 make that call okay great and so we're getting
157:25 call okay great and so we're getting some data back and notice we have some
157:27 some data back and notice we have some properties person wall indoor man
157:29 properties person wall indoor man pointing captions it's not showing all
157:32 pointing captions it's not showing all the information sometimes you have to
157:33 the information sometimes you have to extract it out but we'll take a look
157:35 extract it out but we'll take a look here so uh this is a way of showing mat
157:37 here so uh this is a way of showing mat pla lib in line I don't think we have to
157:39 pla lib in line I don't think we have to run it here but I have it in here anyway
157:41 run it here but I have it in here anyway and so what it's going to do is it's
157:43 and so what it's going to do is it's going to um show us the image right so
157:46 going to um show us the image right so it's going to print us the image and
157:48 it's going to print us the image and it's going to grab whatever caption is
157:50 it's going to grab whatever caption is returned so see how there's captions so
157:53 returned so see how there's captions so we're going to iterate through the
157:54 we're going to iterate through the captions it's going to give us a
157:56 captions it's going to give us a confidence score saying it thinks it's
157:58 confidence score saying it thinks it's this so let's see what it comes out
158:01 this so let's see what it comes out with okay and so here it says Brent
158:03 with okay and so here it says Brent spider Spiner looking at a camera so
158:05 spider Spiner looking at a camera so that is the actor who plays data on Star
158:07 that is the actor who plays data on Star Trek has a confidence score of 57. 45%
158:10 Trek has a confidence score of 57. 45% even though it's 100% correct uh they
158:13 even though it's 100% correct uh they probably don't know contextual things
158:14 probably don't know contextual things like um uh in the sense of like pop
158:17 like um uh in the sense of like pop culture like they don't know probably
158:18 culture like they don't know probably start Tre characters but they're going
158:20 start Tre characters but they're going to be able to identify celebrities
158:21 to be able to identify celebrities because it's in their database so that
158:23 because it's in their database so that is um uh uh the first introduction to
158:27 is um uh uh the first introduction to computer computer vision there but the
158:29 computer computer vision there but the key things you want to remember here is
158:30 key things you want to remember here is that we use this describe an image
158:32 that we use this describe an image stream uh and that we get this
158:34 stream uh and that we get this confidence score and we get this
158:36 confidence score and we get this contextual information okay and so
158:38 contextual information okay and so that's the first one we'll move on to um
158:40 that's the first one we'll move on to um maybe custom Vision
158:46 next all right so let's take a look at custom
158:48 all right so let's take a look at custom Vision so we can do some um
158:50 Vision so we can do some um classification and object detection so
158:53 classification and object detection so um the thing is is that it's possible
158:57 um the thing is is that it's possible it's possible to launch custom Vision
158:59 it's possible to launch custom Vision through the marketplace so if we go
159:00 through the marketplace so if we go we're not going to do it this way if you
159:02 we're not going to do it this way if you type in custom Vision it never shows up
159:04 type in custom Vision it never shows up here but if you go to the marketplace
159:05 here but if you go to the marketplace here and type in custom
159:08 here and type in custom vision and you go here you can create it
159:11 vision and you go here you can create it this way but the way I like to do it I
159:13 this way but the way I like to do it I think it's a lot easier to do is we go
159:15 think it's a lot easier to do is we go up the top here and type in custom
159:17 up the top here and type in custom vision. and you'll come to this website
159:19 vision. and you'll come to this website and what you'll do is go ahead and sign
159:21 and what you'll do is go ahead and sign in it's going to connect to your Azure
159:22 in it's going to connect to your Azure account and once you're in you can go
159:24 account and once you're in you can go ahead here and create a new project so
159:25 ahead here and create a new project so the first one here is I'm just going to
159:27 the first one here is I'm just going to call this the Star Trek crew we're going
159:29 call this the Star Trek crew we're going to use this to identify different Star
159:31 to use this to identify different Star Trek members we'll go down here and uh
159:34 Trek members we'll go down here and uh we haven't yet created a resource so
159:35 we haven't yet created a resource so we'll go create
159:37 we'll go create new my custom Vision
159:43 resource we'll drop this down we'll put this in our Cog services uh we'll go
159:47 this in our Cog services uh we'll go stick with um Us West as much as we can
159:50 stick with um Us West as much as we can here we have fo Ando fo is blocked up
159:53 here we have fo Ando fo is blocked up for me so just choose so I think fo is
159:55 for me so just choose so I think fo is the free tier but I don't get
160:02 it and um once we're back here we'll go down below and choose our standard and
160:03 down below and choose our standard and we're going to have a lot of options
160:05 we're going to have a lot of options here so we have between classification
160:06 here so we have between classification and object detection so classification
160:09 and object detection so classification is when you have an image and you just
160:10 is when you have an image and you just want to say what what is this image
160:13 want to say what what is this image right and so we have two modes where we
160:15 right and so we have two modes where we can say
160:16 can say let's apply multiple labels so let's say
160:18 let's apply multiple labels so let's say there were two people in the photo or
160:20 there were two people in the photo or whether there was a dog and cat I think
160:21 whether there was a dog and cat I think that's example that use a dog and a cat
160:23 that's example that use a dog and a cat or you just have a single class where
160:25 or you just have a single class where it's like what is the one thing that is
160:27 it's like what is the one thing that is in this photo it can only be of one of
160:28 in this photo it can only be of one of the particular categories this is the
160:30 the particular categories this is the one we're going to do multic class and
160:32 one we're going to do multic class and we have a bunch of different domains
160:34 we have a bunch of different domains here and if you want to you can go ahead
160:35 here and if you want to you can go ahead and read about all the different domains
160:37 and read about all the different domains and their best use case but we're going
160:39 and their best use case but we're going to stick with A2 this is optimized for
160:42 to stick with A2 this is optimized for so that it's faster right and that's
160:43 so that it's faster right and that's really good for our demo so we're going
160:46 really good for our demo so we're going to choose General A2 I'm going to go
160:47 to choose General A2 I'm going to go ahead and create this
160:50 ahead and create this project and uh so now what we need to do
160:52 project and uh so now what we need to do is start labeling our our our content so
160:56 is start labeling our our our content so um what we'll do is I just want to go
160:58 um what we'll do is I just want to go ahead and create the tags ahead of time
160:59 ahead and create the tags ahead of time so we'll say
161:00 so we'll say Warf we'll have uh data and we'll have
161:06 Warf we'll have uh data and we'll have Crusher and now what we'll do is we'll
161:08 Crusher and now what we'll do is we'll go ahead and upload those images so you
161:10 go ahead and upload those images so you know we uploaded the jupyter notebook
161:11 know we uploaded the jupyter notebook but it was totally not necessary so here
161:13 but it was totally not necessary so here is data because we're going to do it all
161:15 is data because we're going to do it all through here and we'll just apply the
161:17 through here and we'll just apply the data tag to them all at once which saves
161:18 data tag to them all at once which saves us a lot of time I love that uh we'll
161:21 us a lot of time I love that uh we'll upload now uh
161:24 upload now uh war and I don't want to upload them all
161:26 war and I don't want to upload them all I have this one quick test image we're
161:27 I have this one quick test image we're going to use to make sure that this
161:29 going to use to make sure that this works
161:30 works correctly and I'm going to choose
161:41 Beverly there she is Beverly
161:44 is Beverly Crusher okay so we have all our our
161:46 Crusher okay so we have all our our images in I don't know how this one got
161:48 images in I don't know how this one got in here but it's under worth it works
161:50 in here but it's under worth it works out totally fine so uh what I want to
161:53 out totally fine so uh what I want to do is uh go ahead and train this small
161:57 do is uh go ahead and train this small because they're all labeled so we have a
161:59 because they're all labeled so we have a ground truth and we'll let it go ahead
162:01 ground truth and we'll let it go ahead and train so we'll go and press train
162:03 and train so we'll go and press train and we have two options quick training
162:04 and we have two options quick training or Advanced Training Advanced Training
162:05 or Advanced Training Advanced Training where we can increase the time for
162:07 where we can increase the time for better accuracy but honestly uh we just
162:10 better accuracy but honestly uh we just want to do quick training so I'll go
162:11 want to do quick training so I'll go ahead and do quick training and it's
162:13 ahead and do quick training and it's going to start its iterative process
162:15 going to start its iterative process notice on the left hand side we have
162:17 notice on the left hand side we have probability threshold the minimum
162:19 probability threshold the minimum probability score for a prediction to be
162:21 probability score for a prediction to be valid when calcul calculating precision
162:23 valid when calcul calculating precision and recall so we uh the thing is is that
162:26 and recall so we uh the thing is is that if it doesn't at least meet that
162:28 if it doesn't at least meet that requirements it will quit out and if it
162:30 requirements it will quit out and if it gets above that then it might quit out
162:32 gets above that then it might quit out early just because it's good enough okay
162:34 early just because it's good enough okay so training doesn't take too long it
162:36 so training doesn't take too long it might take 5 to 10 minutes I can't
162:38 might take 5 to 10 minutes I can't remember how long it takes but uh what
162:40 remember how long it takes but uh what I'll do is I'll see you back here in a
162:41 I'll do is I'll see you back here in a moment okay all right so after waiting a
162:44 moment okay all right so after waiting a short little while here looks like our
162:46 short little while here looks like our results are down we get 100% um match
162:49 results are down we get 100% um match here so these are our evaluation metrics
162:50 here so these are our evaluation metrics to say whether uh the model was uh uh
162:54 to say whether uh the model was uh uh achieved its actual goal or not so we
162:56 achieved its actual goal or not so we have Precision recall and I believe this
162:59 have Precision recall and I believe this is average Precision uh and so it says
163:02 is average Precision uh and so it says that it did a really good job so that
163:03 that it did a really good job so that means that it should have no problem um
163:06 means that it should have no problem um matching up an image so in the top right
163:07 matching up an image so in the top right corner we have this button that's called
163:09 corner we have this button that's called quick test and this is going to give us
163:11 quick test and this is going to give us the opportunity to uh quickly test these
163:13 the opportunity to uh quickly test these so what we'll do is browse our files
163:15 so what we'll do is browse our files locally here and uh actually I'm going
163:18 locally here and uh actually I'm going to go to uh yeah we'll go here and we
163:22 to go to uh yeah we'll go here and we have Warf uh and so I have this quick
163:24 have Warf uh and so I have this quick image here we'll test and we'll see if
163:26 image here we'll test and we'll see if it actually matches up to
163:27 it actually matches up to bwf and it says 98.7% Warf that's pretty
163:31 bwf and it says 98.7% Warf that's pretty good I also have some additional images
163:33 good I also have some additional images here I just put into the repo to test
163:35 here I just put into the repo to test against and we'll see what it matches up
163:37 against and we'll see what it matches up because I thought it'd be interesting to
163:38 because I thought it'd be interesting to do something that is not necessarily uh
163:40 do something that is not necessarily uh them but it's something pretty close to
163:43 them but it's something pretty close to um you know it's pretty close to what
163:45 um you know it's pretty close to what those are okay so we'll go to crew here
163:47 those are okay so we'll go to crew here and first we'll try
163:50 and first we'll try Hugh okay and Hugh is a borg so he's
163:53 Hugh okay and Hugh is a borg so he's kind of like an Android and so we can
163:55 kind of like an Android and so we can see he mostly matches to data so that's
163:57 see he mostly matches to data so that's pretty good uh we'll give another one go
163:59 pretty good uh we'll give another one go Marto is a Klingon so he should be
164:01 Marto is a Klingon so he should be matched up to Warf very strong match to
164:04 matched up to Warf very strong match to Warf that's pretty good and then palaski
164:07 Warf that's pretty good and then palaski she is a doctor and female so she should
164:09 she is a doctor and female so she should get matched up to Beverly Crusher and
164:12 get matched up to Beverly Crusher and she does so this works out pretty darn
164:14 she does so this works out pretty darn well uh and I hadn't even tried that so
164:16 well uh and I hadn't even tried that so it's pretty exciting so now let's say we
164:18 it's pretty exciting so now let's say we want to go ahead and well if if we
164:20 want to go ahead and well if if we wanted to um make predictions we could
164:23 wanted to um make predictions we could do them in bulk here um I believe that
164:26 do them in bulk here um I believe that you could do them in bulk but
164:32 anyway yeah I guess I always thought this was like I could have swore yeah if
164:34 this was like I could have swore yeah if we didn't have these images before I
164:35 we didn't have these images before I think that it actually has an upload
164:36 think that it actually has an upload option it's probably just the quick test
164:38 option it's probably just the quick test so I'm a bit confused there um but
164:40 so I'm a bit confused there um but anyway so now that this is ready what we
164:42 anyway so now that this is ready what we can do is go ahead and publish it uh so
164:44 can do is go ahead and publish it uh so that it is publicly accessible so we'll
164:46 that it is publicly accessible so we'll just say here a crew
164:48 just say here a crew model okay and we'll drop that down say
164:58 publish and once it's published now we have this uh public out so this is an
164:59 have this uh public out so this is an endpoint that we can go hit
165:01 endpoint that we can go hit pragmatically uh I'm not going to do
165:03 pragmatically uh I'm not going to do that I mean we could use Postman to do
165:04 that I mean we could use Postman to do that um but my point is is that we've
165:07 that um but my point is is that we've basically uh figured it out for um
165:10 basically uh figured it out for um classification so now that we've done
165:11 classification so now that we've done classification let's go back here to uh
165:16 classification let's go back here to uh the vision here and let's now let's go
165:17 the vision here and let's now let's go ahead and do object detection
165:20 ahead and do object detection [Music]
165:23 [Music] okay all right so we're still in custom
165:26 okay all right so we're still in custom Vision let's go ahead and try out object
165:27 Vision let's go ahead and try out object detection so object detection is when
165:29 detection so object detection is when you can identify a particular items in a
165:32 you can identify a particular items in a scene um and so this one's going to be
165:34 scene um and so this one's going to be combadge that's what we're going to call
165:35 combadge that's what we're going to call it because we're going to try to detect
165:36 it because we're going to try to detect combadge we have more domains here we're
165:38 combadge we have more domains here we're going to stick with the general
165:40 going to stick with the general A1 and we'll go ahead and create this
165:43 A1 and we'll go ahead and create this project
165:44 project here and so what we need to do is add a
165:47 here and so what we need to do is add a bunch of images I'm going to go ahead
165:49 bunch of images I'm going to go ahead and create our tag which is going to be
165:50 and create our tag which is going to be called combadge uh you could look for
165:53 called combadge uh you could look for multiple different kinds of labels but
165:55 multiple different kinds of labels but then you need a lot of images so we're
165:57 then you need a lot of images so we're just going to keep it simple and have
165:58 just going to keep it simple and have that there I'm going to go ahead and add
166:01 that there I'm going to go ahead and add some images and we're going to go back
166:03 some images and we're going to go back um a couple steps here into our objects
166:05 um a couple steps here into our objects and here I have a bunch of photos and we
166:06 and here I have a bunch of photos and we need exactly 15 to train so we got one
166:08 need exactly 15 to train so we got one two 3 4 5 6 7 8 9 10 11 12 13 14 15 16
166:14 two 3 4 5 6 7 8 9 10 11 12 13 14 15 16 and so I threw an additional image in
166:16 and so I threw an additional image in here this is the badge test so we'll
166:18 here this is the badge test so we'll leave that out and we'll see if that
166:20 leave that out and we'll see if that picks up really well and yeah we got
166:23 picks up really well and yeah we got them all here and so we'll go ahead and
166:25 them all here and so we'll go ahead and upload those and we'll hit upload
166:29 upload those and we'll hit upload files
166:31 files okay and we'll say done and we can now
166:34 okay and we'll say done and we can now begin to label so we'll click into here
166:36 begin to label so we'll click into here and what I want to do if you hover over
166:38 and what I want to do if you hover over it should start detecting things if it
166:40 it should start detecting things if it doesn't you can click and drag we'll
166:41 doesn't you can click and drag we'll click this one they're all com badges so
166:43 click this one they're all com badges so we're not going to tag anything else
166:44 we're not going to tag anything else here okay okay so go here hover over is
166:47 here okay okay so go here hover over is it going to give me the combadge no so
166:48 it going to give me the combadge no so I'm just draag clicking and dragging to
166:50 I'm just draag clicking and dragging to get it
166:51 get it okay okay do we get this combadge
166:56 okay okay do we get this combadge yes do we get this one
166:59 yes do we get this one yep so simple as
167:06 that okay it doesn't always get it but uh most cases it
167:12 does okay didn't get that one so we'll just drag it
167:13 just drag it out
167:24 one it's interesting like that one's pretty clear but uh it's interesting
167:26 pretty clear but uh it's interesting what it picks out and what does what it
167:27 what it picks out and what does what it does not grab eh so it's not getting
167:29 does not grab eh so it's not getting this one probably because the photo
167:30 this one probably because the photo doesn't have enough
167:33 doesn't have enough contrast and this one has a lot hoping
167:36 contrast and this one has a lot hoping that that gives us more data to work
167:37 that that gives us more data to work with here yeah I think the higher the
167:40 with here yeah I think the higher the contrast it's easier for it to uh
167:43 contrast it's easier for it to uh um detect those
167:45 um detect those it's not getting that
167:47 it's not getting that one it's not getting that one okay there
167:50 one it's not getting that one okay there we
167:59 go yes there are a lot I know I have some of these ones that are packed but
168:01 some of these ones that are packed but there's only like three photos that are
168:02 there's only like three photos that are like
168:09 this yeah they have badges but they're slightly different so we're going to
168:10 slightly different so we're going to leave those
168:11 leave those out oops I think it actually had that
168:13 out oops I think it actually had that one but we'll just tag it
168:28 anyway and hopefully this will be worth the uh the effort
168:36 here there we go I think that was the last
168:38 go I think that was the last one okay great so we have all of our
168:41 one okay great so we have all of our tagged photos and what we can do is go
168:42 tagged photos and what we can do is go ahead and train the model same option
168:44 ahead and train the model same option quick training Advanced Training we're
168:46 quick training Advanced Training we're going to do a quick training here and
168:47 going to do a quick training here and notice that the options are slightly
168:49 notice that the options are slightly different we have probably threshold and
168:51 different we have probably threshold and then we have overlap threshold so the
168:52 then we have overlap threshold so the minimum percentage of overlap between
168:54 minimum percentage of overlap between predicted bounding boxes and ground
168:55 predicted bounding boxes and ground truth boxes to be considered for correct
168:58 truth boxes to be considered for correct prediction so I'll see you back here
169:00 prediction so I'll see you back here when it is done all right so after
169:02 when it is done all right so after waiting a little bit a while here it
169:04 waiting a little bit a while here it looks like um it's done it's trained and
169:06 looks like um it's done it's trained and so Precision is at 75% so Precision the
169:09 so Precision is at 75% so Precision the number will tell you if a tag is
169:11 number will tell you if a tag is predicted uh by your model How likely
169:13 predicted uh by your model How likely that it's likely to be so how likely did
169:15 that it's likely to be so how likely did it guess right then you have recall so
169:17 it guess right then you have recall so the number will tell you out of the tags
169:19 the number will tell you out of the tags which should be predicted correctly what
169:21 which should be predicted correctly what percentage did your model correctly find
169:23 percentage did your model correctly find so we have 100% uh and then you have
169:25 so we have 100% uh and then you have mean average Precision this number will
169:27 mean average Precision this number will tell you the overall object detector
169:30 tell you the overall object detector performance across all the tags okay so
169:34 performance across all the tags okay so what we'll do is we'll go ahead and uh
169:36 what we'll do is we'll go ahead and uh do a quick test on this model and we'll
169:39 do a quick test on this model and we'll see how it does I can't remember if I
169:41 see how it does I can't remember if I actually even ran this so it'll be
169:42 actually even ran this so it'll be curious to see the first one here um
169:45 curious to see the first one here um it's not as clearly visible it's part of
169:47 it's not as clearly visible it's part of their uniform so I'm not expecting to
169:49 their uniform so I'm not expecting to pick it up but we'll see what it does it
169:50 pick it up but we'll see what it does it picks up pretty much all of them with
169:53 picks up pretty much all of them with exception this one is definitely not a
169:55 exception this one is definitely not a comp badge but uh that's okay only show
169:58 comp badge but uh that's okay only show suggests obious the probabilities above
170:00 suggests obious the probabilities above the selected
170:01 the selected threshold so if we increase
170:04 threshold so if we increase it uh we'll just bring it down a bit so
170:07 it uh we'll just bring it down a bit so there it kind of improves it um if we
170:09 there it kind of improves it um if we move it around back and forth okay so I
170:12 move it around back and forth okay so I imagine via the API we could choose that
170:14 imagine via the API we could choose that let's go look at our other sample image
170:17 let's go look at our other sample image here
170:18 here um I'm not seeing
170:23 um I'm not seeing it uh where did I save it let me just
170:26 it uh where did I save it let me just double check make sure that it's in the
170:28 double check make sure that it's in the correct directory here
170:30 correct directory here okay yeah I saved it to the wrong place
170:32 okay yeah I saved it to the wrong place just a
170:34 just a moment
170:37 moment um I will place
170:53 second okay and so I'll just browse here again and so here we have another one
170:56 again and so here we have another one see if it picks up the badge right here
170:59 see if it picks up the badge right here there we go so looks like it worked so
171:01 there we go so looks like it worked so uh yeah I guess custom vision is uh
171:03 uh yeah I guess custom vision is uh pretty easy to use and uh pretty darn
171:05 pretty easy to use and uh pretty darn good so what we'll do is close this off
171:08 good so what we'll do is close this off and make our way back to our Jupiter
171:10 and make our way back to our Jupiter labs to move on to um our our next uh
171:15 labs to move on to um our our next uh lab here
171:16 lab here [Music]
171:20 [Music] okay all right so let's move on to the
171:22 okay all right so let's move on to the face service so just go ahead and double
171:24 face service so just go ahead and double click there on the left hand side and
171:25 click there on the left hand side and what we'll do is work our way from the
171:26 what we'll do is work our way from the top so the first thing we need to do is
171:28 top so the first thing we need to do is make sure that we have the computer
171:30 make sure that we have the computer vision installed so the face service is
171:33 vision installed so the face service is part of the computer vision API and once
171:36 part of the computer vision API and once that is done we'll go ahead and uh do
171:38 that is done we'll go ahead and uh do our Imports very similar to the last one
171:40 our Imports very similar to the last one but here we're using the face client
171:42 but here we're using the face client we're still using the Cog cognitive
171:44 we're still using the Cog cognitive service credentials we will populate our
171:47 service credentials we will populate our keys we make the face client and
171:50 keys we make the face client and authenticate and we're going to use the
171:52 authenticate and we're going to use the same image we used um uh prior with our
171:55 same image we used um uh prior with our computer vision so the data one there
171:57 computer vision so the data one there and we'll go ahead and print out the
171:58 and we'll go ahead and print out the results and so we get an object back so
172:00 results and so we get an object back so it's not very clear what it is but here
172:03 it's not very clear what it is but here if we hit
172:04 if we hit show okay here it's data and it's
172:06 show okay here it's data and it's identifying the face ID so going through
172:08 identifying the face ID so going through this code so we're just saying open the
172:09 this code so we're just saying open the image we're going to uh set up our
172:12 image we're going to uh set up our figure for plotting uh it's going to say
172:14 figure for plotting uh it's going to say well how many faces did it detect in the
172:16 well how many faces did it detect in the photo and so here it says detected one
172:19 photo and so here it says detected one face it will iterate through it and then
172:22 face it will iterate through it and then we will create a bounding box around the
172:24 we will create a bounding box around the images we can do that because it returns
172:26 images we can do that because it returns back the face rectangle so we get a top
172:28 back the face rectangle so we get a top left right Etc and uh we will draw that
172:32 left right Etc and uh we will draw that Wrangle on top so we have magenta I
172:33 Wrangle on top so we have magenta I could change it to like three if I
172:35 could change it to like three if I wanted to uh I don't know what the other
172:37 wanted to uh I don't know what the other colors are so I'm not even going to try
172:39 colors are so I'm not even going to try but yeah there it is and then we
172:41 but yeah there it is and then we annotate with the face ID that's the
172:42 annotate with the face ID that's the unique identifier for the face and then
172:45 unique identifier for the face and then we show the image okay so that's one and
172:47 we show the image okay so that's one and then if we wanted to get more detailed
172:49 then if we wanted to get more detailed information like attribute such as age
172:51 information like attribute such as age emotion makeup or gender uh this
172:54 emotion makeup or gender uh this resolution image wasn't large enough so
172:56 resolution image wasn't large enough so I had to find a different image and and
172:58 I had to find a different image and and do that so that's one thing you need to
173:00 do that so that's one thing you need to know as if it's not large enough it
173:02 know as if it's not large enough it won't process it so we're just loading
173:03 won't process it so we're just loading data
173:05 data large very similar process but it
173:08 large very similar process but it is uh the same thing detect with stream
173:11 is uh the same thing detect with stream but now we're passing
173:13 but now we're passing in um return face attributes and so here
173:16 in um return face attributes and so here we're saying the attributes we want uh
173:19 we're saying the attributes we want uh and there's that list and we went
173:20 and there's that list and we went through it in the lecture content and so
173:22 through it in the lecture content and so here we'll go ahead and run this and so
173:25 here we'll go ahead and run this and so we're getting more information so that
173:27 we're getting more information so that magenta line is a bit hard to see I'm
173:28 magenta line is a bit hard to see I'm just going to increase that to
173:30 just going to increase that to three okay still really hard to see but
173:33 three okay still really hard to see but that's okay so approximate age 44 I
173:36 that's okay so approximate age 44 I think the actor was a bit younger than
173:37 think the actor was a bit younger than that uh uh data technically is male
173:40 that uh uh data technically is male presenting but he's an Android so he
173:42 presenting but he's an Android so he doesn't necessarily have a gender I
173:44 doesn't necessarily have a gender I suppose he actually is wearing a lot of
173:46 suppose he actually is wearing a lot of makeup but all it detects is it I guess
173:48 makeup but all it detects is it I guess it's only particular on the lips and the
173:50 it's only particular on the lips and the eyes so it says he doesn't have makeup
173:51 eyes so it says he doesn't have makeup so maybe there's a color you know like
173:53 so maybe there's a color you know like ey Shadow stuff maybe we would detect
173:55 ey Shadow stuff maybe we would detect that in terms of personality I like how
173:57 that in terms of personality I like how it's he's a a 002 Point per SB but he's
174:01 it's he's a a 002 Point per SB but he's neutral right uh so just going through
174:03 neutral right uh so just going through the code here very quickly so again it's
174:05 the code here very quickly so again it's the number of faces so it detected one
174:07 the number of faces so it detected one face uh and then we draw a bounding box
174:10 face uh and then we draw a bounding box around the face for the detected
174:12 around the face for the detected attributes it's uh return back in the
174:15 attributes it's uh return back in the data here so we just say get the pH
174:18 data here so we just say get the pH attributes turn it into a dictionary and
174:20 attributes turn it into a dictionary and then we can just uh get those values and
174:23 then we can just uh get those values and uh iterate over it so that's as
174:25 uh iterate over it so that's as complicated as it is um and so there we
174:28 complicated as it is um and so there we [Music]
174:32 [Music] go all right so we're on to uh our next
174:35 go all right so we're on to uh our next cognitive service let's take a look at
174:37 cognitive service let's take a look at form recognizer all right and so form
174:40 form recognizer all right and so form recognizer uh it tries to identify um
174:44 recognizer uh it tries to identify um like forums and turns them into readable
174:45 like forums and turns them into readable things and so they have one for uh
174:47 things and so they have one for uh receipts in particular so at the top
174:49 receipts in particular so at the top finally we're not using um the computer
174:52 finally we're not using um the computer uh computer vision we actually have a
174:53 uh computer vision we actually have a different one so this one's Azure AI
174:55 different one so this one's Azure AI form recognizer so we'll run that there
174:58 form recognizer so we'll run that there but this one in particular isn't up to
175:00 but this one in particular isn't up to date in terms of using it like um notice
175:03 date in terms of using it like um notice all the other ones they're using uh the
175:06 all the other ones they're using uh the cognitive service credential so for this
175:08 cognitive service credential so for this we actually had to use the Azure key uh
175:10 we actually had to use the Azure key uh credential which was annoying I tried to
175:12 credential which was annoying I tried to use the other one to be consistent um
175:14 use the other one to be consistent um but I I couldn't use it okay so what
175:16 but I I couldn't use it okay so what we'll do is run our keys like before we
175:19 we'll do is run our keys like before we have a client very similar
175:21 have a client very similar process and this time we actually have a
175:24 process and this time we actually have a receipt and so we have begin recognize
175:27 receipt and so we have begin recognize receipt so it's going to analyze the
175:28 receipt so it's going to analyze the receipt information and then it's what
175:31 receipt information and then it's what it's going to do is show us the image
175:33 it's going to do is show us the image okay just so we have a reference to look
175:34 okay just so we have a reference to look at now the image isn't actually yellow
175:37 at now the image isn't actually yellow it's a white background I don't know why
175:38 it's a white background I don't know why when it renders out here it does that
175:40 when it renders out here it does that but that's just what
175:41 but that's just what happens and uh it even obscures the
175:44 happens and uh it even obscures the server name I I don't know why um but
175:47 server name I I don't know why um but anyway if we go down below um this is
175:50 anyway if we go down below um this is return results up here right so we got
175:52 return results up here right so we got our results and so if we just print out
175:55 our results and so if we just print out uh the results here we can see we get a
175:57 uh the results here we can see we get a recognized form back we get fields and
176:00 recognized form back we get fields and some additional things and if we go into
176:01 some additional things and if we go into the uh the fields itself we see there's
176:03 the uh the fields itself we see there's a lot more information if you can make
176:05 a lot more information if you can make out like here it says Merchant phone
176:06 out like here it says Merchant phone number form field label value and
176:09 number form field label value and there's the number
176:11 there's the number 512707 so for these things here like um
176:15 512707 so for these things here like um the
176:17 the receipts if we can just find the API
176:19 receipts if we can just find the API quickly here it has predefined
176:22 quickly here it has predefined Fields I'm not sure um yeah business
176:25 Fields I'm not sure um yeah business card
176:27 card Etc um like if we just type in
176:32 Etc um like if we just type in merchant I'm just trying to see if
176:33 merchant I'm just trying to see if there's a big old list here it's not
176:35 there's a big old list here it's not really showing us a full list but these
176:37 really showing us a full list but these are are predefined um things that are
176:39 are are predefined um things that are returned right so they've defined those
176:42 returned right so they've defined those uh maybe it's over here
176:44 uh maybe it's over here there we go so these are the predefined
176:46 there we go so these are the predefined ones that extracts out so we have uh
176:48 ones that extracts out so we have uh receipt type Merchant name etc etc and
176:51 receipt type Merchant name etc etc and so if we go back to here you can see um
176:54 so if we go back to here you can see um I I have the field called Merchant name
176:56 I I have the field called Merchant name so we hit there it says Alm draft out
176:59 so we hit there it says Alm draft out Cinema let's say we want to try to get
177:00 Cinema let's say we want to try to get that balance maybe we can try to figure
177:02 that balance maybe we can try to figure out which one it is I never ran this
177:04 out which one it is I never ran this myself when I I made it so we'll see
177:06 myself when I I made it so we'll see what it is but here it has total price
177:08 what it is but here it has total price what's interesting is that these this
177:10 what's interesting is that these this has a space so it's kind of unusual ual
177:14 has a space so it's kind of unusual ual you think it'd be together but let's see
177:15 you think it'd be together but let's see if that
177:17 if that works okay doesn't like that maybe
177:20 works okay doesn't like that maybe that's just a typo on their part okay so
177:23 that's just a typo on their part okay so we get none uh let's try
177:26 we get none uh let's try price see what it picks
177:28 price see what it picks up nope nothing um we know that the
177:33 up nope nothing um we know that the phone number is there so we'll give the
177:34 phone number is there so we'll give the phone
177:36 phone number there we go so you know it's an
177:39 number there we go so you know it's an okay service but uh you know uh you know
177:42 okay service but uh you know uh you know you're you're mileage will vary based on
177:45 you're you're mileage will vary based on uh what you do there maybe we could try
177:47 uh what you do there maybe we could try total because that makes more sense
177:50 total because that makes more sense right uh yeah there we go okay great so
177:53 right uh yeah there we go okay great so yeah it is pulling out the information
177:54 yeah it is pulling out the information um and so that's pretty much all you
177:57 um and so that's pretty much all you need to know about that service there
177:59 need to know about that service there [Music]
178:02 [Music] okay let's take a look at some of our
178:04 okay let's take a look at some of our OCR capabilities here uh and I believe
178:07 OCR capabilities here uh and I believe that's in computer vision so we'll go
178:08 that's in computer vision so we'll go ahead and open that up at the top here
178:10 ahead and open that up at the top here we'll install computer vision as we did
178:12 we'll install computer vision as we did before very similar to the other
178:15 before very similar to the other computer vision task but this time we
178:16 computer vision task but this time we have a couple of ones here that'll
178:19 have a couple of ones here that'll explain that as we go through here we'll
178:21 explain that as we go through here we'll load our keys we'll do our credentials
178:24 load our keys we'll do our credentials we'll load the client okay and then we
178:27 we'll load the client okay and then we have this um function here called
178:29 have this um function here called printed text so what this function is
178:31 printed text so what this function is going to do is it's going to uh print
178:34 going to do is it's going to uh print out the results of whatever text it
178:36 out the results of whatever text it processes okay so the idea is that we
178:39 processes okay so the idea is that we are going to feed in an image and it's
178:42 are going to feed in an image and it's going to give us back out the text for
178:43 going to give us back out the text for the the image so we'll run this function
178:46 the the image so we'll run this function and I have two different images cuz I
178:48 and I have two different images cuz I actually ran it on the first one and the
178:50 actually ran it on the first one and the results were terrible and so I got a a
178:52 results were terrible and so I got a a second image and it was a bit better
178:54 second image and it was a bit better okay so we'll go ahead and run this it's
178:55 okay so we'll go ahead and run this it's going to show us the image okay and so
178:58 going to show us the image okay and so this is the photo and it was supposed to
178:59 this is the photo and it was supposed to extract out Star Trek the Next
179:00 extract out Star Trek the Next Generation but because of the artifacts
179:02 Generation but because of the artifacts and size of the image we get back uh not
179:05 and size of the image we get back uh not English okay and so you know maybe a
179:08 English okay and so you know maybe a high resolution image it would have um a
179:10 high resolution image it would have um a better a better time there um but that
179:13 better a better time there um but that is what we got back okay so let's go
179:16 is what we got back okay so let's go take a look at our second image and see
179:18 take a look at our second image and see how it did and this one I'm surprised it
179:20 how it did and this one I'm surprised it actually extracts out a lot more
179:21 actually extracts out a lot more information you can see really has a
179:23 information you can see really has a hard time with the Star Trek font but we
179:25 hard time with the Star Trek font but we get Deep Space 9 Nana Visitor tells all
179:27 get Deep Space 9 Nana Visitor tells all Life Death some errors here so it's not
179:30 Life Death some errors here so it's not perfect um but you know you can see that
179:32 perfect um but you know you can see that it does something here now there is the
179:35 it does something here now there is the O this is like for OCR where we have
179:37 O this is like for OCR where we have like for very simple images and text
179:39 like for very simple images and text this is where we use the recognized
179:40 this is where we use the recognized printed text in stream but uh if we were
179:43 printed text in stream but uh if we were doing this for larger amounts of text
179:45 doing this for larger amounts of text and we want to do this uh want this
179:47 and we want to do this uh want this analyzed asynchronously then we want to
179:49 analyzed asynchronously then we want to use the read API and it's a little bit
179:51 use the read API and it's a little bit more involved um so what we'll do here
179:53 more involved um so what we'll do here is load a different image and this is a
179:55 is load a different image and this is a script we'll look at the image here in a
179:56 script we'll look at the image here in a moment um but here we read in stream and
180:00 moment um but here we read in stream and we create these
180:02 we create these operations okay and what it will do is
180:04 operations okay and what it will do is it will asynchronous asynchronously send
180:07 it will asynchronous asynchronously send all the information over okay uh so I
180:10 all the information over okay uh so I think this is supposed to be results
180:12 think this is supposed to be results here minor typ
180:14 here minor typ and um we will go ahead and give that a
180:18 and um we will go ahead and give that a run okay and so here you can see it's
180:21 run okay and so here you can see it's extracting up the image if we want to uh
180:23 extracting up the image if we want to uh uh see this image I thought I uh I
180:26 uh see this image I thought I uh I thought I showed this image here but I
180:27 thought I showed this image here but I guess I don't yeah it says plot image
180:30 guess I don't yeah it says plot image here to show us the
180:32 here to show us the image
180:34 image uh path it's up
180:38 uh path it's up here it doesn't want to show us it's
180:41 here it doesn't want to show us it's funny because this one up here is
180:42 funny because this one up here is showing us no problem right
180:48 um well I can just show you the image it's not a big
180:50 it's not a big deal but I'm not sure why it's not
180:52 deal but I'm not sure why it's not showing up here
180:54 showing up here today so if we go to our assets here I
180:58 today so if we go to our assets here I go to
181:00 go to OCR uh I'm just going to open this
181:03 OCR uh I'm just going to open this up it's opening up in Photoshop and so
181:05 up it's opening up in Photoshop and so this is what it's transcribing okay so
181:08 this is what it's transcribing okay so this is a thing this is like a guide to
181:09 this is a thing this is like a guide to Star Trek where they talk about like you
181:11 Star Trek where they talk about like you know what what makes St Trek Star Trek
181:13 know what what makes St Trek Star Trek so just looking here it's actually
181:15 so just looking here it's actually pretty darn good okay but like read API
181:17 pretty darn good okay but like read API is a lot more uh efficient because it
181:19 is a lot more uh efficient because it can work uh
181:22 can work uh umly and so when you have a lot of text
181:24 umly and so when you have a lot of text that's what you want to do okay um and
181:27 that's what you want to do okay um and like it's feeding in each individual
181:28 like it's feeding in each individual line right so that it can be more
181:30 line right so that it can be more effective that way um so let's go look
181:32 effective that way um so let's go look at some hand Rd and stuff so just in
181:34 at some hand Rd and stuff so just in case the image doesn't pop up we'll go
181:35 case the image doesn't pop up we'll go ahead and open this one and so this is a
181:38 ahead and open this one and so this is a a a handwritten note that uh William
181:41 a a handwritten note that uh William Shatner wrote to a fan of Star Trek and
181:45 Shatner wrote to a fan of Star Trek and it's basically incomprehensible I don't
181:47 it's basically incomprehensible I don't know if you can read that here but see
181:50 know if you can read that here but see was very something he was something
181:53 was very something he was something hospital and healthy was something he
181:56 hospital and healthy was something he was something I can't even read it okay
181:59 was something I can't even read it okay so let's see what uh the machine thinks
182:04 so let's see what uh the machine thinks here and uh it says image path yeah it's
182:08 here and uh it says image path yeah it's called path let's just change that out
182:11 called path let's just change that out go ahead and run that and run that there
182:15 go ahead and run that and run that there and we'll go ahead and run it and here
182:17 and we'll go ahead and run it and here we get the image so uh poner us very
182:21 we get the image so uh poner us very sick he was the hospital his Bey was Etc
182:25 sick he was the hospital his Bey was Etc beat nobody lost his family knew Captain
182:27 beat nobody lost his family knew Captain Halden so reads better than how I could
182:29 Halden so reads better than how I could read it honestly like it is it's really
182:31 read it honestly like it is it's really hard right like if you looked at
182:33 hard right like if you looked at this like that looks like difficult
182:38 this like that looks like difficult was Beady healthy I could see why it's
182:41 was Beady healthy I could see why it's guessing like that right dying that
182:42 guessing like that right dying that looks like dying to me you know what I
182:45 looks like dying to me you know what I mean so you it's just poorly hand
182:47 mean so you it's just poorly hand handwritten but I mean it's pretty good
182:49 handwritten but I mean it's pretty good for what it is so uh yeah there you
182:51 for what it is so uh yeah there you [Music]
182:55 [Music] go all right so let's take a look at
182:57 go all right so let's take a look at another cognitive service here and this
182:58 another cognitive service here and this one is text
183:01 one is text analysis and uh so what we'll do is
183:03 analysis and uh so what we'll do is install the Azure cognitive Services
183:05 install the Azure cognitive Services language uh text analytics here so we go
183:07 language uh text analytics here so we go ahead and hit run all right and once
183:11 ahead and hit run all right and once that's uh installed uh this one is using
183:14 that's uh installed uh this one is using the cognitive Services credentials so
183:15 the cognitive Services credentials so it's a little bit more standard with our
183:17 it's a little bit more standard with our other ones here we'll go ahead and run
183:19 other ones here we'll go ahead and run that there uh we'll make our credentials
183:22 that there uh we'll make our credentials load our client and this one what we're
183:24 load our client and this one what we're going to do is try to determine
183:26 going to do is try to determine sentiment and understand why people like
183:28 sentiment and understand why people like a particular movie or not so I've loaded
183:31 a particular movie or not so I've loaded a bunch of reviews um they are again I
183:33 a bunch of reviews um they are again I can show you the data if it helps uh and
183:38 can show you the data if it helps uh and so I'm just trying to find my right
183:39 so I'm just trying to find my right folder here and so if we go back look
183:43 folder here and so if we go back look our movie reviews here's like a a review
183:44 our movie reviews here's like a a review someone wrote first Contact just works
183:47 someone wrote first Contact just works it works as a rousing chapter in the
183:49 it works as a rousing chapter in the Star Trek to less extent it works as a
183:50 Star Trek to less extent it works as a mainstream entertainment so different
183:52 mainstream entertainment so different reviews for Star Trek first Contact
183:54 reviews for Star Trek first Contact which was a a very popular movie back in
183:56 which was a a very popular movie back in the day um so what we'll
183:59 the day um so what we'll do is we will load uh the reviews so
184:02 do is we will load uh the reviews so it's just iterating through the text
184:03 it's just iterating through the text files and showing us what the reviews
184:05 files and showing us what the reviews are so here we can see all the ridden
184:07 are so here we can see all the ridden text had a lot of trouble getting the
184:09 text had a lot of trouble getting the last one to display but it does get
184:10 last one to display but it does get loaded in and so here we're using the
184:13 loaded in and so here we're using the the the um text analysis to show us uh
184:18 the the um text analysis to show us uh key phrases because maybe that would
184:19 key phrases because maybe that would give us an indicator and so that's the
184:21 give us an indicator and so that's the object back but maybe that give us an
184:23 object back but maybe that give us an indicator as to like what people are
184:25 indicator as to like what people are saying as important things so here we
184:26 saying as important things so here we see Borg ship Enterprise smaller ship
184:28 see Borg ship Enterprise smaller ship escapes neutral zone travels contact
184:31 escapes neutral zone travels contact damage uh co-writer Beautiful Mind
184:33 damage uh co-writer Beautiful Mind sophisticated science fiction best
184:36 sophisticated science fiction best whales Leonard neoy okay uh wealth of
184:40 whales Leonard neoy okay uh wealth of unrealized potential uh filmmaker John
184:43 unrealized potential uh filmmaker John fr s okay so very interesting stuff as
184:46 fr s okay so very interesting stuff as it here Borg ship again you've seen Borg
184:48 it here Borg ship again you've seen Borg ship a lot so that is kind of key
184:50 ship a lot so that is kind of key phrases let's go get uh C or customer
184:53 phrases let's go get uh C or customer sentiment or how people felt about it
184:54 sentiment or how people felt about it did they like it or not and so here we
184:57 did they like it or not and so here we just call sentiment and um what we'll do
184:59 just call sentiment and um what we'll do is if it's uh above five then it's
185:01 is if it's uh above five then it's positive and it's below five then it's a
185:02 positive and it's below five then it's a negative review I think most people uh
185:06 negative review I think most people uh thought it was a very good film uh so
185:08 thought it was a very good film uh so this one says it's pretty low nine so
185:10 this one says it's pretty low nine so let's go take a look at that one uh it
185:12 let's go take a look at that one uh it wasn't actually showing rendered there
185:14 wasn't actually showing rendered there so maybe we'll have to open it up
185:15 so maybe we'll have to open it up manually see if that's actually accurate
185:18 manually see if that's actually accurate it's empty so there you go I guess we
185:20 it's empty so there you go I guess we had a blank one in there um I must have
185:22 had a blank one in there um I must have forgot to paste it in but that's okay uh
185:25 forgot to paste it in but that's okay uh that's a good indicator that uh you know
185:27 that's a good indicator that uh you know that's what happens if you don't have it
185:28 that's what happens if you don't have it so let's look at number one then which
185:29 so let's look at number one then which is uh well actually this one is nine
185:32 is uh well actually this one is nine this is 04 this one here is eight so
185:36 this is 04 this one here is eight so we'll open up eight when the Borg
185:38 we'll open up eight when the Borg launched on Earth the Enterprise is sent
185:39 launched on Earth the Enterprise is sent to the neutral zone etc etc however a
185:42 to the neutral zone etc etc however a smaller ship escapes travels the
185:43 smaller ship escapes travels the Enterprise follows back um meanwhile the
185:47 Enterprise follows back um meanwhile the survivors so like this is a synopsis it
185:49 survivors so like this is a synopsis it doesn't say whether they like it or they
185:50 doesn't say whether they like it or they don't but it was just 04 I I guess so
185:53 don't but it was just 04 I I guess so there's nothing positive about it right
185:55 there's nothing positive about it right um if we look at one that was this one's
185:58 um if we look at one that was this one's pretty low which is no no it's not it's
186:01 pretty low which is no no it's not it's one so it seems like this person
186:03 one so it seems like this person probably really liked it or no I guess
186:06 probably really liked it or no I guess that's actually pretty low because it's
186:07 that's actually pretty low because it's one it's not nine Nine's very high let's
186:10 one it's not nine Nine's very high let's take a look at this one review number
186:12 take a look at this one review number two uh if we go up
186:15 two uh if we go up here the doo has improved the story mon
186:17 here the doo has improved the story mon turn the show but there's a wealth of
186:18 turn the show but there's a wealth of unrealized potential so that's a fair
186:20 unrealized potential so that's a fair one saying they maybe they don't like it
186:22 one saying they maybe they don't like it as much I don't know if they give it two
186:24 as much I don't know if they give it two stars right we could probably actually
186:26 stars right we could probably actually correlate it with the actual results
186:27 correlate it with the actual results because I did get these off of IMDb and
186:29 because I did get these off of IMDb and Rotten Tomatoes but uh yeah there you go
186:31 Rotten Tomatoes but uh yeah there you go that is Tex
186:38 [Music] analysis all right so now we're on to
186:40 analysis all right so now we're on to Q&A maker and so we're not going to need
186:43 Q&A maker and so we're not going to need to do anything pragmatically because Q&A
186:45 to do anything pragmatically because Q&A maker is all about no code or low code
186:48 maker is all about no code or low code to build out a questions and answers uh
186:50 to build out a questions and answers uh uh bot service so what we'll do is go
186:53 uh bot service so what we'll do is go all the way up here and I want you to
186:54 all the way up here and I want you to type in Q andm maker. a because as far
186:56 type in Q andm maker. a because as far as I'm aware of it's not accessible
186:58 as I'm aware of it's not accessible through the portal sometimes you can
187:00 through the portal sometimes you can find these things um again if we go to
187:03 find these things um again if we go to the
187:04 the marketplace I'm just curious I'm going
187:05 marketplace I'm just curious I'm going just take a look here really quickly uh
187:08 just take a look here really quickly uh whenever it decides to log Us in here
187:10 whenever it decides to log Us in here okay great so I'll go over to
187:12 okay great so I'll go over to Marketplace and probably if we typed in
187:14 Marketplace and probably if we typed in Q&A maybe we'd see something here
187:22 Q&A yeah so we go here um give it a second
187:25 here um give it a second here seems like Azure is a little bit
187:28 here seems like Azure is a little bit slow right
187:34 now usually varies fast but uh you know the service
187:36 the service varies well it's not loading for me
187:38 varies well it's not loading for me right now but that's okay because we're
187:39 right now but that's okay because we're not going to do it that way anyway um so
187:42 not going to do it that way anyway um so uh again go to Q&A maker. and what I
187:45 uh again go to Q&A maker. and what I want you to do is go all way to the top
187:47 want you to do is go all way to the top in the right corner and we'll hit sign
187:48 in the right corner and we'll hit sign in and what we'll be doing is connecting
187:51 in and what we'll be doing is connecting via our single sign on with our account
187:53 via our single sign on with our account so it already knows I have an account
187:54 so it already knows I have an account there I'm going to give it a moment here
187:57 there I'm going to give it a moment here and I'm going to go ahead and just give
187:59 and I'm going to go ahead and just give it a
188:15 second there we go so it says I don't have any
188:17 there we go so it says I don't have any um knowledge bases which is true so
188:19 um knowledge bases which is true so let's go ahead and create ourselves a
188:20 let's go ahead and create ourselves a new knowledge base and here we have the
188:22 new knowledge base and here we have the option between stable and preview I'm
188:24 option between stable and preview I'm going to stick with stable because I
188:25 going to stick with stable because I don't know what's in preview I'm pretty
188:26 don't know what's in preview I'm pretty happy with uh that and so we need to
188:28 happy with uh that and so we need to connect uh Q&A Service uh uh Q&A service
188:32 connect uh Q&A Service uh uh Q&A service to our knowledge base and so back over
188:34 to our knowledge base and so back over here in Azure actually I guess we do
188:36 here in Azure actually I guess we do have to make one now that I remember we
188:37 have to make one now that I remember we actually have to create a Q&A maker
188:39 actually have to create a Q&A maker service so I'll go down here and put
188:41 service so I'll go down here and put this under my Cog services we'll say
188:44 this under my Cog services we'll say my um
188:46 my um Q&A Q&A
188:49 Q&A Q&A service might complain about the name uh
188:52 service might complain about the name uh yep so I'll just put some numbers here
188:54 yep so I'll just put some numbers here we'll pick uh free tier sounds good so
188:57 we'll pick uh free tier sounds good so I'll go free when I actually get the
188:58 I'll go free when I actually get the option that's what I will choose um down
189:01 option that's what I will choose um down below we'll choose free again usse
189:03 below we'll choose free again usse sounds great to me uh it generates out
189:05 sounds great to me uh it generates out the name it's the same name as here so
189:07 the name it's the same name as here so that's fine uh we don't need app
189:09 that's fine uh we don't need app insights but I'm going to leave it
189:10 insights but I'm going to leave it enabled because I think it changes it to
189:12 enabled because I think it changes it to standard or s zero when you uh do
189:14 standard or s zero when you uh do not um have it enabled
189:17 not um have it enabled unusually and so we will create our Q&A
189:20 unusually and so we will create our Q&A maker service give it a moment
189:23 maker service give it a moment here and it says I remember it will say
189:26 here and it says I remember it will say like even if you try it might have to
189:28 like even if you try it might have to wait 10 minutes for it to create the
189:29 wait 10 minutes for it to create the service so even though even after it's
189:31 service so even though even after it's provisioned um it'll take some time so
189:33 provisioned um it'll take some time so what we should do is prepare our doc
189:35 what we should do is prepare our doc because it can take in a variety
189:36 because it can take in a variety different files and I just want to show
189:38 different files and I just want to show you here that uh the Q&A they have a
189:41 you here that uh the Q&A they have a whole paper here formatting the
189:42 whole paper here formatting the guidelines
189:43 guidelines and basically it's pretty smart about
189:45 and basically it's pretty smart about knowing where headings and answers is so
189:48 knowing where headings and answers is so for unstructured data we just have a
189:49 for unstructured data we just have a heading and we have some text so let's
189:51 heading and we have some text so let's write some things in here that we can
189:52 write some things in here that we can think of since we're all about
189:53 think of since we're all about certification we should write some stuff
189:55 certification we should write some stuff here so how many adus certifications are
189:59 here so how many adus certifications are there I believe right now there are uh
190:03 there I believe right now there are uh 11 uh adus
190:05 11 uh adus certifications
190:08 certifications okay and maybe if we use our headings
190:10 okay and maybe if we use our headings here this would probably be a good idea
190:12 here this would probably be a good idea here y
190:27 fundamental Azure certifications are
190:38 there and uh we'll give this a heading we'll say um there are three Azure I
190:44 we'll say um there are three Azure I think there's three there's other ones
190:46 think there's three there's other ones right like Power Platform stuff but just
190:48 right like Power Platform stuff but just being Azure specific there are three
190:50 being Azure specific there are three Azure uh
190:53 Azure uh fundamental certifications certification
190:56 fundamental certifications certification so we have
190:58 so we have um the dp900 the AI 900 um the a900 I
191:03 um the dp900 the AI 900 um the a900 I guess there's four there's the sc900
191:05 guess there's four there's the sc900 right so there are
191:07 right so there are four
191:10 four okay we'll say which is the
191:14 okay we'll say which is the hardest
191:15 hardest um Azure assoc Azure Association
191:33 certification and uh what we'll say here is I think I mean it's my my opinion is
191:36 is I think I mean it's my my opinion is it's the Azure administrator had some
191:38 it's the Azure administrator had some background noise there that's why I was
191:40 background noise there that's why I was a bit pausing there but the Azure
191:41 a bit pausing there but the Azure administrator a 104 I would say that's
191:44 administrator a 104 I would say that's the hardest uh which is
192:01 certifications I would say uh Azure certifications are
192:04 certifications are harder because they uh check uh exact
192:09 harder because they uh check uh exact steps for
192:11 steps for implementation
192:13 implementation where AWS focuses
192:19 on Concepts okay so we have a bit of a um
192:22 Concepts okay so we have a bit of a um knowledge base here so I'll save it and
192:25 knowledge base here so I'll save it and assuming that this is ready because we
192:26 assuming that this is ready because we need a little bit time to put this
192:27 need a little bit time to put this together we'll go back to q a get hit a
192:31 together we'll go back to q a get hit a refresh
192:32 refresh here give it a moment drop it down
192:37 here give it a moment drop it down choose our
192:43 service and uh notice here that we have chitchat
192:45 and uh notice here that we have chitchat extraction and only extraction we're
192:46 extraction and only extraction we're going to do chitchat I will say uh my or
192:50 going to do chitchat I will say uh my or this will be uh the reference name you
192:52 this will be uh the reference name you change any time this will be like uh uh
192:54 change any time this will be like uh uh certification
193:00 Q&A and so here we want to populate so we'll go to files here I'm going to go
193:02 we'll go to files here I'm going to go to my
193:04 to my desktop and here it is I'll open
193:07 desktop and here it is I'll open it we will choose professional tone go
193:10 it we will choose professional tone go ahead and create that and so I'll see
193:12 ahead and create that and so I'll see you back here moment all right so after
193:14 you back here moment all right so after waiting a short little time here it
193:16 waiting a short little time here it loaded in our data so you can see that
193:17 loaded in our data so you can see that it it figured out which is the question
193:19 it it figured out which is the question which is the answer and also has a bunch
193:21 which is the answer and also has a bunch of default so here if somebody was asked
193:23 of default so here if somebody was asked something very s uh silly like can you
193:25 something very s uh silly like can you cry I'll say I don't have a body it has
193:27 cry I'll say I don't have a body it has a lot of information pre-loaded for us
193:30 a lot of information pre-loaded for us which is really nice if we wanted to go
193:31 which is really nice if we wanted to go ahead and test this we could go and say
193:33 ahead and test this we could go and say um we'll go here and then we'll write in
193:36 um we'll go here and then we'll write in uh we say like
193:50 boring says good morning okay so we'll say um how many uh
193:53 say um how many uh certifications are there we didn't say
193:56 certifications are there we didn't say AWS but let's just see what
194:05 happens and so it kind of inferred even though we didn't say AWS in particular
194:07 though we didn't say AWS in particular so and notice that there's ads and Azure
194:10 so and notice that there's ads and Azure so how many fundamental Azure
194:11 so how many fundamental Azure certifications things like that and so
194:12 certifications things like that and so it chose AWS so it's not like the
194:15 it chose AWS so it's not like the perfect service but it's pretty good I
194:17 perfect service but it's pretty good I wonder what would happen if we um placed
194:20 wonder what would happen if we um placed in uh one that's like Azure I don't know
194:23 in uh one that's like Azure I don't know how many Azure Sears there are we'll
194:24 how many Azure Sears there are we'll just say like there's 11 12 I can't
194:25 just say like there's 11 12 I can't never remember they're always adding
194:26 never remember they're always adding more but uh it I want to close this here
194:29 more but uh it I want to close this here there we go so let's just go add a new
194:31 there we go so let's just go add a new key pair here and we'll say how many
194:34 key pair here and we'll say how many Azure
194:35 Azure [Music]
194:37 [Music] certification are there I should have
194:39 certification are there I should have said certifications I'll probably just
194:41 said certifications I'll probably just set one moment so there there
194:45 set one moment so there there are 12 Azure
194:48 are 12 Azure certifications who knows how many they
194:50 certifications who knows how many they have they could like 14 or something we
194:51 have they could like 14 or something we could say like between 11 and
194:55 could say like between 11 and 14 they just add them they just update
194:57 14 they just add them they just update them too frequently I can't keep track
195:00 them too frequently I can't keep track so uh we'll go here and we'll just say
195:02 so uh we'll go here and we'll just say certifications and we will save and
195:04 certifications and we will save and retrain so we'll just wait here a
195:12 moment great and so now we go ahead and test this again so we'll
195:14 test this again so we'll say how many
195:17 say how many certifications are
195:25 there and see it's pulling the first answer if I say uh Azure if it's see if
195:28 answer if I say uh Azure if it's see if it gets the right one
195:31 it gets the right one here how many Azure certifications are
195:41 there okay so you know uh maybe you'd have to say you'd have to have a generic
195:43 have to say you'd have to have a generic one for that match so if we go back here
195:45 one for that match so if we go back here and we
195:50 say how many certifications are there you say uh you
195:54 certifications are there you say uh you know like uh uh
195:57 know like uh uh which certification uh uh which Ser
196:02 which certification uh uh which Ser cloud service
196:08 provider here we got ads
196:10 ads Azure uh prompt you can use Guides
196:13 Azure uh prompt you can use Guides Through conversational flow prompts are
196:14 Through conversational flow prompts are used to link Q&A Pairs and can be
196:17 used to link Q&A Pairs and can be displayed um I haven't used this yet but
196:19 displayed um I haven't used this yet but I mean it sounds like something that's
196:20 I mean it sounds like something that's pretty good um because there is
196:22 pretty good um because there is multi-turn in this so the idea is that
196:24 multi-turn in this so the idea is that if you had to go through multiple steps
196:26 if you had to go through multiple steps you could absolutely do that um we try
196:28 you could absolutely do that um we try this a little bit here uh follow prompt
196:30 this a little bit here uh follow prompt you can use the guide use convert
196:32 you can use the guide use convert prompts are used to link Q&A pairs
196:34 prompts are used to link Q&A pairs together texture button for suggested
196:36 together texture button for suggested action oh okay so maybe we just do like
196:38 action oh okay so maybe we just do like AWS link to Q&A and then so search an
196:41 AWS link to Q&A and then so search an existing Q&A or create a new one um so
196:44 existing Q&A or create a new one um so it say like how many eight of
196:47 it say like how many eight of us okay we're typing it
196:49 us okay we're typing it in context only this Falls up will not
196:52 in context only this Falls up will not be understood out of the context flow
196:55 be understood out of the context flow sure because it should be within context
196:58 sure because it should be within context right and uh here we can do another one
197:00 right and uh here we can do another one we say like um
197:05 we say like um Azure we'll say how many
197:14 azure context only oops it uh got away from me
197:24 there we'll save that and uh what we'll do is save and
197:37 train so we go back here and we'll say how
197:39 how many uh certifications are there
197:48 enter so we have to choose AWS and so there we go so we got something that
197:49 there we go so we got something that works pretty good there since I'm happy
197:51 works pretty good there since I'm happy with it we can go ahead and go and
197:52 with it we can go ahead and go and publish that so we's say
198:03 publish and now that it's published we could use Postman or curl to uh trigger
198:06 could use Postman or curl to uh trigger it but what I want to do is create a bot
198:08 it but what I want to do is create a bot because with Azure bot Services then we
198:10 because with Azure bot Services then we can actually utilize it um with other
198:11 can actually utilize it um with other IND ations right it's a great way to uh
198:14 IND ations right it's a great way to uh um use your Bot or to actually host your
198:17 um use your Bot or to actually host your Bot so we'll go over here it'll link it
198:19 Bot so we'll go over here it'll link it over uh if you don't click it it doesn't
198:20 over uh if you don't click it it doesn't preload it in so it's kind of a pain if
198:22 preload it in so it's kind of a pain if you lose it you have to go back there
198:23 you lose it you have to go back there and click it again but uh let's just say
198:25 and click it again but uh let's just say um
198:26 um certification q and
198:29 certification q and day and we will look through here so all
198:32 day and we will look through here so all going to go with free premium messages
198:34 going to go with free premium messages 10K 1K premium message units messages
198:38 10K 1K premium message units messages I'm kind of confused by the pricing but
198:39 I'm kind of confused by the pricing but F0 usually means free so that's what I'm
198:41 F0 usually means free so that's what I'm going to go for that SDK or nodejs I'm
198:43 going to go for that SDK or nodejs I'm going to use NOS not that we're going to
198:44 going to use NOS not that we're going to do anything there with it go ahead and
198:46 do anything there with it go ahead and create
198:48 create that and I don't think this takes too
198:51 that and I don't think this takes too long we'll see
199:02 here and just go ahead and click on that there I'll just wait here a bit I'll see
199:04 there I'll just wait here a bit I'll see you back here in a moment all right so
199:06 you back here in a moment all right so after waiting I don't know about 5
199:08 after waiting I don't know about 5 minutes there it looks like our bot
199:10 minutes there it looks like our bot service is deployed we'll go to that
199:12 service is deployed we'll go to that resour there uh you can download the bot
199:15 resour there uh you can download the bot source code actually never did this uh
199:17 source code actually never did this uh so I don't know what it looks like so be
199:18 so I don't know what it looks like so be curious to see this um just to see what
199:21 curious to see this um just to see what the code is I assume that because we Cho
199:23 the code is I assume that because we Cho chose nodejs it would give us um that as
199:26 chose nodejs it would give us um that as the default there so download your c as
199:28 the default there so download your c as you bought creating the source zip not
199:30 you bought creating the source zip not sure how long this
199:32 sure how long this takes might be regretting clicking on
199:34 takes might be regretting clicking on that but uh what we'll do is we'll go on
199:36 that but uh what we'll do is we'll go on the left hand side here to channels
199:38 the left hand side here to channels because I just want to show uh here yeah
199:40 because I just want to show uh here yeah I don't not didn't
199:43 I don't not didn't download uh we'll try here in a second
199:45 download uh we'll try here in a second but um what we'll do is we'll go back po
199:50 but um what we'll do is we'll go back po profile uh unspecified bot what are you
199:52 profile uh unspecified bot what are you talking
199:54 talking about yeah maybe it needs some
200:06 time so you know maybe we'll just give the bot a little bit of time here I'm
200:07 the bot a little bit of time here I'm not sure why it's giving us a hard time
200:09 not sure why it's giving us a hard time because this bot is definitely deployed
200:10 because this bot is definitely deployed if we go over to our bot right bought
200:13 if we go over to our bot right bought Services it is here sometimes there's
200:17 Services it is here sometimes there's like latency you know with uh Azure oh
200:21 like latency you know with uh Azure oh there we go okay see it works now fine
200:23 there we go okay see it works now fine right and so I want to show you that
200:24 right and so I want to show you that there's different channels and these are
200:25 there's different channels and these are just easy ways to integrate your Bot
200:27 just easy ways to integrate your Bot into different services so whether you
200:29 into different services so whether you wanted to use it with Alexa group me
200:32 wanted to use it with Alexa group me Skype telepon twilio Skype business
200:36 Skype telepon twilio Skype business apparently they don't have that anymore
200:38 apparently they don't have that anymore because I got s teams now right uh keik
200:40 because I got s teams now right uh keik which I don't know people still use that
200:42 which I don't know people still use that slack we should had Discord telegram
200:44 slack we should had Discord telegram Facebook email um that's kind of cool
200:48 Facebook email um that's kind of cool but teams teams is a really good one I
200:50 but teams teams is a really good one I use teams uh there's a direct line
200:51 use teams uh there's a direct line Channel I don't know what that means and
200:53 Channel I don't know what that means and there's web chat which is just having
200:55 there's web chat which is just having like an ined code so if we go over we
200:57 like an ined code so if we go over we can go and test it over here just
200:59 can go and test it over here just testing our web chat and so it's the
201:01 testing our web chat and so it's the same thing as before but we just say
201:02 same thing as before but we just say things like uh um how many
201:06 things like uh um how many certifications are
201:14 there let Azure and get a clear answer back we'll go back up to our overview
201:18 back we'll go back up to our overview let's try to see if we can download that
201:19 let's try to see if we can download that code again I was kind of curious uh what
201:21 code again I was kind of curious uh what that looks
201:40 download must be a lot of code eh
201:46 there we go so now we can hit download and so there is the code I'm going to go
201:47 and so there is the code I'm going to go ahead and open that up uh so yeah I
201:50 ahead and open that up uh so yeah I guess when we chose JavaScript that made
201:52 guess when we chose JavaScript that made a lot more sense let's give it a little
201:54 a lot more sense let's give it a little peek here I'm just going
201:56 peek here I'm just going to uh drop this on my desktop here so
202:00 to uh drop this on my desktop here so let make a new folder here and call this
202:03 let make a new folder here and call this uh bot
202:04 uh bot code okay I know you can't see what I'm
202:06 code okay I know you can't see what I'm doing here but uh let's go here
202:09 doing here but uh let's go here and d double click into here and then
202:12 and d double click into here and then just drag that code on
202:20 in and then what we can do is open this up in VSS code I should have VSS code
202:22 up in VSS code I should have VSS code running somewhere around here just going
202:24 running somewhere around here just going to go ahead and open that I'm off screen
202:27 to go ahead and open that I'm off screen here I'll just show you my screen in a
202:29 here I'll just show you my screen in a moment say show code
202:32 moment say show code oops file open
202:35 oops file open folder bot code
202:38 folder bot code okay and uh we'll come all the way back
202:41 okay and uh we'll come all the way back here and so we got a lot of code here
202:42 here and so we got a lot of code here never looked at this before but you know
202:44 never looked at this before but you know I'm a pretty good programmer so it's not
202:46 I'm a pretty good programmer so it's not too hard for me to
202:48 too hard for me to understand um so looks like you got API
202:51 understand um so looks like you got API request things like that I guess it
202:53 request things like that I guess it would just be like if you needed to
202:54 would just be like if you needed to integrate into your application then it
202:55 integrate into your application then it kind of shows you all the code there I'm
202:58 kind of shows you all the code there I'm just trying to see our dialogue
203:00 just trying to see our dialogue choices nothing super
203:08 exciting okay you know when I go and make the um was it the AI or the AI 100
203:12 make the um was it the AI or the AI 100 whatever the data scientist course is
203:14 whatever the data scientist course is I'm sure I'll be a lot more thorough
203:16 I'm sure I'll be a lot more thorough here but I was just curious as to what
203:17 here but I was just curious as to what that looks like now if we wanted to have
203:20 that looks like now if we wanted to have an easy integration uh we can get an M
203:23 an easy integration uh we can get an M code for this so if we go back to our
203:24 code for this so if we go back to our channels I
203:27 channels I believe uh we can go and is it
203:31 believe uh we can go and is it edit ah yes so here we have a code so
203:34 edit ah yes so here we have a code so what I'll do is go back to jupyter Labs
203:36 what I'll do is go back to jupyter Labs I'm just going to go make a new empty um
203:39 I'm just going to go make a new empty um notebook so we'll just go up here and
203:41 notebook so we'll just go up here and say
203:42 say notebook and this can be for our
203:45 notebook and this can be for our Q&A doesn't really matter what
203:47 Q&A doesn't really matter what kernel uh we'll say Q and A maker just
203:52 kernel uh we'll say Q and A maker just to show like if you wanted a very very
203:54 to show like if you wanted a very very simple way of integrating your Bot um we
203:57 simple way of integrating your Bot um we would go back over
204:00 would go back over to wherever it is here ah here we are
204:03 to wherever it is here ah here we are I'm going to go ahead and copy this
204:04 I'm going to go ahead and copy this iframe I think it's percentage
204:07 iframe I think it's percentage percentage HTML so it treats this cell
204:10 percentage HTML so it treats this cell as HTML
204:12 as HTML and I don't have any HTML to render so
204:15 and I don't have any HTML to render so we will place that in there and notice
204:16 we will place that in there and notice we have to replace our secret key so I
204:19 we have to replace our secret key so I will go back here and I will show my key
204:21 will go back here and I will show my key and we will copy
204:23 and we will copy that and we will paste that key in here
204:27 that and we will paste that key in here and then we'll run
204:28 and then we'll run this and I can type in
204:32 this and I can type in here where am I just ask silly
204:45 how many Azure certifications are there well I wonder
204:48 certifications are there well I wonder if I just leave the are there off let's
204:49 if I just leave the are there off let's see if it figures it out okay cool
204:52 see if it figures it out okay cool so uh yeah I mean that's pretty much it
204:54 so uh yeah I mean that's pretty much it with Q&A
204:55 with Q&A maker um so yeah that's great so I think
204:59 maker um so yeah that's great so I think we're done here and we can move on to uh
205:02 we're done here and we can move on to uh checking out uh leis or Luis learning
205:05 checking out uh leis or Luis learning understanding to make a more uh robust
205:07 understanding to make a more uh robust bot
205:11 okay [Music]
205:13 [Music] all right so we are on to our last
205:15 all right so we are on to our last cognitive service and this one is going
205:17 cognitive service and this one is going to be uh lwis or Louise depending on how
205:20 to be uh lwis or Louise depending on how you like to say it it's Luis which is
205:22 you like to say it it's Luis which is language understanding so you type in
205:24 language understanding so you type in luis. a uh and that's going to bring us
205:28 luis. a uh and that's going to bring us up to this external website still part
205:30 up to this external website still part of um Azure just has its own domain and
205:33 of um Azure just has its own domain and so here we'll choose our subscription
205:35 so here we'll choose our subscription and we have no author authoring source
205:38 and we have no author authoring source so I guess we'll have to go ahead and
205:39 so I guess we'll have to go ahead and create one ourselves so go down here and
205:42 create one ourselves so go down here and we'll choose my cognitive Services asure
205:44 we'll choose my cognitive Services asure resource name so my o uh service or my
205:51 resource name so my o uh service or my cognitive
205:58 service create new cognitive service account but we already have one so I
206:00 account but we already have one so I don't want to make another one right it
206:02 don't want to make another one right it should show up here
206:05 should show up here right are valid in the author authoring
206:08 right are valid in the author authoring region so it's possible that we're just
206:10 region so it's possible that we're just in the incorrect region so we might end
206:12 in the incorrect region so we might end up creating two of these and that's
206:13 up creating two of these and that's totally fine I don't care it's as long
206:15 totally fine I don't care it's as long as we get this working here because
206:17 as we get this working here because we're going to delete everything at the
206:18 we're going to delete everything at the end anyway and so just say my Cog
206:20 end anyway and so just say my Cog service
206:22 service 2 and uh we'll say West us because I
206:25 2 and uh we'll say West us because I think that maybe we didn't choose one of
206:27 think that maybe we didn't choose one of these regions let's go double check uh
206:30 these regions let's go double check uh if we go back to our
206:32 if we go back to our portal just the limitations of the
206:34 portal just the limitations of the service right so we'll go to my Cog
206:37 service right so we'll go to my Cog Services here um I just want to go uh
206:41 Services here um I just want to go uh cognitive
206:43 cognitive services so just want to see where this
206:45 services so just want to see where this is deployed and this is in um you West
206:50 is deployed and this is in um you West us yes I don't know why it's not shown
206:52 us yes I don't know why it's not shown up there but whatever if that's what it
206:54 up there but whatever if that's what it wants we'll give it what it wants
207:00 okay shouldn't give us that much trouble but hey that's how it
207:03 but hey that's how it goes and so we have an author authoring
207:06 goes and so we have an author authoring service I'm going to refresh here and
207:07 service I'm going to refresh here and see if it added a second one it didn't
207:10 see if it added a second one it didn't so all right
207:12 so all right that's fine so we'll just say uh my
207:14 that's fine so we'll just say uh my sample
207:16 sample bot um we'll use English as our culture
207:19 bot um we'll use English as our culture if nothing shows up here don't worry you
207:21 if nothing shows up here don't worry you can choose it later on I remember the
207:22 can choose it later on I remember the first time I did this it didn't show up
207:24 first time I did this it didn't show up and so now we have my Cog service my
207:26 and so now we have my Cog service my custom vision service we want Cog
207:28 custom vision service we want Cog service
207:31 service so um anyway it tells you about schema
207:34 so um anyway it tells you about schema like how you make a schema animates
207:36 like how you make a schema animates talking about like bot action intent and
207:39 talking about like bot action intent and example utterance but we're just going
207:40 example utterance but we're just going to set up something very simple here so
207:42 to set up something very simple here so we're going to create our attent the one
207:43 we're going to create our attent the one that we always see is uh flight booking
207:47 that we always see is uh flight booking so I'll go here and do
207:49 so I'll go here and do that and what we want to do is write an
207:52 that and what we want to do is write an undering so like uh book May flight to
207:57 undering so like uh book May flight to Toronto okay and so if someone were to
208:00 Toronto okay and so if someone were to type that in then the idea is it would
208:02 type that in then the idea is it would return back the intent this value and
208:04 return back the intent this value and metadata around it and we could
208:06 metadata around it and we could programmatically provide code right so
208:08 programmatically provide code right so what we need is identity identities and
208:10 what we need is identity identities and we can actually just click here and uh
208:12 we can actually just click here and uh make one here so enter named identity
208:15 make one here so enter named identity we'll just call this
208:16 we'll just call this location okay here we have an option
208:19 location okay here we have an option machine learned and list if you flip
208:21 machine learned and list if you flip between it this is like imagine you have
208:22 between it this is like imagine you have a ticket order and you have these values
208:24 a ticket order and you have these values that can uh change or you just have a
208:27 that can uh change or you just have a value that always stays the same like
208:29 value that always stays the same like list so that's our
208:30 list so that's our airport that makes sense we'll do
208:34 airport that makes sense we'll do that if we go over to ENT entities we
208:36 that if we go over to ENT entities we can see it
208:39 can see it here all right so uh nothing super
208:41 here all right so uh nothing super exciting there but what I want to show
208:43 exciting there but what I want to show you is if we go ahead and um we should
208:47 you is if we go ahead and um we should probably add fight booking should be uh
208:51 probably add fight booking should be uh how about book
208:53 how about book flight flight booking fight booking okay
208:57 flight flight booking fight booking okay so we'll go ahead and I know there's
208:58 so we'll go ahead and I know there's only one but we'll go ahead and train
208:59 only one but we'll go ahead and train our
209:07 model because we don't need to know tons right we cover a lot in the lecture
209:09 right we cover a lot in the lecture content uh to build a complex spot is
209:11 content uh to build a complex spot is more for the uh associate level um but
209:14 more for the uh associate level um but now what we can do is go ahead and test
209:16 now what we can do is go ahead and test this and we'll say book me a flight to
209:24 Seattle okay and notice here it says book flight we can go inspect it and we
209:26 book flight we can go inspect it and we get some additional data so top scoring
209:29 get some additional data so top scoring so it says How likely that was the
209:31 so it says How likely that was the intent
209:33 intent um okay so you get kind of an idea there
209:36 um okay so you get kind of an idea there there's additional things here it
209:38 there's additional things here it doesn't really matter um we'll go back
209:40 doesn't really matter um we'll go back here and we will go ahead and publish
209:42 here and we will go ahead and publish our model so we can put it into a
209:45 our model so we can put it into a production slot you can see we have
209:46 production slot you can see we have sentiment analysis speech priming we
209:48 sentiment analysis speech priming we don't care about either of those
209:49 don't care about either of those things we can go and see where our
209:51 things we can go and see where our endpoint is and so now we have uh an
209:55 endpoint is and so now we have uh an endpoint that we can work with um so
209:58 endpoint that we can work with um so yeah I mean that's pretty much all you
210:00 yeah I mean that's pretty much all you really need to learn about Lewis um but
210:03 really need to learn about Lewis um but uh I think we're all done for cognitive
210:04 uh I think we're all done for cognitive services so we're going to keep around
210:06 services so we're going to keep around our our notebook because um we're going
210:09 our our notebook because um we're going to still use our jupyter notebook for
210:10 to still use our jupyter notebook for some other things things but what I want
210:12 some other things things but what I want you to do is make your way over
210:14 you to do is make your way over to um your resource groups because if
210:18 to um your resource groups because if you've been pretty clean it's all within
210:20 you've been pretty clean it's all within here we'll just take a look here so we
210:21 here we'll just take a look here so we have our
210:22 have our Q&A all of our stuff here I'm just
210:25 Q&A all of our stuff here I'm just making sure it's all there and so I'm
210:26 making sure it's all there and so I'm just going to go ahead and delete this
210:28 just going to go ahead and delete this Resource Group and that should wipe away
210:31 Resource Group and that should wipe away everything okay for the cognitive
210:33 everything okay for the cognitive Services
210:36 Services part all right so we're all good here
210:39 part all right so we're all good here and I'm just going to go off and I'll
210:40 and I'm just going to go off and I'll leave leave this open because it's
210:43 leave leave this open because it's always a pain to get back to it and
210:44 always a pain to get back to it and reopen it but let's make our way back to
210:45 reopen it but let's make our way back to the home here in the Azure uh machine
210:48 the home here in the Azure uh machine Learning Studio and now we can actually
210:50 Learning Studio and now we can actually explore building up machine learning
210:53 explore building up machine learning [Music]
210:57 [Music] pipelines okay so we are on to the ml uh
211:01 pipelines okay so we are on to the ml uh uh follow alongs here so we're going to
211:03 uh follow alongs here so we're going to learn how to build some pipelines so
211:04 learn how to build some pipelines so first I think is the easiest would be
211:06 first I think is the easiest would be Auto automated ml or also know as autom
211:08 Auto automated ml or also know as autom ml the idea here is it's going to just
211:11 ml the idea here is it's going to just um build out the entire pipeline for us
211:13 um build out the entire pipeline for us so we don't have to do any thinking we
211:14 so we don't have to do any thinking we just say what kind of model we want to
211:16 just say what kind of model we want to run and have it to make a prediction so
211:19 run and have it to make a prediction so what we'll do is a new automated ML and
211:21 what we'll do is a new automated ML and we're going to need a data set so I
211:22 we're going to need a data set so I don't have one but the nicest thing is
211:24 don't have one but the nicest thing is they have these open data sets so if you
211:26 they have these open data sets so if you click here you'll see there is a bunch
211:29 click here you'll see there is a bunch here and a lot of these you'll come
211:30 here and a lot of these you'll come across quite often not just on Azure but
211:33 across quite often not just on Azure but other places like this diabetes one I've
211:35 other places like this diabetes one I've seen it like everywhere okay uh and so
211:39 seen it like everywhere okay uh and so like if we just go click here maybe we
211:40 like if we just go click here maybe we can read a bit more here so diabetes
211:43 can read a bit more here so diabetes data set 422 samples with 10 features
211:45 data set 422 samples with 10 features ideal for getting started with machine
211:47 ideal for getting started with machine learning algorithms it's one of the
211:48 learning algorithms it's one of the popular pyit learn toy data sets it's
211:51 popular pyit learn toy data sets it's probably where I've seen it before
211:53 probably where I've seen it before though it's not showing up there uh you
211:54 though it's not showing up there uh you scroll on down you can see the data uh
211:57 scroll on down you can see the data uh you notice that it's available AZ your
211:58 you notice that it's available AZ your notebooks data bricks and Azure synapse
212:01 notebooks data bricks and Azure synapse uh the thing is we have these values so
212:03 uh the thing is we have these values so age sex BMI BP and the Y is trying to
212:06 age sex BMI BP and the Y is trying to make a prediction it's trying to say
212:08 make a prediction it's trying to say what's the likelihood of you having
212:10 what's the likelihood of you having diabetes or not and so it's not a
212:11 diabetes or not and so it's not a Boolean value so it's not a binary
212:13 Boolean value so it's not a binary classifier it's kind of on a uh well I
212:15 classifier it's kind of on a uh well I guess you would be doing binary classif
212:18 guess you would be doing binary classif classification say do you have di
212:20 classification say do you have di diabetes or you can make a prediction to
212:22 diabetes or you can make a prediction to say what's the likelihood or this value
212:24 say what's the likelihood or this value if you gave another value in there but
212:27 if you gave another value in there but um anyway you this is the predicting
212:29 um anyway you this is the predicting value a lot of times this is X so
212:31 value a lot of times this is X so everything here is X and this is
212:34 everything here is X and this is considered y the actual prediction um so
212:37 considered y the actual prediction um so some sometimes it's why and sometimes
212:38 some sometimes it's why and sometimes it's actually named what it is uh but
212:40 it's actually named what it is uh but that's just what it is here so we'll
212:42 that's just what it is here so we'll close that off and so we'll choose the
212:44 close that off and so we'll choose the diabetes set and it will be data set
212:48 diabetes set and it will be data set one and so we'll worry about feedback
212:51 one and so we'll worry about feedback later so we'll click on Sample uh
212:53 later so we'll click on Sample uh diabetes we'll hit next and here it's
212:55 diabetes we'll hit next and here it's going to try to figure out uh what kind
212:57 going to try to figure out uh what kind of model that we want we have to create
212:59 of model that we want we have to create a new experiment it's a container to run
213:00 a new experiment it's a container to run the model in so we'll just say
213:03 the model in so we'll just say diabetes uh my diabetes it sounds a bit
213:06 diabetes uh my diabetes it sounds a bit odd but that's what it is the target
213:07 odd but that's what it is the target column we want to predict um is the
213:11 column we want to predict um is the train to predict is the Y It's usually
213:13 train to predict is the Y It's usually the Y um we don't have a compute cluster
213:16 the Y um we don't have a compute cluster so I'll go ahead and create a new
213:18 so I'll go ahead and create a new compute we have dedicator or low
213:20 compute we have dedicator or low priority technically we um it is low
213:24 priority technically we um it is low priority but I just want this done low
213:26 priority but I just want this done low priority but don't G to compute nodes
213:29 priority but don't G to compute nodes your job may be pre- emptied um I'm
213:31 your job may be pre- emptied um I'm going to stick with dedicated for the
213:33 going to stick with dedicated for the time being we're going to stick with
213:34 time being we're going to stick with CPU uh if we go with um this it does
213:40 CPU uh if we go with um this it does take about an hour to run so when I ran
213:42 take about an hour to run so when I ran this took about an hour so if you don't
213:44 this took about an hour so if you don't mind it's only going to cost you 15
213:46 mind it's only going to cost you 15 cents but if you want this done a lot
213:47 cents but if you want this done a lot sooner I'm going to try to do something
213:49 sooner I'm going to try to do something a little bit more powerful so I'm just
213:52 a little bit more powerful so I'm just trying to decide here because if it only
213:54 trying to decide here because if it only takes an
213:59 hour uh I might run it on something more powerful that's 90 cents that might be
214:01 powerful that's 90 cents that might be Overkill because it's not really deep
214:04 Overkill because it's not really deep learning uh it's just statistical
214:06 learning uh it's just statistical statistical stuff so try and large data
214:09 statistical stuff so try and large data set I wouldn't say it's large real time
214:11 set I wouldn't say it's large real time inference other latency sensitive
214:14 inference other latency sensitive ones
214:25 about why is this one I'm just looking here because this one's 29 this one's
214:27 here because this one's 29 this one's more expensive but it has 32 GB of RAM
214:30 more expensive but it has 32 GB of RAM this one is 28 oh 14 GB of RAM oh it's
214:33 this one is 28 oh 14 GB of RAM oh it's storage so this one's our highest in the
214:36 storage so this one's our highest in the tier again you can choose this one you
214:38 tier again you can choose this one you you just have to wait a a lot longer I
214:40 you just have to wait a a lot longer I just want to see if it finishes a lot
214:41 just want to see if it finishes a lot faster okay without having to go to the
214:43 faster okay without having to go to the GPU level because I don't think GPU is
214:45 GPU level because I don't think GPU is going to help too much here um the
214:47 going to help too much here um the computer name is uh my diabetes
214:58 machine minimum number nodes uh you want to provision if you want dedicated nodes
215:00 to provision if you want dedicated nodes to set the count here uh
215:02 to set the count here uh maximum I guess I just want one node
215:05 maximum I guess I just want one node right uh we will go ahead and oops uh
215:09 right uh we will go ahead and oops uh complete name be2 characters
215:17 long what doesn't is it too long okay there we
215:30 here yeah it's going to spin up the cluster so it does take a little bit
215:32 cluster so it does take a little bit time to start this so I'll see you back
215:33 time to start this so I'll see you back here when this is done
215:35 here when this is done okay great so after a short little wait
215:37 okay great so after a short little wait there it looks like uh our cluster is
215:39 there it looks like uh our cluster is running if we double check it here we
215:40 running if we double check it here we can go to compute I believe that shows
215:42 can go to compute I believe that shows up under here under the compute cluster
215:45 up under here under the compute cluster so there it is notice it's slightly
215:47 so there it is notice it's slightly different this one shows you
215:48 different this one shows you applications and this one is just size
215:50 applications and this one is just size and Etc we can click in here see nodes
215:52 and Etc we can click in here see nodes and run times we'll go make our way back
215:54 and run times we'll go make our way back here uh and we'll go ahead and hit next
215:57 here uh and we'll go ahead and hit next and notice that I think it actually will
215:59 and notice that I think it actually will select what it generally because it'll
216:01 select what it generally because it'll look at your prediction value maybe
216:02 look at your prediction value maybe sample a bit of it and say okay you
216:04 sample a bit of it and say okay you probably want a regression thing so to
216:05 probably want a regression thing so to predict a continuous numeric values so
216:08 predict a continuous numeric values so the thing is that if it was a label like
216:10 the thing is that if it was a label like text or if it was just zero and one it
216:12 text or if it was just zero and one it probably would choose classification
216:14 probably would choose classification because it's um you saw our our y value
216:17 because it's um you saw our our y value was like a number that was all over the
216:18 was like a number that was all over the place it thinks it's regression so I
216:21 place it thinks it's regression so I think that's a good indicator uh uh
216:23 think that's a good indicator uh uh there so let's go with
216:29 regression you know but you might want it as a binary classifier but uh yeah
216:31 it as a binary classifier but uh yeah it's another story there so it's uh as
216:34 it's another story there so it's uh as soon as we created it just started it
216:35 soon as we created it just started it didn't give us the option to say hey I
216:37 didn't give us the option to say hey I want to start running it notice on this
216:39 want to start running it notice on this here it's going to do featurization so
216:41 here it's going to do featurization so that means it's automatically going to
216:42 that means it's automatically going to select out features for us which is what
216:43 select out features for us which is what we wanted to do it set up to do
216:45 we wanted to do it set up to do regression uh we have some configuration
216:47 regression uh we have some configuration here so training time is 3 hours doesn't
216:50 here so training time is 3 hours doesn't mean it's going to train for three hours
216:51 mean it's going to train for three hours but that's I guess it's timeout for it
216:54 but that's I guess it's timeout for it um you could set a metric uh score
216:56 um you could set a metric uh score threshold so it has to meet at least
216:58 threshold so it has to meet at least this to be successful if it's not going
217:00 this to be successful if it's not going to do it it probably would quit out
217:01 to do it it probably would quit out early cross number Val or cross
217:03 early cross number Val or cross validations just make sure the data is
217:05 validations just make sure the data is good you can see blocked algorithm so
217:06 good you can see blocked algorithm so tensor flow DNN tensor flow L regression
217:09 tensor flow DNN tensor flow L regression if it was using NN so deep learning
217:11 if it was using NN so deep learning neural network I probably would have
217:13 neural network I probably would have chose the GPU to see if it would go
217:15 chose the GPU to see if it would go faster um look at the primary metric
217:17 faster um look at the primary metric it's normalized root square uh root mean
217:20 it's normalized root square uh root mean Square AED sometimes on the exam they'll
217:22 Square AED sometimes on the exam they'll actually ask you like what's the prim
217:23 actually ask you like what's the prim metric for this thing so it's good to uh
217:26 metric for this thing so it's good to uh take a look and see what they actually
217:28 take a look and see what they actually use for that I'll probably be sure to um
217:30 use for that I'll probably be sure to um highlight that stuff in the actual
217:32 highlight that stuff in the actual lecture content um but this will take
217:34 lecture content um but this will take some time to run uh we have data guard
217:37 some time to run uh we have data guard rails it will actually not populate I
217:39 rails it will actually not populate I guess until We've ran it so so we'll
217:41 guess until We've ran it so so we'll just let it run and I'll see you back
217:42 just let it run and I'll see you back here when it's done okay all right so
217:44 here when it's done okay all right so after a very very very long wait our
217:46 after a very very very long wait our automl job is done it took 60 minutes so
217:49 automl job is done it took 60 minutes so using a larger instance didn't save me
217:51 using a larger instance didn't save me any time I don't know if maybe if I ran
217:53 any time I don't know if maybe if I ran a GPU instance it would be a lot faster
217:56 a GPU instance it would be a lot faster I'd be very curious to try that out but
217:57 I'd be very curious to try that out but not something for uh uh this
218:00 not something for uh uh this certification course so we go into here
218:02 certification course so we go into here and yeah the cheaper instance was the
218:04 and yeah the cheaper instance was the same amount of time so it probably just
218:05 same amount of time so it probably just needs gpus it really depends on the type
218:07 needs gpus it really depends on the type of models it's running so we have a
218:09 of models it's running so we have a bunch of different algorithms in here it
218:10 bunch of different algorithms in here it ran uh about 42 different models I
218:14 ran uh about 42 different models I thought last time I ran it I saw a lot
218:16 thought last time I ran it I saw a lot more but you can see there's all kinds
218:18 more but you can see there's all kinds of models that it's running and then
218:20 of models that it's running and then it's going to choose the top candidate
218:21 it's going to choose the top candidate so it chose voting Ensemble so Ensemble
218:25 so it chose voting Ensemble so Ensemble is um uh we don't cover really in the
218:27 is um uh we don't cover really in the course because it's gets too much into
218:29 course because it's gets too much into ml but Ensemble is when you actually use
218:31 ml but Ensemble is when you actually use two different weaker models and combine
218:34 two different weaker models and combine the results in order to make a more uh
218:37 the results in order to make a more uh uh powerful uh ml model okay um so here
218:41 uh powerful uh ml model okay um so here we'll get some explanation I tried this
218:43 we'll get some explanation I tried this before and I didn't get really good
218:45 before and I didn't get really good information so if we go
218:48 information so if we go here uh so like I don't have anything
218:50 here uh so like I don't have anything under model performance so this tab
218:52 under model performance so this tab requires array of predicted values from
218:54 requires array of predicted values from the model to be supplied we didn't
218:56 the model to be supplied we didn't Supply any so we don't get any data
218:58 Supply any so we don't get any data Explorer so select a cohort of the data
219:01 Explorer so select a cohort of the data that all the data is is we have here um
219:04 that all the data is is we have here um so like here we were seeing
219:06 so like here we were seeing age and I guess it's just giving us an
219:08 age and I guess it's just giving us an indicator about the age information um
219:12 indicator about the age information um use the slider to show descending
219:14 use the slider to show descending feature important select up to three
219:16 feature important select up to three cohorts to see the feature important SL
219:18 cohorts to see the feature important SL by
219:19 by side
219:22 side okay so I guess S5 and BM I don't know
219:26 okay so I guess S5 and BM I don't know what S5 is we'd have to look up the data
219:28 what S5 is we'd have to look up the data set be BMI is your body mass index so
219:30 set be BMI is your body mass index so that's a clear indicator as to what
219:32 that's a clear indicator as to what affects whether you have diabetes or not
219:34 affects whether you have diabetes or not so that makes sense age doesn't seem to
219:36 so that makes sense age doesn't seem to be a huge factor which is kind of
219:39 be a huge factor which is kind of interesting individual feature
219:41 interesting individual feature importance we can go here and just kind
219:42 importance we can go here and just kind of like narrow in and say okay well why
219:44 of like narrow in and say okay well why is this outlier over here and they're
219:45 is this outlier over here and they're like age 79 right so that's kind of
219:49 like age 79 right so that's kind of interesting to see that information so
219:51 interesting to see that information so it does give you some uh explanation as
219:54 it does give you some uh explanation as to to you know why things are why they
219:56 to to you know why things are why they are um over here we have a little bit
219:59 are um over here we have a little bit more different data this is kind of
220:01 more different data this is kind of interesting model
220:02 interesting model performance uh I don't know what I'm
220:04 performance uh I don't know what I'm looking at but like here it's over mean
220:06 looking at but like here it's over mean squared so it's that uh mean squared
220:08 squared so it's that uh mean squared calculation there again
220:21 okay so yeah it's something right uh but anyway the point is is that uh that we
220:23 anyway the point is is that uh that we finally get metric so I guess we always
220:25 finally get metric so I guess we always had to click there because that makes
220:27 had to click there because that makes more sense um so yeah there's more
220:30 more sense um so yeah there's more values here sure data
220:33 values here sure data transformation uh illustrates the data
220:36 transformation uh illustrates the data processing feature engine scaling
220:37 processing feature engine scaling techniques and machine learning
220:38 techniques and machine learning algorithm automl so you know if you were
220:40 algorithm automl so you know if you were a real data scientist all this stuff
220:42 a real data scientist all this stuff would make sense to you um I think just
220:45 would make sense to you um I think just with time it'll it'll make sense but
220:46 with time it'll it'll make sense but even at this point I I'm not sure and I
220:49 even at this point I I'm not sure and I don't care about the model right if
220:50 don't care about the model right if you're building something for real I'm
220:51 you're building something for real I'm sure uh the information becomes a lot
220:54 sure uh the information becomes a lot more valuable so this model is done uh
220:58 more valuable so this model is done uh and the idea is that we can deploy oops
221:00 and the idea is that we can deploy oops if we go back to the
221:02 if we go back to the actual uh
221:04 actual uh models oh because we actually went into
221:06 models oh because we actually went into them e so we go back to the um autom ml
221:11 them e so we go back to the um autom ml here I think you can deploy any model
221:14 here I think you can deploy any model that you like so I think you go here and
221:16 that you like so I think you go here and deploy this like if you prefer a
221:18 deploy this like if you prefer a different model you could deploy it um
221:20 different model you could deploy it um if we go into Data guard rails we kind
221:22 if we go into Data guard rails we kind of skipped over that this is a way it
221:24 of skipped over that this is a way it does automatic featurization so it's
221:26 does automatic featurization so it's extracting up the feature so it how it
221:28 extracting up the feature so it how it handles the splitting how it handles
221:31 handles the splitting how it handles missing features high card anality is
221:34 missing features high card anality is like if you have too much data it might
221:37 like if you have too much data it might have to do dimensionality reduction so
221:40 have to do dimensionality reduction so that's just saying like hey if this is a
221:42 that's just saying like hey if this is a problem maybe we would do some
221:44 problem maybe we would do some pre-processing or stuff to make it
221:46 pre-processing or stuff to make it easier to work with the data so if we're
221:48 easier to work with the data so if we're happy with this we can go ahead and
221:49 happy with this we can go ahead and deploy it so let's say um
221:53 deploy it so let's say um deploy just say infer my
221:58 deploy just say infer my diabetes here we have AKs and E
222:02 diabetes here we have AKs and E uh um Azure container instance let's do
222:05 uh um Azure container instance let's do Azure kubernetes uh kubernetes service
222:08 Azure kubernetes uh kubernetes service cuz we did the other one here um say uh
222:12 cuz we did the other one here um say uh diabetes prodad
222:15 diabetes prodad maybe um AKs
222:25 diabetes oh compute name sorry um one of the inference ones okay so in
222:29 um one of the inference ones okay so in order to uh deploy this we would have to
222:31 order to uh deploy this we would have to create our pipeline I'm not sure if I
222:34 create our pipeline I'm not sure if I have enough in my quota here but let's
222:35 have enough in my quota here but let's go give it a go so I think what it's
222:37 go give it a go so I think what it's wanting is one of these here
222:41 wanting is one of these here uh I I think we'd want this wherever we
222:44 uh I I think we'd want this wherever we are right I'm not
222:48 are right I'm not sure where we are If This Is Us East or
222:52 sure where we are If This Is Us East or uh West here let's go
222:55 uh West here let's go check
222:57 check studio
222:59 studio um Azure machine
223:07 learning East usest no I never did this when I was um
223:10 usest no I never did this when I was um I just use usually Azure container
223:12 I just use usually Azure container instance but I'm just curious
223:15 instance but I'm just curious here say
223:17 here say next my uh
223:24 diabetes prod we
223:27 prod we will we need to choose some
223:34 nodes uh the number of nodes multiply by the virtual machine's number of cors
223:35 the virtual machine's number of cors must be greater or equal to 12
223:38 must be greater or equal to 12 okay no again if you're not confident
223:41 okay no again if you're not confident like if you're concerned about cost you
223:42 like if you're concerned about cost you can just again watch you don't have to
223:44 can just again watch you don't have to do right um this is again a uh
223:48 do right um this is again a uh fundamental certification it's not super
223:50 fundamental certification it's not super important to get all the hands-on
223:52 important to get all the hands-on experience
223:53 experience yourself um but I'm just trying to
223:54 yourself um but I'm just trying to explore this so we can see right because
223:57 explore this so we can see right because I I don't care about costs it's not a
223:58 I I don't care about costs it's not a big deal to me on my machine here uh so
224:01 big deal to me on my machine here uh so probably I don't
224:08 have Sy pool must use a VM SKU with more than two cores and four gigabytes well
224:10 than two cores and four gigabytes well what did I
224:11 what did I choose did I not choose the right
224:24 again oh I chose three yeah that's
224:26 three yeah that's fair
224:36 um uh what did it want 12 cores said before I
224:48 details because it already exists based on that name a
224:50 on that name a to it's given us all this trouble a this
224:54 to it's given us all this trouble a this one we'll go ahead and delete you think
224:56 one we'll go ahead and delete you think like it wouldn't matter like I wouldn't
224:57 like it wouldn't matter like I wouldn't have to delete it out but that's
225:01 have to delete it out but that's fine this one failed now what's the
225:04 fine this one failed now what's the problem quota exceeded so I can't do it
225:07 problem quota exceeded so I can't do it because I don't I'd have to go make a
225:08 because I don't I'd have to go make a support request in reset so it's not a
225:11 support request in reset so it's not a real big deal um I guess what we could
225:13 real big deal um I guess what we could do is instead of doing it on AKs we
225:16 do is instead of doing it on AKs we could just deploy to container instance
225:18 could just deploy to container instance if it'll let us um notice I don't have
225:20 if it'll let us um notice I don't have to fill anything additional in it'll
225:22 to fill anything additional in it'll just deploy I
225:30 think great uh and so I guess we'll let that deploy and I'll see you back here
225:32 that deploy and I'll see you back here in a bit okay all right so I'm back here
225:35 in a bit okay all right so I'm back here checking on out on my or checking up on
225:37 checking on out on my or checking up on my automl here so we go over to Compu
225:39 my automl here so we go over to Compu cute we go to inference clusters we
225:42 cute we go to inference clusters we don't have anything under there if we go
225:44 don't have anything under there if we go uh over to our
225:46 uh over to our experiments under our diabetes
225:50 experiments under our diabetes here because we did choose to deploy the
226:05 deploy so it should have created an ACI instance let's make our way over to the
226:07 instance let's make our way over to the portal the reason why it might not be sh
226:09 portal the reason why it might not be sh up is because I'm just running out of
226:11 up is because I'm just running out of compute because again it's a quota thing
226:15 compute because again it's a quota thing um it's not a big deal for us to get a
226:17 um it's not a big deal for us to get a deploy it's not like we're going to do
226:18 deploy it's not like we're going to do anything with it but uh yeah so we can
226:19 anything with it but uh yeah so we can see that we have a container over here
226:22 see that we have a container over here and it's
226:23 and it's running so we must be able to uh see if
226:26 running so we must be able to uh see if we go to endpoints here ah here it is
226:29 we go to endpoints here ah here it is right I was under models that's my
226:31 right I was under models that's my problem uh so pipeline endpoints that
226:33 problem uh so pipeline endpoints that would be something I I think that if we
226:35 would be something I I think that if we had deployed our designer I thought we
226:36 had deployed our designer I thought we would have saw it under there but here
226:38 would have saw it under there but here we have our binary pipeline or our
226:40 we have our binary pipeline or our diabetes prod pipeline so if we wanted
226:42 diabetes prod pipeline so if we wanted to like test data you know we could pass
226:45 to like test data you know we could pass stuff in here um I think if we wanted to
226:47 stuff in here um I think if we wanted to kind of just like see this in action I'm
226:50 kind of just like see this in action I'm not sure if it's going to work but we'll
226:51 not sure if it's going to work but we'll give it a go so if we go into our sample
226:53 give it a go so if we go into our sample diabetes data set and we just explore
226:56 diabetes data set and we just explore some of the data we should be able to
226:58 some of the data we should be able to kind of Select out some values because I
226:59 kind of Select out some values because I I don't know what these values mean so
227:01 I don't know what these values mean so let's just say like
227:03 let's just say like 36 oops 36 but we already know that BMI
227:07 36 oops 36 but we already know that BMI is the major factor here uh sex is
227:09 is the major factor here uh sex is either one or two so we'll say two BMI
227:13 either one or two so we'll say two BMI will say
227:14 will say 25.3 the BP will be
227:18 25.3 the BP will be 83 or whatever oops 83
227:46 five [Music]
227:48 [Music] 5.1 oh we only we're running out of
227:50 5.1 oh we only we're running out of metrics here uh
227:53 metrics here uh 82 wonder why it doesn't give us all of
227:56 82 wonder why it doesn't give us all of them oh I guess it does it's up to
227:58 them oh I guess it does it's up to six okay so let's go ahead and test that
228:01 six okay so let's go ahead and test that see what we get and we got a result back
228:02 see what we get and we got a result back 168 so uh that is uh autom ml all
228:06 168 so uh that is uh autom ml all complete there for
228:07 complete there for you um yeah so there you
228:10 you um yeah so there you [Music]
228:14 [Music] go all right so let's take a look here
228:16 go all right so let's take a look here at the uh visual designer because it's a
228:18 at the uh visual designer because it's a great way to get started very
228:21 great way to get started very easily uh with uh if you don't know what
228:23 easily uh with uh if you don't know what you're doing and you want something a
228:25 you're doing and you want something a little bit more advanced than automl and
228:26 little bit more advanced than automl and have some customization it's great to
228:28 have some customization it's great to start with one of these samples let's go
228:29 start with one of these samples let's go ahead and expand it and see what we have
228:31 ahead and expand it and see what we have here we have binary classification with
228:33 here we have binary classification with custom python script uh TB parameters
228:35 custom python script uh TB parameters for binary
228:37 for binary classification uh multiclass multi class
228:39 classification uh multiclass multi class classification so letter recognition
228:42 classification so letter recognition text classification all sorts of things
228:44 text classification all sorts of things usually binary classification
228:45 usually binary classification classification is pretty easy I'm
228:47 classification is pretty easy I'm looking for one that is pretty darn
228:49 looking for one that is pretty darn simple uh let's go take a look here so
228:51 simple uh let's go take a look here so this says this sample shows how to
228:52 this says this sample shows how to filter base features selection to
228:54 filter base features selection to selection
228:56 selection features um binary classification so how
228:59 features um binary classification so how to predictors related to customer
229:01 to predictors related to customer relationships using binary classes how
229:02 relationships using binary classes how to handle imbalance data sets using smot
229:05 to handle imbalance data sets using smot and modules I'm not really worried about
229:07 and modules I'm not really worried about balancing uh customized python script to
229:09 balancing uh customized python script to perform cost sensitive binary
229:11 perform cost sensitive binary classification tune parameters so you
229:14 classification tune parameters so you tune model parameters best models during
229:17 tune model parameters best models during the training process let's go with this
229:18 the training process let's go with this one this one seems okay to me um and so
229:22 one this one seems okay to me um and so what you can see here is that it's using
229:23 what you can see here is that it's using a sample data set I believe I think this
229:25 a sample data set I believe I think this is a sample and if you wanted to see all
229:28 is a sample and if you wanted to see all of them you could literally drag them
229:30 of them you could literally drag them out here and do things with them uh I
229:32 out here and do things with them uh I haven't actually uh built one uh end to
229:35 haven't actually uh built one uh end to end yet for uh for this again I don't
229:37 end yet for uh for this again I don't think it's like super important for uh
229:39 think it's like super important for uh this level exam but uh this just shows
229:41 this level exam but uh this just shows you that there's a pre-built one if
229:43 you that there's a pre-built one if you've start to get the handle of ml you
229:45 you've start to get the handle of ml you know the full pipeline this isn't too
229:47 know the full pipeline this isn't too confusing so at the beginning here we
229:49 confusing so at the beginning here we have our classification data and then
229:52 have our classification data and then what it's going to do is say select
229:53 what it's going to do is say select columns in the data set so it says
229:56 columns in the data set so it says exclude column names work class
229:58 exclude column names work class occupation native country so it's doing
230:00 occupation native country so it's doing some pre-processing excluding that data
230:03 some pre-processing excluding that data might be interesting to go look at that
230:04 might be interesting to go look at that data set so if we go over to our data
230:06 data set so if we go over to our data sets tab it should show up here I
230:10 sets tab it should show up here I believe maybe because we haven't um uh
230:14 believe maybe because we haven't um uh uh committed or submitted this we we
230:16 uh committed or submitted this we we can't see that data set yet but we'll
230:17 can't see that data set yet but we'll look at it for a moment then we want to
230:19 look at it for a moment then we want to clean our data so here it's saying clean
230:21 clean our data so here it's saying clean all the columns so uh custom
230:24 all the columns so uh custom substitution
230:26 substitution value see if we can see what it's
230:28 value see if we can see what it's substituting
230:36 out uh it's not saying what so clean missing
230:38 missing data so I'm not sure what it's cleaning
230:40 data so I'm not sure what it's cleaning out there
230:46 but because I would suggest that it's using some kind of custom script um I'm
230:48 using some kind of custom script um I'm not sure where it is but that's okay we
230:50 not sure where it is but that's okay we have split data pretty common to split
230:52 have split data pretty common to split your data so you would have a training
230:54 your data so you would have a training and test data set uh it's usually really
230:56 and test data set uh it's usually really good to randomize it so you want to
230:58 good to randomize it so you want to randomize it then split it um and that's
231:01 randomize it then split it um and that's that's just so you get better results
231:03 that's just so you get better results then it has model hyperparameter tuning
231:06 then it has model hyperparameter tuning so the idea is that it's going to use ml
231:08 so the idea is that it's going to use ml to figure out the uh the best um
231:11 to figure out the uh the best um parameters for tuning over here we have
231:13 parameters for tuning over here we have the two class decision tree where it's
231:15 the two class decision tree where it's going to do some work there it's going
231:17 going to do some work there it's going to score our model and then it's going
231:18 to score our model and then it's going to evaluate our model and see if it's
231:20 to evaluate our model and see if it's successful so this is all set up to go
231:22 successful so this is all set up to go so all we got to do is go to the top
231:24 so all we got to do is go to the top here there's a setting wheel here and we
231:25 here there's a setting wheel here and we need to choose some type of compute so
231:27 need to choose some type of compute so I'm going to go here and we have this
231:30 I'm going to go here and we have this one here but I'm going to go create it's
231:31 one here but I'm going to go create it's for my um my diabetes one but I'm going
231:34 for my um my diabetes one but I'm going to go ahead and make a new one and we're
231:36 to go ahead and make a new one and we're going to say
231:38 going to say um uh uh we recommend using a predefined
231:40 um uh uh we recommend using a predefined configuration to quickly set up compute
231:42 configuration to quickly set up compute training this
231:45 training this one looks okay I don't know if it needs
231:48 one looks okay I don't know if it needs two nodes but uh I guess we can do this
231:51 two nodes but uh I guess we can do this one so we'll just say binary we'll just
231:54 one so we'll just say binary we'll just say binary
231:56 say binary pipeline
231:58 pipeline okay say save hopefully it's making a
232:02 okay say save hopefully it's making a good suggestion and we'll have to wait
232:03 good suggestion and we'll have to wait for that to spin up it's going to take a
232:05 for that to spin up it's going to take a little bit of time okay so I'll see you
232:06 little bit of time okay so I'll see you back here in a moment all right so I got
232:09 back here in a moment all right so I got message saying that that is ready so
232:11 message saying that that is ready so what we can do I think it was here my
232:13 what we can do I think it was here my notebook instance no that's not it but I
232:15 notebook instance no that's not it but I I definitely saw a popup on my screen uh
232:18 I definitely saw a popup on my screen uh uh you might have saw it too you'd have
232:19 uh you might have saw it too you'd have to be paying close attention for that
232:20 to be paying close attention for that but if you go over um it says that it's
232:24 but if you go over um it says that it's it's ready to go so what I'm going to do
232:25 it's ready to go so what I'm going to do is make my way back over here we're
232:27 is make my way back over here we're going to select our compute there is our
232:30 going to select our compute there is our binary pipeline I'm going to select that
232:32 binary pipeline I'm going to select that and there are some other options we're
232:34 and there are some other options we're not going to fill around with that we're
232:35 not going to fill around with that we're going to go ahead and hit submit so we
232:37 going to go ahead and hit submit so we need a new experiment so I'm going to
232:39 need a new experiment so I'm going to just say um binary
232:41 just say um binary pipeline we'll hit
232:51 submit okay and so this is now running so after a little while here we're going
232:53 so after a little while here we're going to start seeing these go green so this
232:55 to start seeing these go green so this is not started we'll give it a moment
232:57 is not started we'll give it a moment here just so we can see some kind of
232:59 here just so we can see some kind of animation and there it goes it's Off to
233:01 animation and there it goes it's Off to the Races there's not much to do here
233:03 the Races there's not much to do here this is going to take a while I don't
233:05 this is going to take a while I don't know I've have never ran this one in
233:06 know I've have never ran this one in particular so I don't know if it's an
233:08 particular so I don't know if it's an hour or 30 minutes so I'll see you back
233:10 hour or 30 minutes so I'll see you back when it's done running U but yeah it's
233:13 when it's done running U but yeah it's it's not that fun to watch but it's cool
233:15 it's not that fun to watch but it's cool that you get a visual uh illustration a
233:17 that you get a visual uh illustration a so I'll see you back in a bit I just
233:19 so I'll see you back in a bit I just wanted to peek in here and take a look
233:21 wanted to peek in here and take a look at how it's progressing here and you can
233:22 at how it's progressing here and you can see it's still going and it's just uh
233:24 see it's still going and it's just uh cleaning the data it's still not done um
233:27 cleaning the data it's still not done um I'm not sure how long this has been
233:28 I'm not sure how long this has been running for if we go over to our
233:29 running for if we go over to our experiments and we go into our I think
233:31 experiments and we go into our I think it's binary Pipeline and we look at the
233:33 it's binary Pipeline and we look at the run time we're about 8 minutes in and it
233:36 run time we're about 8 minutes in and it hasn't done a whole lot so it's still
233:39 hasn't done a whole lot so it's still cleaning the data I would have thought
233:40 cleaning the data I would have thought it be a little bit faster I'm kind of
233:42 it be a little bit faster I'm kind of used to using like AWS and it goes um
233:44 used to using like AWS and it goes um Sage makers uh this doesn't usually take
233:47 Sage makers uh this doesn't usually take this long um but I mean it's nice that
233:49 this long um but I mean it's nice that it's it's going here but uh yeah so
233:51 it's it's going here but uh yeah so we're almost out of the pre-processing
233:53 we're almost out of the pre-processing phase we'll be on to the uh the model
233:57 phase we'll be on to the uh the model tuning
233:58 tuning okay all right so after waiting a little
234:00 okay all right so after waiting a little while looks like our pipeline is done uh
234:03 while looks like our pipeline is done uh so if we make our way over to
234:04 so if we make our way over to experiments and go to Binary pipeline we
234:06 experiments and go to Binary pipeline we can see that it took 14 minutes and 22
234:08 can see that it took 14 minutes and 22 seconds
234:10 seconds uh we can go here and just see some uh
234:12 uh we can go here and just see some uh additional information there's nothing
234:14 additional information there's nothing really else to see we saw all the steps
234:15 really else to see we saw all the steps already ran so you can see them all here
234:18 already ran so you can see them all here uh okay and so let's say we wanted to
234:21 uh okay and so let's say we wanted to there's nothing under metrics but um
234:24 there's nothing under metrics but um enable metrics log data points compare
234:26 enable metrics log data points compare these did within across runs we only did
234:28 these did within across runs we only did a single run so there's nothing to
234:29 a single run so there's nothing to compare so let's say we we're happy with
234:32 compare so let's say we we're happy with this and we want to deploy this model
234:33 this and we want to deploy this model well what I'm going to do is go back to
234:35 well what I'm going to do is go back to the designer uh click back here and so
234:38 the designer uh click back here and so now in the top right corner we can
234:40 now in the top right corner we can create our inference pipeline so um I
234:44 create our inference pipeline so um I can't remember if submits going to run
234:46 can't remember if submits going to run it I don't want to run it again um I
234:49 it I don't want to run it again um I just want to go ahead and create
234:50 just want to go ahead and create ourselves a realtime or batch pipeline
234:53 ourselves a realtime or batch pipeline we's say real time pipeline
234:55 we's say real time pipeline here and what this will do is it will
234:57 here and what this will do is it will actually create a completely different
234:58 actually create a completely different pipeline so here's a completely new one
235:01 pipeline so here's a completely new one uh but it's specifically designed to do
235:04 uh but it's specifically designed to do uh deployment okay so this is now one
235:07 uh deployment okay so this is now one was for training the model this one is
235:08 was for training the model this one is actually for uh uh taking in data and
235:11 actually for uh uh taking in data and doing inference okay so what we can do
235:15 doing inference okay so what we can do is we can go ahead and uh just submit
235:18 is we can go ahead and uh just submit this and so we'll put this under our
235:21 this and so we'll put this under our binary pipeline here we'll go ahead and
235:23 binary pipeline here we'll go ahead and hit
235:28 submit and I believe that we need a different kind of compute here I'm
235:29 different kind of compute here I'm surprised that it's even
235:30 surprised that it's even running um no I guess it has a compute
235:33 running um no I guess it has a compute there so it's going to run and once it
235:36 there so it's going to run and once it uh finishes running then I believe that
235:37 uh finishes running then I believe that we we can go ahead head and um uh uh
235:41 we we can go ahead head and um uh uh deploy it okay so let's just wait for
235:43 deploy it okay so let's just wait for that to finish all right all right so
235:45 that to finish all right all right so after a little while there We've ran our
235:47 after a little while there We've ran our inference Pipeline and so uh it's
235:50 inference Pipeline and so uh it's definitely something that is ready for
235:52 definitely something that is ready for use the idea is that when we actually
235:54 use the idea is that when we actually it's going to go through this web
235:55 it's going to go through this web service input to this web service output
235:58 service input to this web service output but uh not so important at this level uh
236:00 but uh not so important at this level uh of certification let's see what it looks
236:02 of certification let's see what it looks like to to go ahead and deploy it so we
236:05 like to to go ahead and deploy it so we have we have the option between a
236:06 have we have the option between a real-time endpoint and an existing
236:08 real-time endpoint and an existing endpoint
236:09 endpoint uh we don't have an endpoint yet so
236:11 uh we don't have an endpoint yet so we'll just say uh binary
236:14 we'll just say uh binary pipeline okay and notice we have the
236:16 pipeline okay and notice we have the option between oh it just it wants to
236:19 option between oh it just it wants to lowercase binary
236:21 lowercase binary Pipeline and we have the option between
236:23 Pipeline and we have the option between Azure kubernetes service and add your
236:25 Azure kubernetes service and add your container instance um it's a lot easier
236:28 container instance um it's a lot easier to deploy I think to container instance
236:30 to deploy I think to container instance so because and we'll be waiting forever
236:31 so because and we'll be waiting forever for kubernetes to start up so we're
236:33 for kubernetes to start up so we're going to do container instance uh we
236:35 going to do container instance uh we have some options like SSL and things
236:36 have some options like SSL and things like that not too worried about it so so
236:38 like that not too worried about it so so we're just going to go ahead and hit
236:41 we're just going to go ahead and hit deploy
236:43 deploy okay and so that is going to go ahead
236:46 okay and so that is going to go ahead and deploy that um so we'll wait for
236:49 and deploy that um so we'll wait for this real time inference if we go over
236:51 this real time inference if we go over to our
236:53 to our compute uh it should spin up so this is
236:56 compute uh it should spin up so this is for a uh AKs so I don't know if it will
236:59 for a uh AKs so I don't know if it will show up here I think only I've seen
237:01 show up here I think only I've seen things under here but I think this will
237:02 things under here but I think this will be for Azure kubernetes service and I
237:05 be for Azure kubernetes service and I don't think we're going to see it show
237:07 don't think we're going to see it show up under there uh however um we do not
237:10 up under there uh however um we do not need to be running this anymore so we'll
237:12 need to be running this anymore so we'll go ahead and delete the binary pipeline
237:15 go ahead and delete the binary pipeline because we're not uh we don't have it
237:17 because we're not uh we don't have it for any use right now and we might need
237:20 for any use right now and we might need to free it up for something else okay so
237:24 to free it up for something else okay so go ahead and delete it we don't need
237:26 go ahead and delete it we don't need it and uh coming back to our
237:30 it and uh coming back to our pipeline our designer here I'm just
237:33 pipeline our designer here I'm just trying to see where we can keep track of
237:34 trying to see where we can keep track of it
237:37 it um well I know it it's deploying SO
237:40 um well I know it it's deploying SO waiting for Real Time endpoint so I'll
237:42 waiting for Real Time endpoint so I'll see you back here when this is done okay
237:44 see you back here when this is done okay just takes a little bit of time all
237:45 just takes a little bit of time all right so I think our pipeline is done if
237:47 right so I think our pipeline is done if we make our way over to endpoints there
237:49 we make our way over to endpoints there it is the binary pipeline if we wanted
237:51 it is the binary pipeline if we wanted to go ahead there we could test the
237:53 to go ahead there we could test the data um and so it actually already has
237:56 data um and so it actually already has some pre-loaded data for us if we hit
237:59 some pre-loaded data for us if we hit test it's nice that it fills it in
238:02 test it's nice that it fills it in E uh we get some results back okay so I
238:06 E uh we get some results back okay so I mean and then we see like scored label
238:08 mean and then we see like scored label and income and score probability so
238:11 and income and score probability so things like that uh that is um useful so
238:14 things like that uh that is um useful so it's giving back all all the results but
238:15 it's giving back all all the results but I don't think it has yeah it doesn't
238:18 I don't think it has yeah it doesn't have scored labels and scored
238:19 have scored labels and scored probabilities which is the value we want
238:21 probabilities which is the value we want come to come back here so there are end
238:23 come to come back here so there are end points and that is the end of um our
238:27 points and that is the end of um our Exploration with designer
238:29 Exploration with designer [Music]
238:33 [Music] okay all right so let's take a look at
238:35 okay all right so let's take a look at what it would be to actually train a job
238:37 what it would be to actually train a job programmatically uh through the notebook
238:39 programmatically uh through the notebook so remember we saw these samples over
238:41 so remember we saw these samples over here and so we saw this image
238:42 here and so we saw this image classification mnist and this is a very
238:44 classification mnist and this is a very popular data set for doing uh computer
238:47 popular data set for doing uh computer vision and these are really great if you
238:49 vision and these are really great if you want to really learn you should really
238:50 want to really learn you should really go through these and just um uh uh read
238:54 go through these and just um uh uh read through them because they're they're
238:54 through them because they're they're probably very very useful uh I've done a
238:57 probably very very useful uh I've done a lot of this before so for me it's it's
238:58 lot of this before so for me it's it's just it's not too hard to figure out but
239:00 just it's not too hard to figure out but I've actually never ran this one so
239:01 I've actually never ran this one so let's run it together again we want to
239:03 let's run it together again we want to be in uh Jupiter lab so you can go here
239:06 be in uh Jupiter lab so you can go here and click it there or go to the compute
239:08 and click it there or go to the compute if it's being a bit finicky and just
239:10 if it's being a bit finicky and just here we'll get a tab open here and we'll
239:13 here we'll get a tab open here and we'll see how this goes so what I want to do
239:16 see how this goes so what I want to do and uh is just make sure we're back here
239:18 and uh is just make sure we're back here I'm going to click into this
239:20 I'm going to click into this one and uh we have a few so there's part
239:23 one and uh we have a few so there's part one and then we have the deploy stage so
239:27 one and then we have the deploy stage so let's look at training I don't know if
239:29 let's look at training I don't know if we really need to deploy but we'll give
239:31 we really need to deploy but we'll give it a read here so in this tutorial you
239:32 it a read here so in this tutorial you train ml model on a computer resource
239:35 train ml model on a computer resource resources you'll be training and uh
239:37 resources you'll be training and uh training and deployment workflow via the
239:39 training and deployment workflow via the Azure machine learning service in a
239:41 Azure machine learning service in a notebook there's two parts to this this
239:43 notebook there's two parts to this this is using the mnus data set and scikit
239:46 is using the mnus data set and scikit learn and with Azure machine learning
239:48 learn and with Azure machine learning proba the SDK it's a popular data set
239:50 proba the SDK it's a popular data set with 70,000 grayscale images each image
239:53 with 70,000 grayscale images each image is handwritten digits of 28 times by 28
239:55 is handwritten digits of 28 times by 28 times pixels representing numbers from 0
239:57 times pixels representing numbers from 0 to 9 the goal is to create multiclass
240:00 to 9 the goal is to create multiclass classifier to define the digits in a
240:02 classifier to define the digits in a given image that represents so we're
240:04 given image that represents so we're going to learn a few things here but
240:05 going to learn a few things here but let's just jump into it uh so the first
240:08 let's just jump into it uh so the first thing is that we need to import our
240:09 thing is that we need to import our packages so here uh it does that map PL
240:13 packages so here uh it does that map PL plot lib inlines just make sure that
240:15 plot lib inlines just make sure that when we print things that we visually
240:16 when we print things that we visually see them we're going to need numpy and
240:18 see them we're going to need numpy and then mat plod lib itself the Azure ml
240:21 then mat plod lib itself the Azure ml core uh and then we're going to import a
240:23 core uh and then we're going to import a workspace since we'll need one there and
240:25 workspace since we'll need one there and uh then I guess it just checks the
240:27 uh then I guess it just checks the version making sure if we have the right
240:28 version making sure if we have the right version here okay so this is 1.28 z it's
240:31 version here okay so this is 1.28 z it's pretty common even this an AWS they'll
240:33 pretty common even this an AWS they'll have like a script in here to update it
240:35 have like a script in here to update it in case it is out of date I'm surprised
240:37 in case it is out of date I'm surprised it didn't include it in here but that's
240:39 it didn't include it in here but that's okay we'll scroll on down and by the way
240:41 okay we'll scroll on down and by the way we're using python 3.6 Azure ml uh if
240:44 we're using python 3.6 Azure ml uh if this is the future you know they might
240:46 this is the future you know they might retire the old one you're using 3.8 but
240:48 retire the old one you're using 3.8 but you know it should generally work if
240:49 you know it should generally work if it's in their sample data set I assume
240:51 it's in their sample data set I assume they try to maintain that okay so
240:53 they try to maintain that okay so connect to a workspace so create a
240:54 connect to a workspace so create a workspace object from an existing
240:56 workspace object from an existing workspace uh reads the file config.js so
241:00 workspace uh reads the file config.js so what we'll do is go run that I assume
241:01 what we'll do is go run that I assume it's kind of like a session and so here
241:04 it's kind of like a session and so here it says it's fig found our our
241:06 it says it's fig found our our workplace so really it's just it's not
241:09 workplace so really it's just it's not creating a workspace it's just returning
241:11 creating a workspace it's just returning the existing one so that we have it as a
241:13 the existing one so that we have it as a variable here create an experiment so uh
241:16 variable here create an experiment so uh that's pretty clear we saw experiments
241:17 that's pretty clear we saw experiments in the automl and the designer uh so
241:19 in the automl and the designer uh so we'll just hit run
241:22 we'll just hit run there okay so we named it cor ML and we
241:26 there okay so we named it cor ML and we said experiment I wonder if it actually
241:28 said experiment I wonder if it actually created one yet let's go over to
241:30 created one yet let's go over to experiment and see if it's there so it
241:32 experiment and see if it's there so it is there cool that was fast I thought it
241:34 is there cool that was fast I thought it would like print something out but it
241:35 would like print something out but it didn't do anything there uh so creator
241:38 didn't do anything there uh so creator attach an existing compute resource by
241:40 attach an existing compute resource by using Azure machine compute a manage
241:42 using Azure machine compute a manage service data scientist etc etc yada yada
241:45 service data scientist etc etc yada yada yada so create a a compute uh uh
241:48 yada so create a a compute uh uh creation of a compute takes about five
241:50 creation of a compute takes about five minutes so let's see what it's trying to
241:53 minutes so let's see what it's trying to create so we have some environment
241:54 create so we have some environment variables that it wants to load in I'm
241:56 variables that it wants to load in I'm not sure how these are getting in
241:59 not sure how these are getting in here um I'm not sure where environment
242:01 here um I'm not sure where environment variables are set in
242:03 variables are set in um uh Jupiter or even how they get
242:05 um uh Jupiter or even how they get feeded in but apparently they're
242:07 feeded in but apparently they're somewhere but we have it doesn't matter
242:09 somewhere but we have it doesn't matter because these are defaulting so here it
242:10 because these are defaulting so here it says CPU
242:12 says CPU cluster uh zero and four it's going to
242:14 cluster uh zero and four it's going to use a standard D2 V2 that is the
242:16 use a standard D2 V2 that is the cheapest one that we can run um I kind
242:19 cheapest one that we can run um I kind of want something a little bit more
242:20 of want something a little bit more powerful just for myself uh just cuz I
242:23 powerful just for myself uh just cuz I want this to be done a lot sooner but
242:24 want this to be done a lot sooner but again you know if you're don't have a
242:26 again you know if you're don't have a lot of money just stick with what's
242:27 lot of money just stick with what's there okay so and this is CPU cluster so
242:32 there okay so and this is CPU cluster so if we go here I just want to see what
242:34 if we go here I just want to see what her options
242:35 her options are
242:37 are um
242:48 here you don't have enough quota for the following VM sizes so it probably it's
242:51 following VM sizes so it probably it's because I'm running more than one VM
242:52 because I'm running more than one VM right
242:58 now yes I've s I've hit my quota okay so like I probably have to
243:00 quota okay so like I probably have to request for more um so I think this is
243:03 request for more um so I think this is the
243:05 the one I'm
243:07 one I'm using
243:09 using what's the difference here this standard
243:10 what's the difference here this standard dv2
243:17 vcpus it's the same one right so request quote to increase I don't know if this
243:19 quote to increase I don't know if this is instant or not I'd have to make a
243:20 is instant or not I'd have to make a support ticket oh that's going to take
243:22 support ticket oh that's going to take too long so the thing is is that uh
243:25 too long so the thing is is that uh because the reason is is that I'm
243:26 because the reason is is that I'm running the autom ML and the design and
243:29 running the autom ML and the design and the uh designer in the background here
243:30 the uh designer in the background here trying to create all the workshops or
243:33 trying to create all the workshops or the uh uh the follow alongs at the same
243:35 the uh uh the follow alongs at the same time but what I'll do is I'll just come
243:36 time but what I'll do is I'll just come back and when I'm not running one of
243:38 back and when I'm not running one of those other ones then I will uh I'll
243:40 those other ones then I will uh I'll come back here and continue on but uh
243:42 come back here and continue on but uh we're just here at the step we want to
243:44 we're just here at the step we want to create a a new uh compute okay all right
243:47 create a a new uh compute okay all right so I'm back and I freed up uh one of my
243:49 so I'm back and I freed up uh one of my compute instances if I go over here now
243:51 compute instances if I go over here now I just have uh the one uh cluster
243:54 I just have uh the one uh cluster instance for my uh automl but what we'll
243:57 instance for my uh automl but what we'll do here is again just read through this
243:58 do here is again just read through this so this will create a CPU cluster 0 to
244:00 so this will create a CPU cluster 0 to four nodes um standard D2 V2 I guess
244:03 four nodes um standard D2 V2 I guess we'll just stick with what what is here
244:06 we'll just stick with what what is here um just reading through here look look
244:08 um just reading through here look look like it tries to find the compute Target
244:10 like it tries to find the compute Target it's going to provision it it will
244:12 it's going to provision it it will create the cluster call Pool for minimum
244:14 create the cluster call Pool for minimum numbers of nodes for specific time so
244:16 numbers of nodes for specific time so wait for completion so we'll go ahead
244:18 wait for completion so we'll go ahead and hit play and so that's going to go
244:22 and hit play and so that's going to go and create us a new cluster so we're
244:24 and create us a new cluster so we're just going to have to wait a little
244:25 just going to have to wait a little while here for it to create about 5
244:27 while here for it to create about 5 minutes and I'll see you back here in a
244:28 minutes and I'll see you back here in a moment all right so uh the cluster
244:31 moment all right so uh the cluster started up if we go back over here we
244:32 started up if we go back over here we can see that it's confirmed I don't know
244:34 can see that it's confirmed I don't know why it uh was so quick but uh it went
244:36 why it uh was so quick but uh it went pretty quick there so we're on the next
244:38 pretty quick there so we're on the next section here explore the data so
244:39 section here explore the data so download the mnist data set display some
244:41 download the mnist data set display some sample images so it's just talking about
244:44 sample images so it's just talking about it being the open data set the code
244:46 it being the open data set the code retrieves in the file data set object
244:48 retrieves in the file data set object which is a subass of data set file data
244:50 which is a subass of data set file data set references a single or multiple
244:52 set references a single or multiple files of any format in your data store
244:54 files of any format in your data store the class provides you with the ability
244:56 the class provides you with the ability to download or amount files to your
244:57 to download or amount files to your computer by creating a reference to the
244:59 computer by creating a reference to the data source location Additionally you
245:01 data source location Additionally you register the data set to your workspace
245:03 register the data set to your workspace for easy retrieval during training
245:06 for easy retrieval during training there's a bit more how-tos but we'll
245:07 there's a bit more how-tos but we'll give it good read here so we have the
245:08 give it good read here so we have the open data set mnist it's kind of nice
245:11 open data set mnist it's kind of nice that they have that reference there uh
245:13 that they have that reference there uh so we have a data folder we make the
245:15 so we have a data folder we make the directory we are getting the data set we
245:18 directory we are getting the data set we download it and then we are registering
245:22 download it and then we are registering it so let's go ahead and run that not
245:24 it so let's go ahead and run that not sure how fast that is shouldn't take too
245:26 sure how fast that is shouldn't take too long as it's running we'll go over here
245:29 long as it's running we'll go over here the left hand side refresh and we'll see
245:31 the left hand side refresh and we'll see if it
245:33 if it appears
245:35 appears um uh not as of yet there it
245:39 um uh not as of yet there it is go into here maybe explore the data
245:42 is go into here maybe explore the data I'm not sure how would look like because
245:44 I'm not sure how would look like because these are all images right yeah so
245:46 these are all images right yeah so they're in ubite gz so they're in
245:49 they're in ubite gz so they're in compressed files we're not going to be
245:50 compressed files we're not going to be able to see within them but they're
245:52 able to see within them but they're definitely there we know they're there
245:54 definitely there we know they're there so that that is now registered into our
245:56 so that that is now registered into our our data set uh display some sample
245:58 our data set uh display some sample images so load the compressed into a
246:01 images so load the compressed into a files into numpy then use map plot lib
246:05 files into numpy then use map plot lib plot 30 random images from the data set
246:07 plot 30 random images from the data set from above not the step requires load
246:08 from above not the step requires load data function it's included in the utils
246:10 data function it's included in the utils pie this file is included in the sample
246:12 pie this file is included in the sample folder we have it over here we just
246:15 folder we have it over here we just double click very simple file the load
246:18 double click very simple file the load data and we'll go ahead and run
246:21 data and we'll go ahead and run that and it's
246:23 that and it's pretty pretty simple here uh so load
246:26 pretty pretty simple here uh so load data X train X test it are we setting up
246:29 data X train X test it are we setting up our training and testing data here it
246:31 our training and testing data here it kind of looks like it because it says
246:33 kind of looks like it because it says train and test data that's when we
246:34 train and test data that's when we usually see that kind of
246:36 usually see that kind of split um and again it's doing a random
246:38 split um and again it's doing a random split so that sounds pretty good to me
246:41 split so that sounds pretty good to me uh let's show some randomly chosen
246:42 uh let's show some randomly chosen images yeah so I guess they do set up
246:45 images yeah so I guess they do set up the training data here and then down
246:47 the training data here and then down below we're actually showing the images
246:49 below we're actually showing the images so here's some random images train on a
246:51 so here's some random images train on a remote cluster so for this task you
246:53 remote cluster so for this task you submit the job to run on the remote
246:54 submit the job to run on the remote training cluster to set up earlier
246:56 training cluster to set up earlier submit your
246:57 submit your job um create the directory create a
247:00 job um create the directory create a training script create a script for run
247:02 training script create a script for run configuration submit the job so first we
247:04 configuration submit the job so first we will create our
247:06 will create our directory
247:08 directory um and notice it created this directory
247:10 um and notice it created this directory over
247:11 over here because I guess it's going to put
247:13 here because I guess it's going to put the training file in there and so this
247:14 the training file in there and so this will actually write to a training file
247:16 will actually write to a training file this makes uh quite a bit of sense so if
247:19 this makes uh quite a bit of sense so if we click into here it should now have a
247:21 we click into here it should now have a training file it'll just give it a quick
247:23 training file it'll just give it a quick read see what's going on here so a lot
247:25 read see what's going on here so a lot of times when you create these training
247:27 of times when you create these training files you have to do and this is the
247:28 files you have to do and this is the same if you're using AWS like when
247:30 same if you're using AWS like when you're creating tra like or sagemaker um
247:33 you're creating tra like or sagemaker um you create a train file because it's
247:34 you create a train file because it's part of Frameworks it's just how the
247:35 part of Frameworks it's just how the Frameworks work but you'll have uh these
247:37 Frameworks work but you'll have uh these arguments uh so it could be like
247:40 arguments uh so it could be like parameters to run for training um uh and
247:44 parameters to run for training um uh and there could be a whole sorts of ones
247:47 there could be a whole sorts of ones here here they're loading in the
247:49 here here they're loading in the training and testing data so it's the
247:51 training and testing data so it's the same stuff we saw earlier when we were
247:53 same stuff we saw earlier when we were just viewing the
247:56 just viewing the data um here it's doing a logistic
248:00 data um here it's doing a logistic regression it's using Li uh so linear
248:03 regression it's using Li uh so linear maybe linear learning model there it's
248:05 maybe linear learning model there it's doing
248:06 doing multiclass on that there and so what
248:08 multiclass on that there and so what it's going to do is fit so fit is
248:10 it's going to do is fit so fit is actually performing the training and
248:13 actually performing the training and then what it's going to do is make a
248:14 then what it's going to do is make a prediction on the test Set uh then it's
248:17 prediction on the test Set uh then it's going we're going to get accuracy so
248:19 going we're going to get accuracy so we're getting kind of a score so notice
248:20 we're getting kind of a score so notice that it's using accuracy uh as a
248:24 that it's using accuracy uh as a evaluation metric I suppose right and
248:28 evaluation metric I suppose right and then at the end we're going to dump the
248:29 then at the end we're going to dump the data a lot of times like you have to
248:31 data a lot of times like you have to save the model somewhere so they're
248:33 save the model somewhere so they're outputting the actual weights of the
248:35 outputting the actual weights of the neural network and all other stuff it's
248:36 neural network and all other stuff it's a plk file I don't know what that is but
248:39 a plk file I don't know what that is but if you're using like tensor flow you
248:40 if you're using like tensor flow you would use tensor flow serving at the end
248:42 would use tensor flow serving at the end of this a lot of times uh Frameworks
248:44 of this a lot of times uh Frameworks will like Pi P torch or tensor flow or
248:47 will like Pi P torch or tensor flow or mxnet they'll have a serving layer um
248:50 mxnet they'll have a serving layer um but uh since we're just using S kit
248:52 but uh since we're just using S kit learn which is very simple it's just
248:53 learn which is very simple it's just going to dump out uh that file into our
248:56 going to dump out uh that file into our outputs this is going to probably run a
248:58 outputs this is going to probably run a container so this outputs isn't going to
249:00 container so this outputs isn't going to necessarily be on um the outputs into
249:03 necessarily be on um the outputs into here it's more like the outputs of the
249:05 here it's more like the outputs of the container and um
249:08 container and um a lot of times the container will then
249:10 a lot of times the container will then place this somewhere so like it'll be
249:11 place this somewhere so like it'll be saved on The Container but it'll be
249:13 saved on The Container but it'll be passed out to the register or or
249:15 passed out to the register or or something like that like model registry
249:17 something like that like model registry so anyway we ran this and so that
249:18 so anyway we ran this and so that generated the file we don't want to keep
249:20 generated the file we don't want to keep on running this multiple times I
249:21 on running this multiple times I probably would just overwrite the file
249:22 probably would just overwrite the file so it's not a big deal here it says
249:25 so it's not a big deal here it says notice how the script gets saved in the
249:26 notice how the script gets saved in the data model so here it's saying the data
249:28 data model so here it's saying the data uh data folder I guess we didn't look at
249:30 uh data folder I guess we didn't look at that so we go top here um I didn't see
249:35 that so we go top here um I didn't see this is data
249:36 this is data folder was it wasn't really paying
249:38 folder was it wasn't really paying attention to where that
249:39 attention to where that was guess it looks like where more so
249:42 was guess it looks like where more so it's loading the data in so here it
249:44 it's loading the data in so here it saves the data outut anything written to
249:46 saves the data outut anything written to the strory is automatically uploaded to
249:47 the strory is automatically uploaded to your workspace so I guess that's just
249:49 your workspace so I guess that's just how it works so it probably will end up
249:51 how it works so it probably will end up in here then um so util pii reference
249:54 in here then um so util pii reference the training script to load the data set
249:56 the training script to load the data set correctly and copy the file over
249:58 correctly and copy the file over so um we will run this to copy the file
250:04 so um we will run this to copy the file over so I'm guessing did it put it into
250:06 over so I'm guessing did it put it into here I'm just yeah so just put it in
250:08 here I'm just yeah so just put it in there because when it actually uh
250:10 there because when it actually uh packages it for the container it's going
250:12 packages it for the container it's going to bring that file over because it's a
250:14 to bring that file over because it's a dependency
250:16 dependency so configure the training job so create
250:19 so configure the training job so create a script run config the directory that
250:22 a script run config the directory that contains the script the compute Target
250:23 contains the script the compute Target the training script training file Etc
250:26 the training script training file Etc sometimes like in other Frameworks
250:27 sometimes like in other Frameworks they'll just call them estimators but
250:29 they'll just call them estimators but here it's just called a script run
250:31 here it's just called a script run config
250:33 config so uh I'm just trying to see what it's
250:36 so uh I'm just trying to see what it's doing so sidekit learn is the dependency
250:40 doing so sidekit learn is the dependency okay sure we'll just hit
250:42 okay sure we'll just hit run okay and then down below here we
250:46 run okay and then down below here we have script run
250:48 have script run config so it looks like we're passing
250:51 config so it looks like we're passing our arguments so we're saying this is
250:53 our arguments so we're saying this is our data folder which is apparently here
250:56 our data folder which is apparently here we're mounting it and then we're setting
250:58 we're mounting it and then we're setting regularization to
251:00 regularization to 0.5 sometimes you'll pass inde
251:02 0.5 sometimes you'll pass inde dependencies in here as well I guess
251:04 dependencies in here as well I guess these are technically are our parameters
251:06 these are technically are our parameters that are getting configured up here at
251:08 that are getting configured up here at the top right but sometimes you'll have
251:11 the top right but sometimes you'll have dependencies if you're in uh including
251:14 dependencies if you're in uh including other files here uh and I guess that's
251:17 other files here uh and I guess that's up here right so see where it says
251:19 up here right so see where it says environment and then we're saying
251:21 environment and then we're saying include the Azure ml defaults and the
251:23 include the Azure ml defaults and the pyit learn and stuff like that and so
251:26 pyit learn and stuff like that and so then it gets passed in the EnV so that
251:28 then it gets passed in the EnV so that makes sense to me we haven't ran that
251:29 makes sense to me we haven't ran that yet because we don't see any number here
251:32 yet because we don't see any number here submit the job to the Clusters let's go
251:34 submit the job to the Clusters let's go ahead and do
251:36 ahead and do that
251:38 that so it says it returns a preparing or
251:39 so it says it returns a preparing or running State as soon as the job is
251:41 running State as soon as the job is completed so it's in a starting
251:49 State monitor remote run so in total the the first run takes 10 minutes but the
251:51 the first run takes 10 minutes but the second run uh is uh as long as the
251:53 second run uh is uh as long as the dependencies in Azure ml firment don't
251:55 dependencies in Azure ml firment don't change the same images reused and hence
251:57 change the same images reused and hence the start here start time is much faster
252:00 the start here start time is much faster here's what's happening while you wait
252:01 here's what's happening while you wait the image creation a Docker image is is
252:03 the image creation a Docker image is is created matching the python environment
252:05 created matching the python environment specified by the azl environment
252:07 specified by the azl environment the image is built and stored in the ACR
252:10 the image is built and stored in the ACR the Azure container registry associated
252:12 the Azure container registry associated with your workspace let's go take a look
252:14 with your workspace let's go take a look and see if that's the case because
252:16 and see if that's the case because sometimes like resources aren't visible
252:18 sometimes like resources aren't visible to you so I'm just curious do we
252:20 to you so I'm just curious do we actually see
252:21 actually see it
252:23 it okay and yep there it is okay so they
252:26 okay and yep there it is okay so they did not lie
252:29 did not lie um so associated with your workspace
252:32 um so associated with your workspace image creation uploading takes about 5
252:34 image creation uploading takes about 5 minutes this stage happens once for each
252:36 minutes this stage happens once for each python environment since the container's
252:38 python environment since the container's cach subsequent runs during image
252:40 cach subsequent runs during image creation logs are stem to the Run
252:41 creation logs are stem to the Run history you can monitor the image
252:43 history you can monitor the image creation Pro process using these logs
252:46 creation Pro process using these logs wherever those are if you if the remote
252:48 wherever those are if you if the remote cluster requires more nodes to execute
252:50 cluster requires more nodes to execute the Run than currently available
252:51 the Run than currently available additional nodes are out added
252:53 additional nodes are out added automatically scaling T typically takes
252:55 automatically scaling T typically takes about five minutes and I've seen this
252:57 about five minutes and I've seen this before where if you're in your compute
252:58 before where if you're in your compute here uh sometimes it'll just say like
253:00 here uh sometimes it'll just say like scaling because it's just not
253:03 scaling because it's just not enough so uh running into the stage the
253:06 enough so uh running into the stage the necessary Scripts and files are sent to
253:08 necessary Scripts and files are sent to the compute Target then the data stores
253:09 the compute Target then the data stores are amounted copied the entry script is
253:11 are amounted copied the entry script is run so entry script is actually the
253:13 run so entry script is actually the train.py file while the job is running
253:16 train.py file while the job is running SD out and the files is in the logs
253:18 SD out and the files is in the logs directory or stem to the Run history you
253:21 directory or stem to the Run history you can monitor the runs progress using
253:22 can monitor the runs progress using these
253:23 these logs the dot outputs directory of the
253:26 logs the dot outputs directory of the run is copied over to the Run history in
253:28 run is copied over to the Run history in your workspace so you can access these
253:30 your workspace so you can access these results you can check the progress of a
253:32 results you can check the progress of a running job in multiple ways this
253:33 running job in multiple ways this tutorial uses the Jupiter widget so
253:36 tutorial uses the Jupiter widget so looks like we can uh run this watch the
253:39 looks like we can uh run this watch the progress so maybe we'll run that and so
253:42 progress so maybe we'll run that and so it's actually showing us the progress
253:43 it's actually showing us the progress that's kind of cool I really like
253:45 that's kind of cool I really like that so it's just a little widget
253:47 that so it's just a little widget showing us all the things that it's
253:49 showing us all the things that it's doing let's go take a look and see what
253:52 doing let's go take a look and see what we can see under experiments and our run
253:54 we can see under experiments and our run pipeline because it was talking about
253:56 pipeline because it was talking about things like outputs and things like that
253:58 things like outputs and things like that so over here in the outputs and logs I'm
254:00 so over here in the outputs and logs I'm just
254:02 just curious is if this is the same
254:06 curious is if this is the same thing
254:13 I'm not sure if this uh is this Tails yeah it does tail it just moves so we
254:15 yeah it does tail it just moves so we can actually monitor it from here I
254:16 can actually monitor it from here I guess that's what it was talking
254:18 guess that's what it was talking about um so here we can see that it's
254:21 about um so here we can see that it's setting up Docker it's actually building
254:22 setting up Docker it's actually building a Docker
254:23 a Docker image and
254:26 image and then I'm not sure did it send it to I
254:29 then I'm not sure did it send it to I mean it's on ACR already I think it
254:32 mean it's on ACR already I think it looks like it's just still uh
254:34 looks like it's just still uh downloading extracting packages so maybe
254:35 downloading extracting packages so maybe it's actually running on the image now
254:37 it's actually running on the image now so we'll just wait there we pop back
254:39 so we'll just wait there we pop back over here you know we can see probably
254:42 over here you know we can see probably the same information is it identical
254:43 the same information is it identical yeah it
254:45 yeah it is so we're 3 minutes in uh it's
254:48 is so we're 3 minutes in uh it's probably not that fun to to watch it in
254:50 probably not that fun to to watch it in real time and and talk about it so let's
254:53 real time and and talk about it so let's just wait until it's done I'll see you
254:54 just wait until it's done I'll see you back then okay all right so I'm uh about
254:57 back then okay all right so I'm uh about 17 minutes in here I'm not seeing any
254:59 17 minutes in here I'm not seeing any more uh movement here so it could be
255:01 more uh movement here so it could be that it is done it does say if you run
255:03 that it is done it does say if you run this next step here it will wait for
255:05 this next step here it will wait for completion um
255:07 completion um specify show output to true for verbose
255:10 specify show output to true for verbose log so here actually did output a moment
255:13 log so here actually did output a moment ago so maybe it actually was done um but
255:16 ago so maybe it actually was done um but I just ran it twice so I'm not sure if
255:19 I just ran it twice so I'm not sure if that's going to cause me uh issues there
255:23 that's going to cause me uh issues there so because I can't run the next step
255:25 so because I can't run the next step unless I stop this um can I individually
255:28 unless I stop this um can I individually cancel this one
255:35 here uh I think I can just hit interrupt the kernel there there we
255:37 hit interrupt the kernel there there we go okay so I think that it's done okay
255:40 go okay so I think that it's done okay because it's 18 minutes in and I don't
255:41 because it's 18 minutes in and I don't see any more logging in here it's just
255:43 see any more logging in here it's just not very clear and also uh the logs we
255:46 not very clear and also uh the logs we just have a lot of stuff going on here
255:48 just have a lot of stuff going on here like it's just so much so you know if
255:51 like it's just so much so you know if we're keeping keeping Pace we probably
255:53 we're keeping keeping Pace we probably would have saw all these created yeah so
255:54 would have saw all these created yeah so another we just had a few more outputs
255:56 another we just had a few more outputs there but uh I think that it's done
256:05 okay it's just there's nothing definitively saying like done
256:07 definitively saying like done do you know what I'm saying and then up
256:08 do you know what I'm saying and then up here it doesn't say oh oh I guess it
256:10 here it doesn't say oh oh I guess it does say that it's done all right so
256:12 does say that it's done all right so yeah I just never ran it with this tool
256:14 yeah I just never ran it with this tool so I just don't know so I guess it does
256:17 so I just don't know so I guess it does definitively say that I already ran this
256:19 definitively say that I already ran this so we don't need to run that again I
256:21 so we don't need to run that again I just feel like we'll get stuck there so
256:23 just feel like we'll get stuck there so let's take a look at the
256:25 let's take a look at the metrics so regularization rate is 0.5
256:28 metrics so regularization rate is 0.5 accuracy is nine so N9 is pretty good
256:31 accuracy is nine so N9 is pretty good the last step is training the script
256:32 the last step is training the script wrote in the output uh uh s SK learn I
256:37 wrote in the output uh uh s SK learn I want to see if it's actually in our
256:38 want to see if it's actually in our environment
256:40 environment here I don't think it is so outputs is
256:43 here I don't think it is so outputs is somewhere it's in our workspace
256:44 somewhere it's in our workspace somewhere but it's just not uh we just
256:47 somewhere but it's just not uh we just don't oh it's right here okay so it
256:49 don't oh it's right here okay so it outputed the actual model right there um
256:52 outputed the actual model right there um and
256:54 and so you can see the associated files that
256:56 so you can see the associated files that are ran okay we'll run
256:59 are ran okay we'll run it register the work model in space so
257:01 it register the work model in space so you can work with other collaborators
257:03 you can work with other collaborators sure so if I click on that here and we
257:06 sure so if I click on that here and we go back over to our models it is now
257:09 go back over to our models it is now registered over here
257:12 registered over here okay and so we're done part one I don't
257:16 okay and so we're done part one I don't want to do all these other parts um
257:17 want to do all these other parts um training is enough as it is but let's
257:19 training is enough as it is but let's just take a look at the deploy stage
257:22 just take a look at the deploy stage okay so for
257:27 prerequisites uh we're setting up a workspace we have our we are loading our
257:30 workspace we have our we are loading our registered
257:32 registered model okay we register it we have to
257:34 model okay we register it we have to import packages we are going to
257:39 import packages we are going to um create scoring
257:42 um create scoring script deploy to an ACI model test the
257:45 script deploy to an ACI model test the model if you want to do this you can go
257:47 model if you want to do this you can go through all the steps it does talk about
257:48 through all the steps it does talk about a confusion Matrix and that is something
257:50 a confusion Matrix and that is something that can show up on the exam is actually
257:52 that can show up on the exam is actually talking about a confusion Matrix but we
257:54 talking about a confusion Matrix but we do cover that in lecture content so you
257:56 do cover that in lecture content so you generally understand what that is but um
257:58 generally understand what that is but um you know I'm just I'm too tired I don't
258:00 you know I'm just I'm too tired I don't want to run through all this there's not
258:01 want to run through all this there's not a whole lot of value other than reading
258:03 a whole lot of value other than reading reading through it yourself here um so I
258:05 reading through it yourself here um so I think we're all done here
258:08 think we're all done here [Music]
258:12 [Music] okay okay one service we uh forgot to
258:14 okay okay one service we uh forgot to check out was Data labeling so let's go
258:16 check out was Data labeling so let's go over there and give that a go so I'm
258:17 over there and give that a go so I'm going to go ahead and create ourselves a
258:18 going to go ahead and create ourselves a new project I say my labeling project
258:22 new project I say my labeling project and we can say whether we want to
258:23 and we can say whether we want to classify images or text um we have
258:25 classify images or text um we have multiclass multi label bounding box uh
258:28 multiclass multi label bounding box uh segmentation let's go with multic
258:31 segmentation let's go with multic class I'll go back here for a second um
258:33 class I'll go back here for a second um multic class
258:35 multic class whoops I I don't know if we uh create
258:38 whoops I I don't know if we uh create create data set but we could probably
258:39 create data set but we could probably upload some local
258:41 upload some local files uh let's say uh my St Trek Data
258:50 set it doesn't let us choose the image file type here be nice if these were
258:58 images going to tell us what here it's very finicky this input here
259:02 here it's very finicky this input here uh file dis set references a single or
259:03 uh file dis set references a single or multiple files in your public data store
259:05 multiple files in your public data store or private public L okay so we'll go
259:08 or private public L okay so we'll go next uh if we can upload files directly
259:10 next uh if we can upload files directly that'd be nice oo upload a folder I like
259:12 that'd be nice oo upload a folder I like that so what we'll do um is we do have
259:15 that so what we'll do um is we do have some images in the free uh AI here under
259:18 some images in the free uh AI here under cognitive Services
259:20 cognitive Services assets uh we
259:23 assets uh we have um we'll go back here and we'll
259:27 have um we'll go back here and we'll say I think objects would be the
259:31 say I think objects would be the easiest oh but we just want a folder
259:33 easiest oh but we just want a folder right so yeah we'll just take
259:35 right so yeah we'll just take objects yep we'll upload the 17
259:40 objects yep we'll upload the 17 files uh yep we'll just let it stick to
259:42 files uh yep we'll just let it stick to that path that seems fine to
259:50 me we will go ahead and create it and so now we have a data set there
259:52 it and so now we have a data set there we'll go ahead and select that data set
259:53 we'll go ahead and select that data set we'll say next your data set is
259:55 we'll say next your data set is periodically checked for new data points
259:57 periodically checked for new data points any data points will be added as tasks
259:59 any data points will be added as tasks it doesn't matter we're only doing this
260:01 it doesn't matter we're only doing this for test uh enter the list of labels so
260:03 for test uh enter the list of labels so we have um uh TNG
260:08 we have um uh TNG DS9 uh
260:11 DS9 uh Voyager toss that's the types of Star
260:15 Voyager toss that's the types of Star Trek
260:16 Trek episodes um label
260:20 episodes um label which
260:22 which um Star Trek
260:30 series the image is from say next I don't want enabled but you can
260:33 next I don't want enabled but you can have Auto uh enabled assistant labeler
260:36 have Auto uh enabled assistant labeler I'm going to say no we'll create the
260:43 project okay and I'll just wait for that to create and I'll see you back here in
260:45 to create and I'll see you back here in a moment okay all right so I'm back here
260:47 a moment okay all right so I'm back here actually didn't have to wait long I
260:49 actually didn't have to wait long I think it instantly runs I just assumed
260:51 think it instantly runs I just assumed like I was waiting for a state that says
260:52 like I was waiting for a state that says completed but it's not something we have
260:54 completed but it's not something we have to do so uh we have 0 out of 17 progress
260:57 to do so uh we have 0 out of 17 progress we're going to go in here we're going to
260:58 we're going to go in here we're going to go label some data we can view the
261:01 go label some data we can view the instructions it's not showing up here
261:03 instructions it's not showing up here but that's fine if we go to tasks we can
261:04 but that's fine if we go to tasks we can start labeling so what season is from or
261:07 start labeling so what season is from or Series this is Voyager we'll hit submit
261:09 Series this is Voyager we'll hit submit this is Voyager we'll hit submit this is
261:12 this is Voyager we'll hit submit this is toss we'll hit submit this is
261:15 toss we'll hit submit this is TNG this is
261:17 TNG this is TNG this is
261:19 TNG this is DS9
261:21 DS9 DS9
261:23 DS9 Voyager
261:25 Voyager Voyager uh
261:28 Voyager uh TNG
261:29 TNG DS9 you get the idea though and you got
261:32 DS9 you get the idea though and you got some options here like change the
261:33 some options here like change the contrast if someone can't see the photo
261:35 contrast if someone can't see the photo or rotate it this is
261:37 or rotate it this is Voyager
261:39 Voyager Voyager uh
261:41 Voyager uh TNG
261:43 TNG DS9 uh
261:45 DS9 uh Voyager
261:47 Voyager Voyager and we're done so we'll go back
261:50 Voyager and we're done so we'll go back to our labeling job here we'll see we
261:52 to our labeling job here we'll see we have the breakdown there uh and now our
261:55 have the breakdown there uh and now our data set is
261:56 data set is labeled we can export our data set CSV
261:59 labeled we can export our data set CSV Coco as your ml data set I believe that
262:02 Coco as your ml data set I believe that means it'll go back into the data sets
262:04 means it'll go back into the data sets over here which will make our Liv a
262:06 over here which will make our Liv a little bit easier we go back to data
262:10 little bit easier we go back to data labeling okay so you just granted people
262:13 labeling okay so you just granted people access to the studio they'd be able to
262:14 access to the studio they'd be able to just go in here and and jump into that
262:16 just go in here and and jump into that job okay uh if we go over to the data
262:18 job okay uh if we go over to the data set I believe we should have a labeled
262:20 set I believe we should have a labeled version of it now so my labeling project
262:23 version of it now so my labeling project so I believe that is the uh the labeled
262:26 so I believe that is the uh the labeled stuff here
262:28 stuff here right yeah so it's labeled so there you
262:31 right yeah so it's labeled so there you go we're all done aure machine learning
262:33 go we're all done aure machine learning uh and so all that's left is to do some
262:35 uh and so all that's left is to do some cleanup
262:40 [Music] okay so we're all done with Azure
262:42 okay so we're all done with Azure machine learning uh if we want to we can
262:44 machine learning uh if we want to we can go to our compute and just uh kill the
262:47 go to our compute and just uh kill the services we have here now if we go to
262:49 services we have here now if we go to the resource Group and delete everything
262:50 the resource Group and delete everything it'll it'll take all these things down
262:51 it'll it'll take all these things down anyway but I'm just going to go a bit
262:54 anyway but I'm just going to go a bit paranoid so I'm going to just manually
262:55 paranoid so I'm going to just manually do this
263:06 delete okay and so we'll go back to portal. azure.com
263:12 and uh I'm going to go to my resource groups and everything is contained it
263:15 groups and everything is contained it should be all contained within my studio
263:17 should be all contained within my studio just be sure to check these other ones
263:18 just be sure to check these other ones for that and we can see all the stuff
263:20 for that and we can see all the stuff that we spun up we'll go ahead and hit
263:22 that we spun up we'll go ahead and hit delete Resource Group um I don't know if
263:25 delete Resource Group um I don't know if it includes like because I don't see
263:28 it includes like because I don't see like container registry right so I know
263:30 like container registry right so I know like it puts stuff
263:32 like it puts stuff there I guess it does it says container
263:34 there I guess it does it says container registry so that's pretty much
263:35 registry so that's pretty much everything right
263:37 everything right and that'll take down everything so and
263:39 and that'll take down everything so and if you're paranoid all you can do is go
263:40 if you're paranoid all you can do is go to all resources and double check over
263:42 to all resources and double check over here because if there's anything running
263:44 here because if there's anything running it will show up here okay um but that's
263:47 it will show up here okay um but that's pretty much it and so just delete and
263:49 pretty much it and so just delete and we're all done