This content is a comprehensive course overview and tutorial for the Azure AI Fundamentals (AI-900) certification, covering essential AI and machine learning concepts on Azure, including practical demonstrations of various Azure AI services.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
hey this is Andrew Brown and I'm
bringing you another certification
course and this time it's the Azure AI
fundamentals also known as the AI 900
and if you're looking to pass a
certification we have everything that
you need here such as Labs lectures and
a free practice exam so you can go ASAT
exam get that uh certification put on
your resume and Linkedin to go get that
job you've been looking for um if you
want to support more free courses like
this one the best way is to purchase the
additional uh paid materials where you
can get access uh to more practice exams
and other resources if you don't know me
um I've taught a bit of everything uh
here on the cloud that's been with adabs
Azure gcp devops terraform kubernetes
you name it I've taught it but uh you
know you know the drill here let's get
into it and learn more about the Azure AI
fundamentals hey this is Andrew Brown
from exam Pro and we are at the start of
our journey here learning about the AI
900 asking the most important question
which is what is the AI 900 so the Azure
AI fundamental certification is for
those seeking an ml role such as AI
engineer or data scientist the
certification will demonstrate if a
person can Define and understand Azure
AI services such as competive services
and Azure applied AI Services AI
Concepts knowledge mining responsible AI
basics of NL pipelines classical ml
models autom ml generative AI workloads
which is newly added content and Azure
AI studio so you don't need to know
super complicated ml knowledge here but
it definitely helps to get you through
there so this certification is generally
referred to by its course code the AI
900 and it's the natural path for the
Azure AI engineer or Azure data
scientist certification this generally
is an easy course to pass and it's great
for those new to cloud or ml related
technology looking at our road map you
might be asking okay well what are the
paths and what should I learn first so
here are a few suggested routes if you
already have your a900 that's that's a
great starting point before you take
your AI 900 if you don't have your a
z900 you can jump right into the AI 900
but I strongly recommend you go get that
a z900 because it gives you General
foundational knowledge it's just another
thing that you should not have to worry
about which is just how to use Azure at
a fundamental level do you need the
dp900 to take the AI 900 no but a lot of
people seem to like to go this route
where they want to have that data
Foundation before they move on to the AI
900 because they know that the broad
knowledge is going to be useful they so
it's app pairing that you see a lot of
people getting the AI 900 and the dp900
together for the AI 900 the path is a
little bit more clear it's either going
to be data scientists or AI engineer so
for the AI engineer you have to know how
to use the AI services in and out for
data scientists it's more focused on
setting up actual pipelines and things
like that within Azure machine learning
so you just have to decide which path is
for you the data scientist is definitely
harder than the AI engineer so if you
aren't ready for the data science some
people like taking the AI engineer first
and then doing the data scientists so
this is kind of like a warmup again it's
not 100% necessary but it's just based
on your personal learning style and a
lot of times people like to take the
data engineer after the data scientists
just to round out their complete
knowledge now if you already have the
a900 and the administrator associate you
can safely go to the data scientist if
you want to risk it because this one is
really hard so if you've passed the
a1004 you know you're going to probably
have a lot more confidence learning up
about all the concepts at this level
here but of course it's always
recommended to go grab these
foundational CTS because sometimes
course materials just do not cover the
information and so the obvious stuff is
going to get left out okay so moving
forward here how long should you study
to pass for the AI 900 if you're
entirely new to ml Ai and Cloud
providers such as Azure you should
anticipate dedicating around 15 hours to
grasp the basics this estimate can vary
base on your familiarity with these
concepts for complete beginners the time
commitment might extend to 20 to 30
hours for the intermediate level so
people that have passed the a900 or
dp900 you're looking at around 8 to 10
hours if you have one or more years of
experience with Azure or another cloud
service provider like a WS or gcp you're
looking at about 5 hours or less the
average stunny time is about 8 hours
this is where you should be committing
50% of the time to the lecture in labs
and 50% for the practice exams the
recommended study time is 30 minutes to
an hour a day for 14 days this should
get you through it but just don't
overstudy and just don't spend too
little time what does it take to pass
the exam well you got to watch the
lectures and memorize key information do
Hands-On labs and follow along with your
own Azure account I'd say that you could
probably get away with just watching all
the videos in this one without having to
do the labs but again it really does
reinforce that information if you do
take the time there is some stuff that
is in Azure AI Studio or machine
learning you might be wary of launching
instances because we do have to run
instances and they will cost money
unless you delete the instances after
use resulting in very small costs so if
you feel that you're not comfortable
with that by just watching the videos
you should be okay but when you get into
the associate tier you absolutely have
to expect to pay something to learn and
take that risk you want to do paid
online practice exams that simulate the
real exam as I've mentioned before I do
provide a free practice exam and have
paid practice exams that accompany this
course that are on my platform exam Pro
and that's how you can help support more
of these free courses so can you pass
this certification without taking a
practice exam well azzure is a little
bit harder if this isn't a WS exam I
would say yes but for Azure exams like
AI 900 dp900 and sc900 probably not it's
kind of risky I think you should do at
least one practice exam or go through
the sample one there's a sample one
probably laying around on the Azure
website let's take a look at the exam
guide breakdown here and then in the
following video we'll look at in more
detail so it's broken down into the
following domain so the exam has five
domains of questions and each domain has
its own waiting which determines how
many questions in a domain that will
show up so 15 to 20% will be described
Ai workloads and considerations 20 to
25% will consist of describe fundamental
principles of machine learning on Azure
15 to 20% will consist of described
features of computer vision workloads on
Azure 15 to 20% will be described
features of natural language processing
workloads on Azure and 15 to 20% will be
described features of generative AI
workloads on Azure I want you to notice
it's says describe these domains this is
good because that tells you it's not
going to be super hard if you start
seeing things that say Beyond describe
and identify then you know it's going to
be a bit harder so where do you take
this exam well you can take it in person
at a test center or online from the
convenience of your own home so there's
two popular test centers there's CER
aort and there's Pearson view you can
also take it at a local test center if
there are nearby locations the term
Proctor means a supervisor or person
that is monitoring you while you're
taking the exam if I had the the option
between in person or online I would
always choose the in person because it's
a controlled environment and it's way
less stressful online there are many
things that can go wrong but it's up to
your personal preference and your
situation the passing grade here is 700
out of a th000 so that's around 70% I
would say around because you could
possibly fail with 70% because these
things work on scaled scoring for
response types there's about 37 to 47
questions and you can afford to get
about 10 to 13 questions wrong so some
questions are worth more than one point
some questions cannot be skipped and the
format of questions can be multiple
choice multiple answer drag and drop and
hot area there shouldn't be any case
studies for foundational level exams and
there's no penalty for wrong questions
so for the duration you get 1 hour that
means about 1 minute per question the
time for this exam is 60 Minutes your C
time is 90 minutes C time refers to the
amount of time that you should take to
allocate for that exam so this includes
time to review the instructions read and
accept the NDA complete the exam and
provide feedback at the end this
certification is going to be valid
forever and it does not expire Microsoft
fundamental certifications such as the
a900 or ms9900 do not expire as long as
the technology is still available or
relevant so we'll proceed to the full
exam guide [Music]
[Music]
now hey this is Andrew Brown from exam
Pro and what we've pulled up here is the
official exam outline on the Microsoft
website if you want to find this year s
you just have to type in AI 900 Azure or
Microsoft you should be able to easily
find it the page looks like this what I
want you to do is scroll on down because
we're looking for the AI 900 set of
guide and from there we're going to
scroll on down to the skills measured
section and you might want to bump up
the text Azure loves updating their
courses with minor updates that don't
generally affect the outcome of the
study here but it does get a lot of
people worried because they always say
well is your course out of date so no
they're just making minor changes
because they'll do this like five times
a year and so if there was a major
revision what would happen is they would
change it so instead of being the AI 900
it would be like the AI 9001 or 9002
similar to how the AI 102 was previously
AI 100 but now it's the AI 102 so just
watch out for those and if it's a major
revision then yes it would probably need
a completely new course so there aren't
any major changes with the new update
other than the update for the generative
AI workloads on Azure section A couple
of name changes and a few things being
removed everything else remains is
relatively the same with very minor
changes so the concepts and such are
still up to date overall I think the
exam is easier than the four so let's go
through some of the topics and work our
way through here so describe Ai
workloads and considerations so here
we're just kind of describing the
generalities of AI so content moderation
workloads involve filtering out
inappropriate or harmful content from
user generated inputs ensuring a safe
and positive user experience
personalization workloads analyze user
behavior and preferences to tailor
content recommendations or experiences
to individual users computer vision
workloads involve the analysis of images
and videos to recognize patterns objects
faces and actions identify natural
language processing knowledge mining
document intelligence and features of
generative AI workloads note that these
are all just Concepts you don't need to
know how to use the services at a high
level then you have the responsible AI
section so Microsoft has these six
principles that they really want you to
know and they push it throughout all
their AI services so those are the six
you'll need to know and they're not that
hard to
learn moving on we have described
fundamental principles of machine
learning on
Azure so here it's just describing
regression classification clustering and
features of deep learning we have a lot
of practical experience with these in
the course so you will understand at the
end what these are used for next we have
core machine learning Concepts we can
identify features and labels in a data
set so that's the data labeling service
describe how training validation data
sets are used in machine learning so
we'll touch on that describe
capabilities of Automated machine
learning automl simplifies building and
picking the best models while data and
compute Services provide the power you
need for training with Azure machine
learning it helps with managing and
deploying your models letting you put
your machine learning projects into
action smoothly under computer vision
workloads we have image classification
object detection optical character
recognition facial detection and facial analysis
analysis
Solutions next we have Azure AI Vision
Azure AI face detection and Azure AI
video indexer the Azure AI Services
Encompass a wide range of tools designed
to facilitate the development of
intelligent applications these Services
used to be called computer vision custom
vision face service and form recognizer
but have Evol or been grouped under
broader service categories to streamline
their application and integration into
projects for NLP we have key phrase
extraction entity recognition sentiment
analysis language modeling speech
recognition synthesis this one doesn't
really appear much it's kind of a
concept not so much something we have to
do and then there's
translation so now we have Azure tools
and services for NLP workloads these
include the Azure AI language service
Azure AI speech service and Azure AI
translator service these used to be
separate Services I believe like the
text analytics service Lewis speech
service and translator text service but
they have been added to the Azure AI
umbrella of AI services and now we'll be
moving on to the generative AI workloads
on Azure we'll be covering features of
generative AI models common scenarios
for generative Ai and responsible AI
considerations for generative Ai and
also some of the cool features that
Azure open ey service has to offer such
as natural language generation code
generation and image
generation so that's about a general
breakdown of the AI 900 exam [Music]
[Music]
guide hey this is angrew Brown from exam
Pro and we are looking at the layers of
machine learning so here I have this
thing that looks like kind of an onion
and what it is it's just describing the
relationship between these uh ml terms
uh uh related to Ai and we'll just work
our way through here starting at the top
so artificial intelligence also known as
AI is when machines that perform jobs
that mimic human behavior so it doesn't
describe uh how it does that but it's
just the fact that that's what AI is uh
one layer underneath we have machine
learning so machines that get better at
a task without explicit programming uh
then we have deep learning so these are
machines that have an artificial neural
network inspired by the human brain to
solve complex problems and if you're
talking about someone that actually
assembles either ml or or deep learning
uh models or algorithms that's a data
scientist so a person with
multi-disciplinary skills and math
statistics predictive modeling machine
learning to make future predictions so
what you need to understand is that AI
is just the outcome right and so AI
could be using ml underneath or deep
learning or a combination of both or
just IFL statements okay [Music]
[Music]
all right so let's take a look here at
the key elements of AI so AI is the
software that imitates human behaviors
and capabilities and there are key
elements according to Azure or Microsoft
as to what makes up AI so let's go
through this list quickly here so we
have machine learning which is the
foundation of an AI system that can
learn and predict like a human you have
anomaly detection so detect outliers or
things out of place like a human
computer vision be able to see like a
human natural language processing also
known as NLP be able to process human
languages in referr contexts you know
like a human at conversational AI be
able to hold a conversation with a human
so you know I wrote here according to
Microsoft and Azure because you know the
the global definition is a bit different
but I just wanted to put this here
because I've definitely seen this as an
exam question and so we're going to have
let's define what is a data set so a
data set is a logical grouping of units
of data that are closely related to or
share the same data structure and there
are publicly available data sets that
are used in uh learning of Statistics
data analytics and machine learning I
just want to cover a couple here so the
first is the mnist database so images of
handwritten digits used to test classify
cluster image processing algorithms
commonly used when learning uh how to
build computer vision ml models to
translate handwritten into or
handwriting into digital text so it's
just a bunch of handwritten uh numbers
and letters and then another very
popular data set is the common objects
in context Coco data set so this is a
data set which contains many common
images using a Json file Coco format
that identify objects or segments within
an image uh and so this data set has a
lot of stuff in it so object
segmentations recognition and it
contexts super pixel stuff segmentation
they have a lot of images and a lot of
objects uh so there's a lot of stuff in
there so why am I talking about this and
in particular Coco data sets well when
you use um Azure machine Learning Studio
it has a DAT data labeling service and
um the thing is is that uh it can
actually export out into Coco formats
that's why I wanted you to get exposure
to what Coco was and the other thing is
is that when you're building out Azure
machine learning uh pipelines you uh
they actually have open data sets which
we'll see later in the course um that
shows you that you can just use very
common ones and so uh you might see mest
and uh the other one there uh so I just
wanted to get you some exposure [Music]
[Music]
okay let's talk about data labeling so
this is the process of identifying raw
data so images text files videos and
adding one or more meaningful and
informative labels to provide context so
a machine learning model can learn so
with supervised machine learning
labeling is a prerequisite to produce
training data and each piece of data
will generally be labeled by a human the
reason why I say generally here is
because with azure's uh data labeling
Service uh they can actually do ml
assisted labeling uh so with
unsupervised machine learning labels
will be produced by the machine and may
not be human readable uh and then one
other thing I want to touch on is the
term called Ground truth so this is a
proper u a properly labeled data set
that you can use as the objective
standard to train and assess a given
model is often called Ground truth the
accuracy of your train model will depend
on the accuracy of your ground Truth Now
using um azures tools I never seen use
the word ground truth I see that a lot
in AWS and even this graphing here is
from AWS but uh I just want to make sure
you are familiar with all that stuff [Music]
[Music]
okay let's compare supervised
unsupervised and reinforcement learning
starting at the top we got supervised
learning this is where the data has been
labeled for training and it's considered
Tas driven because you are trying to
make a prediction get a value back so
when the labels are known and you want a
precise outcome when you need a specific
value returned and so you're going to be
using classification and regression in
these cases for unsupervised learning
this is where data that has not been
labeled uh the ml model needs to do its
own labeling this is considered data
driven it's trying to recognize a
structure or a pattern and so this is
when the labels are not known and the
outcome does not need to be prise when
you're trying to make sense of data so
you have clustering dimensionality
reduction and associ
if You' never heard this term before the
idea is it's trying to reduce the amount
of Dimensions to make it easier to work
with the data so make sense of the data
right uh we have reinforcement learning
so this is where there is no data
there's an environment and an ml model
generates data uh and and makes many
attempts to reach a goal so this is
considered uh decisions driven and so
this is for game AI learning tasks robot
navigation when you've seen someone code
a video game that can play itself that's
what this is if you're wondering this is
not all the types of machine learning uh
and these are in specific unsupervised
and supervised is considered classical
machine learning because they he heavily
rely on statistics and math to produce
the outcome uh but there you [Music]
[Music]
go so what is a neural network well it's
often described as mimicking the brain
it's a neuron or node that represents an
algorithm so data is inputed into a
neuron and based on the output the data
will be passed to one of many connected
neurals the connections between neurons
is weighted I really should have
highlighted that one that's very
important uh the network is organized
into layers there will be an input layer
uh one to many hidden layers and an
output layer so here's an example of a
very simple neural network notice the NN
a lot of times you'll see this in ml as
an abbreviation for neural networks and
sometimes neural networks are just
called neural Nets so just understand
that's the same term here what is deep
learning this is a neural network that
has three or more hidden layers it's
considered deep learning because at this
point it's uh it's not human readable to
understand what's going on with within those
those
layers what is forward feed so neural
networks where they have connections
between nodes that do not form a cycle
they always move forward so that just
describes uh a a forward pass through
the network you'll see fnn which stands
for forward feed neural network just to
describe that type of network uh then
there's back back propagation which are
in forward feed uh networks this is
where we move backwards through the
neural net adjusting the weights to
improve the outcome on next iteration
this is how a neural net learns the way
the back propagation knows to do this is
that there's a loss function so a
function that compares the ground truth
to the prediction to determine the error
rate how bad the network performs so
when it gets to the end it's going to
perform that calculation and then it's
going to do its back propagation and
adjust the weights um then you have
activation functions I'm just going to
uh clear this up here so activation
functions uh they're an algorithm
applied to a hidden layer uh node that
affects connected output so for this
entire hidden layer they'll all have the
same uh one here and it just kind of
affects uh how it learns and like how
the waiting works so it's part of back
propagation and just the learning
process there's a concept of D so when
the next layer increases the amount of
nodes and you have spars so when the
next layer decreases the amount of nodes
anytime you see something going from a
dense layer to a sparse layer that's
usually called dimensional
dimensionality reduction because you're
reducing the amount of Dimensions
because the amount of nodes in your
network determines the dimensions you have
have okay
okay [Music]
[Music]
what is a GPU well it's a general
processing unit that is specially
designed to quickly uh render high
resolution images and videos
concurrently gpus can perform parallel
operations on multiple sets of data so
they are commonly used for non-graphical
tasks such as machine learning and
scientific computation so a CPU has an
average of four to 16 processor cores a
GPU can have thousands of processor
cores so something that has 408 gpus
could have as many as 40,000 cores
here's an image I grabbed right off the
Nvidia website and so it really
illustrates very well uh like how this
would be really good for machine
learning or U neural networks because no
networks have a bunch of nodes they're
very repetitive tasks if you can spread
them across a lot of cores that's going
to work out really great so gpus are
suited uh for repetitive and highly
paralleled Computing tasks such as
rendering Graphics cryptocurrency mining
deep learning and machine [Music]
[Music]
learning we're talking about Cuda before
we can let's talk about what Nvidia is
so Nvidia is a company that manufactures
graphical processor units for gaming and
professional markets if you play video
games you've heard of Nvidia so what is
Cuda it is the compute unified device
architecture it is a parallel Computing
platform in API by Nvidia that allows
developers to use Cuda enable gpus for
general purpose Computing on gpus so
gpg all major deep learning Frameworks
are integrated with Nvidia deep uh
learning SDK the Nvidia uh deep learning
SDK is a collection of Nvidia libraries
for deep learning one of those libraries
is the Cuda deep neural network library
so CNN so Cuda or CNN provides highly
tuned implementations for standard
routines such as forward and back uh
convolution convolution is really great
for um uh uh computer vision pooling
normalization activation layers uh so
you know in the Azure certification uh
for the AI 900 uh they're not going to
be talking about Cuda but if you
understand these two things you'll
understand why gpus uh really matter [Music]
[Music]
okay all right let's get a uh easy
introduction into machine learning
pipeline so this one is definitely not
an exhaustive one and we're definitely
going to see more complex ones uh
throughout this course but let's get to
it here so starting on the left hand
side we might start with data labeling
this is very uh important when you're
doing supervis learning because you need
to label your data so the ml model can
learn by example during training uh this
stage and the feature engineering stage
are is considered pre-processing because
we are preparing our data to be trained
for the model uh when we move on to
feature engineering the idea here is
that ml models can only work with
numerical data so you'll need to
translate it into a format that it can
understand so extract out the important
data that the ml model needs to focus on
okay uh then there's the training steps
so your model needs to learn how to
become smarter it will perform multiple
iterations getting smarter with each iteration
iteration
uh you might also have a hyperparameter
tuning uh step here it says tunning but
it should say tuning um but the ml model
can have different parameters so you can
use ml to try out many different
parameters to optimize the outcome when
you get to deep learning it's impossible
to tweak the parameters by hand so you
have to use hyperparameter tuning then
you have serving sometimes known as
deploying uh but you know when we say
deploy we talk about the entire pipeline
not necessarily just the the ml model
step so we need to make an ml model
accessible so we serve it by hosting in
a virtual machine or container uh when
we're talking about Azure um machine
learning it's either going to be an
Azure kubernetes service or Azure
container instance and you have uh
inference so inference is the active
request uh of requesting to make a
prediction so you send your payload with
either CSV or whatever and you get back
the results you have a real time
endpoint and batch processing so real
time is just they can batch can be real
as well but generally it's slower but
the idea is that do I am I making a
single item prediction or am I giving
you a bunch of data at once and again
this is a very simplified ml pipeline
I'm sure we'll revisit ml pipeline later
in this [Music]
[Music]
course so let's compare the uh the terms
forecasting and prediction so
forecasting you make a prediction with
relevant data it's great for analysis of
Trends uh and it's not guessing and when
you're talking about prediction this is
where you make a prediction without
relevant data you use statistics to
predict future outcomes it's more of
guessing and it uses decision Theory so
imagine you have a bunch of data and the
idea is you're going to infer from that
data okay maybe it's a maybe it's B
maybe it's C and for prediction you
don't have really much data so you're
going to have to uh kind of invent it
and the idea is that you'll figure out
what the outcome is there these are
extremely broad terms but you know just
so you have a highle view of these two things
things okay
okay
so what are performance or evaluation
metrics well they are used to evaluate
different machine learning algorithms
the idea is uh you know when your
machine learning makes a prediction
these are the metrics you're using to
evaluate to determine you know is your
ml model working as you intended so for
different types of problems different
metrics matter this is absolutely not an
exhaustive list I just want you to get
you exposure to these uh words and
things uh so that when you see them you
go okay I'll come back here and refer to
this uh but lots of these you're just
it's not it's not necessarily to
remember but classification metrics you
should know so classification we have
accuracy precision recall F1 score rock
and a for regression metrics we have MSE
R msce Mae ranking metrics we have MMR
dcg and dcg statistical metrics we have
correlation computer vision metrics we
have psnr ssim IOU NLP metrics we have
perplexity blue medor Rogue deep
learning related metrics we have
Inception score I cannot say this
person's name but I'm assuming it's a
person but uh this Inception distance
and there are two categories of
evaluation metrics we have internal
evaluations so metrics used to evaluate
the internals of an ml model so accuracy
F1 score Precision recall I call them
the famous four using all kinds of uh
models and uh external evaluation
metrics used to evaluate the final
prediction of an ml model so yeah uh
don't get too worked up here I know
that's a lot of stuff uh the ones that
matter we will see again again [Music]
[Music]
okay let's take a look at Jupiter
notebook so these are web-based
applications for authoring documents
that combin live code narrative text
equations visualizations uh so if you're
doing data science or you're building ml
models you absolutely are going to be
working with jupyter notebooks they're
always integrated into uh cloud service
providers ml tools um uh so jupyter
notebook actually came about from
IPython so IPython is the precursor of
it and they extracted that feature out
it became jupyter notebook I IPython is
now a kernel uh to run uh python so when
you execute out python code here it's
using IPython which is just a version of
python uh jupyter notebooks were
overhauled and better integrated into an
IDE called Jupiter Labs which we'll talk
about here in a moment and you generally
want to open notebooks in Labs the
Legacy webbased interface is known as
Jupiter classic notebooks so this is
what the old one looks like you can
still open them up but everyone uses
Jupiter Labs now okay so let's talk
about Jupiter labs jupyter labs is the
next generation web-based user interface
all familiar features of the classic
Jupiter notebook uh is in a flexible
powerful user interface it has notebooks
a terminal a text editor a file browser
Rich outputs Jupiter Labs will
eventually replace the classic uh
Jupiter notebooks so there you [Music]
[Music]
go we keep mentioning regression but
let's talk about it in uh more detail
here so we kind of understand the
concept so regression is the process of
finding a function to equate a labeled
data set notice it says labeled that
means it's going to be for supervised
learning into a continuous variable
number so another way to say it is
predict this variable in the future so
the future is just means like that
continuous variable doesn't have to be
time but that's just a good example of
regression so what will the temperature
be next week so will it be 20 Celsius
how would we determine that well we
would have vectors so dots that are
plotted on a graph that has multiple
Dimensions the dimensions could be
greater than just X and Y you could have
uh many
uh and then you have a regression Line
This is the line that's going through
our data set and uh and that's going to
help us uh figure out um how to predict
the value so how would we do that well
we would need to calculate the distance
of a vector from the regression line
which is called an error and so
different regression algorithms use uh
the error to predict different variable
future variables so just to look at this
graphic here so here is our regression
line and here is a a DOT like a a vector
a piece of information and this distance
from the line the the actual distance is
what we're going to use in our ml model
to figure out if we were to plot another
line up here right you know we would
compare this line to all the other lines
okay and that's how we'd find similarity
and what we'll commonly see for this is
mean squared error root mean squared
error mean absolute error so MSE mrse
and Mae [Music]
[Music]
okay let's take a closer look at the
concepts of classification so
classification is the process of finding
a function to divide a labeled data set
so again this is supervised learning
into classes or categories so predict a
category to apply to the inputed data so
will it rain next Saturday will it be
sunny or rainy so we have our data set
and the idea is we're drawing through
this a classification line to divide the
data set so we're regression we're
measuring the line two or the the the
vectors to the line and this one it's
just what side of the line is it on if
it's on this side then it's sunny if
it's on this side it's rainy okay for
classification algorithms we got log
logistic regression decision trees
random forests neural networks uh naive
Bays K nearest neighbor also known as
knnn and support Vector machines svms [Music]
[Music]
okay let's take a closer look at
clustering so clustering is the process
of grouping unlabeled data so unlabeled
data means it's unsupervised learning
based on similarity and differences so
the outcome could be group data based on
similarities or differences I guess it's
the same description up here uh but
imagine we have a graph and we have data
and the idea is we draw boundaries
around that to see uh similar groups so
maybe we're recommending purchases to
Windows computers or recommending
purchase to Mac computers now remember
this is unlabeled data so the label is
being inferred or um or they're just
saying these things are similar right so
clustering algorithms we got K means k
medoids a density Bas hierarchial [Music]
[Music]
okay hey this is Andrew Brown from exam
Pro and we're looking at the confusing
Matrix and this is a table to visualize
the model predictions the predicted
versus the ground truth labels the
actual also known as an error Matrix and
they are useful for classification
problems to determine if our um if our
classification is working as we think it
is so imagine we have a question how
many bananas did this person eat or
these people eat and so we have this
kind of a box here where we have
predicted versus actual and it's really
comparing the ground truth and what the
model predicted right and so on the exam
they'll ask you questions like okay well
imagine that uh and they might not even
say yes or no maybe like zero and one
and so what they're saying is you know
imagine you have you want to tell us the
true positives right and so the idea is
they won't show you the labels here but
you know one and one would be a true
positive and zero and Z would be a false
negative okay another thing they'll ask
you about these uh confusion matrixes is
uh the size of them so the idea is that
we're looking right now at a um oops
just going to erase that there but we
are looking at a binary classifier
because we have one label and uh uh just
two labels right one and two okay but
you could have three say one two and
three so how would you calculate that
well there would just be a third cell
over here uh you know and it's just
going to be actual predicted because
we're only going to have ground Truth
Versus prediction and so that's how
you'll know it will be six the size will
be six might not say cells but it'll
just say six [Music]
[Music]
okay so to understand anomaly detection
let's define quickly what is an anomaly
so an abnormal thing that is marked by
deviation from the norm or standard so
anomaly detection is the process of
finding outliers within a data set
called an anomaly so detecting when a
piece of data or access pattern
appear suspicious or malicious so use
cases for Nom detection can be data
cleaning intrusion detection fraud
detection system Health monitoring event
detection and sensory or sensor networks
ecosystem disturbances detection of
critical and cascading flaws Anomaly
detections by hand is a very tedious
process so using ml for anomaly
detection is more efficient and accurate
and Azure has a service called anomaly
detector detects anomalies and data to
quickly find uh quick identify and troubleshoot
troubleshoot [Music]
[Music]
issues so computer vision is when we use
machine learning neural networks to gain
high level understanding of digital
images or videos so for computer vision
deep learning algorithms we have
convolutional neural networks these are
for image and video recognition they're
inspired after how the human eye
actually processes information and sends
it back to the brain to be processed you
have recurrent neural networks rnns
which are generally used for handri
recognition or speech recognition of
course these algorithms have other
applications but these are the most
common use cases for them for types of
computer vision we have image
classification so look at an image or
video and classify its place in a
category object detection so identify
objects within an image or video and
apply labels and location boundaries
semantic segmentation so identify
segments or objects by drawing pixel
masks around them so great for objects
and movement image analysis so analyze
uh an image or video to apply
descriptive context uh labels so maybe
an employee is sitting at a desk in
Tokyo would be uh something that image
analysis would do optical character
recognition or OCR find texts in images
or videos and extract them into digital
text for editing facial detection so
detect faces in a photo or video and
draw a location boundary uh and label
their expression so for computer vision
to some things around Azure or Microsoft
Services there's one called seeing AI
it's an AI app developed by Microsoft
for iOS and so you use your device
camera to identify OB uh people and
objects and the app is audibly describes
those objects for people with visual
impairments it's totally free if you
have an IOS app I have an Android phone
so I cannot use it but I hear it's great
some of the Azure computer vision
service offering is computer vision so
analyze images and videos extract
descriptions tags objects and texts
custom Vision so custom uh image
classification object detection models
using your own images face so detect and
identify people uh emotions and images
form recognizer so translate scan
documents into key value or tabular editable
editable [Music]
[Music]
data so natural language processing also
known as NLP is machine learning that
can understand the context of a corpus
Corpus being a body of related text so
nlps enable you to analyze and interpret
text within documents and email messages
interpret or contextualize spoken tokens
so for example maybe customer sentiment
analysis whether customer is happy or
sad synthesize speech so a voice
assistant uh assistant talking to you
automatically translates spoken or
written phrases and sentences between
languages interpret spoken or written
commands and determine appropriate
actions a very famous example for a
voice assistant specifically or virtual
assistant for Microsoft is Cortana uh it
uses the Bing search engine to perform
tasks such as setting reminders and
answering questions for the user uh and
if you're on a Windows 10 machine uh
it's very easy to activate Cortana by
accident uh when we're talking about
azures MLP offering we have text and
analytics so sentiment analysis to find
out what customers think find topic uh
topic relevant phrases using key phrase
extraction identify the language of the
text with the language detection detect
and categorize entities in your text
with named entity recognition for
translator we have real-time text
translation multilanguage support uh for
speech service we have transcribe
audible speech into readable searchable
texts and then we have
language understanding also known as Lewis
Lewis
uh natural language processing service
that enables you to understand human
language in your own application website
chatbots iot device and more when we
talk about conversational AI it usually
generally uses NLP so that's where
you'll see that overlap next [Music]
[Music]
okay let's take a look here at
conversational AI which is technology
that can participate in conversations
with humans so we have chat Bots voice
assistants and interactive voice
recognition systems which is like the
the second version to interactive voice
response system so you know when you
call in and they say press these numbers
that is a response system and a
recognition system is when they can
actually take human uh Speech and
translate that into action so the use
cases here would be online customer
support replaces human agents for uh for
replying about customer FAQs maybe
shipping questions anything about
customer support accessibility so voice
operate UI for those who are uh visually
impaired HR processes so employee
training onboarding updating employee
information I've never seen it used like
that but that's what they say as a use
case Healthcare accessible affordable
healthare so maybe you're doing a claim
process I've never seen this but maybe
in the US where you do more your claims
and everything is privatized it makes
more sense Internet of Things So iot
devices so Amazon Alexa Apple Siri
Google home and I suppose Cortana but it
doesn't really have a particular device
so that's why I didn't list it there
computer software so autocomplete search
on phone or desktop so that would be
Cortana something it could do uh for the
two services that are around
conversation AI for Azure we have Q&A
maker so create a conversational
question and answer bot from your
existing content also known as a
knowledge base and Azure bot service
intelligent serverless bot service that
scales on demand used for creating
publishing managing bots so uh the idea
is you make your Bot here and then you
deploy it with this [Music]
[Music]
okay let's take a look here at
responsible AI which focuses on ethical
transparent and accountable uses of AI
technology Microsoft put into practice
responsible AI via its six Microsoft AI
principles this whole thing is invented
by Microsoft uh and so you know it's not
necessarily a standard but it's
something that Microsoft is pushing hard
to uh have people adopt okay so we the
first thing we have is fairness so this
is an AI system which should treat all
people fairly we have reliability and
safety an AI system should perform
reliably and safely privacy and security
AI system should be secure and respect
privacy inclusiveness AI system should
Empower everyone and engage people
transparency AI systems should be
understandable accountability people
should be accountable for AI systems and
we need to know these in uh uh greater
detail so we're going to have a a short
little video on each of these [Music]
[Music]
okay the first on our list is fairness
so AI systems should treat all people
fairly so an AI system can reinforce
existing social social stere uh
stereotypical bias can be introduced uh
during the development of a pipeline so
an A system that are used to allocate or
withhold opportunities resources or
information uh in domains such as
criminal justice employee employment and
hiring finance and credit so an example
here would be an ml model designed to
select a final applicant for hiring
pipeline without incorporating any bias
based on gender ethnicity or may result
in unfair Advantage so Azure ml can tell
you how each feature can influence a
model's prediction for bias uh one thing
that could be of use is fair learn so
it's an open source python project to
help data science is to improve fairness
in the AI systems at the time of I made
this course a lot of their stuff is
still in preview so you know it's the
fairness component is is not 100% there
but it's great to see that they're
okay so we are on to our second AI
principle for Microsoft and this one is
AI systems should perform reliably and
safely so AI software must be rigorously
tested to ensure they work as expected
before release to the end user if there
are scenarios where AI is making
mistakes it is important to release a
report Quantified risks and harms to end
users so they are informed of the
shortcomings of an AI solution something
you should really remember for the exam
they'll definitely ask that AI wear
concern uh for reliability safety for
humans is critically important
autonomous vehicles a health diagnosis a
suggestion prescriptions and autonomous
weapon systems they didn't mention this
in their content and I was just like
doing some additional research research
I'm like yeah you really don't want
mistakes when you have automated weapons
or ethnically you shouldn't have them at
all but hey that's uh that's just how
the world works but yeah this is this category
category [Music]
[Music]
here we're on to our third Microsoft AI
principle AI system should be secure and
respect privacy so AI can require vast
amounts of data to train deep machine ml
models the nature of an ml model may
require personally identifiable
information so
piis uh it is important that we ensure
protection of user data that it is not
leak or disclosed in some cases ml
models can be run locally on a user's
device so their uh piis remain on their
device avoiding the the vulnerability
this is called this is like Edge
Computing so that's the concept there AI
security principles to malicious actors
so data origin and lineage data use
internal versus external data corruption
considerations anomaly detection so
there you [Music]
[Music]
go we're on to the fourth Microsoft AI
principle so AI systems should Empower
everyone and engage people if we can
design AI Solutions for the minority of
users they can design a solution for the
majority of users so when we're talking
about minority groups we're talking
about physical ability gender sexual
orientation ethnicity other factors this
one's really simple uh in terms of
practicality it doesn't 100% make sense
because if you've worked with um uh
groups that are deaf and blind
developing technology for them a lot of
times they need specialized Solutions uh
but the approach here is that you know
if we can design for the minority we can
design for all that is uh the principle
there so that's what we need to know okay
let's take a look here at transparency
so AI systems should be understandable
so interpretability and intelligibility
is when the end user can understand the
behavior of UI so transparency of AI
systems can result in mitigating
unfairness help developers debug their
AI systems ging more trust from our
users those build a those who build AI
systems should be open about why they're
using AI open about the limitations of
the AI systems adopting an open source
AI framework can provide transparency at
least from a technical perspective on
system we are on to the last Microsoft
AI principle here people uh should be
accountable for AI systems so the
structure put in place to consistently
enacting AI principles and taking them
into account AI systems should work
within Frameworks of governments
organizational principles ethical and
legal standards that are clearly defined
principles guide Microsoft and how they
develop sell and Advocate when working
with third parties and this push towards
regulation towards a principles so this
is Microsoft saying hey everybody adopt
our model um there aren't many other
model so I guess it's great that
Microsoft is taking the charge there I
just feel that it needs to be a bit more
welldeveloped but what we'll do is look
at some more practical examples so we
can better understand how to apply their principles
okay so if we really want to understand
how to apply the Microsoft AI principles
they've great created this nice little
tool via a free web app for practical
scenarios so they have these cards you
can read through these cards they're
color coded for different scenarios and
there's a website so let's go take a
look at that and see what we can learn [Music]
[Music]
okay all right so we're here on the
guidelines for human AI interaction so
we can better understand the uh how to
put into practice the Microsoft AI
principles they have 18 cards and let's
work our way through here and see the
examples the first one our list make
clear what the system can do help the
users understand what the AI system is
capable of doing so here PowerPoint
quick start Builders an on uh Builds an
online outline to help you get started
researching a subject it display uh
suggested topics that help you
understand the features capability then
we have the Bing app shows examples of
types of things you can search for um
Apple watch displays all metrics it
tracks and explains how going on the
second card we have make clear how well
the system can do what it can do so here
we have office new uh companion
experience ideas dock alongside your
work work and offers one-click
assistance with grammar design Data
Insights richer images and more the
unassuming term ideas coupled with label
previews help set expectations and
presented suggestions the recommender in
apple music uses language such as we
will think you'll like to communicate
uncertainty the help page for Outlook
web mail explains the filtering into
focused and other and we'll start
working right away but we'll get better
with use making clear the mistakes uh
will happen and you teach the product
and set overrides onto our red cards
here we have time Services based on
context time when to act or interrupt
based on the user's current task and
environment when it's time to leave for
appointments Outlook sends a time to
leave notification with directions for
both driving and public transit taking
into account current location event
location real-time traffic
information um and then we have after
using Apple Maps routing it remembers
when you're parked your car when you
open the app after for a little while it
suggests routing to the location of the
park car all these Apple examples make
me think that Microsoft has some kind of
partnership with apple I guess I guess
Microsoft or or Bill Gates did own Apple
shares so maybe they're closer than we
think uh show contextually relevant
information time when to act or
interrupt based on user's current task
and environment powered by Machine
learning acronyms in word helps you
understand shorthand employed uh in your
own work environment relative to current Open
Open
document uh on Walmart.com when the user
is looking at a product such as gaming
console recommends accessories and games
that would go with it when a user
searches for movies Google shows results
including showtimes near the users's
location for the current data onto our
fifth card here match based uh we didn't
we didn't miss this one right yeah we
did okay so we're on the fifth one here
match relevant social norms ensure
experience is delivered in a way the
users would expect given the social
cultural context when editor identifies
ways to improve writing style presents
options politely consider using that's
the Canadian way being polite uh Google
photos is able to recognize pets and use
the wording important cats and dogs
recognizing that for many pets are an
important part of one's family and you
know what uh when I uh started renting
my new house uh I I said you know is
there a problem with dogs and my
landlord said well of course pets are
part of the family and that was
something I like to hear uh Cortana uses
semiformal tone apologizing when unable
to find a uh contact which is polite and
socially appropriate I like
that okay mitigate social biases ensure
AI system languages and behaviors do not
reinforce undesirable unfair stereotypes
and biases my anal summarizes how you
spend your time at work then suggest
ways to work smarter one ways to
mitigate bias is by using gender neutral
icons to represent important people
sounds good to me a Bing search for CEO
or doctor shows images of diverse people
in terms
of gender and an ethnicity sounds good
to me the predictive uh keyboard for
Android suggests both genders when
typing a pronoun starting with the
letter H we're on to our yellow cards uh
so support efficient invocation so make
it easy to invoke or request system
Services when needed so flashfill is a
helpful timesaver in Excel that can be
easily invoked with on canvas
interactions and uh that keep you in
flow on amazon.com oh hey there got
Amazon in addition to the system giving
recommendations as you browse you can
manually invoke additional
recommendations from the recommender for
your menu uh design ideas in Microsoft
PowerPoint can be invoked uh with the
with the Press of a button if needed I
cannot stand it when that pops up I
always have to tell it to leave me alone
okay support efficient dismal uh efficient
efficient
disle dismissal oh support efficient
dismissal okay make it easy to dismiss
or ignore or undesired AI system
Services okay this sounds good to me
Microsoft forms allows you to create
custom surveys quizzes polls
questionnaires and forms some choices
questions trigger suggested options
position beneath the relevant question
the suggestion can be easily ignored and
dismissed Instagram allows the user to
easily hide or report ads that have been
suggested by AI by tapping the ellipses
at the top of the right of the ad Siri
can be easily dismissed uh uh by saying
never mind
I'm always telling my Alexa never mind
support efficient uh correction make it
easy to edit refine or recover the AI
system uh when the when the AI system is
wrong so alt Auto alt text automatically
generates alt text for photographs by
using intelligent services in the cloud
descriptions can be easily Modified by
clicking the alt text button in the
ribbon once you set a reminder with Siri
the UI displays a tap to edit link when
Bing automatically corrects spelling
errors in search queries it provides the
option to revert to the query as
originally typed with one click on to
card number
10 Scope Services when in doubt so
engage in disambiguate
disambiguate
disambiguation or gracefully degrade the
AI system service when uncertain about a
user's goal so when Auto replacing word
is uncertain of a correction it engages
in disambiguation by displaying multiple
options you can select from Siri will
let you know it has trouble hearing if
you don't respond or talk or or speak
too softly big Maps will provide
multiple routing options when uh when
unable to recommend best one we're on to
card number 11 make clear why the system
did what it did enable users to access
an explanation of why the AI system
behaved as it did office online
recommends docu documents based on
history and activity descriptive text
above each document makes it clear why
the recommendation is shown product
recommendations on Amazon .c include why
Rec recommended link that shows that
what products in the user shopping
history informs the recommendations
Facebook enables you to access an
explanation about why you are seeing
each ad in the news
feed onto our green cards so remember
recent interactions so maintain
short-term memory and allow the user to
make efficient references to that memory
when attaching a file Outlook offers a
list of recent files including recently
copied file links Outlook also remembers
people you have interacted with recently
and displays uh them when addressing a new
new
email uh Bing search remembers some
recent queries and search can be
continued uh conversationally how old is
he after a search for kyanu Reeves Siri
carries over the context from one
interaction to the next a text message
is Created from the person you told Siri
to message to onto card number 13 lucky
number 13 learn from user Behavior
personalize the user experience by
learning from their actions over time
tap on a search bar in office
applications and search lists uh the top
three commands on your screen that
you're most likely to need to
personalize the technology called zero
query doesn't even need to type in the
search bar to provide a personalized
predictive answer amazon.com gives
personalized product recomm
recommendations based on previous
purchases onto card 14 update and adapt
Cur uh C uh cautiously limit disruptive
changes when updating adaptive adapting
the systems behaviors so PowerPoint
designer improves slides for office 65
subscribers by automatically generating
design ideas from to choose from
designer has integrated new capabilities
such as smart Graphics icon suggestions
and existing user experience ensuring
the updates are not
disruptive office tell office tell me
feature shows dynamically recommended
items and a designated try area to
minimize disruptive changes onto card
number 15 encourage granular feedback
back enable the users to provide
feedback indicating their preferences
during regular interactions with the AI
system so ideas and Excel empowers you
to understand your data through high
level visual summaries Trends and
patterns encourages feedback on each
suggestion by asking is this helpful not
only does Instagram provide the option
to hide specific ads but it also
solicits feedback to understand why the
ad is not relevant and Apple's music app
love dislike buttons are prominent easily
easily
accessible number 16 convey the
consequences of us actions immediately
update or convey how user actions will
impact future behaviors of the AI system
you can get stock in G Geographic data
types and Excel it is easy as typing
text into a cell and converting it to
stock data type or geograph geography
data type when you perform the
conversion action an icon immediately
appears in the converted cells uh upon
tapping the like dislike button for each
recommendation it at in apple music a
pop-up informs the user that they'll
receive more or fewer similar
recommendations onto card number 17
we're almost near the end provide Global
controls allow the user to globally
customize the system system monitors and
how it behaves so editor expands on
spelling and grammar checking
capabilities of word to include more
advanced proofing and editing designed
to ensure document is readable editor
can flag a range of critique types and
allow to customize the thing is is that
in word it's so awful spellchecking I
don't understand like it's been years
and the the spell checking never gets
better so they got to emplore better
spellchecking AI I think bang search
provides settings that impact the the
types of results the engine will return
for example safe
search uh then we have Google photos
allows user to turn location history on
enough for future photos it's kind of
funny seeing like Bing in there about
like using AI because at one point it's
almost pretty certain that Bing was
copying just Google search indexes to
learn how to index I don't know that's
Microsoft for you uh we're on on to card
18 notify users about changes inform the
user when AI system adds or updates his
capabilities uh the what's new dialogue
in office informs you about changes by
giving an overview of the latest
features and updates including updates
to AI features in Outlook web the help
tab includes a what's new section that
covers updates so there we go we made it
to the end of the list I hope that was a
fun listen for you and and there I hope
that we could kind of match up the uh
the responsible AI I kind of wish what
they would have done is actually mapped
it out here and said where it match but
I guess it's kind of an isolate service
that kind of ties in so I guess there we go
go [Music]
[Music]
okay hey this is Andrew Brown from exam
Pro and we are looking at Azure
cognitive services and this is a
comprehensive family of AI services and
cognitive apis to help you build
intelligent apps so create customizable
pre-trained models built with
breakthrough AI researches I put that in
quotations I'm kind of throwing some
shade at uh Microsoft at Azure just
because it's their marketing material
right uh deploy cognitive Services
anywhere from Cloud to the edge uh with
containers get started quickly no
machine learning expertise required but
I think it it helps to have a bit of
background knowledge uh develop with
strict ethical standards uh Microsoft
loves talking about the responsible um
their responsible AI stuff empowering
responsible use with industry leading
tools and guidelines so let's do a quick
breakdown of the types of services in
this family so for decision we have
anomaly detector identify potential
problems early on content moderator
detect potentially offensive or unwanted
content personalizer create Rich
personalized experiences for every user
for languages we have language
understanding also known as uh Luis
Lewis I don't know why I didn't put the
initialism there but don't worry we'll
see it again build natural language
understanding into app spots and iot
devices Q&A maker create a
conversational question and answer layer
over your data text analytics detect
sentiment so sentiment is is like
whether customers are happy sad glad uh
keep phrases and named entities
translator detect and translate more
than 90 supported languages for speech
we have speech to text so transcribe
audible um speech into readable search
text text to speech convert text to
lifelike speech for natural interfaces
speech translation so integrate realtime
speech translation into your apps uh
speaker recognition uh identify and
verify uh the People speaking based on
audio for vision uh we have computer
vision so analyze content in images and
videos custom Vision so analyze or sorry
customize image Rec image recognition to
fit your business needs um pH detect uh
and identify people and emotions in
images so there you [Music]
[Music]
go so Azure cognitive Services is an
umbrella AI service that enables
customers to access multiple AI services
with an AI key and API endpoint so what
you do is you go create a new cognitive
service and once you're there it's going
to generate out two keys and an endpoint and that is what you're using generally
and that is what you're using generally for authentication uh with the various
for authentication uh with the various AI services programmatically and that is
AI services programmatically and that is something that is key to the service
something that is key to the service that you need to
that you need to [Music]
[Music] know so knowledge mining is a discipline
know so knowledge mining is a discipline in AI that uses a combination of
in AI that uses a combination of intelligent services to quickly learn
intelligent services to quickly learn from vast amounts of information so it
from vast amounts of information so it allows organizations to deeply
allows organizations to deeply understand and easily explore
understand and easily explore information uncover hidden insights and
information uncover hidden insights and find relationships and patterns at scale
find relationships and patterns at scale so we have ingest enrich and explore as
so we have ingest enrich and explore as our three steps so for ingest content
our three steps so for ingest content from a range of sources using connectors
from a range of sources using connectors to First and thirdparty data stores so
to First and thirdparty data stores so we might have structured data such as
we might have structured data such as databases csvs uh the csvs would more be
databases csvs uh the csvs would more be semi-structured but we're not going to
semi-structured but we're not going to get into that level of detail
get into that level of detail unstructured data so PDFs videos images
unstructured data so PDFs videos images and audio for enrich the content with AI
and audio for enrich the content with AI capabilities that let you extract
capabilities that let you extract information find patterns and deepen
information find patterns and deepen understanding in so cognitive services
understanding in so cognitive services like Vision language speech decision and
like Vision language speech decision and search and explore the newly indexed
search and explore the newly indexed data via search spots existing
data via search spots existing businesses applications and data
businesses applications and data visualizations enrich uh structured Data
visualizations enrich uh structured Data customer relationship management wrap
customer relationship management wrap systems powerbi this whole knowledge
systems powerbi this whole knowledge mining thing is a thing but like I
mining thing is a thing but like I believe that the whole model around this
believe that the whole model around this is so that Azure uh shows you how you
is so that Azure uh shows you how you can use the cognitive services to solve
can use the cognitive services to solve uh things without having to invent new
uh things without having to invent new Solutions so let's look at a bunch of
Solutions so let's look at a bunch of use cases that Azure has uh and see what
use cases that Azure has uh and see what where we can find some in uh useful use
where we can find some in uh useful use so the first one here is for Content
so the first one here is for Content research so when organizations uh task
research so when organizations uh task employees review and research of
employees review and research of technical data it can be tedious to read
technical data it can be tedious to read page after page of dense texts knowledge
page after page of dense texts knowledge mining helps employees quickly review
mining helps employees quickly review these dense materials so you have a
these dense materials so you have a document and in the enrichment step you
document and in the enrichment step you could be doing printed text recognition
could be doing printed text recognition key phrase extraction sharpener uh
detection onto regression scenarios uh will break it down into ranges so when
will break it down into ranges so when you have a very wide range uh Spearman
you have a very wide range uh Spearman correlation works really well R2 score
correlation works really well R2 score this is great for airline uh delay
this is great for airline uh delay salary estimation bug res resolution
salary estimation bug res resolution time we're looking at smaller ranges
time we're looking at smaller ranges where you're talking about normalize
where you're talking about normalize root square mean to error so price
root square mean to error so price predictions um review tips score
predictions um review tips score predictions for normalized mean absolute
predictions for normalized mean absolute error um it's going to be just another
error um it's going to be just another one here they don't give a description
one here they don't give a description for time series it's the same thing it's
for time series it's the same thing it's just in the context of Time series so
just in the context of Time series so forecasting all
forecasting all [Music]
[Music] right another option we can change is
right another option we can change is the validation type when we're setting
the validation type when we're setting up our ml model so Val validation model
up our ml model so Val validation model validation is when we compare the
validation is when we compare the results of our training data set to our
results of our training data set to our test data set model validation occurs
test data set model validation occurs after we train the model and so you can
after we train the model and so you can just drop it down there we have some
just drop it down there we have some options so Auto kfold cross validation
options so Auto kfold cross validation Monte Carlo cross validation train
Monte Carlo cross validation train validation split I'm not going to really
validation split I'm not going to really get into the details of that I don't
get into the details of that I don't think it'll show up on the AI 900 exam
think it'll show up on the AI 900 exam but I just want you to be aware that you
but I just want you to be aware that you do have those options
do have those options [Music]
[Music] okay hey this is Andrew Brown from exam
okay hey this is Andrew Brown from exam Pro and we are taking a look here at
Pro and we are taking a look here at custom vision and this is a fully
custom vision and this is a fully managed no code service to quickly build
managed no code service to quickly build your own classification and object
your own classification and object detection ml models the service is
detection ml models the service is hosted on its own isolate domain at
hosted on its own isolate domain at www.com vision. so the first idea is you
www.com vision. so the first idea is you upload your images so bring your own
upload your images so bring your own labeled images or custom Vision to
labeled images or custom Vision to quickly add tags to any unlabeled data
quickly add tags to any unlabeled data images you use the labeled images to
images you use the labeled images to teach custom Vision the concepts you
teach custom Vision the concepts you care about which is training and you use
care about which is training and you use a simple rest API that calls
a simple rest API that calls uh to quickly tag images with your new
uh to quickly tag images with your new custom computer vision model so you can
custom computer vision model so you can evaluate
evaluate [Music]
[Music] okay so when we launch custom Vision we
okay so when we launch custom Vision we have to create a project and with that
have to create a project and with that we need to choose a project type and we
we need to choose a project type and we have classification and object detection
have classification and object detection reviewing classification here you have
reviewing classification here you have the option between multi-label so when
the option between multi-label so when you want to apply many tags to an image
you want to apply many tags to an image so think of an image that contains both
so think of an image that contains both a cat and a dog you have multi class so
a cat and a dog you have multi class so when you only have one possible tag to
when you only have one possible tag to apply to an image so it's either an
apply to an image so it's either an apple banana and orange it's not
apple banana and orange it's not multiples of these things you have
multiples of these things you have object detection this is when we want to
object detection this is when we want to detect various objects in an image uh
detect various objects in an image uh and you also need to choose a domain a
and you also need to choose a domain a domain is a Microsoft managed data set
domain is a Microsoft managed data set that is used for training the ml model
that is used for training the ml model there are different domains that are
there are different domains that are suited for different use cases so let's
suited for different use cases so let's go take a look first at image
go take a look first at image classification domains so here is the
classification domains so here is the big list the domains being over
big list the domains being over here okay and we'll go through these
here okay and we'll go through these here so General is optimized for a broad
here so General is optimized for a broad range of image classification tasks if
range of image classification tasks if none of the uh uh if none of the other
none of the uh uh if none of the other specified domains are appropriate or
specified domains are appropriate or you're unsure of which domain to choose
you're unsure of which domain to choose select one of the general domains so G
select one of the general domains so G uh or A1 is optimized for better
uh or A1 is optimized for better accuracy with comparable inference time
accuracy with comparable inference time as general domain recommended for larger
as general domain recommended for larger data sets or more difficult user
data sets or more difficult user scenarios this domain requires a more
scenarios this domain requires a more training time then you have A2 optimized
training time then you have A2 optimized for better accuracy with faster advert
for better accuracy with faster advert times than A1 and general domains
times than A1 and general domains recommended for more most data sets this
recommended for more most data sets this domain requires less training time than
domain requires less training time than General and A1 you have food optimized
General and A1 you have food optimized for photographs or dishes of as you
for photographs or dishes of as you would see them on a restaurant menu if
would see them on a restaurant menu if you want to classify photographs of
you want to classify photographs of individual fruits or vegetables use food
individual fruits or vegetables use food domains uh so then we have optimize for
domains uh so then we have optimize for recognizable landmarks both natural and
recognizable landmarks both natural and artificial this domain works best when
artificial this domain works best when Landmark is clearly visible in the
Landmark is clearly visible in the photograph this domain works even if the
photograph this domain works even if the land mark is slightly um obstructed by
land mark is slightly um obstructed by people in front of
people in front of it then you have retail so optimize for
it then you have retail so optimize for images that are found in a shopping cart
images that are found in a shopping cart or shopping uh website if you want a
or shopping uh website if you want a high Precision classifying uh
high Precision classifying uh classifying between dresses pants shirts
classifying between dresses pants shirts use this domain compact domains
use this domain compact domains optimized for the constraints of
optimized for the constraints of real-time classification on the edge
real-time classification on the edge okay then uh we have object detection
okay then uh we have object detection domain so this one's a lot shorter so
domain so this one's a lot shorter so we'll get through a lot quicker so
we'll get through a lot quicker so optimize for a broad range of object
optimize for a broad range of object detection tasks if none of the uh other
detection tasks if none of the uh other domains are appropriate or you're unsure
domains are appropriate or you're unsure of which domain choose the general one
of which domain choose the general one A1 optimized for better accuracy and
A1 optimized for better accuracy and comparable inference time than the
comparable inference time than the general domain recommended for most
general domain recommended for most accurate region locations larger data
accurate region locations larger data sets or more difficult use case
sets or more difficult use case scenarios the domain requires more
scenarios the domain requires more training and results are not
training and results are not deterministic expect uh plus minus 1%
deterministic expect uh plus minus 1% mean average Precision difference uh
mean average Precision difference uh with the same training data provided you
with the same training data provided you have logo optimized for finding Brands
have logo optimized for finding Brands uh logos and images uh products on
uh logos and images uh products on shelves so optimized for detecting and
shelves so optimized for detecting and classifying
classifying products on the shelves so there you
products on the shelves so there you [Music]
[Music] go okay so let's get some uh more
go okay so let's get some uh more practical knowledge of the service so
practical knowledge of the service so for image classification you're going to
for image classification you're going to upload multiple images and apply a
upload multiple images and apply a single or multiple labels to the entire
single or multiple labels to the entire image so here I have a bunch of images
image so here I have a bunch of images uploaded and then I have my tags over
uploaded and then I have my tags over here and they could either be multi or
here and they could either be multi or singular for object detection you apply
singular for object detection you apply tags to objects in an image for data
tags to objects in an image for data labeling and you hover uh your cursor
labeling and you hover uh your cursor over the image custom Vision uses ml to
over the image custom Vision uses ml to show boundaries uh bounding boxes of
show boundaries uh bounding boxes of possible objects have not yet been
possible objects have not yet been labeled if it does not detect it you can
labeled if it does not detect it you can also just click and drag to draw out
also just click and drag to draw out whatever Square you want so here's one
whatever Square you want so here's one where I tagged it up quite a bit you
where I tagged it up quite a bit you have to have at least 50 images on every
have to have at least 50 images on every tag to train uh so just be aware of that
tag to train uh so just be aware of that when you are tagging your images uh when
when you are tagging your images uh when you're training your model is ready when
you're training your model is ready when you and you have two options so you have
you and you have two options so you have quick training this trains quickly but
quick training this trains quickly but it will be less accurate you have
it will be less accurate you have Advanced Training this increases compute
Advanced Training this increases compute time to improve your results so for
time to improve your results so for Advanced Training BAS basically you just
Advanced Training BAS basically you just have this thing that you move to the
have this thing that you move to the right uh with each iteration of training
right uh with each iteration of training our ml model will improve the evaluation
our ml model will improve the evaluation metric so precision and recall it's
metric so precision and recall it's going to vary we're going to talk about
going to vary we're going to talk about the metrics here in a moment but the
the metrics here in a moment but the probability threshold value determines
probability threshold value determines when to stop training when our
when to stop training when our evaluation metric meets our desired
evaluation metric meets our desired threshold so these are just additional
threshold so these are just additional options where when you're training you
options where when you're training you can move this left to right uh and these
can move this left to right uh and these left to right
left to right okay and then when we get our results
okay and then when we get our results back uh we're going to get um some
back uh we're going to get um some metrics here so uh we have evaluation
metrics here so uh we have evaluation metric so we have Precision being exact
metric so we have Precision being exact and accurate selects items that are
and accurate selects items that are relevant recall such sensitivity or
relevant recall such sensitivity or known as true positive rate how many
known as true positive rate how many relevant items returned average
relevant items returned average Precision it's important that you
Precision it's important that you remember these because they might ask
remember these because they might ask you that on the exam so for uh cut when
you that on the exam so for uh cut when we're looking at object detection and
we're looking at object detection and we're looking at the evaluation metric
we're looking at the evaluation metric outcomes for this one we have Precision
outcomes for this one we have Precision recall and mean average Precision uh
recall and mean average Precision uh once we've deployed our pipeline it
once we've deployed our pipeline it makes sense that we go ahead and give it
makes sense that we go ahead and give it a quick test to make sure it's working
a quick test to make sure it's working correctly so you press the click quick
correctly so you press the click quick test button and you can upload your
test button and you can upload your image and it will tell you so this one
image and it will tell you so this one says it's Warf uh when you're ready to
says it's Warf uh when you're ready to publish you just hit the publish button
publish you just hit the publish button and then you'll get uh some prediction
and then you'll get uh some prediction URL and information so you can invoke it
URL and information so you can invoke it uh one other feature that's kind of
uh one other feature that's kind of useful is the Smart labeler so once
useful is the Smart labeler so once you've loaded some training data within
you've loaded some training data within it can now make suggestions right so you
it can now make suggestions right so you can't do this right away but once it has
can't do this right away but once it has some data it's like it's like kind of a
some data it's like it's like kind of a prediction that is not 100% guaranteed
prediction that is not 100% guaranteed right and it just helps you build up
right and it just helps you build up your training uh data set a lot faster
your training uh data set a lot faster uh very useful if you have a very large
uh very useful if you have a very large data set this is known as ml assisted
data set this is known as ml assisted labeling
labeling [Music]
[Music] okay hey this is Andrew Brown from exam
okay hey this is Andrew Brown from exam Pro and in this section we'll be
Pro and in this section we'll be covering the newly added section to the
covering the newly added section to the AI 900 that focuses on generative AI
AI 900 that focuses on generative AI generative AI including Technologies
generative AI including Technologies like chat GPT is becoming more
like chat GPT is becoming more recognized outside of tech circles while
recognized outside of tech circles while it may seem magical in its ability to
it may seem magical in its ability to produce humanlike content it's actually
produce humanlike content it's actually based on Advanced mathematical
based on Advanced mathematical techniques from statistics data science
techniques from statistics data science and machine learning understanding these
and machine learning understanding these Core Concepts can help Society Envision
Core Concepts can help Society Envision new AI possibilities for the future
new AI possibilities for the future first let's compare the differences
first let's compare the differences between regular AI versus generative ai
between regular AI versus generative ai ai refers to the development of computer
ai refers to the development of computer systems that can perform tasks typically
systems that can perform tasks typically requiring human intelligence these
requiring human intelligence these include problem solving decision-making
include problem solving decision-making understanding natural language
understanding natural language recognizing speech and images and more
recognizing speech and images and more the primary goal of traditional AI is to
the primary goal of traditional AI is to create systems that can interpret
create systems that can interpret analyze and respond to human actions or
analyze and respond to human actions or environmental changes efficiently and
environmental changes efficiently and accurately it aims to replicate or
accurately it aims to replicate or simulate human intelligence in machines
simulate human intelligence in machines AI applications are vast and include
AI applications are vast and include areas like expert systems natural
areas like expert systems natural language processing speech recognition
language processing speech recognition and Robotics AI is used in various
and Robotics AI is used in various Industries for tasks such as customer
Industries for tasks such as customer service chatbots recommendation systems
service chatbots recommendation systems in e-commerce autonomous vehicles and
in e-commerce autonomous vehicles and medical diagnosis on the other hand
medical diagnosis on the other hand generative AI is a subset of AI that
generative AI is a subset of AI that focuses on creating new content or data
focuses on creating new content or data that is novel and realistic it does not
that is novel and realistic it does not just interpret or analyze data but
just interpret or analyze data but generates new data itself it includes
generates new data itself it includes generating text images music speech and
generating text images music speech and other forms of media it often involves
other forms of media it often involves Advanced machine learning techniques
Advanced machine learning techniques particularly deep learning models like
particularly deep learning models like generative adversarial networks
generative adversarial networks variational autoencoders and Transformer
variational autoencoders and Transformer models like GPT generative AI is used in
models like GPT generative AI is used in a range of applications including
a range of applications including creating realistic images and videos
creating realistic images and videos generating human-like text composing
generating human-like text composing music creating virtual environments and
music creating virtual environments and even drug Discovery some examples
even drug Discovery some examples include tools like GPT for text
include tools like GPT for text generation doly for image creation and
generation doly for image creation and various deep learning models that
various deep learning models that compose music so let's quickly summarize
compose music so let's quickly summarize the differences of regular AI with
the differences of regular AI with generative AI across three features
generative AI across three features functionality data handling and
functionality data handling and applications regular AI focuses on
applications regular AI focuses on understanding and decision making
understanding and decision making whereas generative AI is about creating
whereas generative AI is about creating new original outputs in terms of data
new original outputs in terms of data handling regular AI analyzes and bases
handling regular AI analyzes and bases decisions on existing data while
decisions on existing data while generative AI uses the same data to
generative AI uses the same data to generate new previously unseen outputs
generate new previously unseen outputs and for applications regular I scope
and for applications regular I scope includes data analysis automation
includes data analysis automation natural language processing and
natural language processing and Healthcare in contrast generative AI
Healthcare in contrast generative AI leans towards more creative and
leans towards more creative and Innovative applications such as content
Innovative applications such as content creation synthetic data generation deep
creation synthetic data generation deep fakes and
Design the next topic we'll be covering is what
the next topic we'll be covering is what is a large language Model A large
is a large language Model A large language model such as GPT Works in a
language model such as GPT Works in a way that's similar to a complex
way that's similar to a complex automatic system that recognizes
automatic system that recognizes patterns and makes predictions training
patterns and makes predictions training on large data sets initially the model
on large data sets initially the model is trained on massive amounts of text
is trained on massive amounts of text Data this data can include books
Data this data can include books articles websites and other written
articles websites and other written material during this training phase the
material during this training phase the model learns patterns and language such
model learns patterns and language such as grammar word usage sentence structure
as grammar word usage sentence structure and even style and tone understanding
and even style and tone understanding context the model's design allows it to
context the model's design allows it to consider a wide context this means it
consider a wide context this means it doesn't just focus on single words but
doesn't just focus on single words but understands them in relation to the
understands them in relation to the words and sentences that come before and
words and sentences that come before and after this context understanding is
after this context understanding is important for generating coherent and
important for generating coherent and relevant text predicting the next word
relevant text predicting the next word when you give the model a prompt which
when you give the model a prompt which is a starting piece of text it uses what
is a starting piece of text it uses what it has learned to predict the next most
it has learned to predict the next most likely word it then adds this word to
likely word it then adds this word to the prompt and repeats the process
the prompt and repeats the process continually predicting the next word
continually predicting the next word based on the extended sequence
based on the extended sequence generating text this process of
generating text this process of predicting the next word continues
predicting the next word continues creating a chain of words that forms a
creating a chain of words that forms a coherent piece of text the length of
coherent piece of text the length of this generated text can vary based on
this generated text can vary based on Specific Instructions or limitations set
Specific Instructions or limitations set for the model refinement with feedback
for the model refinement with feedback the model can be further refined and
the model can be further refined and improved over time with feedback this
improved over time with feedback this means it gets better at understanding
means it gets better at understanding and generating text as it is exposed to
and generating text as it is exposed to more data and usage in summary a large
more data and usage in summary a large language model works by learning from a
language model works by learning from a vast quantity of text Data understanding
vast quantity of text Data understanding the context of language and using this
the context of language and using this understanding to predict and generate
understanding to predict and generate new text that is coherent and
new text that is coherent and contextually appropriate which can be
contextually appropriate which can be further refined with feedback as shown
further refined with feedback as shown in the workflow
in the workflow [Music]
[Music] image next let's talk about Transformer
image next let's talk about Transformer models so a Transformer model is a type
models so a Transformer model is a type of machine learning model that's
of machine learning model that's especially good at understanding and
especially good at understanding and generating language it's built using a
generating language it's built using a structure called the Transformer
structure called the Transformer architecture which is really effective
architecture which is really effective for tasks involving natural language
for tasks involving natural language processing like translating languages or
processing like translating languages or writing text trans Transformer model
writing text trans Transformer model architecture consists of two components
architecture consists of two components or blocks first we have the encoder this
or blocks first we have the encoder this part reads and understands the input
part reads and understands the input text it's like a smart system that goes
text it's like a smart system that goes through everything it's been taught
through everything it's been taught which is a lot of text and picks up on
which is a lot of text and picks up on the meanings of words and how they're
the meanings of words and how they're used in different contexts then we have
used in different contexts then we have the decoder So based on what the encoder
the decoder So based on what the encoder has learned this part generates New
has learned this part generates New pieces of text it's like a skilled
pieces of text it's like a skilled writer that can make up sentences that
writer that can make up sentences that flow well and make sense there are
flow well and make sense there are different types of Transformer models
different types of Transformer models with specific jobs for example Bert is
with specific jobs for example Bert is good at understanding the language it's
good at understanding the language it's like a librarian who knows where every
like a librarian who knows where every book is and what's inside them Google
book is and what's inside them Google uses it to help its search engine
uses it to help its search engine understand what you're looking for GPT
understand what you're looking for GPT is good at creating text it's like a
is good at creating text it's like a skilled author who can write stories
skilled author who can write stories articles or conversations based on what
articles or conversations based on what it has learned so that's an overview of
it has learned so that's an overview of a transformer model next we'll be
a transformer model next we'll be talking about the main components of a
talking about the main components of a transformer
model the next component of a transformer model we'll be covering is
transformer model we'll be covering is the the tokenization process
the the tokenization process tokenization in a Transformer model is
tokenization in a Transformer model is like turning a sentence into a puzzle
like turning a sentence into a puzzle for example you have the sentence I
for example you have the sentence I heard a dog bark loudly at a cat to help
heard a dog bark loudly at a cat to help a computer understand it we chop up the
a computer understand it we chop up the sentence into pieces called tokens each
sentence into pieces called tokens each piece can be a word or even a part of a
piece can be a word or even a part of a word so for our sentence we give each
word so for our sentence we give each word a number like this I might be one
word a number like this I might be one her might be two a might be three do
her might be two a might be three do might be four bark might be five loudly
might be four bark might be five loudly might be six at might be seven is
might be six at might be seven is already token is three tap might be
already token is three tap might be eight now our sentence becomes a series
eight now our sentence becomes a series of numbers this is like giving each word
of numbers this is like giving each word a special code the computer uses these
a special code the computer uses these codes to learn about the words and how
codes to learn about the words and how they fit together if a word repeats like
they fit together if a word repeats like a we use its code again instead of
a we use its code again instead of making a new one as the computer reads
making a new one as the computer reads more text it keeps turning new words
more text it keeps turning new words into new tokens with new numbers if it
into new tokens with new numbers if it learns the word meow it might call it
learns the word meow it might call it nine and skateboard could be 10 by doing
nine and skateboard could be 10 by doing this with lots and lots of text the
this with lots and lots of text the computer builds a big list of these
computer builds a big list of these tokens which it then uses to stand and
tokens which it then uses to stand and generate language it's a bit like
generate language it's a bit like creating a dictionary where every word
creating a dictionary where every word has a unique
number the next component of a transformer model will be covering our
transformer model will be covering our embeddings so to help a computer
embeddings so to help a computer understand language we turn words into
understand language we turn words into tokens and then give each token a
tokens and then give each token a special numeric code called an embedding
special numeric code called an embedding these embeddings are like a secret code
these embeddings are like a secret code that captures the meaning of the word as
that captures the meaning of the word as a simple example suppose the embeddings
a simple example suppose the embeddings for our tokens consist of vectors with
for our tokens consist of vectors with three elements for example four for dog
three elements for example four for dog has the embedding vectors 10 32 five for
has the embedding vectors 10 32 five for bark has the vectors 10 22 eight for cat
bark has the vectors 10 22 eight for cat the vectors are 10 3 1 nine for meow the
the vectors are 10 3 1 nine for meow the vectors are 10 2 1 and 10 for skateboard
vectors are 10 2 1 and 10 for skateboard as the vectors 3 3 one which is quite
as the vectors 3 3 one which is quite different from the rest words that have
different from the rest words that have similar meanings or are used in similar
similar meanings or are used in similar ways get codes that look alike so dog
ways get codes that look alike so dog and bark might have similar codes
and bark might have similar codes because they are related but skateboard
because they are related but skateboard might be off in a different area because
might be off in a different area because it's not much related to these other
it's not much related to these other words words this way the computer can
words words this way the computer can figure out which words are similar to
figure out which words are similar to each other just by looking at their
each other just by looking at their codes it's like giving each word a home
codes it's like giving each word a home on a map and words that are neighbors on
on a map and words that are neighbors on this map have related meanings the image
this map have related meanings the image shows a simple example model in which
shows a simple example model in which each embedding has only three dimensions
each embedding has only three dimensions real language models have many more
real language models have many more Dimensions tools such as word TWC or the
Dimensions tools such as word TWC or the encoding part of a transformer model
encoding part of a transformer model help AI to figure out where each word
help AI to figure out where each word dot should go on this big
map let's go over positional encoding from a Transformer model positional
from a Transformer model positional encoding is a technique used to ensure
encoding is a technique used to ensure that a language model such as GPT
that a language model such as GPT doesn't lose the order of words when
doesn't lose the order of words when processing natural language this is
processing natural language this is important because the order in which
important because the order in which words appear can change the meaning of a
words appear can change the meaning of a sentence let's take the sentence I heard
sentence let's take the sentence I heard a dog bark loudly at a c from our
a dog bark loudly at a c from our previous example without positional
previous example without positional encoding if we simply tokenize this
encoding if we simply tokenize this sentence and convert the tokens into
sentence and convert the tokens into embedding vectors we might end up with a
embedding vectors we might end up with a set of vectors that lose the sequence
set of vectors that lose the sequence information positional encoding adds a
information positional encoding adds a positional Vector to each word in order
positional Vector to each word in order to keep track of the positions of the
to keep track of the positions of the words however by adding positional en
words however by adding positional en coding vectors to each words embedding
coding vectors to each words embedding we ensure that each position in the
we ensure that each position in the sentence is uniquely identified the
sentence is uniquely identified the embedding for I would be modified by
embedding for I would be modified by adding a positional Vector corresponding
adding a positional Vector corresponding to position one labeled I 1 the
to position one labeled I 1 the embedding for herd would be altered by a
embedding for herd would be altered by a vector for position two labeled herd 2
vector for position two labeled herd 2 the embedding for a would be updated
the embedding for a would be updated with a vector for position three labeled
with a vector for position three labeled a 3 and reused with the same positional
a 3 and reused with the same positional vector for its second occurrence this
vector for its second occurrence this process continues for each word token in
process continues for each word token in the sentence with dog four bark five
the sentence with dog four bark five loudly six at seven and Cat 8 all
loudly six at seven and Cat 8 all receiving their unique positional
receiving their unique positional encodings as a result the sentence I
encodings as a result the sentence I heard a dog bark loudly at a cat is
heard a dog bark loudly at a cat is represented not just by a sequence of
represented not just by a sequence of vectors for its words but by a sequence
vectors for its words but by a sequence of vectors that are influenced by the
of vectors that are influenced by the position of each word in the sentence
position of each word in the sentence this means that even if another sentence
this means that even if another sentence had the same words in a different order
had the same words in a different order its overall representation would be
its overall representation would be different because the positional
different because the positional encodings differ reflecting the
encodings differ reflecting the different sequence of words so that's an
different sequence of words so that's an overview of positional
encoding the next component of a transformer we'll be covering is
transformer we'll be covering is attention attention in AI especially in
attention attention in AI especially in Transformer models is a way the model
Transformer models is a way the model figures out how important each word or
figures out how important each word or token is to the meaning of a sentence
token is to the meaning of a sentence particularly in relation to the other
particularly in relation to the other words around it let's reuse the sentence
words around it let's reuse the sentence I heard a DOT bark loudly at a cat to
I heard a DOT bark loudly at a cat to explain this better self attention
explain this better self attention imagine each word word in the sentence
imagine each word word in the sentence shining a flashlight on the other words
shining a flashlight on the other words the brightness of the light shows how
the brightness of the light shows how much one word should pay attention to
much one word should pay attention to the others when understanding the
the others when understanding the sentence for bark the light might shine
sentence for bark the light might shine brightest on dot because they're closely
brightest on dot because they're closely related encoder's role in the encoder
related encoder's role in the encoder part of a transformer model attention
part of a transformer model attention helps decide how to represent each word
helps decide how to represent each word as a number or vector it's not just the
as a number or vector it's not just the word itself but also its context that
word itself but also its context that matters for example bark in the bark of
matters for example bark in the bark of a tree would have a different
a tree would have a different representation than bark and I heard a
representation than bark and I heard a DOT bark because the surrounding words
DOT bark because the surrounding words are different decoder's role when
are different decoder's role when generating new text like completing a
generating new text like completing a sentence the decoder uses attention to
sentence the decoder uses attention to figure out which words it already has
figure out which words it already has are most important for deciding what
are most important for deciding what comes next if our sentence is I heard a
comes next if our sentence is I heard a dog the model uses attention to know
dog the model uses attention to know that her and dog are key to adding the
that her and dog are key to adding the next word which might be bark multi-head
next word which might be bark multi-head attention this is like having multiple
attention this is like having multiple flashlights each highlighting different
flashlights each highlighting different aspects of the words maybe one
aspects of the words maybe one flashlight looks at the meaning of the
flashlight looks at the meaning of the word another looks at its role in the
word another looks at its role in the sentence like subject or object and so
sentence like subject or object and so on this helps the model get a richer
on this helps the model get a richer understanding of the text building the
understanding of the text building the output the decoder builds the sentence
output the decoder builds the sentence one word at a time using attention at
one word at a time using attention at each step it looks at the sentence so
each step it looks at the sentence so far decides what's important and then
far decides what's important and then predicts the next word it's an ongoing
predicts the next word it's an ongoing process with each new word influencing
process with each new word influencing the next so attention in Transformer
the next so attention in Transformer models is like a guide that helps the AI
models is like a guide that helps the AI understand and create Language by
understand and create Language by focusing on the most relevant parts of
focusing on the most relevant parts of the text considering both individual
the text considering both individual word meanings and their relationships
word meanings and their relationships within the
within the sentence let's take a look at the
sentence let's take a look at the attention process token embeddings each
attention process token embeddings each word in the sentence is represented as a
word in the sentence is represented as a vector of numbers or its embedding
vector of numbers or its embedding predicting the next token the goal is to
predicting the next token the goal is to figure out what the next word should be
figure out what the next word should be also represented as a vector as signing
also represented as a vector as signing weights the attention layer looks at the
weights the attention layer looks at the sentence so far and decides how much
sentence so far and decides how much influence each word should have on the
influence each word should have on the next one calculating attention scores
next one calculating attention scores using these weights a new Vector for the
using these weights a new Vector for the next token is calculated which includes
next token is calculated which includes an attention score multi-head attention
an attention score multi-head attention does this several times focusing on
does this several times focusing on different aspects of the words choosing
different aspects of the words choosing the most likely word a neural network
the most likely word a neural network takes these vectors with attention
takes these vectors with attention scores and picks the word from the
scores and picks the word from the vocabulary that most likely comes next
vocabulary that most likely comes next adding to the sequence The Chosen word
adding to the sequence The Chosen word is added to the existing sequence and
is added to the existing sequence and the process repeats for each new word so
the process repeats for each new word so let's use gp4 is an example for how this
let's use gp4 is an example for how this entire process works explained in a
entire process works explained in a simplified manner a Transformer model
simplified manner a Transformer model like gp4 works by taking a text input
like gp4 works by taking a text input and producing a well structured output
and producing a well structured output during training it learns from a vast
during training it learns from a vast array of text Data understand
array of text Data understand understanding how words are typically
understanding how words are typically arranged in sentences the model knows
arranged in sentences the model knows the correct sequence of words but hides
the correct sequence of words but hides future words to learn how to predict
future words to learn how to predict them when it tries to predict a word it
them when it tries to predict a word it Compares its guess to the actual word
Compares its guess to the actual word gradually adjusting to reduce errors in
gradually adjusting to reduce errors in practice the model uses its training to
practice the model uses its training to aside importance to each word in a
aside importance to each word in a sequence helping it guess the next word
sequence helping it guess the next word accurately the result is that gp4 can
accurately the result is that gp4 can create sentences that sound like they
create sentences that sound like they were written by a human however this
were written by a human however this doesn't mean the model knows things or
doesn't mean the model knows things or is intelligent in the human sense it's
is intelligent in the human sense it's it's simply very good at using its large
it's simply very good at using its large vocabulary and training to generate
vocabulary and training to generate realistic text base on word
realistic text base on word relationships so that's an overview of
relationships so that's an overview of attention in a Transformer
attention in a Transformer [Music]
[Music] model hey this is Andrew Brown from exam
model hey this is Andrew Brown from exam Pro and in this section we'll be going
Pro and in this section we'll be going over an introduction to Azure openai
over an introduction to Azure openai service Azure open AI service is a
service Azure open AI service is a cloud-based platform designed to deploy
cloud-based platform designed to deploy and manage Advanced language models from
and manage Advanced language models from open AI this service combines open I's
open AI this service combines open I's latest language model development with
latest language model development with the robust security and scalability of
the robust security and scalability of azure's cloud infrastructure Azure open
azure's cloud infrastructure Azure open AI offers several types of models for
AI offers several types of models for different purposes gp4 models these are
different purposes gp4 models these are the newest in the line of GPT models and
the newest in the line of GPT models and can create text and programming code
can create text and programming code when given a prompt written in natural
when given a prompt written in natural language GPT 3.5 models similar to GPT 4
language GPT 3.5 models similar to GPT 4 these models also create text and code
these models also create text and code from natural language props the GPT 3.5
from natural language props the GPT 3.5 turbo version is specially designed for
turbo version is specially designed for conversations making it a great choice
conversations making it a great choice for chat applications and other
for chat applications and other interactive AI tasks embedding models
interactive AI tasks embedding models these models turn written text into
these models turn written text into number sequences which is helpful for
number sequences which is helpful for analyzing and comparing different pieces
analyzing and comparing different pieces of text to find out how similar they are
of text to find out how similar they are doll e models these models can make
doll e models these models can make images from descriptions given in words
images from descriptions given in words the doll e models are still being tested
the doll e models are still being tested and are shown in the Azure Open aai
and are shown in the Azure Open aai studio so you don't have to set them up
studio so you don't have to set them up for use manually key Concepts in using
for use manually key Concepts in using Azure open AI include prompts and
Azure open AI include prompts and completions tokens resources deployments
completions tokens resources deployments prompt engineering and various models
prompt engineering and various models prompts and completions users interact
prompts and completions users interact with the API by providing a text command
with the API by providing a text command in English known as a prompt and the
in English known as a prompt and the model generates a text response or
model generates a text response or completion for example a prompt to C to
completion for example a prompt to C to five and a loop results in the model
five and a loop results in the model returning appropriate code tokens asure
returning appropriate code tokens asure open AI breaks down text into tokens
open AI breaks down text into tokens which are words or character chunks to
which are words or character chunks to process requests the number of tokens
process requests the number of tokens affects response latency and throughput
affects response latency and throughput for images token cost varies with image
for images token cost varies with image size and detail setting with low detail
size and detail setting with low detail images costing fewer tokens and high
images costing fewer tokens and high detail images costing more resources
detail images costing more resources Azure open aai operates like other Azure
Azure open aai operates like other Azure products where users create a resource
products where users create a resource within their Azure subscription
within their Azure subscription deployments to use the service users
deployments to use the service users must deploy a model via deployment apis
must deploy a model via deployment apis choosing the specific model for their
choosing the specific model for their needs prompt engineering crafting
needs prompt engineering crafting prompts is crucial as they guide the
prompts is crucial as they guide the model's output this requires skill as
model's output this requires skill as prompt construction is nuanced and
prompt construction is nuanced and impacts the model's response models
impacts the model's response models various models offer different
various models offer different capabilities and pricing Dolly creates
capabilities and pricing Dolly creates images from text while whisper
images from text while whisper transcribes and translates speech to
transcribes and translates speech to text each has unique features suitable
text each has unique features suitable for different tasks so that's an
for different tasks so that's an overview of azure open ey
service the next topic we'll be covering is azure open AI Studio developers can
is azure open AI Studio developers can work with these models in Azure open AI
work with these models in Azure open AI Studio A web-based environment where AI
Studio A web-based environment where AI professionals can deploy test and manage
professionals can deploy test and manage llms that support generative AI app
llms that support generative AI app development on Azure access is currently
development on Azure access is currently limited due to the high demand upcoming
limited due to the high demand upcoming product improvements and Microsoft's
product improvements and Microsoft's commitment to responsible AI presently
commitment to responsible AI presently collaborations are being prioritized for
collaborations are being prioritized for those who already have a partnership
those who already have a partnership with Microsoft are engaged in lower risk
with Microsoft are engaged in lower risk use cases and are dedicated to including
use cases and are dedicated to including necessary
necessary safeguards in Azure open AI Studio you
safeguards in Azure open AI Studio you can deploy large language models provide
can deploy large language models provide F shot examples and test them in Azure
F shot examples and test them in Azure open AI Studios chat playground the
open AI Studios chat playground the image shows Azure open eyes chat
image shows Azure open eyes chat playground interface where users can
playground interface where users can test and configure an AI chat bot in the
test and configure an AI chat bot in the middle there's a chat area to type user
middle there's a chat area to type user messages and see the assistant's replies
messages and see the assistant's replies on the left there's a menu for
on the left there's a menu for navigation and a section to set up the
navigation and a section to set up the assistant including a reminder to save
assistant including a reminder to save changes on the right adjustable
changes on the right adjustable parameters control the eyes response
parameters control the eyes response Behavior like length Randomness and
Behavior like length Randomness and repetition users into queries adjust
repetition users into queries adjust settings and observe how the AI responds
settings and observe how the AI responds to fine-tune its performance so that's
to fine-tune its performance so that's an overview of azure open AI
Studio let's take a look at the pricing for the model models in Azure open AI
for the model models in Azure open AI service starting off with the language
service starting off with the language models we have GPT 3.5 Turbo with a
models we have GPT 3.5 Turbo with a context of 4K tokens cost
context of 4K tokens cost 0.15 for prompts and
0.15 for prompts and 0.002 for completions per 1,000 tokens
0.002 for completions per 1,000 tokens another version of GPT 3.5 turbo can
another version of GPT 3.5 turbo can handle a larger context of 16k tokens
handle a larger context of 16k tokens with PRT and completion costs increased
with PRT and completion costs increased up to
up to 0.003 and
0.003 and 0.004 respectively gpt3 5 Turbo 11106
0.004 respectively gpt3 5 Turbo 11106 with a 16k context has no available
with a 16k context has no available pricing gp4 turbo and gp4 turbo Vision
pricing gp4 turbo and gp4 turbo Vision both have an even larger Contex size of
both have an even larger Contex size of 128k tokens but also have no listed
128k tokens but also have no listed prices the standard gp4 model with an 8K
prices the standard gp4 model with an 8K token Contex costs 3 cents for prompts
token Contex costs 3 cents for prompts and 6 cents for completions and a larger
and 6 cents for completions and a larger context version of gp4 with 32k tokens
context version of gp4 with 32k tokens cost 6 cents for prompts and 12 cents
cost 6 cents for prompts and 12 cents for
for completions there are other models such
completions there are other models such as the base models
as the base models fine-tuning models image models
fine-tuning models image models embedding models and speech models they
embedding models and speech models they all have their respective pricing but we
all have their respective pricing but we won't be going through each of them in a
won't be going through each of them in a lot of detail but essentially they are
lot of detail but essentially they are all on a paper use pricing model it
all on a paper use pricing model it could be payer hour or paper token and
could be payer hour or paper token and so on the higher quality the model the
so on the higher quality the model the more expensive it will likely be so
more expensive it will likely be so that's an overview of azure open AI
that's an overview of azure open AI Service
Service [Music]
[Music] pricing hey this is Andrew Brown from
pricing hey this is Andrew Brown from exam Pro and the next topic will be
exam Pro and the next topic will be going over co-pilots co-pilots are a new
going over co-pilots co-pilots are a new type of computing tool that integrates
type of computing tool that integrates with applications to help users with
with applications to help users with common tasks using generative AI models
common tasks using generative AI models they are designed using a standard
they are designed using a standard architecture allowing developers to
architecture allowing developers to create custom co-pilots tailored to
create custom co-pilots tailored to specific business needs and applications
specific business needs and applications co-pilots might appear as a chat feature
co-pilots might appear as a chat feature beside your document or file and they
beside your document or file and they utilize the content within the product
utilize the content within the product to generate specific results creating a
to generate specific results creating a co-pilot involves several steps training
co-pilot involves several steps training a large language model with a vast
a large language model with a vast amount of data utilizing services like
amount of data utilizing services like Azure open AI service which provide
Azure open AI service which provide pre-trained models that developers can
pre-trained models that developers can either use as his refin tune with their
either use as his refin tune with their own data for more specific tasks
own data for more specific tasks deploying the model to make it available
deploying the model to make it available for use within applications building
for use within applications building co-pilots that prompt the models to
co-pilots that prompt the models to generate usable content enabling
generate usable content enabling business users to enhance their
business users to enhance their productivity and creativity through AI
productivity and creativity through AI generated assistance co-pilots have the
generated assistance co-pilots have the potential to revolutionize the way we
potential to revolutionize the way we work these co-pilots use generative AI
work these co-pilots use generative AI to help with first draft information
to help with first draft information synthesis strategic planning and much
synthesis strategic planning and much more let's take a look at a few examples
more let's take a look at a few examples of co-pilot starting with Microsoft
of co-pilot starting with Microsoft co-pilot so Microsoft co-pilot is
co-pilot so Microsoft co-pilot is integrated into various applications to
integrated into various applications to assist users in creating documents
assist users in creating documents spreadsheets presentations and more by
spreadsheets presentations and more by generating content summarizing
generating content summarizing information and aiding in strategic
information and aiding in strategic planning it is used across Microsoft
planning it is used across Microsoft Suite of products and services to
Suite of products and services to enhance user experience and efficiency
enhance user experience and efficiency next we have the Microsoft being search
next we have the Microsoft being search engine which which has an integrated
engine which which has an integrated co-pilot to help users when browsing or
co-pilot to help users when browsing or searching the Internet by generating
searching the Internet by generating natural language answers to questions by
natural language answers to questions by understanding the context of the
understanding the context of the questions providing a richer and more
questions providing a richer and more intuitive search experience Microsoft
intuitive search experience Microsoft 365 co-pilot is designed to be a partner
365 co-pilot is designed to be a partner in your workflow integrated with
in your workflow integrated with productivity and communication tools
productivity and communication tools like PowerPoint and Outlook it's there
like PowerPoint and Outlook it's there to help you craft effective documents
to help you craft effective documents design spreadsheets put together
design spreadsheets put together presentations manage emails and
presentations manage emails and streamline other tasks GitHub co-pilot
streamline other tasks GitHub co-pilot is tool that helps software developers
is tool that helps software developers offering real-time assistance as they
offering real-time assistance as they write code it offers more than
write code it offers more than suggesting code Snippets it can help in
suggesting code Snippets it can help in Thoroughly documenting the code for
Thoroughly documenting the code for better understanding and maintenance
better understanding and maintenance additionally co-pilot contributes to the
additionally co-pilot contributes to the development process by providing support
development process by providing support for testing code ensuring that
for testing code ensuring that developers can work more efficiently and
developers can work more efficiently and with fewer errors so that's an overview
with fewer errors so that's an overview of
of [Music]
[Music] co-pilot hey this is Andrew Brown from
co-pilot hey this is Andrew Brown from exam Pro and the next topic will be
exam Pro and the next topic will be covering is prompt engineering prompt
covering is prompt engineering prompt engineering is a process that improves
engineering is a process that improves the interaction between humans and
the interaction between humans and generative AI it involves refining the
generative AI it involves refining the props or instructions given to an AI
props or instructions given to an AI application to generate higher quality
application to generate higher quality responses this process is valuable for
responses this process is valuable for both the developers who create AI driven
both the developers who create AI driven applications and the end users who
applications and the end users who interact with them for example
interact with them for example developers May build a generative AI
developers May build a generative AI application for teachers to create
application for teachers to create multiple choice questions related to
multiple choice questions related to text students read during the
text students read during the development of the application
development of the application developers can add other rules for what
developers can add other rules for what the program should do with the prompts
the program should do with the prompts it receives system messages prompt
it receives system messages prompt engineering techniques include defining
engineering techniques include defining a system message the message sets the
a system message the message sets the context for the model by describing
context for the model by describing expectations and constraints for example
expectations and constraints for example you're a helpful assistant that responds
you're a helpful assistant that responds in a cheerful friendly manner these
in a cheerful friendly manner these system messages determine constraints
system messages determine constraints and styles for the model's responses
and styles for the model's responses writing good prompts to maximize the
writing good prompts to maximize the utility of AI responses it is essential
utility of AI responses it is essential to be precise and explicit in your props
to be precise and explicit in your props a well structured propt such as create a
a well structured propt such as create a list of 10 things things to do in
list of 10 things things to do in Edinburgh during August directs the AI
Edinburgh during August directs the AI to produce a targeted and relevant
to produce a targeted and relevant output achieving better results zero
output achieving better results zero shot learning refers to an AI model's
shot learning refers to an AI model's ability to correctly perform a task
ability to correctly perform a task without any prior examples or training
without any prior examples or training on that specific task one shot learning
on that specific task one shot learning involves the AI model learning from a
involves the AI model learning from a single example or instance to perform a
single example or instance to perform a task here is an example of prompt
task here is an example of prompt engineering with a user query and system
engineering with a user query and system response so the user inputs can my
response so the user inputs can my camera handle the rainy season if I go
camera handle the rainy season if I go to the Amazon rainforest next week some
to the Amazon rainforest next week some The Prompt engineering components could
The Prompt engineering components could be the weather resistance feature check
be the weather resistance feature check users equipment database rainforest
users equipment database rainforest climate data product specifications
climate data product specifications travel tips for photographers etc for
travel tips for photographers etc for the llm processing the AI system
the llm processing the AI system integrates the user question with data
integrates the user question with data about the Amazon's climate specifically
about the Amazon's climate specifically during the rainy season and the product
during the rainy season and the product information about the camera's weather
information about the camera's weather resistance features it also references a
resistance features it also references a database of the user's equipment to
database of the user's equipment to ensure it's talking about the correct
ensure it's talking about the correct item and may include travel tips that
item and may include travel tips that are useful for photographers heading to
are useful for photographers heading to similar climate
similar climate and the output results in your current
and the output results in your current camera model the proot markv is designed
camera model the proot markv is designed with a weather sealed body suitable for
with a weather sealed body suitable for high humidity and Rain conditions which
high humidity and Rain conditions which matches the expected weather in the
matches the expected weather in the Amazon rainforest for next week however
Amazon rainforest for next week however for added protection during heavy rains
for added protection during heavy rains consider using a rain cover next let's
consider using a rain cover next let's take a look at the prompt engineering
take a look at the prompt engineering workflow this image describes a
workflow this image describes a simplified step process for working with
simplified step process for working with AI models and prompt engineering why
AI models and prompt engineering why task understanding know what you want
task understanding know what you want the AI to do two craft prompts write
the AI to do two craft prompts write instructions for the AI three prompt
instructions for the AI three prompt alignment make sure instructions match
alignment make sure instructions match what the AI can do for optimizing prompt
what the AI can do for optimizing prompt improve the instructions for better AI
improve the instructions for better AI responses five AI model processing the
responses five AI model processing the AI thinks about the instructions six
AI thinks about the instructions six generating output the AI gives an answer
generating output the AI gives an answer or result seven output refinement fix or
or result seven output refinement fix or tweak the I's answer hey to iterative
tweak the I's answer hey to iterative improvement keep improving the
improvement keep improving the instructions and answers so that's an
instructions and answers so that's an overview of prompt engineering
overview of prompt engineering [Music]
[Music] the next topic we'll be covering is
the next topic we'll be covering is grounding grounding impr prompt
grounding grounding impr prompt engineering is a technique used in large
engineering is a technique used in large language models where you provide
language models where you provide specific relevant context within a
specific relevant context within a prompt this helps the AI to produce a
prompt this helps the AI to produce a more accurate and related response for
more accurate and related response for example if you want an llm to summarize
example if you want an llm to summarize an email you would include the actual
an email you would include the actual email text in the prompt along with a
email text in the prompt along with a command to summarize it this approach
command to summarize it this approach allows you to Leverage The llm for tasks
allows you to Leverage The llm for tasks it wasn't explicitly trained on without
it wasn't explicitly trained on without the need for training the model so
the need for training the model so what's the difference between prompt
what's the difference between prompt engineering and grounding prompt
engineering and grounding prompt engineering broadly refers to the art of
engineering broadly refers to the art of crafting effective prps to produce the
crafting effective prps to produce the desired output from an AI model
desired output from an AI model grounding specifically involves
grounding specifically involves enriching prps with relevant context to
enriching prps with relevant context to improve the model's understanding and
improve the model's understanding and responses grounding ensures the AI has
responses grounding ensures the AI has enough information to process the prop
enough information to process the prop correctly whereas prop engineering can
correctly whereas prop engineering can also include techniques like format
also include techniques like format style and the Strategic use of examples
style and the Strategic use of examples or questions to guide the AI the image
or questions to guide the AI the image outlines of framework for grounding
outlines of framework for grounding options in prompt engineering within the
options in prompt engineering within the context of large language models
context of large language models grounding options these are techniques
grounding options these are techniques to ensure llm outputs are accurate and
to ensure llm outputs are accurate and adhere to responsible AI principles
adhere to responsible AI principles prompt engineering placed at the top
prompt engineering placed at the top indicating its broad applicability this
indicating its broad applicability this involves designing prompts to direct the
involves designing prompts to direct the AI toward generating the desired output
AI toward generating the desired output fine-tuning a step below in complexity
fine-tuning a step below in complexity where llms are trained on specific data
where llms are trained on specific data to improve their task performance
to improve their task performance training the most resource intensive
training the most resource intensive process at the triangle base suggesting
process at the triangle base suggesting it's used in more extensive
it's used in more extensive customization needs llm Ops and
customization needs llm Ops and responsible AI these foundational
responsible AI these foundational aspects emphasize the importance of
aspects emphasize the importance of operational efficiency and ethical
operational efficiency and ethical standards across all stages of LL and
standards across all stages of LL and application development so that's an
application development so that's an overview of
overview of [Music]
[Music] grounding hey this is Andrew Brown from
grounding hey this is Andrew Brown from exampro and in this demo we'll be going
exampro and in this demo we'll be going over a short demo on what you can do
over a short demo on what you can do with co-pilot with gp4 on Microsoft bang
with co-pilot with gp4 on Microsoft bang so to get here you'll need to search for
so to get here you'll need to search for something like co-pilot Bing and click
something like co-pilot Bing and click on the TR copilot and you should be able
on the TR copilot and you should be able to access this page so on here you have
to access this page so on here you have some suggested or popular prompts that
some suggested or popular prompts that people commonly use such as create an
people commonly use such as create an image of a concept kitchen generate
image of a concept kitchen generate ideas for whacking new products how
ideas for whacking new products how would you explain AI to a sixth grader
would you explain AI to a sixth grader WR python code to calculate all the
WR python code to calculate all the different flavor combination as for my
different flavor combination as for my ice cream parlor and so on you can
ice cream parlor and so on you can choose the conversation style ranging
choose the conversation style ranging from more creative for more original and
from more creative for more original and imaginative ideas more balanced or more
imaginative ideas more balanced or more precise for more factual information
precise for more factual information we'll be going with somewhere in the
we'll be going with somewhere in the middle so more balanced just for this
middle so more balanced just for this example on the bottom here you can type
example on the bottom here you can type in any prompt you want so for example we
in any prompt you want so for example we can type something simple like summarize
can type something simple like summarize the main differences between supervised
the main differences between supervised and unsupervised learning for the AI 900
and unsupervised learning for the AI 900 exam you'll see that it will start
exam you'll see that it will start generating an answer for you so for
generating an answer for you so for supervised learning data labeling in
supervised learning data labeling in supervised learning the training data is
supervised learning the training data is pre-labeled with the correct output
pre-labeled with the correct output values
values and it provides other objectives and
and it provides other objectives and examples as well and for unsupervised
examples as well and for unsupervised learning No Labels unsupervised learning
learning No Labels unsupervised learning operates without labeled data it seeks
operates without labeled data it seeks to discover patterns structures or
to discover patterns structures or relationships within the raw data notice
relationships within the raw data notice how it uses sources from the internet
how it uses sources from the internet and if you want to learn more you can
and if you want to learn more you can click on these links that it provides to
click on these links that it provides to directly go to the source of the
directly go to the source of the information which is very convenient so
information which is very convenient so let's quickly check one out and it seems
let's quickly check one out and it seems like the information we got was pretty
like the information we got was pretty good and
good and credible and on the bottom it also
credible and on the bottom it also provides us some suggestions for
provides us some suggestions for follow-up questions you may want to ask
follow-up questions you may want to ask in the future that is related to the
in the future that is related to the previous
previous PRT another cool feature of co-pilot is
PRT another cool feature of co-pilot is that it's integrated with dolly3 which
that it's integrated with dolly3 which is a image generation service so for
is a image generation service so for example you can say something like
example you can say something like create an image of a cute dog running
create an image of a cute dog running through a green field on a sunny
through a green field on a sunny day so now you'll have to wait a little
day so now you'll have to wait a little bit for it to generate the image that
bit for it to generate the image that you described in your
prompt and there we go we have an adorable little puppy running through
adorable little puppy running through the
fields you also have the power to modify images if you're not satisfied with the
images if you're not satisfied with the result so they've provided a few options
result so they've provided a few options for you here so for example we can add a
for you here so for example we can add a rainbow in the background change it into
rainbow in the background change it into a cat or make the sky pink and purple
a cat or make the sky pink and purple let's try changing it to a cat so it's
let's try changing it to a cat so it's going to generate and change it from a
going to generate and change it from a dog to a cat and there we go it's now a
dog to a cat and there we go it's now a cute little cat running through the
cute little cat running through the field
field you could also write code using
you could also write code using co-pilot so for example I can type in
co-pilot so for example I can type in write a python function to check if a
write a python function to check if a given number is
given number is prime it'll start generating a piece of
prime it'll start generating a piece of code for
me it can write code in multiple languages not just python so let's try
languages not just python so let's try out something with
out something with JavaScript let's try create a JavaScript
JavaScript let's try create a JavaScript function to reverse a
string and of of course we'll need to wait for the code to
generate so there we go here's our code for the function to reverse a string
for the function to reverse a string just as we asked for it so that's a
just as we asked for it so that's a really quick and general demo for
really quick and general demo for co-pilot with
co-pilot with [Music]
[Music] gp4 hey this is Andrew Brown from exam
gp4 hey this is Andrew Brown from exam Pro and in this follow along we're going
Pro and in this follow along we're going to set up a studio with an Azure machine
to set up a studio with an Azure machine learning uh Service uh so that it will
learning uh Service uh so that it will be the basis for all the follow alongs
be the basis for all the follow alongs here so I want you to do is go all the
here so I want you to do is go all the way to the top here and type in Azure
way to the top here and type in Azure machine learning and you're looking for
machine learning and you're looking for this one that looks like a science uh
this one that looks like a science uh bottle here and we'll go ahead and
bottle here and we'll go ahead and create ourselves our machine learning uh
create ourselves our machine learning uh studio and so I'll create a new one here
studio and so I'll create a new one here and I'll just say um my
and I'll just say um my studio and we'll hit okay and we'll name
studio and we'll hit okay and we'll name the workpace so we'll say my uh work
we'll maybe say ml workplace here uh for containers there are none so
here uh for containers there are none so it'll create all that stuff for us I'll
it'll create all that stuff for us I'll hit
hit create and
create and so what we're going to do here is just wait for that creation
here is just wait for that creation okay all right so after a short little
okay all right so after a short little wait there it looks like our studio set
wait there it looks like our studio set up so we'll go to that resource launch
up so we'll go to that resource launch the studio and we are now in so uh
the studio and we are now in so uh there's a lot of stuff in here but
there's a lot of stuff in here but generally the first thing you'll ever
generally the first thing you'll ever want to do is get yourself a notebook
want to do is get yourself a notebook going so in the top left corner I'm
going so in the top left corner I'm going to go to notebooks and what we'll
going to go to notebooks and what we'll need to do is load some files in here
need to do is load some files in here now they do have some sample files like
now they do have some sample files like how to use uh Azure ml so if we just
how to use uh Azure ml so if we just quickly go through here um you know
quickly go through here um you know maybe we'll want to look at something
maybe we'll want to look at something like uh Ms nist here and we'll go ahead
like uh Ms nist here and we'll go ahead and open this
and open this one and maybe we'll just go ahead and
one and maybe we'll just go ahead and clone uh this and we'll just clone it
clone uh this and we'll just clone it over
over here
here okay and the idea is that we want to get
okay and the idea is that we want to get this notebook running and so notebooks
this notebook running and so notebooks have to be backed by some kind of
have to be backed by some kind of compute so up here it says no compute
compute so up here it says no compute found and Etc so what we can do here and
found and Etc so what we can do here and I'm just going to go back to my files oh
I'm just going to go back to my files oh it went back there for me but what I'm
it went back there for me but what I'm going to do is go all the way down
going to do is go all the way down actually I'll just expand this up here
actually I'll just expand this up here makes it a bit easier close this tab out
makes it a bit easier close this tab out but uh what we'll do is go down to
but uh what we'll do is go down to compute and here we have our four types
compute and here we have our four types of comput so compute instances is when
of comput so compute instances is when we're running notebooks compute clusters
we're running notebooks compute clusters is when we're doing training and
is when we're doing training and inference clusters is when we have uh a
inference clusters is when we have uh a inference pipeline uh and then attach
inference pipeline uh and then attach computer is bringing uh things like hdn
computer is bringing uh things like hdn sites or data bricks into here but for
sites or data bricks into here but for compute instances is what we need we'll
compute instances is what we need we'll get ahead and go new you'll notice we
get ahead and go new you'll notice we have the option between CPU and GPU GPU
have the option between CPU and GPU GPU is much more expensive see it's like 90
is much more expensive see it's like 90 cents per hour for a notebook we do not
cents per hour for a notebook we do not need anything uh super powerful notice
need anything uh super powerful notice it'll say here development on notebooks
it'll say here development on notebooks IDs lightweight testing here it says
IDs lightweight testing here it says classical ml model training autom ml
classical ml model training autom ml pipelines Etc
pipelines Etc so I want to make this a bit cheaper for
so I want to make this a bit cheaper for us here uh because we're going to be
us here uh because we're going to be using the notebook to run uh cognitive
using the notebook to run uh cognitive services and those cost next to nothing
services and those cost next to nothing like they don't take much compute power
like they don't take much compute power uh and for some other ones we might do
uh and for some other ones we might do something a bit larger for this this is
something a bit larger for this this is good enough so I'll go ahead and hit
good enough so I'll go ahead and hit next I'm just going to say my uh
next I'm just going to say my uh notebook uh instance
notebook uh instance here we'll go ahead and hit
here we'll go ahead and hit create and so we're just going to have
create and so we're just going to have to wait for that to finish creating and
to wait for that to finish creating and running and when it is I'll see you back
running and when it is I'll see you back here in a moment all right so after a
here in a moment all right so after a short little wait there it looks like
short little wait there it looks like our server is running and you can even
our server is running and you can even see here it shows you you can launch in
see here it shows you you can launch in jupyter Labs Jupiter vs code R Studio or
jupyter Labs Jupiter vs code R Studio or The Terminal but what I'm going to do is
The Terminal but what I'm going to do is go back all the way to our notebooks
go back all the way to our notebooks just so we have some consistency here I
just so we have some consistency here I want you to notice that it's now running
want you to notice that it's now running on this compute if it's not you can go
on this compute if it's not you can go ahead and select it uh and it also
ahead and select it uh and it also loaded in Python 3.6 there is 3.8 right
loaded in Python 3.6 there is 3.8 right now it's not a big deal which one you
now it's not a big deal which one you use um but that is the kernel like how
use um but that is the kernel like how it will run this stuff now this is all
it will run this stuff now this is all interesting but I don't want to uh run
interesting but I don't want to uh run this right now what I want to do is get
this right now what I want to do is get those cognitive Services uh into here so
those cognitive Services uh into here so what we can do is just go up here and
what we can do is just go up here and we'll choose editors and edit in Jupiter
we'll choose editors and edit in Jupiter lab and what that should do is open up a
lab and what that should do is open up a new tab
new tab here uh is it
here uh is it opening if it's not opening what we can
opening if it's not opening what we can do is go to compute sometimes it's a bit
do is go to compute sometimes it's a bit more responsive if we just click there
more responsive if we just click there it's the same way of getting to it um I
it's the same way of getting to it um I don't know why but just sometimes that
don't know why but just sometimes that link doesn't work uh when you're in the
link doesn't work uh when you're in the notebook and what we can do is well
notebook and what we can do is well we're in here now we can see that this
we're in here now we can see that this is where uh this example project is okay
is where uh this example project is okay um but what we want to do is get those
um but what we want to do is get those cognitive services in here so I don't
cognitive services in here so I don't know if I showed it to you yet but I
know if I showed it to you yet but I have a repository I just got to go find
have a repository I just got to go find it it's somewhere on my
it it's somewhere on my screen um here it is okay so I have a
screen um here it is okay so I have a repo called the free a a uh the free a
repo called the free a a uh the free a it should be AI 900 I think I'll go
it should be AI 900 I think I'll go ahead and change that or that is going
ahead and change that or that is going to get
to get confusing okay so what I want you to do
confusing okay so what I want you to do here is um we'll get this loaded in so
here is um we'll get this loaded in so this is a public directory I'm just
this is a public directory I'm just thinking there's a couple ways we can do
thinking there's a couple ways we can do it we can go and uh use the terminal to
it we can go and uh use the terminal to grab it what I'm going to do is I'm just
grab it what I'm going to do is I'm just going to go download the
zip and this is just one of the easiest ways to install it and we need um to
ways to install it and we need um to place it somewhere so here are my
place it somewhere so here are my downloads and I'm just going to drag it
downloads and I'm just going to drag it out here
out here okay and uh what we'll do is upload that
okay and uh what we'll do is upload that there so I can't remember if it lets you
there so I can't remember if it lets you upload entire folders we'll give it a go
upload entire folders we'll give it a go see if it lets us maybe rename this to
see if it lets us maybe rename this to the free a or AI 900 there we'll say
the free a or AI 900 there we'll say open uh yeah so it's individual file so
open uh yeah so it's individual file so it's not that big of a deal but we can
it's not that big of a deal but we can go and ahead and select it like
go and ahead and select it like that and maybe we'll just make a new
that and maybe we'll just make a new folder in here we'll say this cognitive
folder in here we'll say this cognitive [Music]
[Music] services
okay and uh what we'll do here is just keep on uploading some stuff so we have
assets so I have a couple loose Falls there and I know we have a
there and I know we have a crew oops we'll have
crew oops we'll have crew
crew oops is not as responsive
oops is not as responsive um we want
um we want OCR uh I believe we have one called
OCR uh I believe we have one called movie uh
movie uh reviews so we'll go into OCR here and
reviews so we'll go into OCR here and upload the files that we
upload the files that we have so we have a few files
have so we have a few files there and we'll go back a directory here
there and we'll go back a directory here and I know movie uh reviews are just
and I know movie uh reviews are just static
objects and then we'll go back and to crew and we need a folder called
crew and we need a folder called Warf a folder called
Warf a folder called Crusher a folder called data and so for
Crusher a folder called data and so for each of these we have some
each of these we have some images
images I think we're on Warf right yep we are
I think we're on Warf right yep we are okay great so we will quickly upload all
these well technically we don't really need to upload any of these well these
need to upload any of these well these images we don't but I'm going to put
images we don't but I'm going to put them here anyway I just remember that uh
them here anyway I just remember that uh these we just upload directly to the
these we just upload directly to the service but because I'm already doing
service but because I'm already doing anyway I'm just going to put them here
anyway I'm just going to put them here even though we're not going to do
even though we're not going to do anything with
all right and so now we are all set up to do some cognitive services so I'll
to do some cognitive services so I'll see you in the next video all right so
see you in the next video all right so now that we have our work environment
now that we have our work environment set up what we can do is go ahead and
set up what we can do is go ahead and get cognitive Services hooked up because
get cognitive Services hooked up because um we need that service in order to
um we need that service in order to interact with it because if we open up
interact with it because if we open up any of these you're going to notice we
any of these you're going to notice we have a cognitive key and endpoint that
have a cognitive key and endpoint that we're going to need so what I want you
we're going to need so what I want you to do is go back to your Azure portal
to do is go back to your Azure portal and at the top here we'll type in
and at the top here we'll type in cognitive
cognitive Services now the thing is is that all
Services now the thing is is that all these services are individualized but at
these services are individualized but at some point they did group them together
some point they did group them together and you're able to use them through
and you're able to use them through unified um key and API Point that's what
unified um key and API Point that's what this is and that's the way we're going
this is and that's the way we're going to do it so we'll say
to do it so we'll say add and uh it brought us to the
add and uh it brought us to the marketplace so I'm just going to type in
services and then just click this one here here and we'll hit
here here and we'll hit create and uh we'll make a new one here
create and uh we'll make a new one here I'm just going to call my uh Cog
I'm just going to call my uh Cog Services say Okay um I prefer to be in
Services say Okay um I prefer to be in Us East I will leave in US West it's
Us East I will leave in US West it's fine and so in here we'll just say my
fine and so in here we'll just say my Cog
Cog services and if it doesn't like that
services and if it doesn't like that I'll just put some numbers in there we
I'll just put some numbers in there we go we'll do standard so we will be
go we'll do standard so we will be charged something for that let's go take
charged something for that let's go take a look at the
a look at the pricing
pricing so you can see that the pricing is uh
so you can see that the pricing is uh quite variable here but uh it's like
quite variable here but uh it's like you'd have to do a thousand transactions
you'd have to do a thousand transactions before you are build uh so I think we're
before you are build uh so I think we're going to be okay for billing uh we'll
going to be okay for billing uh we'll checkbox this here we'll go down below
checkbox this here we'll go down below it's telling us about responsible AI
it's telling us about responsible AI notice uh sometimes services will
notice uh sometimes services will actually have you checkbox it but in
actually have you checkbox it but in this case it just tells us
and I don't believe this took very long so we'll give it a second here yep it's
so we'll give it a second here yep it's all deployed so we'll go to this
all deployed so we'll go to this resource here and what we're looking for
resource here and what we're looking for are our keys and end
are our keys and end points uh and so we have two keys and
points uh and so we have two keys and two end points we only need a single key
two end points we only need a single key so I'm going to copy this endpoint over
so I'm going to copy this endpoint over we're going to go over to Jupiter lab
we're going to go over to Jupiter lab and I'm just going to paste this in here
and I'm just going to paste this in here I'm just going to put it in all the ones
I'm just going to put it in all the ones that need it so this one needs
that need it so this one needs one this one needs
one this one needs one this one needs
one this one needs one and this one needs
one and this one needs one and we will show the key here I
one and we will show the key here I guess it doesn't show but it copies of
guess it doesn't show but it copies of course I will end up deleting my key
course I will end up deleting my key before you ever see it but this is
before you ever see it but this is something you don't want to share
something you don't want to share publicly and usually you don't want to
publicly and usually you don't want to embed Keys directly into a notebook but
embed Keys directly into a notebook but uh this is the only way to do it so it's
uh this is the only way to do it so it's just how it is with Azure um so yeah all
just how it is with Azure um so yeah all our keys are installed going back to the
our keys are installed going back to the cognitive Services uh nothing super
cognitive Services uh nothing super exciting here but but it does tell us
exciting here but but it does tell us what services work with it you'll see
what services work with it you'll see there's an aster beside custom Vision
there's an aster beside custom Vision because we're going to access that
because we're going to access that through another app um but uh yeah
through another app um but uh yeah cognitive servic is all set up and so
cognitive servic is all set up and so that means we are ready to uh start
that means we are ready to uh start doing some of these Labs
doing some of these Labs [Music]
[Music] okay all right so let's take a look here
okay all right so let's take a look here at computer vision first and computer
at computer vision first and computer vision is actually used for a variety of
vision is actually used for a variety of different Services as you will see it's
different Services as you will see it's kind of an umbrella for a lot of
kind of an umbrella for a lot of different things but the one in
different things but the one in particular that we're looking at here is
particular that we're looking at here is to describe image in stream if we go
to describe image in stream if we go over here to the documentation this
over here to the documentation this operation generates description of image
operation generates description of image in a human reable language with complete
in a human reable language with complete sentences the description is based on a
sentences the description is based on a collection of content tags which also
collection of content tags which also returned by the operation okay so let's
returned by the operation okay so let's go see what that looks like in action so
go see what that looks like in action so the first thing is is that um we need to
the first thing is is that um we need to install this Azure cognitive Services
install this Azure cognitive Services Vision computer vision now we do have a
Vision computer vision now we do have a kernel and these aren't installed by
kernel and these aren't installed by default they're not part of the um uh
default they're not part of the um uh machine learning uh the Azure machine
machine learning uh the Azure machine learning uh SDK for python I believe
learning uh SDK for python I believe that's pre-installed but uh these AI
that's pre-installed but uh these AI services are not so what we'll do is go
services are not so what we'll do is go ahead and run it this way and you'll
ahead and run it this way and you'll notice where it says pip install that's
notice where it says pip install that's how it knows to install and once that is
how it knows to install and once that is done we'll go run our requirements here
done we'll go run our requirements here so we have the OS which is for usually
so we have the OS which is for usually handling op like OS layer stuff we have
handling op like OS layer stuff we have matte matte plot lib which is to
matte matte plot lib which is to visually plot things and we're going to
visually plot things and we're going to use that to show images and draw borders
use that to show images and draw borders we need to handle images is I'm not sure
we need to handle images is I'm not sure if we're using numpy here but I have
if we're using numpy here but I have numpy loaded and then here we have the
numpy loaded and then here we have the Azure cognitive Services Vision computer
Azure cognitive Services Vision computer vision we're going to load the client
vision we're going to load the client and then we have the credentials and
and then we have the credentials and these are generic credentials for the
these are generic credentials for the cognitive Services credentials it's
cognitive Services credentials it's commonly used for most of these services
commonly used for most of these services and some exceptions they the apis do not
and some exceptions they the apis do not support them yet but I imagine they will
support them yet but I imagine they will in the future so just notice that when
in the future so just notice that when we run something it will show a number
we run something it will show a number if there's an aster it means it hasn't
if there's an aster it means it hasn't ran yet so I'll go ahead and hit play up
ran yet so I'll go ahead and hit play up here so it goes an aster and we get a
here so it goes an aster and we get a two and we'll go ahe head and hit play
two and we'll go ahe head and hit play again and now those are loaded in and so
again and now those are loaded in and so we'll go ahead and hit
we'll go ahead and hit play okay so here we've just packaged
play okay so here we've just packaged our credentials together so we passed
our credentials together so we passed our key into here and then we'll now
our key into here and then we'll now load in the client uh and so we'll pass
load in the client uh and so we'll pass our endpoint and our key okay so we hit
our endpoint and our key okay so we hit play and so now we just want to load our
play and so now we just want to load our image so here we're loading assets
image so here we're loading assets data.jpg let's just make sure that that
data.jpg let's just make sure that that is there so we go assets and there it is
is there so we go assets and there it is and we're going to load it as a stream
and we're going to load it as a stream because you have to pass streams along
because you have to pass streams along so we'll hit play and you'll see that it
so we'll hit play and you'll see that it now ran and so now we'll go ahead and
now ran and so now we'll go ahead and make that
make that call okay great and so we're getting
call okay great and so we're getting some data back and notice we have some
some data back and notice we have some properties person wall indoor man
properties person wall indoor man pointing captions it's not showing all
pointing captions it's not showing all the information sometimes you have to
the information sometimes you have to extract it out but we'll take a look
extract it out but we'll take a look here so uh this is a way of showing mat
here so uh this is a way of showing mat pla lib in line I don't think we have to
pla lib in line I don't think we have to run it here but I have it in here anyway
run it here but I have it in here anyway and so what it's going to do is it's
and so what it's going to do is it's going to um show us the image right so
going to um show us the image right so it's going to print us the image and
it's going to print us the image and it's going to grab whatever caption is
it's going to grab whatever caption is returned so see how there's captions so
returned so see how there's captions so we're going to iterate through the
we're going to iterate through the captions it's going to give us a
captions it's going to give us a confidence score saying it thinks it's
confidence score saying it thinks it's this so let's see what it comes out
this so let's see what it comes out with okay and so here it says Brent
with okay and so here it says Brent spider Spiner looking at a camera so
spider Spiner looking at a camera so that is the actor who plays data on Star
that is the actor who plays data on Star Trek has a confidence score of 57. 45%
Trek has a confidence score of 57. 45% even though it's 100% correct uh they
even though it's 100% correct uh they probably don't know contextual things
probably don't know contextual things like um uh in the sense of like pop
like um uh in the sense of like pop culture like they don't know probably
culture like they don't know probably start Tre characters but they're going
start Tre characters but they're going to be able to identify celebrities
to be able to identify celebrities because it's in their database so that
because it's in their database so that is um uh uh the first introduction to
is um uh uh the first introduction to computer computer vision there but the
computer computer vision there but the key things you want to remember here is
key things you want to remember here is that we use this describe an image
that we use this describe an image stream uh and that we get this
stream uh and that we get this confidence score and we get this
confidence score and we get this contextual information okay and so
contextual information okay and so that's the first one we'll move on to um
that's the first one we'll move on to um maybe custom Vision
next all right so let's take a look at custom
all right so let's take a look at custom Vision so we can do some um
Vision so we can do some um classification and object detection so
classification and object detection so um the thing is is that it's possible
um the thing is is that it's possible it's possible to launch custom Vision
it's possible to launch custom Vision through the marketplace so if we go
through the marketplace so if we go we're not going to do it this way if you
we're not going to do it this way if you type in custom Vision it never shows up
type in custom Vision it never shows up here but if you go to the marketplace
here but if you go to the marketplace here and type in custom
here and type in custom vision and you go here you can create it
vision and you go here you can create it this way but the way I like to do it I
this way but the way I like to do it I think it's a lot easier to do is we go
think it's a lot easier to do is we go up the top here and type in custom
up the top here and type in custom vision. and you'll come to this website
vision. and you'll come to this website and what you'll do is go ahead and sign
and what you'll do is go ahead and sign in it's going to connect to your Azure
in it's going to connect to your Azure account and once you're in you can go
account and once you're in you can go ahead here and create a new project so
ahead here and create a new project so the first one here is I'm just going to
the first one here is I'm just going to call this the Star Trek crew we're going
call this the Star Trek crew we're going to use this to identify different Star
to use this to identify different Star Trek members we'll go down here and uh
Trek members we'll go down here and uh we haven't yet created a resource so
we haven't yet created a resource so we'll go create
we'll go create new my custom Vision
resource we'll drop this down we'll put this in our Cog services uh we'll go
this in our Cog services uh we'll go stick with um Us West as much as we can
stick with um Us West as much as we can here we have fo Ando fo is blocked up
here we have fo Ando fo is blocked up for me so just choose so I think fo is
for me so just choose so I think fo is the free tier but I don't get
it and um once we're back here we'll go down below and choose our standard and
down below and choose our standard and we're going to have a lot of options
we're going to have a lot of options here so we have between classification
here so we have between classification and object detection so classification
and object detection so classification is when you have an image and you just
is when you have an image and you just want to say what what is this image
want to say what what is this image right and so we have two modes where we
right and so we have two modes where we can say
can say let's apply multiple labels so let's say
let's apply multiple labels so let's say there were two people in the photo or
there were two people in the photo or whether there was a dog and cat I think
whether there was a dog and cat I think that's example that use a dog and a cat
that's example that use a dog and a cat or you just have a single class where
or you just have a single class where it's like what is the one thing that is
it's like what is the one thing that is in this photo it can only be of one of
in this photo it can only be of one of the particular categories this is the
the particular categories this is the one we're going to do multic class and
one we're going to do multic class and we have a bunch of different domains
we have a bunch of different domains here and if you want to you can go ahead
here and if you want to you can go ahead and read about all the different domains
and read about all the different domains and their best use case but we're going
and their best use case but we're going to stick with A2 this is optimized for
to stick with A2 this is optimized for so that it's faster right and that's
so that it's faster right and that's really good for our demo so we're going
really good for our demo so we're going to choose General A2 I'm going to go
to choose General A2 I'm going to go ahead and create this
ahead and create this project and uh so now what we need to do
project and uh so now what we need to do is start labeling our our our content so
is start labeling our our our content so um what we'll do is I just want to go
um what we'll do is I just want to go ahead and create the tags ahead of time
ahead and create the tags ahead of time so we'll say
so we'll say Warf we'll have uh data and we'll have
Warf we'll have uh data and we'll have Crusher and now what we'll do is we'll
Crusher and now what we'll do is we'll go ahead and upload those images so you
go ahead and upload those images so you know we uploaded the jupyter notebook
know we uploaded the jupyter notebook but it was totally not necessary so here
but it was totally not necessary so here is data because we're going to do it all
is data because we're going to do it all through here and we'll just apply the
through here and we'll just apply the data tag to them all at once which saves
data tag to them all at once which saves us a lot of time I love that uh we'll
us a lot of time I love that uh we'll upload now uh
upload now uh war and I don't want to upload them all
war and I don't want to upload them all I have this one quick test image we're
I have this one quick test image we're going to use to make sure that this
going to use to make sure that this works
works correctly and I'm going to choose
Beverly there she is Beverly
is Beverly Crusher okay so we have all our our
Crusher okay so we have all our our images in I don't know how this one got
images in I don't know how this one got in here but it's under worth it works
in here but it's under worth it works out totally fine so uh what I want to
out totally fine so uh what I want to do is uh go ahead and train this small
do is uh go ahead and train this small because they're all labeled so we have a
because they're all labeled so we have a ground truth and we'll let it go ahead
ground truth and we'll let it go ahead and train so we'll go and press train
and train so we'll go and press train and we have two options quick training
and we have two options quick training or Advanced Training Advanced Training
or Advanced Training Advanced Training where we can increase the time for
where we can increase the time for better accuracy but honestly uh we just
better accuracy but honestly uh we just want to do quick training so I'll go
want to do quick training so I'll go ahead and do quick training and it's
ahead and do quick training and it's going to start its iterative process
going to start its iterative process notice on the left hand side we have
notice on the left hand side we have probability threshold the minimum
probability threshold the minimum probability score for a prediction to be
probability score for a prediction to be valid when calcul calculating precision
valid when calcul calculating precision and recall so we uh the thing is is that
and recall so we uh the thing is is that if it doesn't at least meet that
if it doesn't at least meet that requirements it will quit out and if it
requirements it will quit out and if it gets above that then it might quit out
gets above that then it might quit out early just because it's good enough okay
early just because it's good enough okay so training doesn't take too long it
so training doesn't take too long it might take 5 to 10 minutes I can't
might take 5 to 10 minutes I can't remember how long it takes but uh what
remember how long it takes but uh what I'll do is I'll see you back here in a
I'll do is I'll see you back here in a moment okay all right so after waiting a
moment okay all right so after waiting a short little while here looks like our
short little while here looks like our results are down we get 100% um match
results are down we get 100% um match here so these are our evaluation metrics
here so these are our evaluation metrics to say whether uh the model was uh uh
to say whether uh the model was uh uh achieved its actual goal or not so we
achieved its actual goal or not so we have Precision recall and I believe this
have Precision recall and I believe this is average Precision uh and so it says
is average Precision uh and so it says that it did a really good job so that
that it did a really good job so that means that it should have no problem um
means that it should have no problem um matching up an image so in the top right
matching up an image so in the top right corner we have this button that's called
corner we have this button that's called quick test and this is going to give us
quick test and this is going to give us the opportunity to uh quickly test these
the opportunity to uh quickly test these so what we'll do is browse our files
so what we'll do is browse our files locally here and uh actually I'm going
locally here and uh actually I'm going to go to uh yeah we'll go here and we
to go to uh yeah we'll go here and we have Warf uh and so I have this quick
have Warf uh and so I have this quick image here we'll test and we'll see if
image here we'll test and we'll see if it actually matches up to
it actually matches up to bwf and it says 98.7% Warf that's pretty
bwf and it says 98.7% Warf that's pretty good I also have some additional images
good I also have some additional images here I just put into the repo to test
here I just put into the repo to test against and we'll see what it matches up
against and we'll see what it matches up because I thought it'd be interesting to
because I thought it'd be interesting to do something that is not necessarily uh
do something that is not necessarily uh them but it's something pretty close to
them but it's something pretty close to um you know it's pretty close to what
um you know it's pretty close to what those are okay so we'll go to crew here
those are okay so we'll go to crew here and first we'll try
and first we'll try Hugh okay and Hugh is a borg so he's
Hugh okay and Hugh is a borg so he's kind of like an Android and so we can
kind of like an Android and so we can see he mostly matches to data so that's
see he mostly matches to data so that's pretty good uh we'll give another one go
pretty good uh we'll give another one go Marto is a Klingon so he should be
Marto is a Klingon so he should be matched up to Warf very strong match to
matched up to Warf very strong match to Warf that's pretty good and then palaski
Warf that's pretty good and then palaski she is a doctor and female so she should
she is a doctor and female so she should get matched up to Beverly Crusher and
get matched up to Beverly Crusher and she does so this works out pretty darn
she does so this works out pretty darn well uh and I hadn't even tried that so
well uh and I hadn't even tried that so it's pretty exciting so now let's say we
it's pretty exciting so now let's say we want to go ahead and well if if we
want to go ahead and well if if we wanted to um make predictions we could
wanted to um make predictions we could do them in bulk here um I believe that
do them in bulk here um I believe that you could do them in bulk but
anyway yeah I guess I always thought this was like I could have swore yeah if
this was like I could have swore yeah if we didn't have these images before I
we didn't have these images before I think that it actually has an upload
think that it actually has an upload option it's probably just the quick test
option it's probably just the quick test so I'm a bit confused there um but
so I'm a bit confused there um but anyway so now that this is ready what we
anyway so now that this is ready what we can do is go ahead and publish it uh so
can do is go ahead and publish it uh so that it is publicly accessible so we'll
that it is publicly accessible so we'll just say here a crew
just say here a crew model okay and we'll drop that down say
publish and once it's published now we have this uh public out so this is an
have this uh public out so this is an endpoint that we can go hit
endpoint that we can go hit pragmatically uh I'm not going to do
pragmatically uh I'm not going to do that I mean we could use Postman to do
that I mean we could use Postman to do that um but my point is is that we've
that um but my point is is that we've basically uh figured it out for um
basically uh figured it out for um classification so now that we've done
classification so now that we've done classification let's go back here to uh
classification let's go back here to uh the vision here and let's now let's go
the vision here and let's now let's go ahead and do object detection
ahead and do object detection [Music]
[Music] okay all right so we're still in custom
okay all right so we're still in custom Vision let's go ahead and try out object
Vision let's go ahead and try out object detection so object detection is when
detection so object detection is when you can identify a particular items in a
you can identify a particular items in a scene um and so this one's going to be
scene um and so this one's going to be combadge that's what we're going to call
combadge that's what we're going to call it because we're going to try to detect
it because we're going to try to detect combadge we have more domains here we're
combadge we have more domains here we're going to stick with the general
going to stick with the general A1 and we'll go ahead and create this
A1 and we'll go ahead and create this project
project here and so what we need to do is add a
here and so what we need to do is add a bunch of images I'm going to go ahead
bunch of images I'm going to go ahead and create our tag which is going to be
and create our tag which is going to be called combadge uh you could look for
called combadge uh you could look for multiple different kinds of labels but
multiple different kinds of labels but then you need a lot of images so we're
then you need a lot of images so we're just going to keep it simple and have
just going to keep it simple and have that there I'm going to go ahead and add
that there I'm going to go ahead and add some images and we're going to go back
some images and we're going to go back um a couple steps here into our objects
um a couple steps here into our objects and here I have a bunch of photos and we
and here I have a bunch of photos and we need exactly 15 to train so we got one
need exactly 15 to train so we got one two 3 4 5 6 7 8 9 10 11 12 13 14 15 16
two 3 4 5 6 7 8 9 10 11 12 13 14 15 16 and so I threw an additional image in
and so I threw an additional image in here this is the badge test so we'll
here this is the badge test so we'll leave that out and we'll see if that
leave that out and we'll see if that picks up really well and yeah we got
picks up really well and yeah we got them all here and so we'll go ahead and
them all here and so we'll go ahead and upload those and we'll hit upload
upload those and we'll hit upload files
files okay and we'll say done and we can now
okay and we'll say done and we can now begin to label so we'll click into here
begin to label so we'll click into here and what I want to do if you hover over
and what I want to do if you hover over it should start detecting things if it
it should start detecting things if it doesn't you can click and drag we'll
doesn't you can click and drag we'll click this one they're all com badges so
click this one they're all com badges so we're not going to tag anything else
we're not going to tag anything else here okay okay so go here hover over is
here okay okay so go here hover over is it going to give me the combadge no so
it going to give me the combadge no so I'm just draag clicking and dragging to
I'm just draag clicking and dragging to get it
get it okay okay do we get this combadge
okay okay do we get this combadge yes do we get this one
yes do we get this one yep so simple as
that okay it doesn't always get it but uh most cases it
does okay didn't get that one so we'll just drag it
just drag it out
one it's interesting like that one's pretty clear but uh it's interesting
pretty clear but uh it's interesting what it picks out and what does what it
what it picks out and what does what it does not grab eh so it's not getting
does not grab eh so it's not getting this one probably because the photo
this one probably because the photo doesn't have enough
doesn't have enough contrast and this one has a lot hoping
contrast and this one has a lot hoping that that gives us more data to work
that that gives us more data to work with here yeah I think the higher the
with here yeah I think the higher the contrast it's easier for it to uh
contrast it's easier for it to uh um detect those
um detect those it's not getting that
it's not getting that one it's not getting that one okay there
one it's not getting that one okay there we
go yes there are a lot I know I have some of these ones that are packed but
some of these ones that are packed but there's only like three photos that are
there's only like three photos that are like
this yeah they have badges but they're slightly different so we're going to
slightly different so we're going to leave those
leave those out oops I think it actually had that
out oops I think it actually had that one but we'll just tag it
anyway and hopefully this will be worth the uh the effort
here there we go I think that was the last
go I think that was the last one okay great so we have all of our
one okay great so we have all of our tagged photos and what we can do is go
tagged photos and what we can do is go ahead and train the model same option
ahead and train the model same option quick training Advanced Training we're
quick training Advanced Training we're going to do a quick training here and
going to do a quick training here and notice that the options are slightly
notice that the options are slightly different we have probably threshold and
different we have probably threshold and then we have overlap threshold so the
then we have overlap threshold so the minimum percentage of overlap between
minimum percentage of overlap between predicted bounding boxes and ground
predicted bounding boxes and ground truth boxes to be considered for correct
truth boxes to be considered for correct prediction so I'll see you back here
prediction so I'll see you back here when it is done all right so after
when it is done all right so after waiting a little bit a while here it
waiting a little bit a while here it looks like um it's done it's trained and
looks like um it's done it's trained and so Precision is at 75% so Precision the
so Precision is at 75% so Precision the number will tell you if a tag is
number will tell you if a tag is predicted uh by your model How likely
predicted uh by your model How likely that it's likely to be so how likely did
that it's likely to be so how likely did it guess right then you have recall so
it guess right then you have recall so the number will tell you out of the tags
the number will tell you out of the tags which should be predicted correctly what
which should be predicted correctly what percentage did your model correctly find
percentage did your model correctly find so we have 100% uh and then you have
so we have 100% uh and then you have mean average Precision this number will
mean average Precision this number will tell you the overall object detector
tell you the overall object detector performance across all the tags okay so
performance across all the tags okay so what we'll do is we'll go ahead and uh
what we'll do is we'll go ahead and uh do a quick test on this model and we'll
do a quick test on this model and we'll see how it does I can't remember if I
see how it does I can't remember if I actually even ran this so it'll be
actually even ran this so it'll be curious to see the first one here um
curious to see the first one here um it's not as clearly visible it's part of
it's not as clearly visible it's part of their uniform so I'm not expecting to
their uniform so I'm not expecting to pick it up but we'll see what it does it
pick it up but we'll see what it does it picks up pretty much all of them with
picks up pretty much all of them with exception this one is definitely not a
exception this one is definitely not a comp badge but uh that's okay only show
comp badge but uh that's okay only show suggests obious the probabilities above
suggests obious the probabilities above the selected
the selected threshold so if we increase
threshold so if we increase it uh we'll just bring it down a bit so
it uh we'll just bring it down a bit so there it kind of improves it um if we
there it kind of improves it um if we move it around back and forth okay so I
move it around back and forth okay so I imagine via the API we could choose that
imagine via the API we could choose that let's go look at our other sample image
let's go look at our other sample image here
here um I'm not seeing
um I'm not seeing it uh where did I save it let me just
it uh where did I save it let me just double check make sure that it's in the
double check make sure that it's in the correct directory here
correct directory here okay yeah I saved it to the wrong place
okay yeah I saved it to the wrong place just a
just a moment
moment um I will place
second okay and so I'll just browse here again and so here we have another one
again and so here we have another one see if it picks up the badge right here
see if it picks up the badge right here there we go so looks like it worked so
there we go so looks like it worked so uh yeah I guess custom vision is uh
uh yeah I guess custom vision is uh pretty easy to use and uh pretty darn
pretty easy to use and uh pretty darn good so what we'll do is close this off
good so what we'll do is close this off and make our way back to our Jupiter
and make our way back to our Jupiter labs to move on to um our our next uh
labs to move on to um our our next uh lab here
lab here [Music]
[Music] okay all right so let's move on to the
okay all right so let's move on to the face service so just go ahead and double
face service so just go ahead and double click there on the left hand side and
click there on the left hand side and what we'll do is work our way from the
what we'll do is work our way from the top so the first thing we need to do is
top so the first thing we need to do is make sure that we have the computer
make sure that we have the computer vision installed so the face service is
vision installed so the face service is part of the computer vision API and once
part of the computer vision API and once that is done we'll go ahead and uh do
that is done we'll go ahead and uh do our Imports very similar to the last one
our Imports very similar to the last one but here we're using the face client
but here we're using the face client we're still using the Cog cognitive
we're still using the Cog cognitive service credentials we will populate our
service credentials we will populate our keys we make the face client and
keys we make the face client and authenticate and we're going to use the
authenticate and we're going to use the same image we used um uh prior with our
same image we used um uh prior with our computer vision so the data one there
computer vision so the data one there and we'll go ahead and print out the
and we'll go ahead and print out the results and so we get an object back so
results and so we get an object back so it's not very clear what it is but here
it's not very clear what it is but here if we hit
if we hit show okay here it's data and it's
show okay here it's data and it's identifying the face ID so going through
identifying the face ID so going through this code so we're just saying open the
this code so we're just saying open the image we're going to uh set up our
image we're going to uh set up our figure for plotting uh it's going to say
figure for plotting uh it's going to say well how many faces did it detect in the
well how many faces did it detect in the photo and so here it says detected one
photo and so here it says detected one face it will iterate through it and then
face it will iterate through it and then we will create a bounding box around the
we will create a bounding box around the images we can do that because it returns
images we can do that because it returns back the face rectangle so we get a top
back the face rectangle so we get a top left right Etc and uh we will draw that
left right Etc and uh we will draw that Wrangle on top so we have magenta I
Wrangle on top so we have magenta I could change it to like three if I
could change it to like three if I wanted to uh I don't know what the other
wanted to uh I don't know what the other colors are so I'm not even going to try
colors are so I'm not even going to try but yeah there it is and then we
but yeah there it is and then we annotate with the face ID that's the
annotate with the face ID that's the unique identifier for the face and then
unique identifier for the face and then we show the image okay so that's one and
we show the image okay so that's one and then if we wanted to get more detailed
then if we wanted to get more detailed information like attribute such as age
information like attribute such as age emotion makeup or gender uh this
emotion makeup or gender uh this resolution image wasn't large enough so
resolution image wasn't large enough so I had to find a different image and and
I had to find a different image and and do that so that's one thing you need to
do that so that's one thing you need to know as if it's not large enough it
know as if it's not large enough it won't process it so we're just loading
won't process it so we're just loading data
data large very similar process but it
large very similar process but it is uh the same thing detect with stream
is uh the same thing detect with stream but now we're passing
but now we're passing in um return face attributes and so here
in um return face attributes and so here we're saying the attributes we want uh
we're saying the attributes we want uh and there's that list and we went
and there's that list and we went through it in the lecture content and so
through it in the lecture content and so here we'll go ahead and run this and so
here we'll go ahead and run this and so we're getting more information so that
we're getting more information so that magenta line is a bit hard to see I'm
magenta line is a bit hard to see I'm just going to increase that to
just going to increase that to three okay still really hard to see but
three okay still really hard to see but that's okay so approximate age 44 I
that's okay so approximate age 44 I think the actor was a bit younger than
think the actor was a bit younger than that uh uh data technically is male
that uh uh data technically is male presenting but he's an Android so he
presenting but he's an Android so he doesn't necessarily have a gender I
doesn't necessarily have a gender I suppose he actually is wearing a lot of
suppose he actually is wearing a lot of makeup but all it detects is it I guess
makeup but all it detects is it I guess it's only particular on the lips and the
it's only particular on the lips and the eyes so it says he doesn't have makeup
eyes so it says he doesn't have makeup so maybe there's a color you know like
so maybe there's a color you know like ey Shadow stuff maybe we would detect
ey Shadow stuff maybe we would detect that in terms of personality I like how
that in terms of personality I like how it's he's a a 002 Point per SB but he's
it's he's a a 002 Point per SB but he's neutral right uh so just going through
neutral right uh so just going through the code here very quickly so again it's
the code here very quickly so again it's the number of faces so it detected one
the number of faces so it detected one face uh and then we draw a bounding box
face uh and then we draw a bounding box around the face for the detected
around the face for the detected attributes it's uh return back in the
attributes it's uh return back in the data here so we just say get the pH
data here so we just say get the pH attributes turn it into a dictionary and
attributes turn it into a dictionary and then we can just uh get those values and
then we can just uh get those values and uh iterate over it so that's as
uh iterate over it so that's as complicated as it is um and so there we
complicated as it is um and so there we [Music]
[Music] go all right so we're on to uh our next
go all right so we're on to uh our next cognitive service let's take a look at
cognitive service let's take a look at form recognizer all right and so form
form recognizer all right and so form recognizer uh it tries to identify um
recognizer uh it tries to identify um like forums and turns them into readable
like forums and turns them into readable things and so they have one for uh
things and so they have one for uh receipts in particular so at the top
receipts in particular so at the top finally we're not using um the computer
finally we're not using um the computer uh computer vision we actually have a
uh computer vision we actually have a different one so this one's Azure AI
different one so this one's Azure AI form recognizer so we'll run that there
form recognizer so we'll run that there but this one in particular isn't up to
but this one in particular isn't up to date in terms of using it like um notice
date in terms of using it like um notice all the other ones they're using uh the
all the other ones they're using uh the cognitive service credential so for this
cognitive service credential so for this we actually had to use the Azure key uh
we actually had to use the Azure key uh credential which was annoying I tried to
credential which was annoying I tried to use the other one to be consistent um
use the other one to be consistent um but I I couldn't use it okay so what
but I I couldn't use it okay so what we'll do is run our keys like before we
we'll do is run our keys like before we have a client very similar
have a client very similar process and this time we actually have a
process and this time we actually have a receipt and so we have begin recognize
receipt and so we have begin recognize receipt so it's going to analyze the
receipt so it's going to analyze the receipt information and then it's what
receipt information and then it's what it's going to do is show us the image
it's going to do is show us the image okay just so we have a reference to look
okay just so we have a reference to look at now the image isn't actually yellow
at now the image isn't actually yellow it's a white background I don't know why
it's a white background I don't know why when it renders out here it does that
when it renders out here it does that but that's just what
but that's just what happens and uh it even obscures the
happens and uh it even obscures the server name I I don't know why um but
server name I I don't know why um but anyway if we go down below um this is
anyway if we go down below um this is return results up here right so we got
return results up here right so we got our results and so if we just print out
our results and so if we just print out uh the results here we can see we get a
uh the results here we can see we get a recognized form back we get fields and
recognized form back we get fields and some additional things and if we go into
some additional things and if we go into the uh the fields itself we see there's
the uh the fields itself we see there's a lot more information if you can make
a lot more information if you can make out like here it says Merchant phone
out like here it says Merchant phone number form field label value and
number form field label value and there's the number
there's the number 512707 so for these things here like um
512707 so for these things here like um the
the receipts if we can just find the API
receipts if we can just find the API quickly here it has predefined
quickly here it has predefined Fields I'm not sure um yeah business
Fields I'm not sure um yeah business card
card Etc um like if we just type in
Etc um like if we just type in merchant I'm just trying to see if
merchant I'm just trying to see if there's a big old list here it's not
there's a big old list here it's not really showing us a full list but these
really showing us a full list but these are are predefined um things that are
are are predefined um things that are returned right so they've defined those
returned right so they've defined those uh maybe it's over here
uh maybe it's over here there we go so these are the predefined
there we go so these are the predefined ones that extracts out so we have uh
ones that extracts out so we have uh receipt type Merchant name etc etc and
receipt type Merchant name etc etc and so if we go back to here you can see um
so if we go back to here you can see um I I have the field called Merchant name
I I have the field called Merchant name so we hit there it says Alm draft out
so we hit there it says Alm draft out Cinema let's say we want to try to get
Cinema let's say we want to try to get that balance maybe we can try to figure
that balance maybe we can try to figure out which one it is I never ran this
out which one it is I never ran this myself when I I made it so we'll see
myself when I I made it so we'll see what it is but here it has total price
what it is but here it has total price what's interesting is that these this
what's interesting is that these this has a space so it's kind of unusual ual
has a space so it's kind of unusual ual you think it'd be together but let's see
you think it'd be together but let's see if that
if that works okay doesn't like that maybe
works okay doesn't like that maybe that's just a typo on their part okay so
that's just a typo on their part okay so we get none uh let's try
we get none uh let's try price see what it picks
price see what it picks up nope nothing um we know that the
up nope nothing um we know that the phone number is there so we'll give the
phone number is there so we'll give the phone
phone number there we go so you know it's an
number there we go so you know it's an okay service but uh you know uh you know
okay service but uh you know uh you know you're you're mileage will vary based on
you're you're mileage will vary based on uh what you do there maybe we could try
uh what you do there maybe we could try total because that makes more sense
total because that makes more sense right uh yeah there we go okay great so
right uh yeah there we go okay great so yeah it is pulling out the information
yeah it is pulling out the information um and so that's pretty much all you
um and so that's pretty much all you need to know about that service there
need to know about that service there [Music]
[Music] okay let's take a look at some of our
okay let's take a look at some of our OCR capabilities here uh and I believe
OCR capabilities here uh and I believe that's in computer vision so we'll go
that's in computer vision so we'll go ahead and open that up at the top here
ahead and open that up at the top here we'll install computer vision as we did
we'll install computer vision as we did before very similar to the other
before very similar to the other computer vision task but this time we
computer vision task but this time we have a couple of ones here that'll
have a couple of ones here that'll explain that as we go through here we'll
explain that as we go through here we'll load our keys we'll do our credentials
load our keys we'll do our credentials we'll load the client okay and then we
we'll load the client okay and then we have this um function here called
have this um function here called printed text so what this function is
printed text so what this function is going to do is it's going to uh print
going to do is it's going to uh print out the results of whatever text it
out the results of whatever text it processes okay so the idea is that we
processes okay so the idea is that we are going to feed in an image and it's
are going to feed in an image and it's going to give us back out the text for
going to give us back out the text for the the image so we'll run this function
the the image so we'll run this function and I have two different images cuz I
and I have two different images cuz I actually ran it on the first one and the
actually ran it on the first one and the results were terrible and so I got a a
results were terrible and so I got a a second image and it was a bit better
second image and it was a bit better okay so we'll go ahead and run this it's
okay so we'll go ahead and run this it's going to show us the image okay and so
going to show us the image okay and so this is the photo and it was supposed to
this is the photo and it was supposed to extract out Star Trek the Next
extract out Star Trek the Next Generation but because of the artifacts
Generation but because of the artifacts and size of the image we get back uh not
and size of the image we get back uh not English okay and so you know maybe a
English okay and so you know maybe a high resolution image it would have um a
high resolution image it would have um a better a better time there um but that
better a better time there um but that is what we got back okay so let's go
is what we got back okay so let's go take a look at our second image and see
take a look at our second image and see how it did and this one I'm surprised it
how it did and this one I'm surprised it actually extracts out a lot more
actually extracts out a lot more information you can see really has a
information you can see really has a hard time with the Star Trek font but we
hard time with the Star Trek font but we get Deep Space 9 Nana Visitor tells all
get Deep Space 9 Nana Visitor tells all Life Death some errors here so it's not
Life Death some errors here so it's not perfect um but you know you can see that
perfect um but you know you can see that it does something here now there is the
it does something here now there is the O this is like for OCR where we have
O this is like for OCR where we have like for very simple images and text
like for very simple images and text this is where we use the recognized
this is where we use the recognized printed text in stream but uh if we were
printed text in stream but uh if we were doing this for larger amounts of text
doing this for larger amounts of text and we want to do this uh want this
and we want to do this uh want this analyzed asynchronously then we want to
analyzed asynchronously then we want to use the read API and it's a little bit
use the read API and it's a little bit more involved um so what we'll do here
more involved um so what we'll do here is load a different image and this is a
is load a different image and this is a script we'll look at the image here in a
script we'll look at the image here in a moment um but here we read in stream and
moment um but here we read in stream and we create these
we create these operations okay and what it will do is
operations okay and what it will do is it will asynchronous asynchronously send
it will asynchronous asynchronously send all the information over okay uh so I
all the information over okay uh so I think this is supposed to be results
think this is supposed to be results here minor typ
here minor typ and um we will go ahead and give that a
and um we will go ahead and give that a run okay and so here you can see it's
run okay and so here you can see it's extracting up the image if we want to uh
extracting up the image if we want to uh uh see this image I thought I uh I
uh see this image I thought I uh I thought I showed this image here but I
thought I showed this image here but I guess I don't yeah it says plot image
guess I don't yeah it says plot image here to show us the
here to show us the image
image uh path it's up
uh path it's up here it doesn't want to show us it's
here it doesn't want to show us it's funny because this one up here is
funny because this one up here is showing us no problem right
um well I can just show you the image it's not a big
it's not a big deal but I'm not sure why it's not
deal but I'm not sure why it's not showing up here
showing up here today so if we go to our assets here I
today so if we go to our assets here I go to
go to OCR uh I'm just going to open this
OCR uh I'm just going to open this up it's opening up in Photoshop and so
up it's opening up in Photoshop and so this is what it's transcribing okay so
this is what it's transcribing okay so this is a thing this is like a guide to
this is a thing this is like a guide to Star Trek where they talk about like you
Star Trek where they talk about like you know what what makes St Trek Star Trek
know what what makes St Trek Star Trek so just looking here it's actually
so just looking here it's actually pretty darn good okay but like read API
pretty darn good okay but like read API is a lot more uh efficient because it
is a lot more uh efficient because it can work uh
can work uh umly and so when you have a lot of text
umly and so when you have a lot of text that's what you want to do okay um and
that's what you want to do okay um and like it's feeding in each individual
like it's feeding in each individual line right so that it can be more
line right so that it can be more effective that way um so let's go look
effective that way um so let's go look at some hand Rd and stuff so just in
at some hand Rd and stuff so just in case the image doesn't pop up we'll go
case the image doesn't pop up we'll go ahead and open this one and so this is a
ahead and open this one and so this is a a a handwritten note that uh William
a a handwritten note that uh William Shatner wrote to a fan of Star Trek and
Shatner wrote to a fan of Star Trek and it's basically incomprehensible I don't
it's basically incomprehensible I don't know if you can read that here but see
know if you can read that here but see was very something he was something
was very something he was something hospital and healthy was something he
hospital and healthy was something he was something I can't even read it okay
was something I can't even read it okay so let's see what uh the machine thinks
so let's see what uh the machine thinks here and uh it says image path yeah it's
here and uh it says image path yeah it's called path let's just change that out
called path let's just change that out go ahead and run that and run that there
go ahead and run that and run that there and we'll go ahead and run it and here
and we'll go ahead and run it and here we get the image so uh poner us very
we get the image so uh poner us very sick he was the hospital his Bey was Etc
sick he was the hospital his Bey was Etc beat nobody lost his family knew Captain
beat nobody lost his family knew Captain Halden so reads better than how I could
Halden so reads better than how I could read it honestly like it is it's really
read it honestly like it is it's really hard right like if you looked at
hard right like if you looked at this like that looks like difficult
this like that looks like difficult was Beady healthy I could see why it's
was Beady healthy I could see why it's guessing like that right dying that
guessing like that right dying that looks like dying to me you know what I
looks like dying to me you know what I mean so you it's just poorly hand
mean so you it's just poorly hand handwritten but I mean it's pretty good
handwritten but I mean it's pretty good for what it is so uh yeah there you
for what it is so uh yeah there you [Music]
[Music] go all right so let's take a look at
go all right so let's take a look at another cognitive service here and this
another cognitive service here and this one is text
one is text analysis and uh so what we'll do is
analysis and uh so what we'll do is install the Azure cognitive Services
install the Azure cognitive Services language uh text analytics here so we go
language uh text analytics here so we go ahead and hit run all right and once
ahead and hit run all right and once that's uh installed uh this one is using
that's uh installed uh this one is using the cognitive Services credentials so
the cognitive Services credentials so it's a little bit more standard with our
it's a little bit more standard with our other ones here we'll go ahead and run
other ones here we'll go ahead and run that there uh we'll make our credentials
that there uh we'll make our credentials load our client and this one what we're
load our client and this one what we're going to do is try to determine
going to do is try to determine sentiment and understand why people like
sentiment and understand why people like a particular movie or not so I've loaded
a particular movie or not so I've loaded a bunch of reviews um they are again I
a bunch of reviews um they are again I can show you the data if it helps uh and
can show you the data if it helps uh and so I'm just trying to find my right
so I'm just trying to find my right folder here and so if we go back look
folder here and so if we go back look our movie reviews here's like a a review
our movie reviews here's like a a review someone wrote first Contact just works
someone wrote first Contact just works it works as a rousing chapter in the
it works as a rousing chapter in the Star Trek to less extent it works as a
Star Trek to less extent it works as a mainstream entertainment so different
mainstream entertainment so different reviews for Star Trek first Contact
reviews for Star Trek first Contact which was a a very popular movie back in
which was a a very popular movie back in the day um so what we'll
the day um so what we'll do is we will load uh the reviews so
do is we will load uh the reviews so it's just iterating through the text
it's just iterating through the text files and showing us what the reviews
files and showing us what the reviews are so here we can see all the ridden
are so here we can see all the ridden text had a lot of trouble getting the
text had a lot of trouble getting the last one to display but it does get
last one to display but it does get loaded in and so here we're using the
loaded in and so here we're using the the the um text analysis to show us uh
the the um text analysis to show us uh key phrases because maybe that would
key phrases because maybe that would give us an indicator and so that's the
give us an indicator and so that's the object back but maybe that give us an
object back but maybe that give us an indicator as to like what people are
indicator as to like what people are saying as important things so here we
saying as important things so here we see Borg ship Enterprise smaller ship
see Borg ship Enterprise smaller ship escapes neutral zone travels contact
escapes neutral zone travels contact damage uh co-writer Beautiful Mind
damage uh co-writer Beautiful Mind sophisticated science fiction best
sophisticated science fiction best whales Leonard neoy okay uh wealth of
whales Leonard neoy okay uh wealth of unrealized potential uh filmmaker John
unrealized potential uh filmmaker John fr s okay so very interesting stuff as
fr s okay so very interesting stuff as it here Borg ship again you've seen Borg
it here Borg ship again you've seen Borg ship a lot so that is kind of key
ship a lot so that is kind of key phrases let's go get uh C or customer
phrases let's go get uh C or customer sentiment or how people felt about it
sentiment or how people felt about it did they like it or not and so here we
did they like it or not and so here we just call sentiment and um what we'll do
just call sentiment and um what we'll do is if it's uh above five then it's
is if it's uh above five then it's positive and it's below five then it's a
positive and it's below five then it's a negative review I think most people uh
negative review I think most people uh thought it was a very good film uh so
thought it was a very good film uh so this one says it's pretty low nine so
this one says it's pretty low nine so let's go take a look at that one uh it
let's go take a look at that one uh it wasn't actually showing rendered there
wasn't actually showing rendered there so maybe we'll have to open it up
so maybe we'll have to open it up manually see if that's actually accurate
manually see if that's actually accurate it's empty so there you go I guess we
it's empty so there you go I guess we had a blank one in there um I must have
had a blank one in there um I must have forgot to paste it in but that's okay uh
forgot to paste it in but that's okay uh that's a good indicator that uh you know
that's a good indicator that uh you know that's what happens if you don't have it
that's what happens if you don't have it so let's look at number one then which
so let's look at number one then which is uh well actually this one is nine
is uh well actually this one is nine this is 04 this one here is eight so
this is 04 this one here is eight so we'll open up eight when the Borg
we'll open up eight when the Borg launched on Earth the Enterprise is sent
launched on Earth the Enterprise is sent to the neutral zone etc etc however a
to the neutral zone etc etc however a smaller ship escapes travels the
smaller ship escapes travels the Enterprise follows back um meanwhile the
Enterprise follows back um meanwhile the survivors so like this is a synopsis it
survivors so like this is a synopsis it doesn't say whether they like it or they
doesn't say whether they like it or they don't but it was just 04 I I guess so
don't but it was just 04 I I guess so there's nothing positive about it right
there's nothing positive about it right um if we look at one that was this one's
um if we look at one that was this one's pretty low which is no no it's not it's
pretty low which is no no it's not it's one so it seems like this person
one so it seems like this person probably really liked it or no I guess
probably really liked it or no I guess that's actually pretty low because it's
that's actually pretty low because it's one it's not nine Nine's very high let's
one it's not nine Nine's very high let's take a look at this one review number
take a look at this one review number two uh if we go up
two uh if we go up here the doo has improved the story mon
here the doo has improved the story mon turn the show but there's a wealth of
turn the show but there's a wealth of unrealized potential so that's a fair
unrealized potential so that's a fair one saying they maybe they don't like it
one saying they maybe they don't like it as much I don't know if they give it two
as much I don't know if they give it two stars right we could probably actually
stars right we could probably actually correlate it with the actual results
correlate it with the actual results because I did get these off of IMDb and
because I did get these off of IMDb and Rotten Tomatoes but uh yeah there you go
Rotten Tomatoes but uh yeah there you go that is Tex
[Music] analysis all right so now we're on to
analysis all right so now we're on to Q&A maker and so we're not going to need
Q&A maker and so we're not going to need to do anything pragmatically because Q&A
to do anything pragmatically because Q&A maker is all about no code or low code
maker is all about no code or low code to build out a questions and answers uh
to build out a questions and answers uh uh bot service so what we'll do is go
uh bot service so what we'll do is go all the way up here and I want you to
all the way up here and I want you to type in Q andm maker. a because as far
type in Q andm maker. a because as far as I'm aware of it's not accessible
as I'm aware of it's not accessible through the portal sometimes you can
through the portal sometimes you can find these things um again if we go to
find these things um again if we go to the
the marketplace I'm just curious I'm going
marketplace I'm just curious I'm going just take a look here really quickly uh
just take a look here really quickly uh whenever it decides to log Us in here
whenever it decides to log Us in here okay great so I'll go over to
okay great so I'll go over to Marketplace and probably if we typed in
Marketplace and probably if we typed in Q&A maybe we'd see something here
Q&A yeah so we go here um give it a second
here um give it a second here seems like Azure is a little bit
here seems like Azure is a little bit slow right
now usually varies fast but uh you know the service
the service varies well it's not loading for me
varies well it's not loading for me right now but that's okay because we're
right now but that's okay because we're not going to do it that way anyway um so
not going to do it that way anyway um so uh again go to Q&A maker. and what I
uh again go to Q&A maker. and what I want you to do is go all way to the top
want you to do is go all way to the top in the right corner and we'll hit sign
in the right corner and we'll hit sign in and what we'll be doing is connecting
in and what we'll be doing is connecting via our single sign on with our account
via our single sign on with our account so it already knows I have an account
so it already knows I have an account there I'm going to give it a moment here
there I'm going to give it a moment here and I'm going to go ahead and just give
and I'm going to go ahead and just give it a
second there we go so it says I don't have any
there we go so it says I don't have any um knowledge bases which is true so
um knowledge bases which is true so let's go ahead and create ourselves a
let's go ahead and create ourselves a new knowledge base and here we have the
new knowledge base and here we have the option between stable and preview I'm
option between stable and preview I'm going to stick with stable because I
going to stick with stable because I don't know what's in preview I'm pretty
don't know what's in preview I'm pretty happy with uh that and so we need to
happy with uh that and so we need to connect uh Q&A Service uh uh Q&A service
connect uh Q&A Service uh uh Q&A service to our knowledge base and so back over
to our knowledge base and so back over here in Azure actually I guess we do
here in Azure actually I guess we do have to make one now that I remember we
have to make one now that I remember we actually have to create a Q&A maker
actually have to create a Q&A maker service so I'll go down here and put
service so I'll go down here and put this under my Cog services we'll say
this under my Cog services we'll say my um
my um Q&A Q&A
Q&A Q&A service might complain about the name uh
service might complain about the name uh yep so I'll just put some numbers here
yep so I'll just put some numbers here we'll pick uh free tier sounds good so
we'll pick uh free tier sounds good so I'll go free when I actually get the
I'll go free when I actually get the option that's what I will choose um down
option that's what I will choose um down below we'll choose free again usse
below we'll choose free again usse sounds great to me uh it generates out
sounds great to me uh it generates out the name it's the same name as here so
the name it's the same name as here so that's fine uh we don't need app
that's fine uh we don't need app insights but I'm going to leave it
insights but I'm going to leave it enabled because I think it changes it to
enabled because I think it changes it to standard or s zero when you uh do
standard or s zero when you uh do not um have it enabled
not um have it enabled unusually and so we will create our Q&A
unusually and so we will create our Q&A maker service give it a moment
maker service give it a moment here and it says I remember it will say
here and it says I remember it will say like even if you try it might have to
like even if you try it might have to wait 10 minutes for it to create the
wait 10 minutes for it to create the service so even though even after it's
service so even though even after it's provisioned um it'll take some time so
provisioned um it'll take some time so what we should do is prepare our doc
what we should do is prepare our doc because it can take in a variety
because it can take in a variety different files and I just want to show
different files and I just want to show you here that uh the Q&A they have a
you here that uh the Q&A they have a whole paper here formatting the
whole paper here formatting the guidelines
guidelines and basically it's pretty smart about
and basically it's pretty smart about knowing where headings and answers is so
knowing where headings and answers is so for unstructured data we just have a
for unstructured data we just have a heading and we have some text so let's
heading and we have some text so let's write some things in here that we can
write some things in here that we can think of since we're all about
think of since we're all about certification we should write some stuff
certification we should write some stuff here so how many adus certifications are
here so how many adus certifications are there I believe right now there are uh
there I believe right now there are uh 11 uh adus
11 uh adus certifications
certifications okay and maybe if we use our headings
okay and maybe if we use our headings here this would probably be a good idea
here this would probably be a good idea here y
fundamental Azure certifications are
there and uh we'll give this a heading we'll say um there are three Azure I
we'll say um there are three Azure I think there's three there's other ones
think there's three there's other ones right like Power Platform stuff but just
right like Power Platform stuff but just being Azure specific there are three
being Azure specific there are three Azure uh
Azure uh fundamental certifications certification
fundamental certifications certification so we have
so we have um the dp900 the AI 900 um the a900 I
um the dp900 the AI 900 um the a900 I guess there's four there's the sc900
guess there's four there's the sc900 right so there are
right so there are four
four okay we'll say which is the
okay we'll say which is the hardest
hardest um Azure assoc Azure Association
certification and uh what we'll say here is I think I mean it's my my opinion is
is I think I mean it's my my opinion is it's the Azure administrator had some
it's the Azure administrator had some background noise there that's why I was
background noise there that's why I was a bit pausing there but the Azure
a bit pausing there but the Azure administrator a 104 I would say that's
administrator a 104 I would say that's the hardest uh which is
certifications I would say uh Azure certifications are
certifications are harder because they uh check uh exact
harder because they uh check uh exact steps for
steps for implementation
implementation where AWS focuses
on Concepts okay so we have a bit of a um
Concepts okay so we have a bit of a um knowledge base here so I'll save it and
knowledge base here so I'll save it and assuming that this is ready because we
assuming that this is ready because we need a little bit time to put this
need a little bit time to put this together we'll go back to q a get hit a
together we'll go back to q a get hit a refresh
refresh here give it a moment drop it down
here give it a moment drop it down choose our
service and uh notice here that we have chitchat
and uh notice here that we have chitchat extraction and only extraction we're
extraction and only extraction we're going to do chitchat I will say uh my or
going to do chitchat I will say uh my or this will be uh the reference name you
this will be uh the reference name you change any time this will be like uh uh
change any time this will be like uh uh certification
Q&A and so here we want to populate so we'll go to files here I'm going to go
we'll go to files here I'm going to go to my
to my desktop and here it is I'll open
desktop and here it is I'll open it we will choose professional tone go
it we will choose professional tone go ahead and create that and so I'll see
ahead and create that and so I'll see you back here moment all right so after
you back here moment all right so after waiting a short little time here it
waiting a short little time here it loaded in our data so you can see that
loaded in our data so you can see that it it figured out which is the question
it it figured out which is the question which is the answer and also has a bunch
which is the answer and also has a bunch of default so here if somebody was asked
of default so here if somebody was asked something very s uh silly like can you
something very s uh silly like can you cry I'll say I don't have a body it has
cry I'll say I don't have a body it has a lot of information pre-loaded for us
a lot of information pre-loaded for us which is really nice if we wanted to go
which is really nice if we wanted to go ahead and test this we could go and say
ahead and test this we could go and say um we'll go here and then we'll write in
um we'll go here and then we'll write in uh we say like
boring says good morning okay so we'll say um how many uh
say um how many uh certifications are there we didn't say
certifications are there we didn't say AWS but let's just see what
happens and so it kind of inferred even though we didn't say AWS in particular
though we didn't say AWS in particular so and notice that there's ads and Azure
so and notice that there's ads and Azure so how many fundamental Azure
so how many fundamental Azure certifications things like that and so
certifications things like that and so it chose AWS so it's not like the
it chose AWS so it's not like the perfect service but it's pretty good I
perfect service but it's pretty good I wonder what would happen if we um placed
wonder what would happen if we um placed in uh one that's like Azure I don't know
in uh one that's like Azure I don't know how many Azure Sears there are we'll
how many Azure Sears there are we'll just say like there's 11 12 I can't
just say like there's 11 12 I can't never remember they're always adding
never remember they're always adding more but uh it I want to close this here
more but uh it I want to close this here there we go so let's just go add a new
there we go so let's just go add a new key pair here and we'll say how many
key pair here and we'll say how many Azure
Azure [Music]
[Music] certification are there I should have
certification are there I should have said certifications I'll probably just
said certifications I'll probably just set one moment so there there
set one moment so there there are 12 Azure
are 12 Azure certifications who knows how many they
certifications who knows how many they have they could like 14 or something we
have they could like 14 or something we could say like between 11 and
could say like between 11 and 14 they just add them they just update
14 they just add them they just update them too frequently I can't keep track
them too frequently I can't keep track so uh we'll go here and we'll just say
so uh we'll go here and we'll just say certifications and we will save and
certifications and we will save and retrain so we'll just wait here a
moment great and so now we go ahead and test this again so we'll
test this again so we'll say how many
say how many certifications are
there and see it's pulling the first answer if I say uh Azure if it's see if
answer if I say uh Azure if it's see if it gets the right one
it gets the right one here how many Azure certifications are
there okay so you know uh maybe you'd have to say you'd have to have a generic
have to say you'd have to have a generic one for that match so if we go back here
one for that match so if we go back here and we
say how many certifications are there you say uh you
certifications are there you say uh you know like uh uh
know like uh uh which certification uh uh which Ser
which certification uh uh which Ser cloud service
provider here we got ads
ads Azure uh prompt you can use Guides
Azure uh prompt you can use Guides Through conversational flow prompts are
Through conversational flow prompts are used to link Q&A Pairs and can be
used to link Q&A Pairs and can be displayed um I haven't used this yet but
displayed um I haven't used this yet but I mean it sounds like something that's
I mean it sounds like something that's pretty good um because there is
pretty good um because there is multi-turn in this so the idea is that
multi-turn in this so the idea is that if you had to go through multiple steps
if you had to go through multiple steps you could absolutely do that um we try
you could absolutely do that um we try this a little bit here uh follow prompt
this a little bit here uh follow prompt you can use the guide use convert
you can use the guide use convert prompts are used to link Q&A pairs
prompts are used to link Q&A pairs together texture button for suggested
together texture button for suggested action oh okay so maybe we just do like
action oh okay so maybe we just do like AWS link to Q&A and then so search an
AWS link to Q&A and then so search an existing Q&A or create a new one um so
existing Q&A or create a new one um so it say like how many eight of
it say like how many eight of us okay we're typing it
us okay we're typing it in context only this Falls up will not
in context only this Falls up will not be understood out of the context flow
be understood out of the context flow sure because it should be within context
sure because it should be within context right and uh here we can do another one
right and uh here we can do another one we say like um
we say like um Azure we'll say how many
azure context only oops it uh got away from me
there we'll save that and uh what we'll do is save and
train so we go back here and we'll say how
how many uh certifications are there
enter so we have to choose AWS and so there we go so we got something that
there we go so we got something that works pretty good there since I'm happy
works pretty good there since I'm happy with it we can go ahead and go and
with it we can go ahead and go and publish that so we's say
publish and now that it's published we could use Postman or curl to uh trigger
could use Postman or curl to uh trigger it but what I want to do is create a bot
it but what I want to do is create a bot because with Azure bot Services then we
because with Azure bot Services then we can actually utilize it um with other
can actually utilize it um with other IND ations right it's a great way to uh
IND ations right it's a great way to uh um use your Bot or to actually host your
um use your Bot or to actually host your Bot so we'll go over here it'll link it
Bot so we'll go over here it'll link it over uh if you don't click it it doesn't
over uh if you don't click it it doesn't preload it in so it's kind of a pain if
preload it in so it's kind of a pain if you lose it you have to go back there
you lose it you have to go back there and click it again but uh let's just say
and click it again but uh let's just say um
um certification q and
certification q and day and we will look through here so all
day and we will look through here so all going to go with free premium messages
going to go with free premium messages 10K 1K premium message units messages
10K 1K premium message units messages I'm kind of confused by the pricing but
I'm kind of confused by the pricing but F0 usually means free so that's what I'm
F0 usually means free so that's what I'm going to go for that SDK or nodejs I'm
going to go for that SDK or nodejs I'm going to use NOS not that we're going to
going to use NOS not that we're going to do anything there with it go ahead and
do anything there with it go ahead and create
create that and I don't think this takes too
that and I don't think this takes too long we'll see
here and just go ahead and click on that there I'll just wait here a bit I'll see
there I'll just wait here a bit I'll see you back here in a moment all right so
you back here in a moment all right so after waiting I don't know about 5
after waiting I don't know about 5 minutes there it looks like our bot
minutes there it looks like our bot service is deployed we'll go to that
service is deployed we'll go to that resour there uh you can download the bot
resour there uh you can download the bot source code actually never did this uh
source code actually never did this uh so I don't know what it looks like so be
so I don't know what it looks like so be curious to see this um just to see what
curious to see this um just to see what the code is I assume that because we Cho
the code is I assume that because we Cho chose nodejs it would give us um that as
chose nodejs it would give us um that as the default there so download your c as
the default there so download your c as you bought creating the source zip not
you bought creating the source zip not sure how long this
sure how long this takes might be regretting clicking on
takes might be regretting clicking on that but uh what we'll do is we'll go on
that but uh what we'll do is we'll go on the left hand side here to channels
the left hand side here to channels because I just want to show uh here yeah
because I just want to show uh here yeah I don't not didn't
I don't not didn't download uh we'll try here in a second
download uh we'll try here in a second but um what we'll do is we'll go back po
but um what we'll do is we'll go back po profile uh unspecified bot what are you
profile uh unspecified bot what are you talking
talking about yeah maybe it needs some
time so you know maybe we'll just give the bot a little bit of time here I'm
the bot a little bit of time here I'm not sure why it's giving us a hard time
not sure why it's giving us a hard time because this bot is definitely deployed
because this bot is definitely deployed if we go over to our bot right bought
if we go over to our bot right bought Services it is here sometimes there's
Services it is here sometimes there's like latency you know with uh Azure oh
like latency you know with uh Azure oh there we go okay see it works now fine
there we go okay see it works now fine right and so I want to show you that
right and so I want to show you that there's different channels and these are
there's different channels and these are just easy ways to integrate your Bot
just easy ways to integrate your Bot into different services so whether you
into different services so whether you wanted to use it with Alexa group me
wanted to use it with Alexa group me Skype telepon twilio Skype business
Skype telepon twilio Skype business apparently they don't have that anymore
apparently they don't have that anymore because I got s teams now right uh keik
because I got s teams now right uh keik which I don't know people still use that
which I don't know people still use that slack we should had Discord telegram
slack we should had Discord telegram Facebook email um that's kind of cool
Facebook email um that's kind of cool but teams teams is a really good one I
but teams teams is a really good one I use teams uh there's a direct line
use teams uh there's a direct line Channel I don't know what that means and
Channel I don't know what that means and there's web chat which is just having
there's web chat which is just having like an ined code so if we go over we
like an ined code so if we go over we can go and test it over here just
can go and test it over here just testing our web chat and so it's the
testing our web chat and so it's the same thing as before but we just say
same thing as before but we just say things like uh um how many
things like uh um how many certifications are
there let Azure and get a clear answer back we'll go back up to our overview
back we'll go back up to our overview let's try to see if we can download that
let's try to see if we can download that code again I was kind of curious uh what
code again I was kind of curious uh what that looks
download must be a lot of code eh
there we go so now we can hit download and so there is the code I'm going to go
and so there is the code I'm going to go ahead and open that up uh so yeah I
ahead and open that up uh so yeah I guess when we chose JavaScript that made
guess when we chose JavaScript that made a lot more sense let's give it a little
a lot more sense let's give it a little peek here I'm just going
peek here I'm just going to uh drop this on my desktop here so
to uh drop this on my desktop here so let make a new folder here and call this
let make a new folder here and call this uh bot
uh bot code okay I know you can't see what I'm
code okay I know you can't see what I'm doing here but uh let's go here
doing here but uh let's go here and d double click into here and then
and d double click into here and then just drag that code on
in and then what we can do is open this up in VSS code I should have VSS code
up in VSS code I should have VSS code running somewhere around here just going
running somewhere around here just going to go ahead and open that I'm off screen
to go ahead and open that I'm off screen here I'll just show you my screen in a
here I'll just show you my screen in a moment say show code
moment say show code oops file open
oops file open folder bot code
folder bot code okay and uh we'll come all the way back
okay and uh we'll come all the way back here and so we got a lot of code here
here and so we got a lot of code here never looked at this before but you know
never looked at this before but you know I'm a pretty good programmer so it's not
I'm a pretty good programmer so it's not too hard for me to
too hard for me to understand um so looks like you got API
understand um so looks like you got API request things like that I guess it
request things like that I guess it would just be like if you needed to
would just be like if you needed to integrate into your application then it
integrate into your application then it kind of shows you all the code there I'm
kind of shows you all the code there I'm just trying to see our dialogue
just trying to see our dialogue choices nothing super
exciting okay you know when I go and make the um was it the AI or the AI 100
make the um was it the AI or the AI 100 whatever the data scientist course is
whatever the data scientist course is I'm sure I'll be a lot more thorough
I'm sure I'll be a lot more thorough here but I was just curious as to what
here but I was just curious as to what that looks like now if we wanted to have
that looks like now if we wanted to have an easy integration uh we can get an M
an easy integration uh we can get an M code for this so if we go back to our
code for this so if we go back to our channels I
channels I believe uh we can go and is it
believe uh we can go and is it edit ah yes so here we have a code so
edit ah yes so here we have a code so what I'll do is go back to jupyter Labs
what I'll do is go back to jupyter Labs I'm just going to go make a new empty um
I'm just going to go make a new empty um notebook so we'll just go up here and
notebook so we'll just go up here and say
say notebook and this can be for our
notebook and this can be for our Q&A doesn't really matter what
Q&A doesn't really matter what kernel uh we'll say Q and A maker just
kernel uh we'll say Q and A maker just to show like if you wanted a very very
to show like if you wanted a very very simple way of integrating your Bot um we
simple way of integrating your Bot um we would go back over
would go back over to wherever it is here ah here we are
to wherever it is here ah here we are I'm going to go ahead and copy this
I'm going to go ahead and copy this iframe I think it's percentage
iframe I think it's percentage percentage HTML so it treats this cell
percentage HTML so it treats this cell as HTML
as HTML and I don't have any HTML to render so
and I don't have any HTML to render so we will place that in there and notice
we will place that in there and notice we have to replace our secret key so I
we have to replace our secret key so I will go back here and I will show my key
will go back here and I will show my key and we will copy
and we will copy that and we will paste that key in here
that and we will paste that key in here and then we'll run
and then we'll run this and I can type in
this and I can type in here where am I just ask silly
how many Azure certifications are there well I wonder
certifications are there well I wonder if I just leave the are there off let's
if I just leave the are there off let's see if it figures it out okay cool
see if it figures it out okay cool so uh yeah I mean that's pretty much it
so uh yeah I mean that's pretty much it with Q&A
with Q&A maker um so yeah that's great so I think
maker um so yeah that's great so I think we're done here and we can move on to uh
we're done here and we can move on to uh checking out uh leis or Luis learning
checking out uh leis or Luis learning understanding to make a more uh robust
understanding to make a more uh robust bot
okay [Music]
[Music] all right so we are on to our last
all right so we are on to our last cognitive service and this one is going
cognitive service and this one is going to be uh lwis or Louise depending on how
to be uh lwis or Louise depending on how you like to say it it's Luis which is
you like to say it it's Luis which is language understanding so you type in
language understanding so you type in luis. a uh and that's going to bring us
luis. a uh and that's going to bring us up to this external website still part
up to this external website still part of um Azure just has its own domain and
of um Azure just has its own domain and so here we'll choose our subscription
so here we'll choose our subscription and we have no author authoring source
and we have no author authoring source so I guess we'll have to go ahead and
so I guess we'll have to go ahead and create one ourselves so go down here and
create one ourselves so go down here and we'll choose my cognitive Services asure
we'll choose my cognitive Services asure resource name so my o uh service or my
resource name so my o uh service or my cognitive
service create new cognitive service account but we already have one so I
account but we already have one so I don't want to make another one right it
don't want to make another one right it should show up here
should show up here right are valid in the author authoring
right are valid in the author authoring region so it's possible that we're just
region so it's possible that we're just in the incorrect region so we might end
in the incorrect region so we might end up creating two of these and that's
up creating two of these and that's totally fine I don't care it's as long
totally fine I don't care it's as long as we get this working here because
as we get this working here because we're going to delete everything at the
we're going to delete everything at the end anyway and so just say my Cog
end anyway and so just say my Cog service
service 2 and uh we'll say West us because I
2 and uh we'll say West us because I think that maybe we didn't choose one of
think that maybe we didn't choose one of these regions let's go double check uh
these regions let's go double check uh if we go back to our
if we go back to our portal just the limitations of the
portal just the limitations of the service right so we'll go to my Cog
service right so we'll go to my Cog Services here um I just want to go uh
Services here um I just want to go uh cognitive
cognitive services so just want to see where this
services so just want to see where this is deployed and this is in um you West
is deployed and this is in um you West us yes I don't know why it's not shown
us yes I don't know why it's not shown up there but whatever if that's what it
up there but whatever if that's what it wants we'll give it what it wants
okay shouldn't give us that much trouble but hey that's how it
but hey that's how it goes and so we have an author authoring
goes and so we have an author authoring service I'm going to refresh here and
service I'm going to refresh here and see if it added a second one it didn't
see if it added a second one it didn't so all right
so all right that's fine so we'll just say uh my
that's fine so we'll just say uh my sample
sample bot um we'll use English as our culture
bot um we'll use English as our culture if nothing shows up here don't worry you
if nothing shows up here don't worry you can choose it later on I remember the
can choose it later on I remember the first time I did this it didn't show up
first time I did this it didn't show up and so now we have my Cog service my
and so now we have my Cog service my custom vision service we want Cog
custom vision service we want Cog service
service so um anyway it tells you about schema
so um anyway it tells you about schema like how you make a schema animates
like how you make a schema animates talking about like bot action intent and
talking about like bot action intent and example utterance but we're just going
example utterance but we're just going to set up something very simple here so
to set up something very simple here so we're going to create our attent the one
we're going to create our attent the one that we always see is uh flight booking
that we always see is uh flight booking so I'll go here and do
so I'll go here and do that and what we want to do is write an
that and what we want to do is write an undering so like uh book May flight to
undering so like uh book May flight to Toronto okay and so if someone were to
Toronto okay and so if someone were to type that in then the idea is it would
type that in then the idea is it would return back the intent this value and
return back the intent this value and metadata around it and we could
metadata around it and we could programmatically provide code right so
programmatically provide code right so what we need is identity identities and
what we need is identity identities and we can actually just click here and uh
we can actually just click here and uh make one here so enter named identity
make one here so enter named identity we'll just call this
we'll just call this location okay here we have an option
location okay here we have an option machine learned and list if you flip
machine learned and list if you flip between it this is like imagine you have
between it this is like imagine you have a ticket order and you have these values
a ticket order and you have these values that can uh change or you just have a
that can uh change or you just have a value that always stays the same like
value that always stays the same like list so that's our
list so that's our airport that makes sense we'll do
airport that makes sense we'll do that if we go over to ENT entities we
that if we go over to ENT entities we can see it
can see it here all right so uh nothing super
here all right so uh nothing super exciting there but what I want to show
exciting there but what I want to show you is if we go ahead and um we should
you is if we go ahead and um we should probably add fight booking should be uh
probably add fight booking should be uh how about book
how about book flight flight booking fight booking okay
flight flight booking fight booking okay so we'll go ahead and I know there's
so we'll go ahead and I know there's only one but we'll go ahead and train
only one but we'll go ahead and train our
model because we don't need to know tons right we cover a lot in the lecture
right we cover a lot in the lecture content uh to build a complex spot is
content uh to build a complex spot is more for the uh associate level um but
more for the uh associate level um but now what we can do is go ahead and test
now what we can do is go ahead and test this and we'll say book me a flight to
Seattle okay and notice here it says book flight we can go inspect it and we
book flight we can go inspect it and we get some additional data so top scoring
get some additional data so top scoring so it says How likely that was the
so it says How likely that was the intent
intent um okay so you get kind of an idea there
um okay so you get kind of an idea there there's additional things here it
there's additional things here it doesn't really matter um we'll go back
doesn't really matter um we'll go back here and we will go ahead and publish
here and we will go ahead and publish our model so we can put it into a
our model so we can put it into a production slot you can see we have
production slot you can see we have sentiment analysis speech priming we
sentiment analysis speech priming we don't care about either of those
don't care about either of those things we can go and see where our
things we can go and see where our endpoint is and so now we have uh an
endpoint is and so now we have uh an endpoint that we can work with um so
endpoint that we can work with um so yeah I mean that's pretty much all you
yeah I mean that's pretty much all you really need to learn about Lewis um but
really need to learn about Lewis um but uh I think we're all done for cognitive
uh I think we're all done for cognitive services so we're going to keep around
services so we're going to keep around our our notebook because um we're going
our our notebook because um we're going to still use our jupyter notebook for
to still use our jupyter notebook for some other things things but what I want
some other things things but what I want you to do is make your way over
you to do is make your way over to um your resource groups because if
to um your resource groups because if you've been pretty clean it's all within
you've been pretty clean it's all within here we'll just take a look here so we
here we'll just take a look here so we have our
have our Q&A all of our stuff here I'm just
Q&A all of our stuff here I'm just making sure it's all there and so I'm
making sure it's all there and so I'm just going to go ahead and delete this
just going to go ahead and delete this Resource Group and that should wipe away
Resource Group and that should wipe away everything okay for the cognitive
everything okay for the cognitive Services
Services part all right so we're all good here
part all right so we're all good here and I'm just going to go off and I'll
and I'm just going to go off and I'll leave leave this open because it's
leave leave this open because it's always a pain to get back to it and
always a pain to get back to it and reopen it but let's make our way back to
reopen it but let's make our way back to the home here in the Azure uh machine
the home here in the Azure uh machine Learning Studio and now we can actually
Learning Studio and now we can actually explore building up machine learning
explore building up machine learning [Music]
[Music] pipelines okay so we are on to the ml uh
pipelines okay so we are on to the ml uh uh follow alongs here so we're going to
uh follow alongs here so we're going to learn how to build some pipelines so
learn how to build some pipelines so first I think is the easiest would be
first I think is the easiest would be Auto automated ml or also know as autom
Auto automated ml or also know as autom ml the idea here is it's going to just
ml the idea here is it's going to just um build out the entire pipeline for us
um build out the entire pipeline for us so we don't have to do any thinking we
so we don't have to do any thinking we just say what kind of model we want to
just say what kind of model we want to run and have it to make a prediction so
run and have it to make a prediction so what we'll do is a new automated ML and
what we'll do is a new automated ML and we're going to need a data set so I
we're going to need a data set so I don't have one but the nicest thing is
don't have one but the nicest thing is they have these open data sets so if you
they have these open data sets so if you click here you'll see there is a bunch
click here you'll see there is a bunch here and a lot of these you'll come
here and a lot of these you'll come across quite often not just on Azure but
across quite often not just on Azure but other places like this diabetes one I've
other places like this diabetes one I've seen it like everywhere okay uh and so
seen it like everywhere okay uh and so like if we just go click here maybe we
like if we just go click here maybe we can read a bit more here so diabetes
can read a bit more here so diabetes data set 422 samples with 10 features
data set 422 samples with 10 features ideal for getting started with machine
ideal for getting started with machine learning algorithms it's one of the
learning algorithms it's one of the popular pyit learn toy data sets it's
popular pyit learn toy data sets it's probably where I've seen it before
probably where I've seen it before though it's not showing up there uh you
though it's not showing up there uh you scroll on down you can see the data uh
scroll on down you can see the data uh you notice that it's available AZ your
you notice that it's available AZ your notebooks data bricks and Azure synapse
notebooks data bricks and Azure synapse uh the thing is we have these values so
uh the thing is we have these values so age sex BMI BP and the Y is trying to
age sex BMI BP and the Y is trying to make a prediction it's trying to say
make a prediction it's trying to say what's the likelihood of you having
what's the likelihood of you having diabetes or not and so it's not a
diabetes or not and so it's not a Boolean value so it's not a binary
Boolean value so it's not a binary classifier it's kind of on a uh well I
classifier it's kind of on a uh well I guess you would be doing binary classif
guess you would be doing binary classif classification say do you have di
classification say do you have di diabetes or you can make a prediction to
diabetes or you can make a prediction to say what's the likelihood or this value
say what's the likelihood or this value if you gave another value in there but
if you gave another value in there but um anyway you this is the predicting
um anyway you this is the predicting value a lot of times this is X so
value a lot of times this is X so everything here is X and this is
everything here is X and this is considered y the actual prediction um so
considered y the actual prediction um so some sometimes it's why and sometimes
some sometimes it's why and sometimes it's actually named what it is uh but
it's actually named what it is uh but that's just what it is here so we'll
that's just what it is here so we'll close that off and so we'll choose the
close that off and so we'll choose the diabetes set and it will be data set
diabetes set and it will be data set one and so we'll worry about feedback
one and so we'll worry about feedback later so we'll click on Sample uh
later so we'll click on Sample uh diabetes we'll hit next and here it's
diabetes we'll hit next and here it's going to try to figure out uh what kind
going to try to figure out uh what kind of model that we want we have to create
of model that we want we have to create a new experiment it's a container to run
a new experiment it's a container to run the model in so we'll just say
the model in so we'll just say diabetes uh my diabetes it sounds a bit
diabetes uh my diabetes it sounds a bit odd but that's what it is the target
odd but that's what it is the target column we want to predict um is the
column we want to predict um is the train to predict is the Y It's usually
train to predict is the Y It's usually the Y um we don't have a compute cluster
the Y um we don't have a compute cluster so I'll go ahead and create a new
so I'll go ahead and create a new compute we have dedicator or low
compute we have dedicator or low priority technically we um it is low
priority technically we um it is low priority but I just want this done low
priority but I just want this done low priority but don't G to compute nodes
priority but don't G to compute nodes your job may be pre- emptied um I'm
your job may be pre- emptied um I'm going to stick with dedicated for the
going to stick with dedicated for the time being we're going to stick with
time being we're going to stick with CPU uh if we go with um this it does
CPU uh if we go with um this it does take about an hour to run so when I ran
take about an hour to run so when I ran this took about an hour so if you don't
this took about an hour so if you don't mind it's only going to cost you 15
mind it's only going to cost you 15 cents but if you want this done a lot
cents but if you want this done a lot sooner I'm going to try to do something
sooner I'm going to try to do something a little bit more powerful so I'm just
a little bit more powerful so I'm just trying to decide here because if it only
trying to decide here because if it only takes an
hour uh I might run it on something more powerful that's 90 cents that might be
powerful that's 90 cents that might be Overkill because it's not really deep
Overkill because it's not really deep learning uh it's just statistical
learning uh it's just statistical statistical stuff so try and large data
statistical stuff so try and large data set I wouldn't say it's large real time
set I wouldn't say it's large real time inference other latency sensitive
inference other latency sensitive ones
about why is this one I'm just looking here because this one's 29 this one's
here because this one's 29 this one's more expensive but it has 32 GB of RAM
more expensive but it has 32 GB of RAM this one is 28 oh 14 GB of RAM oh it's
this one is 28 oh 14 GB of RAM oh it's storage so this one's our highest in the
storage so this one's our highest in the tier again you can choose this one you
tier again you can choose this one you you just have to wait a a lot longer I
you just have to wait a a lot longer I just want to see if it finishes a lot
just want to see if it finishes a lot faster okay without having to go to the
faster okay without having to go to the GPU level because I don't think GPU is
GPU level because I don't think GPU is going to help too much here um the
going to help too much here um the computer name is uh my diabetes
machine minimum number nodes uh you want to provision if you want dedicated nodes
to provision if you want dedicated nodes to set the count here uh
to set the count here uh maximum I guess I just want one node
maximum I guess I just want one node right uh we will go ahead and oops uh
right uh we will go ahead and oops uh complete name be2 characters
long what doesn't is it too long okay there we
here yeah it's going to spin up the cluster so it does take a little bit
cluster so it does take a little bit time to start this so I'll see you back
time to start this so I'll see you back here when this is done
here when this is done okay great so after a short little wait
okay great so after a short little wait there it looks like uh our cluster is
there it looks like uh our cluster is running if we double check it here we
running if we double check it here we can go to compute I believe that shows
can go to compute I believe that shows up under here under the compute cluster
up under here under the compute cluster so there it is notice it's slightly
so there it is notice it's slightly different this one shows you
different this one shows you applications and this one is just size
applications and this one is just size and Etc we can click in here see nodes
and Etc we can click in here see nodes and run times we'll go make our way back
and run times we'll go make our way back here uh and we'll go ahead and hit next
here uh and we'll go ahead and hit next and notice that I think it actually will
and notice that I think it actually will select what it generally because it'll
select what it generally because it'll look at your prediction value maybe
look at your prediction value maybe sample a bit of it and say okay you
sample a bit of it and say okay you probably want a regression thing so to
probably want a regression thing so to predict a continuous numeric values so
predict a continuous numeric values so the thing is that if it was a label like
the thing is that if it was a label like text or if it was just zero and one it
text or if it was just zero and one it probably would choose classification
probably would choose classification because it's um you saw our our y value
because it's um you saw our our y value was like a number that was all over the
was like a number that was all over the place it thinks it's regression so I
place it thinks it's regression so I think that's a good indicator uh uh
think that's a good indicator uh uh there so let's go with
regression you know but you might want it as a binary classifier but uh yeah
it as a binary classifier but uh yeah it's another story there so it's uh as
it's another story there so it's uh as soon as we created it just started it
soon as we created it just started it didn't give us the option to say hey I
didn't give us the option to say hey I want to start running it notice on this
want to start running it notice on this here it's going to do featurization so
here it's going to do featurization so that means it's automatically going to
that means it's automatically going to select out features for us which is what
select out features for us which is what we wanted to do it set up to do
we wanted to do it set up to do regression uh we have some configuration
regression uh we have some configuration here so training time is 3 hours doesn't
here so training time is 3 hours doesn't mean it's going to train for three hours
mean it's going to train for three hours but that's I guess it's timeout for it
but that's I guess it's timeout for it um you could set a metric uh score
um you could set a metric uh score threshold so it has to meet at least
threshold so it has to meet at least this to be successful if it's not going
this to be successful if it's not going to do it it probably would quit out
to do it it probably would quit out early cross number Val or cross
early cross number Val or cross validations just make sure the data is
validations just make sure the data is good you can see blocked algorithm so
good you can see blocked algorithm so tensor flow DNN tensor flow L regression
tensor flow DNN tensor flow L regression if it was using NN so deep learning
if it was using NN so deep learning neural network I probably would have
neural network I probably would have chose the GPU to see if it would go
chose the GPU to see if it would go faster um look at the primary metric
faster um look at the primary metric it's normalized root square uh root mean
it's normalized root square uh root mean Square AED sometimes on the exam they'll
Square AED sometimes on the exam they'll actually ask you like what's the prim
actually ask you like what's the prim metric for this thing so it's good to uh
metric for this thing so it's good to uh take a look and see what they actually
take a look and see what they actually use for that I'll probably be sure to um
use for that I'll probably be sure to um highlight that stuff in the actual
highlight that stuff in the actual lecture content um but this will take
lecture content um but this will take some time to run uh we have data guard
some time to run uh we have data guard rails it will actually not populate I
rails it will actually not populate I guess until We've ran it so so we'll
guess until We've ran it so so we'll just let it run and I'll see you back
just let it run and I'll see you back here when it's done okay all right so
here when it's done okay all right so after a very very very long wait our
after a very very very long wait our automl job is done it took 60 minutes so
automl job is done it took 60 minutes so using a larger instance didn't save me
using a larger instance didn't save me any time I don't know if maybe if I ran
any time I don't know if maybe if I ran a GPU instance it would be a lot faster
a GPU instance it would be a lot faster I'd be very curious to try that out but
I'd be very curious to try that out but not something for uh uh this
not something for uh uh this certification course so we go into here
certification course so we go into here and yeah the cheaper instance was the
and yeah the cheaper instance was the same amount of time so it probably just
same amount of time so it probably just needs gpus it really depends on the type
needs gpus it really depends on the type of models it's running so we have a
of models it's running so we have a bunch of different algorithms in here it
bunch of different algorithms in here it ran uh about 42 different models I
ran uh about 42 different models I thought last time I ran it I saw a lot
thought last time I ran it I saw a lot more but you can see there's all kinds
more but you can see there's all kinds of models that it's running and then
of models that it's running and then it's going to choose the top candidate
it's going to choose the top candidate so it chose voting Ensemble so Ensemble
so it chose voting Ensemble so Ensemble is um uh we don't cover really in the
is um uh we don't cover really in the course because it's gets too much into
course because it's gets too much into ml but Ensemble is when you actually use
ml but Ensemble is when you actually use two different weaker models and combine
two different weaker models and combine the results in order to make a more uh
the results in order to make a more uh uh powerful uh ml model okay um so here
uh powerful uh ml model okay um so here we'll get some explanation I tried this
we'll get some explanation I tried this before and I didn't get really good
before and I didn't get really good information so if we go
information so if we go here uh so like I don't have anything
here uh so like I don't have anything under model performance so this tab
under model performance so this tab requires array of predicted values from
requires array of predicted values from the model to be supplied we didn't
the model to be supplied we didn't Supply any so we don't get any data
Supply any so we don't get any data Explorer so select a cohort of the data
Explorer so select a cohort of the data that all the data is is we have here um
that all the data is is we have here um so like here we were seeing
so like here we were seeing age and I guess it's just giving us an
age and I guess it's just giving us an indicator about the age information um
indicator about the age information um use the slider to show descending
use the slider to show descending feature important select up to three
feature important select up to three cohorts to see the feature important SL
cohorts to see the feature important SL by
by side
side okay so I guess S5 and BM I don't know
okay so I guess S5 and BM I don't know what S5 is we'd have to look up the data
what S5 is we'd have to look up the data set be BMI is your body mass index so
set be BMI is your body mass index so that's a clear indicator as to what
that's a clear indicator as to what affects whether you have diabetes or not
affects whether you have diabetes or not so that makes sense age doesn't seem to
so that makes sense age doesn't seem to be a huge factor which is kind of
be a huge factor which is kind of interesting individual feature
interesting individual feature importance we can go here and just kind
importance we can go here and just kind of like narrow in and say okay well why
of like narrow in and say okay well why is this outlier over here and they're
is this outlier over here and they're like age 79 right so that's kind of
like age 79 right so that's kind of interesting to see that information so
interesting to see that information so it does give you some uh explanation as
it does give you some uh explanation as to to you know why things are why they
to to you know why things are why they are um over here we have a little bit
are um over here we have a little bit more different data this is kind of
more different data this is kind of interesting model
interesting model performance uh I don't know what I'm
performance uh I don't know what I'm looking at but like here it's over mean
looking at but like here it's over mean squared so it's that uh mean squared
squared so it's that uh mean squared calculation there again
okay so yeah it's something right uh but anyway the point is is that uh that we
anyway the point is is that uh that we finally get metric so I guess we always
finally get metric so I guess we always had to click there because that makes
had to click there because that makes more sense um so yeah there's more
more sense um so yeah there's more values here sure data
values here sure data transformation uh illustrates the data
transformation uh illustrates the data processing feature engine scaling
processing feature engine scaling techniques and machine learning
techniques and machine learning algorithm automl so you know if you were
algorithm automl so you know if you were a real data scientist all this stuff
a real data scientist all this stuff would make sense to you um I think just
would make sense to you um I think just with time it'll it'll make sense but
with time it'll it'll make sense but even at this point I I'm not sure and I
even at this point I I'm not sure and I don't care about the model right if
don't care about the model right if you're building something for real I'm
you're building something for real I'm sure uh the information becomes a lot
sure uh the information becomes a lot more valuable so this model is done uh
more valuable so this model is done uh and the idea is that we can deploy oops
and the idea is that we can deploy oops if we go back to the
if we go back to the actual uh
actual uh models oh because we actually went into
models oh because we actually went into them e so we go back to the um autom ml
them e so we go back to the um autom ml here I think you can deploy any model
here I think you can deploy any model that you like so I think you go here and
that you like so I think you go here and deploy this like if you prefer a
deploy this like if you prefer a different model you could deploy it um
different model you could deploy it um if we go into Data guard rails we kind
if we go into Data guard rails we kind of skipped over that this is a way it
of skipped over that this is a way it does automatic featurization so it's
does automatic featurization so it's extracting up the feature so it how it
extracting up the feature so it how it handles the splitting how it handles
handles the splitting how it handles missing features high card anality is
missing features high card anality is like if you have too much data it might
like if you have too much data it might have to do dimensionality reduction so
have to do dimensionality reduction so that's just saying like hey if this is a
that's just saying like hey if this is a problem maybe we would do some
problem maybe we would do some pre-processing or stuff to make it
pre-processing or stuff to make it easier to work with the data so if we're
easier to work with the data so if we're happy with this we can go ahead and
happy with this we can go ahead and deploy it so let's say um
deploy it so let's say um deploy just say infer my
deploy just say infer my diabetes here we have AKs and E
diabetes here we have AKs and E uh um Azure container instance let's do
uh um Azure container instance let's do Azure kubernetes uh kubernetes service
Azure kubernetes uh kubernetes service cuz we did the other one here um say uh
cuz we did the other one here um say uh diabetes prodad
diabetes prodad maybe um AKs
diabetes oh compute name sorry um one of the inference ones okay so in
um one of the inference ones okay so in order to uh deploy this we would have to
order to uh deploy this we would have to create our pipeline I'm not sure if I
create our pipeline I'm not sure if I have enough in my quota here but let's
have enough in my quota here but let's go give it a go so I think what it's
go give it a go so I think what it's wanting is one of these here
wanting is one of these here uh I I think we'd want this wherever we
uh I I think we'd want this wherever we are right I'm not
are right I'm not sure where we are If This Is Us East or
sure where we are If This Is Us East or uh West here let's go
uh West here let's go check
check studio
studio um Azure machine
learning East usest no I never did this when I was um
usest no I never did this when I was um I just use usually Azure container
I just use usually Azure container instance but I'm just curious
instance but I'm just curious here say
here say next my uh
diabetes prod we
prod we will we need to choose some
nodes uh the number of nodes multiply by the virtual machine's number of cors
the virtual machine's number of cors must be greater or equal to 12
must be greater or equal to 12 okay no again if you're not confident
okay no again if you're not confident like if you're concerned about cost you
like if you're concerned about cost you can just again watch you don't have to
can just again watch you don't have to do right um this is again a uh
do right um this is again a uh fundamental certification it's not super
fundamental certification it's not super important to get all the hands-on
important to get all the hands-on experience
experience yourself um but I'm just trying to
yourself um but I'm just trying to explore this so we can see right because
explore this so we can see right because I I don't care about costs it's not a
I I don't care about costs it's not a big deal to me on my machine here uh so
big deal to me on my machine here uh so probably I don't
have Sy pool must use a VM SKU with more than two cores and four gigabytes well
than two cores and four gigabytes well what did I
what did I choose did I not choose the right
again oh I chose three yeah that's
three yeah that's fair
um uh what did it want 12 cores said before I
details because it already exists based on that name a
on that name a to it's given us all this trouble a this
to it's given us all this trouble a this one we'll go ahead and delete you think
one we'll go ahead and delete you think like it wouldn't matter like I wouldn't
like it wouldn't matter like I wouldn't have to delete it out but that's
have to delete it out but that's fine this one failed now what's the
fine this one failed now what's the problem quota exceeded so I can't do it
problem quota exceeded so I can't do it because I don't I'd have to go make a
because I don't I'd have to go make a support request in reset so it's not a
support request in reset so it's not a real big deal um I guess what we could
real big deal um I guess what we could do is instead of doing it on AKs we
do is instead of doing it on AKs we could just deploy to container instance
could just deploy to container instance if it'll let us um notice I don't have
if it'll let us um notice I don't have to fill anything additional in it'll
to fill anything additional in it'll just deploy I
think great uh and so I guess we'll let that deploy and I'll see you back here
that deploy and I'll see you back here in a bit okay all right so I'm back here
in a bit okay all right so I'm back here checking on out on my or checking up on
checking on out on my or checking up on my automl here so we go over to Compu
my automl here so we go over to Compu cute we go to inference clusters we
cute we go to inference clusters we don't have anything under there if we go
don't have anything under there if we go uh over to our
uh over to our experiments under our diabetes
experiments under our diabetes here because we did choose to deploy the
deploy so it should have created an ACI instance let's make our way over to the
instance let's make our way over to the portal the reason why it might not be sh
portal the reason why it might not be sh up is because I'm just running out of
up is because I'm just running out of compute because again it's a quota thing
compute because again it's a quota thing um it's not a big deal for us to get a
um it's not a big deal for us to get a deploy it's not like we're going to do
deploy it's not like we're going to do anything with it but uh yeah so we can
anything with it but uh yeah so we can see that we have a container over here
see that we have a container over here and it's
and it's running so we must be able to uh see if
running so we must be able to uh see if we go to endpoints here ah here it is
we go to endpoints here ah here it is right I was under models that's my
right I was under models that's my problem uh so pipeline endpoints that
problem uh so pipeline endpoints that would be something I I think that if we
would be something I I think that if we had deployed our designer I thought we
had deployed our designer I thought we would have saw it under there but here
would have saw it under there but here we have our binary pipeline or our
we have our binary pipeline or our diabetes prod pipeline so if we wanted
diabetes prod pipeline so if we wanted to like test data you know we could pass
to like test data you know we could pass stuff in here um I think if we wanted to
stuff in here um I think if we wanted to kind of just like see this in action I'm
kind of just like see this in action I'm not sure if it's going to work but we'll
not sure if it's going to work but we'll give it a go so if we go into our sample
give it a go so if we go into our sample diabetes data set and we just explore
diabetes data set and we just explore some of the data we should be able to
some of the data we should be able to kind of Select out some values because I
kind of Select out some values because I I don't know what these values mean so
I don't know what these values mean so let's just say like
let's just say like 36 oops 36 but we already know that BMI
36 oops 36 but we already know that BMI is the major factor here uh sex is
is the major factor here uh sex is either one or two so we'll say two BMI
either one or two so we'll say two BMI will say
will say 25.3 the BP will be
25.3 the BP will be 83 or whatever oops 83
five [Music]
[Music] 5.1 oh we only we're running out of
5.1 oh we only we're running out of metrics here uh
metrics here uh 82 wonder why it doesn't give us all of
82 wonder why it doesn't give us all of them oh I guess it does it's up to
them oh I guess it does it's up to six okay so let's go ahead and test that
six okay so let's go ahead and test that see what we get and we got a result back
see what we get and we got a result back 168 so uh that is uh autom ml all
168 so uh that is uh autom ml all complete there for
complete there for you um yeah so there you
you um yeah so there you [Music]
[Music] go all right so let's take a look here
go all right so let's take a look here at the uh visual designer because it's a
at the uh visual designer because it's a great way to get started very
great way to get started very easily uh with uh if you don't know what
easily uh with uh if you don't know what you're doing and you want something a
you're doing and you want something a little bit more advanced than automl and
little bit more advanced than automl and have some customization it's great to
have some customization it's great to start with one of these samples let's go
start with one of these samples let's go ahead and expand it and see what we have
ahead and expand it and see what we have here we have binary classification with
here we have binary classification with custom python script uh TB parameters
custom python script uh TB parameters for binary
for binary classification uh multiclass multi class
classification uh multiclass multi class classification so letter recognition
classification so letter recognition text classification all sorts of things
text classification all sorts of things usually binary classification
usually binary classification classification is pretty easy I'm
classification is pretty easy I'm looking for one that is pretty darn
looking for one that is pretty darn simple uh let's go take a look here so
simple uh let's go take a look here so this says this sample shows how to
this says this sample shows how to filter base features selection to
filter base features selection to selection
selection features um binary classification so how
features um binary classification so how to predictors related to customer
to predictors related to customer relationships using binary classes how
relationships using binary classes how to handle imbalance data sets using smot
to handle imbalance data sets using smot and modules I'm not really worried about
and modules I'm not really worried about balancing uh customized python script to
balancing uh customized python script to perform cost sensitive binary
perform cost sensitive binary classification tune parameters so you
classification tune parameters so you tune model parameters best models during
tune model parameters best models during the training process let's go with this
the training process let's go with this one this one seems okay to me um and so
one this one seems okay to me um and so what you can see here is that it's using
what you can see here is that it's using a sample data set I believe I think this
a sample data set I believe I think this is a sample and if you wanted to see all
is a sample and if you wanted to see all of them you could literally drag them
of them you could literally drag them out here and do things with them uh I
out here and do things with them uh I haven't actually uh built one uh end to
haven't actually uh built one uh end to end yet for uh for this again I don't
end yet for uh for this again I don't think it's like super important for uh
think it's like super important for uh this level exam but uh this just shows
this level exam but uh this just shows you that there's a pre-built one if
you that there's a pre-built one if you've start to get the handle of ml you
you've start to get the handle of ml you know the full pipeline this isn't too
know the full pipeline this isn't too confusing so at the beginning here we
confusing so at the beginning here we have our classification data and then
have our classification data and then what it's going to do is say select
what it's going to do is say select columns in the data set so it says
columns in the data set so it says exclude column names work class
exclude column names work class occupation native country so it's doing
occupation native country so it's doing some pre-processing excluding that data
some pre-processing excluding that data might be interesting to go look at that
might be interesting to go look at that data set so if we go over to our data
data set so if we go over to our data sets tab it should show up here I
sets tab it should show up here I believe maybe because we haven't um uh
believe maybe because we haven't um uh uh committed or submitted this we we
uh committed or submitted this we we can't see that data set yet but we'll
can't see that data set yet but we'll look at it for a moment then we want to
look at it for a moment then we want to clean our data so here it's saying clean
clean our data so here it's saying clean all the columns so uh custom
all the columns so uh custom substitution
substitution value see if we can see what it's
value see if we can see what it's substituting
out uh it's not saying what so clean missing
missing data so I'm not sure what it's cleaning
data so I'm not sure what it's cleaning out there
but because I would suggest that it's using some kind of custom script um I'm
using some kind of custom script um I'm not sure where it is but that's okay we
not sure where it is but that's okay we have split data pretty common to split
have split data pretty common to split your data so you would have a training
your data so you would have a training and test data set uh it's usually really
and test data set uh it's usually really good to randomize it so you want to
good to randomize it so you want to randomize it then split it um and that's
randomize it then split it um and that's that's just so you get better results
that's just so you get better results then it has model hyperparameter tuning
then it has model hyperparameter tuning so the idea is that it's going to use ml
so the idea is that it's going to use ml to figure out the uh the best um
to figure out the uh the best um parameters for tuning over here we have
parameters for tuning over here we have the two class decision tree where it's
the two class decision tree where it's going to do some work there it's going
going to do some work there it's going to score our model and then it's going
to score our model and then it's going to evaluate our model and see if it's
to evaluate our model and see if it's successful so this is all set up to go
successful so this is all set up to go so all we got to do is go to the top
so all we got to do is go to the top here there's a setting wheel here and we
here there's a setting wheel here and we need to choose some type of compute so
need to choose some type of compute so I'm going to go here and we have this
I'm going to go here and we have this one here but I'm going to go create it's
one here but I'm going to go create it's for my um my diabetes one but I'm going
for my um my diabetes one but I'm going to go ahead and make a new one and we're
to go ahead and make a new one and we're going to say
going to say um uh uh we recommend using a predefined
um uh uh we recommend using a predefined configuration to quickly set up compute
configuration to quickly set up compute training this
training this one looks okay I don't know if it needs
one looks okay I don't know if it needs two nodes but uh I guess we can do this
two nodes but uh I guess we can do this one so we'll just say binary we'll just
one so we'll just say binary we'll just say binary
say binary pipeline
pipeline okay say save hopefully it's making a
okay say save hopefully it's making a good suggestion and we'll have to wait
good suggestion and we'll have to wait for that to spin up it's going to take a
for that to spin up it's going to take a little bit of time okay so I'll see you
little bit of time okay so I'll see you back here in a moment all right so I got
back here in a moment all right so I got message saying that that is ready so
message saying that that is ready so what we can do I think it was here my
what we can do I think it was here my notebook instance no that's not it but I
notebook instance no that's not it but I I definitely saw a popup on my screen uh
I definitely saw a popup on my screen uh uh you might have saw it too you'd have
uh you might have saw it too you'd have to be paying close attention for that
to be paying close attention for that but if you go over um it says that it's
but if you go over um it says that it's it's ready to go so what I'm going to do
it's ready to go so what I'm going to do is make my way back over here we're
is make my way back over here we're going to select our compute there is our
going to select our compute there is our binary pipeline I'm going to select that
binary pipeline I'm going to select that and there are some other options we're
and there are some other options we're not going to fill around with that we're
not going to fill around with that we're going to go ahead and hit submit so we
going to go ahead and hit submit so we need a new experiment so I'm going to
need a new experiment so I'm going to just say um binary
just say um binary pipeline we'll hit
submit okay and so this is now running so after a little while here we're going
so after a little while here we're going to start seeing these go green so this
to start seeing these go green so this is not started we'll give it a moment
is not started we'll give it a moment here just so we can see some kind of
here just so we can see some kind of animation and there it goes it's Off to
animation and there it goes it's Off to the Races there's not much to do here
the Races there's not much to do here this is going to take a while I don't
this is going to take a while I don't know I've have never ran this one in
know I've have never ran this one in particular so I don't know if it's an
particular so I don't know if it's an hour or 30 minutes so I'll see you back
hour or 30 minutes so I'll see you back when it's done running U but yeah it's
when it's done running U but yeah it's it's not that fun to watch but it's cool
it's not that fun to watch but it's cool that you get a visual uh illustration a
that you get a visual uh illustration a so I'll see you back in a bit I just
so I'll see you back in a bit I just wanted to peek in here and take a look
wanted to peek in here and take a look at how it's progressing here and you can
at how it's progressing here and you can see it's still going and it's just uh
see it's still going and it's just uh cleaning the data it's still not done um
cleaning the data it's still not done um I'm not sure how long this has been
I'm not sure how long this has been running for if we go over to our
running for if we go over to our experiments and we go into our I think
experiments and we go into our I think it's binary Pipeline and we look at the
it's binary Pipeline and we look at the run time we're about 8 minutes in and it
run time we're about 8 minutes in and it hasn't done a whole lot so it's still
hasn't done a whole lot so it's still cleaning the data I would have thought
cleaning the data I would have thought it be a little bit faster I'm kind of
it be a little bit faster I'm kind of used to using like AWS and it goes um
used to using like AWS and it goes um Sage makers uh this doesn't usually take
Sage makers uh this doesn't usually take this long um but I mean it's nice that
this long um but I mean it's nice that it's it's going here but uh yeah so
it's it's going here but uh yeah so we're almost out of the pre-processing
we're almost out of the pre-processing phase we'll be on to the uh the model
phase we'll be on to the uh the model tuning
tuning okay all right so after waiting a little
okay all right so after waiting a little while looks like our pipeline is done uh
while looks like our pipeline is done uh so if we make our way over to
so if we make our way over to experiments and go to Binary pipeline we
experiments and go to Binary pipeline we can see that it took 14 minutes and 22
can see that it took 14 minutes and 22 seconds
seconds uh we can go here and just see some uh
uh we can go here and just see some uh additional information there's nothing
additional information there's nothing really else to see we saw all the steps
really else to see we saw all the steps already ran so you can see them all here
already ran so you can see them all here uh okay and so let's say we wanted to
uh okay and so let's say we wanted to there's nothing under metrics but um
there's nothing under metrics but um enable metrics log data points compare
enable metrics log data points compare these did within across runs we only did
these did within across runs we only did a single run so there's nothing to
a single run so there's nothing to compare so let's say we we're happy with
compare so let's say we we're happy with this and we want to deploy this model
this and we want to deploy this model well what I'm going to do is go back to
well what I'm going to do is go back to the designer uh click back here and so
the designer uh click back here and so now in the top right corner we can
now in the top right corner we can create our inference pipeline so um I
create our inference pipeline so um I can't remember if submits going to run
can't remember if submits going to run it I don't want to run it again um I
it I don't want to run it again um I just want to go ahead and create
just want to go ahead and create ourselves a realtime or batch pipeline
ourselves a realtime or batch pipeline we's say real time pipeline
we's say real time pipeline here and what this will do is it will
here and what this will do is it will actually create a completely different
actually create a completely different pipeline so here's a completely new one
pipeline so here's a completely new one uh but it's specifically designed to do
uh but it's specifically designed to do uh deployment okay so this is now one
uh deployment okay so this is now one was for training the model this one is
was for training the model this one is actually for uh uh taking in data and
actually for uh uh taking in data and doing inference okay so what we can do
doing inference okay so what we can do is we can go ahead and uh just submit
is we can go ahead and uh just submit this and so we'll put this under our
this and so we'll put this under our binary pipeline here we'll go ahead and
binary pipeline here we'll go ahead and hit
submit and I believe that we need a different kind of compute here I'm
different kind of compute here I'm surprised that it's even
surprised that it's even running um no I guess it has a compute
running um no I guess it has a compute there so it's going to run and once it
there so it's going to run and once it uh finishes running then I believe that
uh finishes running then I believe that we we can go ahead head and um uh uh
we we can go ahead head and um uh uh deploy it okay so let's just wait for
deploy it okay so let's just wait for that to finish all right all right so
that to finish all right all right so after a little while there We've ran our
after a little while there We've ran our inference Pipeline and so uh it's
inference Pipeline and so uh it's definitely something that is ready for
definitely something that is ready for use the idea is that when we actually
use the idea is that when we actually it's going to go through this web
it's going to go through this web service input to this web service output
service input to this web service output but uh not so important at this level uh
but uh not so important at this level uh of certification let's see what it looks
of certification let's see what it looks like to to go ahead and deploy it so we
like to to go ahead and deploy it so we have we have the option between a
have we have the option between a real-time endpoint and an existing
real-time endpoint and an existing endpoint
endpoint uh we don't have an endpoint yet so
uh we don't have an endpoint yet so we'll just say uh binary
we'll just say uh binary pipeline okay and notice we have the
pipeline okay and notice we have the option between oh it just it wants to
option between oh it just it wants to lowercase binary
lowercase binary Pipeline and we have the option between
Pipeline and we have the option between Azure kubernetes service and add your
Azure kubernetes service and add your container instance um it's a lot easier
container instance um it's a lot easier to deploy I think to container instance
to deploy I think to container instance so because and we'll be waiting forever
so because and we'll be waiting forever for kubernetes to start up so we're
for kubernetes to start up so we're going to do container instance uh we
going to do container instance uh we have some options like SSL and things
have some options like SSL and things like that not too worried about it so so
like that not too worried about it so so we're just going to go ahead and hit
we're just going to go ahead and hit deploy
deploy okay and so that is going to go ahead
okay and so that is going to go ahead and deploy that um so we'll wait for
and deploy that um so we'll wait for this real time inference if we go over
this real time inference if we go over to our
to our compute uh it should spin up so this is
compute uh it should spin up so this is for a uh AKs so I don't know if it will
for a uh AKs so I don't know if it will show up here I think only I've seen
show up here I think only I've seen things under here but I think this will
things under here but I think this will be for Azure kubernetes service and I
be for Azure kubernetes service and I don't think we're going to see it show
don't think we're going to see it show up under there uh however um we do not
up under there uh however um we do not need to be running this anymore so we'll
need to be running this anymore so we'll go ahead and delete the binary pipeline
go ahead and delete the binary pipeline because we're not uh we don't have it
because we're not uh we don't have it for any use right now and we might need
for any use right now and we might need to free it up for something else okay so
to free it up for something else okay so go ahead and delete it we don't need
go ahead and delete it we don't need it and uh coming back to our
it and uh coming back to our pipeline our designer here I'm just
pipeline our designer here I'm just trying to see where we can keep track of
trying to see where we can keep track of it
it um well I know it it's deploying SO
um well I know it it's deploying SO waiting for Real Time endpoint so I'll
waiting for Real Time endpoint so I'll see you back here when this is done okay
see you back here when this is done okay just takes a little bit of time all
just takes a little bit of time all right so I think our pipeline is done if
right so I think our pipeline is done if we make our way over to endpoints there
we make our way over to endpoints there it is the binary pipeline if we wanted
it is the binary pipeline if we wanted to go ahead there we could test the
to go ahead there we could test the data um and so it actually already has
data um and so it actually already has some pre-loaded data for us if we hit
some pre-loaded data for us if we hit test it's nice that it fills it in
test it's nice that it fills it in E uh we get some results back okay so I
E uh we get some results back okay so I mean and then we see like scored label
mean and then we see like scored label and income and score probability so
and income and score probability so things like that uh that is um useful so
things like that uh that is um useful so it's giving back all all the results but
it's giving back all all the results but I don't think it has yeah it doesn't
I don't think it has yeah it doesn't have scored labels and scored
have scored labels and scored probabilities which is the value we want
probabilities which is the value we want come to come back here so there are end
come to come back here so there are end points and that is the end of um our
points and that is the end of um our Exploration with designer
Exploration with designer [Music]
[Music] okay all right so let's take a look at
okay all right so let's take a look at what it would be to actually train a job
what it would be to actually train a job programmatically uh through the notebook
programmatically uh through the notebook so remember we saw these samples over
so remember we saw these samples over here and so we saw this image
here and so we saw this image classification mnist and this is a very
classification mnist and this is a very popular data set for doing uh computer
popular data set for doing uh computer vision and these are really great if you
vision and these are really great if you want to really learn you should really
want to really learn you should really go through these and just um uh uh read
go through these and just um uh uh read through them because they're they're
through them because they're they're probably very very useful uh I've done a
probably very very useful uh I've done a lot of this before so for me it's it's
lot of this before so for me it's it's just it's not too hard to figure out but
just it's not too hard to figure out but I've actually never ran this one so
I've actually never ran this one so let's run it together again we want to
let's run it together again we want to be in uh Jupiter lab so you can go here
be in uh Jupiter lab so you can go here and click it there or go to the compute
and click it there or go to the compute if it's being a bit finicky and just
if it's being a bit finicky and just here we'll get a tab open here and we'll
here we'll get a tab open here and we'll see how this goes so what I want to do
see how this goes so what I want to do and uh is just make sure we're back here
and uh is just make sure we're back here I'm going to click into this
I'm going to click into this one and uh we have a few so there's part
one and uh we have a few so there's part one and then we have the deploy stage so
one and then we have the deploy stage so let's look at training I don't know if
let's look at training I don't know if we really need to deploy but we'll give
we really need to deploy but we'll give it a read here so in this tutorial you
it a read here so in this tutorial you train ml model on a computer resource
train ml model on a computer resource resources you'll be training and uh
resources you'll be training and uh training and deployment workflow via the
training and deployment workflow via the Azure machine learning service in a
Azure machine learning service in a notebook there's two parts to this this
notebook there's two parts to this this is using the mnus data set and scikit
is using the mnus data set and scikit learn and with Azure machine learning
learn and with Azure machine learning proba the SDK it's a popular data set
proba the SDK it's a popular data set with 70,000 grayscale images each image
with 70,000 grayscale images each image is handwritten digits of 28 times by 28
is handwritten digits of 28 times by 28 times pixels representing numbers from 0
times pixels representing numbers from 0 to 9 the goal is to create multiclass
to 9 the goal is to create multiclass classifier to define the digits in a
classifier to define the digits in a given image that represents so we're
given image that represents so we're going to learn a few things here but
going to learn a few things here but let's just jump into it uh so the first
let's just jump into it uh so the first thing is that we need to import our
thing is that we need to import our packages so here uh it does that map PL
packages so here uh it does that map PL plot lib inlines just make sure that
plot lib inlines just make sure that when we print things that we visually
when we print things that we visually see them we're going to need numpy and
see them we're going to need numpy and then mat plod lib itself the Azure ml
then mat plod lib itself the Azure ml core uh and then we're going to import a
core uh and then we're going to import a workspace since we'll need one there and
workspace since we'll need one there and uh then I guess it just checks the
uh then I guess it just checks the version making sure if we have the right
version making sure if we have the right version here okay so this is 1.28 z it's
version here okay so this is 1.28 z it's pretty common even this an AWS they'll
pretty common even this an AWS they'll have like a script in here to update it
have like a script in here to update it in case it is out of date I'm surprised
in case it is out of date I'm surprised it didn't include it in here but that's
it didn't include it in here but that's okay we'll scroll on down and by the way
okay we'll scroll on down and by the way we're using python 3.6 Azure ml uh if
we're using python 3.6 Azure ml uh if this is the future you know they might
this is the future you know they might retire the old one you're using 3.8 but
retire the old one you're using 3.8 but you know it should generally work if
you know it should generally work if it's in their sample data set I assume
it's in their sample data set I assume they try to maintain that okay so
they try to maintain that okay so connect to a workspace so create a
connect to a workspace so create a workspace object from an existing
workspace object from an existing workspace uh reads the file config.js so
workspace uh reads the file config.js so what we'll do is go run that I assume
what we'll do is go run that I assume it's kind of like a session and so here
it's kind of like a session and so here it says it's fig found our our
it says it's fig found our our workplace so really it's just it's not
workplace so really it's just it's not creating a workspace it's just returning
creating a workspace it's just returning the existing one so that we have it as a
the existing one so that we have it as a variable here create an experiment so uh
variable here create an experiment so uh that's pretty clear we saw experiments
that's pretty clear we saw experiments in the automl and the designer uh so
in the automl and the designer uh so we'll just hit run
we'll just hit run there okay so we named it cor ML and we
there okay so we named it cor ML and we said experiment I wonder if it actually
said experiment I wonder if it actually created one yet let's go over to
created one yet let's go over to experiment and see if it's there so it
experiment and see if it's there so it is there cool that was fast I thought it
is there cool that was fast I thought it would like print something out but it
would like print something out but it didn't do anything there uh so creator
didn't do anything there uh so creator attach an existing compute resource by
attach an existing compute resource by using Azure machine compute a manage
using Azure machine compute a manage service data scientist etc etc yada yada
service data scientist etc etc yada yada yada so create a a compute uh uh
yada so create a a compute uh uh creation of a compute takes about five
creation of a compute takes about five minutes so let's see what it's trying to
minutes so let's see what it's trying to create so we have some environment
create so we have some environment variables that it wants to load in I'm
variables that it wants to load in I'm not sure how these are getting in
not sure how these are getting in here um I'm not sure where environment
here um I'm not sure where environment variables are set in
variables are set in um uh Jupiter or even how they get
um uh Jupiter or even how they get feeded in but apparently they're
feeded in but apparently they're somewhere but we have it doesn't matter
somewhere but we have it doesn't matter because these are defaulting so here it
because these are defaulting so here it says CPU
says CPU cluster uh zero and four it's going to
cluster uh zero and four it's going to use a standard D2 V2 that is the
use a standard D2 V2 that is the cheapest one that we can run um I kind
cheapest one that we can run um I kind of want something a little bit more
of want something a little bit more powerful just for myself uh just cuz I
powerful just for myself uh just cuz I want this to be done a lot sooner but
want this to be done a lot sooner but again you know if you're don't have a
again you know if you're don't have a lot of money just stick with what's
lot of money just stick with what's there okay so and this is CPU cluster so
there okay so and this is CPU cluster so if we go here I just want to see what
if we go here I just want to see what her options
her options are
are um
here you don't have enough quota for the following VM sizes so it probably it's
following VM sizes so it probably it's because I'm running more than one VM
because I'm running more than one VM right
now yes I've s I've hit my quota okay so like I probably have to
quota okay so like I probably have to request for more um so I think this is
request for more um so I think this is the
the one I'm
one I'm using
using what's the difference here this standard
what's the difference here this standard dv2
vcpus it's the same one right so request quote to increase I don't know if this
quote to increase I don't know if this is instant or not I'd have to make a
is instant or not I'd have to make a support ticket oh that's going to take
support ticket oh that's going to take too long so the thing is is that uh
too long so the thing is is that uh because the reason is is that I'm
because the reason is is that I'm running the autom ML and the design and
running the autom ML and the design and the uh designer in the background here
the uh designer in the background here trying to create all the workshops or
trying to create all the workshops or the uh uh the follow alongs at the same
the uh uh the follow alongs at the same time but what I'll do is I'll just come
time but what I'll do is I'll just come back and when I'm not running one of
back and when I'm not running one of those other ones then I will uh I'll
those other ones then I will uh I'll come back here and continue on but uh
come back here and continue on but uh we're just here at the step we want to
we're just here at the step we want to create a a new uh compute okay all right
create a a new uh compute okay all right so I'm back and I freed up uh one of my
so I'm back and I freed up uh one of my compute instances if I go over here now
compute instances if I go over here now I just have uh the one uh cluster
I just have uh the one uh cluster instance for my uh automl but what we'll
instance for my uh automl but what we'll do here is again just read through this
do here is again just read through this so this will create a CPU cluster 0 to
so this will create a CPU cluster 0 to four nodes um standard D2 V2 I guess
four nodes um standard D2 V2 I guess we'll just stick with what what is here
we'll just stick with what what is here um just reading through here look look
um just reading through here look look like it tries to find the compute Target
like it tries to find the compute Target it's going to provision it it will
it's going to provision it it will create the cluster call Pool for minimum
create the cluster call Pool for minimum numbers of nodes for specific time so
numbers of nodes for specific time so wait for completion so we'll go ahead
wait for completion so we'll go ahead and hit play and so that's going to go
and hit play and so that's going to go and create us a new cluster so we're
and create us a new cluster so we're just going to have to wait a little
just going to have to wait a little while here for it to create about 5
while here for it to create about 5 minutes and I'll see you back here in a
minutes and I'll see you back here in a moment all right so uh the cluster
moment all right so uh the cluster started up if we go back over here we
started up if we go back over here we can see that it's confirmed I don't know
can see that it's confirmed I don't know why it uh was so quick but uh it went
why it uh was so quick but uh it went pretty quick there so we're on the next
pretty quick there so we're on the next section here explore the data so
section here explore the data so download the mnist data set display some
download the mnist data set display some sample images so it's just talking about
sample images so it's just talking about it being the open data set the code
it being the open data set the code retrieves in the file data set object
retrieves in the file data set object which is a subass of data set file data
which is a subass of data set file data set references a single or multiple
set references a single or multiple files of any format in your data store
files of any format in your data store the class provides you with the ability
the class provides you with the ability to download or amount files to your
to download or amount files to your computer by creating a reference to the
computer by creating a reference to the data source location Additionally you
data source location Additionally you register the data set to your workspace
register the data set to your workspace for easy retrieval during training
for easy retrieval during training there's a bit more how-tos but we'll
there's a bit more how-tos but we'll give it good read here so we have the
give it good read here so we have the open data set mnist it's kind of nice
open data set mnist it's kind of nice that they have that reference there uh
that they have that reference there uh so we have a data folder we make the
so we have a data folder we make the directory we are getting the data set we
directory we are getting the data set we download it and then we are registering
download it and then we are registering it so let's go ahead and run that not
it so let's go ahead and run that not sure how fast that is shouldn't take too
sure how fast that is shouldn't take too long as it's running we'll go over here
long as it's running we'll go over here the left hand side refresh and we'll see
the left hand side refresh and we'll see if it
if it appears
appears um uh not as of yet there it
um uh not as of yet there it is go into here maybe explore the data
is go into here maybe explore the data I'm not sure how would look like because
I'm not sure how would look like because these are all images right yeah so
these are all images right yeah so they're in ubite gz so they're in
they're in ubite gz so they're in compressed files we're not going to be
compressed files we're not going to be able to see within them but they're
able to see within them but they're definitely there we know they're there
definitely there we know they're there so that that is now registered into our
so that that is now registered into our our data set uh display some sample
our data set uh display some sample images so load the compressed into a
images so load the compressed into a files into numpy then use map plot lib
files into numpy then use map plot lib plot 30 random images from the data set
plot 30 random images from the data set from above not the step requires load
from above not the step requires load data function it's included in the utils
data function it's included in the utils pie this file is included in the sample
pie this file is included in the sample folder we have it over here we just
folder we have it over here we just double click very simple file the load
double click very simple file the load data and we'll go ahead and run
data and we'll go ahead and run that and it's
that and it's pretty pretty simple here uh so load
pretty pretty simple here uh so load data X train X test it are we setting up
data X train X test it are we setting up our training and testing data here it
our training and testing data here it kind of looks like it because it says
kind of looks like it because it says train and test data that's when we
train and test data that's when we usually see that kind of
usually see that kind of split um and again it's doing a random
split um and again it's doing a random split so that sounds pretty good to me
split so that sounds pretty good to me uh let's show some randomly chosen
uh let's show some randomly chosen images yeah so I guess they do set up
images yeah so I guess they do set up the training data here and then down
the training data here and then down below we're actually showing the images
below we're actually showing the images so here's some random images train on a
so here's some random images train on a remote cluster so for this task you
remote cluster so for this task you submit the job to run on the remote
submit the job to run on the remote training cluster to set up earlier
training cluster to set up earlier submit your
submit your job um create the directory create a
job um create the directory create a training script create a script for run
training script create a script for run configuration submit the job so first we
configuration submit the job so first we will create our
will create our directory
directory um and notice it created this directory
um and notice it created this directory over
over here because I guess it's going to put
here because I guess it's going to put the training file in there and so this
the training file in there and so this will actually write to a training file
will actually write to a training file this makes uh quite a bit of sense so if
this makes uh quite a bit of sense so if we click into here it should now have a
we click into here it should now have a training file it'll just give it a quick
training file it'll just give it a quick read see what's going on here so a lot
read see what's going on here so a lot of times when you create these training
of times when you create these training files you have to do and this is the
files you have to do and this is the same if you're using AWS like when
same if you're using AWS like when you're creating tra like or sagemaker um
you're creating tra like or sagemaker um you create a train file because it's
you create a train file because it's part of Frameworks it's just how the
part of Frameworks it's just how the Frameworks work but you'll have uh these
Frameworks work but you'll have uh these arguments uh so it could be like
arguments uh so it could be like parameters to run for training um uh and
parameters to run for training um uh and there could be a whole sorts of ones
there could be a whole sorts of ones here here they're loading in the
here here they're loading in the training and testing data so it's the
training and testing data so it's the same stuff we saw earlier when we were
same stuff we saw earlier when we were just viewing the
just viewing the data um here it's doing a logistic
data um here it's doing a logistic regression it's using Li uh so linear
regression it's using Li uh so linear maybe linear learning model there it's
maybe linear learning model there it's doing
doing multiclass on that there and so what
multiclass on that there and so what it's going to do is fit so fit is
it's going to do is fit so fit is actually performing the training and
actually performing the training and then what it's going to do is make a
then what it's going to do is make a prediction on the test Set uh then it's
prediction on the test Set uh then it's going we're going to get accuracy so
going we're going to get accuracy so we're getting kind of a score so notice
we're getting kind of a score so notice that it's using accuracy uh as a
that it's using accuracy uh as a evaluation metric I suppose right and
evaluation metric I suppose right and then at the end we're going to dump the
then at the end we're going to dump the data a lot of times like you have to
data a lot of times like you have to save the model somewhere so they're
save the model somewhere so they're outputting the actual weights of the
outputting the actual weights of the neural network and all other stuff it's
neural network and all other stuff it's a plk file I don't know what that is but
a plk file I don't know what that is but if you're using like tensor flow you
if you're using like tensor flow you would use tensor flow serving at the end
would use tensor flow serving at the end of this a lot of times uh Frameworks
of this a lot of times uh Frameworks will like Pi P torch or tensor flow or
will like Pi P torch or tensor flow or mxnet they'll have a serving layer um
mxnet they'll have a serving layer um but uh since we're just using S kit
but uh since we're just using S kit learn which is very simple it's just
learn which is very simple it's just going to dump out uh that file into our
going to dump out uh that file into our outputs this is going to probably run a
outputs this is going to probably run a container so this outputs isn't going to
container so this outputs isn't going to necessarily be on um the outputs into
necessarily be on um the outputs into here it's more like the outputs of the
here it's more like the outputs of the container and um
container and um a lot of times the container will then
a lot of times the container will then place this somewhere so like it'll be
place this somewhere so like it'll be saved on The Container but it'll be
saved on The Container but it'll be passed out to the register or or
passed out to the register or or something like that like model registry
something like that like model registry so anyway we ran this and so that
so anyway we ran this and so that generated the file we don't want to keep
generated the file we don't want to keep on running this multiple times I
on running this multiple times I probably would just overwrite the file
probably would just overwrite the file so it's not a big deal here it says
so it's not a big deal here it says notice how the script gets saved in the
notice how the script gets saved in the data model so here it's saying the data
data model so here it's saying the data uh data folder I guess we didn't look at
uh data folder I guess we didn't look at that so we go top here um I didn't see
that so we go top here um I didn't see this is data
this is data folder was it wasn't really paying
folder was it wasn't really paying attention to where that
attention to where that was guess it looks like where more so
was guess it looks like where more so it's loading the data in so here it
it's loading the data in so here it saves the data outut anything written to
saves the data outut anything written to the strory is automatically uploaded to
the strory is automatically uploaded to your workspace so I guess that's just
your workspace so I guess that's just how it works so it probably will end up
how it works so it probably will end up in here then um so util pii reference
in here then um so util pii reference the training script to load the data set
the training script to load the data set correctly and copy the file over
correctly and copy the file over so um we will run this to copy the file
so um we will run this to copy the file over so I'm guessing did it put it into
over so I'm guessing did it put it into here I'm just yeah so just put it in
here I'm just yeah so just put it in there because when it actually uh
there because when it actually uh packages it for the container it's going
packages it for the container it's going to bring that file over because it's a
to bring that file over because it's a dependency
dependency so configure the training job so create
so configure the training job so create a script run config the directory that
a script run config the directory that contains the script the compute Target
contains the script the compute Target the training script training file Etc
the training script training file Etc sometimes like in other Frameworks
sometimes like in other Frameworks they'll just call them estimators but
they'll just call them estimators but here it's just called a script run
here it's just called a script run config
config so uh I'm just trying to see what it's
so uh I'm just trying to see what it's doing so sidekit learn is the dependency
doing so sidekit learn is the dependency okay sure we'll just hit
okay sure we'll just hit run okay and then down below here we
run okay and then down below here we have script run
have script run config so it looks like we're passing
config so it looks like we're passing our arguments so we're saying this is
our arguments so we're saying this is our data folder which is apparently here
our data folder which is apparently here we're mounting it and then we're setting
we're mounting it and then we're setting regularization to
regularization to 0.5 sometimes you'll pass inde
0.5 sometimes you'll pass inde dependencies in here as well I guess
dependencies in here as well I guess these are technically are our parameters
these are technically are our parameters that are getting configured up here at
that are getting configured up here at the top right but sometimes you'll have
the top right but sometimes you'll have dependencies if you're in uh including
dependencies if you're in uh including other files here uh and I guess that's
other files here uh and I guess that's up here right so see where it says
up here right so see where it says environment and then we're saying
environment and then we're saying include the Azure ml defaults and the
include the Azure ml defaults and the pyit learn and stuff like that and so
pyit learn and stuff like that and so then it gets passed in the EnV so that
then it gets passed in the EnV so that makes sense to me we haven't ran that
makes sense to me we haven't ran that yet because we don't see any number here
yet because we don't see any number here submit the job to the Clusters let's go
submit the job to the Clusters let's go ahead and do
ahead and do that
that so it says it returns a preparing or
so it says it returns a preparing or running State as soon as the job is
running State as soon as the job is completed so it's in a starting
State monitor remote run so in total the the first run takes 10 minutes but the
the first run takes 10 minutes but the second run uh is uh as long as the
second run uh is uh as long as the dependencies in Azure ml firment don't
dependencies in Azure ml firment don't change the same images reused and hence
change the same images reused and hence the start here start time is much faster
the start here start time is much faster here's what's happening while you wait
here's what's happening while you wait the image creation a Docker image is is
the image creation a Docker image is is created matching the python environment
created matching the python environment specified by the azl environment
specified by the azl environment the image is built and stored in the ACR
the image is built and stored in the ACR the Azure container registry associated
the Azure container registry associated with your workspace let's go take a look
with your workspace let's go take a look and see if that's the case because
and see if that's the case because sometimes like resources aren't visible
sometimes like resources aren't visible to you so I'm just curious do we
to you so I'm just curious do we actually see
actually see it
it okay and yep there it is okay so they
okay and yep there it is okay so they did not lie
did not lie um so associated with your workspace
um so associated with your workspace image creation uploading takes about 5
image creation uploading takes about 5 minutes this stage happens once for each
minutes this stage happens once for each python environment since the container's
python environment since the container's cach subsequent runs during image
cach subsequent runs during image creation logs are stem to the Run
creation logs are stem to the Run history you can monitor the image
history you can monitor the image creation Pro process using these logs
creation Pro process using these logs wherever those are if you if the remote
wherever those are if you if the remote cluster requires more nodes to execute
cluster requires more nodes to execute the Run than currently available
the Run than currently available additional nodes are out added
additional nodes are out added automatically scaling T typically takes
automatically scaling T typically takes about five minutes and I've seen this
about five minutes and I've seen this before where if you're in your compute
before where if you're in your compute here uh sometimes it'll just say like
here uh sometimes it'll just say like scaling because it's just not
scaling because it's just not enough so uh running into the stage the
enough so uh running into the stage the necessary Scripts and files are sent to
necessary Scripts and files are sent to the compute Target then the data stores
the compute Target then the data stores are amounted copied the entry script is
are amounted copied the entry script is run so entry script is actually the
run so entry script is actually the train.py file while the job is running
train.py file while the job is running SD out and the files is in the logs
SD out and the files is in the logs directory or stem to the Run history you
directory or stem to the Run history you can monitor the runs progress using
can monitor the runs progress using these
these logs the dot outputs directory of the
logs the dot outputs directory of the run is copied over to the Run history in
run is copied over to the Run history in your workspace so you can access these
your workspace so you can access these results you can check the progress of a
results you can check the progress of a running job in multiple ways this
running job in multiple ways this tutorial uses the Jupiter widget so
tutorial uses the Jupiter widget so looks like we can uh run this watch the
looks like we can uh run this watch the progress so maybe we'll run that and so
progress so maybe we'll run that and so it's actually showing us the progress
it's actually showing us the progress that's kind of cool I really like
that's kind of cool I really like that so it's just a little widget
that so it's just a little widget showing us all the things that it's
showing us all the things that it's doing let's go take a look and see what
doing let's go take a look and see what we can see under experiments and our run
we can see under experiments and our run pipeline because it was talking about
pipeline because it was talking about things like outputs and things like that
things like outputs and things like that so over here in the outputs and logs I'm
so over here in the outputs and logs I'm just
just curious is if this is the same
curious is if this is the same thing
I'm not sure if this uh is this Tails yeah it does tail it just moves so we
yeah it does tail it just moves so we can actually monitor it from here I
can actually monitor it from here I guess that's what it was talking
guess that's what it was talking about um so here we can see that it's
about um so here we can see that it's setting up Docker it's actually building
setting up Docker it's actually building a Docker
a Docker image and
image and then I'm not sure did it send it to I
then I'm not sure did it send it to I mean it's on ACR already I think it
mean it's on ACR already I think it looks like it's just still uh
looks like it's just still uh downloading extracting packages so maybe
downloading extracting packages so maybe it's actually running on the image now
it's actually running on the image now so we'll just wait there we pop back
so we'll just wait there we pop back over here you know we can see probably
over here you know we can see probably the same information is it identical
the same information is it identical yeah it
yeah it is so we're 3 minutes in uh it's
is so we're 3 minutes in uh it's probably not that fun to to watch it in
probably not that fun to to watch it in real time and and talk about it so let's
real time and and talk about it so let's just wait until it's done I'll see you
just wait until it's done I'll see you back then okay all right so I'm uh about
back then okay all right so I'm uh about 17 minutes in here I'm not seeing any
17 minutes in here I'm not seeing any more uh movement here so it could be
more uh movement here so it could be that it is done it does say if you run
that it is done it does say if you run this next step here it will wait for
this next step here it will wait for completion um
completion um specify show output to true for verbose
specify show output to true for verbose log so here actually did output a moment
log so here actually did output a moment ago so maybe it actually was done um but
ago so maybe it actually was done um but I just ran it twice so I'm not sure if
I just ran it twice so I'm not sure if that's going to cause me uh issues there
that's going to cause me uh issues there so because I can't run the next step
so because I can't run the next step unless I stop this um can I individually
unless I stop this um can I individually cancel this one
here uh I think I can just hit interrupt the kernel there there we
hit interrupt the kernel there there we go okay so I think that it's done okay
go okay so I think that it's done okay because it's 18 minutes in and I don't
because it's 18 minutes in and I don't see any more logging in here it's just
see any more logging in here it's just not very clear and also uh the logs we
not very clear and also uh the logs we just have a lot of stuff going on here
just have a lot of stuff going on here like it's just so much so you know if
like it's just so much so you know if we're keeping keeping Pace we probably
we're keeping keeping Pace we probably would have saw all these created yeah so
would have saw all these created yeah so another we just had a few more outputs
another we just had a few more outputs there but uh I think that it's done
okay it's just there's nothing definitively saying like done
definitively saying like done do you know what I'm saying and then up
do you know what I'm saying and then up here it doesn't say oh oh I guess it
here it doesn't say oh oh I guess it does say that it's done all right so
does say that it's done all right so yeah I just never ran it with this tool
yeah I just never ran it with this tool so I just don't know so I guess it does
so I just don't know so I guess it does definitively say that I already ran this
definitively say that I already ran this so we don't need to run that again I
so we don't need to run that again I just feel like we'll get stuck there so
just feel like we'll get stuck there so let's take a look at the
let's take a look at the metrics so regularization rate is 0.5
metrics so regularization rate is 0.5 accuracy is nine so N9 is pretty good
accuracy is nine so N9 is pretty good the last step is training the script
the last step is training the script wrote in the output uh uh s SK learn I
wrote in the output uh uh s SK learn I want to see if it's actually in our
want to see if it's actually in our environment
environment here I don't think it is so outputs is
here I don't think it is so outputs is somewhere it's in our workspace
somewhere it's in our workspace somewhere but it's just not uh we just
somewhere but it's just not uh we just don't oh it's right here okay so it
don't oh it's right here okay so it outputed the actual model right there um
outputed the actual model right there um and
and so you can see the associated files that
so you can see the associated files that are ran okay we'll run
are ran okay we'll run it register the work model in space so
it register the work model in space so you can work with other collaborators
you can work with other collaborators sure so if I click on that here and we
sure so if I click on that here and we go back over to our models it is now
go back over to our models it is now registered over here
registered over here okay and so we're done part one I don't
okay and so we're done part one I don't want to do all these other parts um
want to do all these other parts um training is enough as it is but let's
training is enough as it is but let's just take a look at the deploy stage
just take a look at the deploy stage okay so for
prerequisites uh we're setting up a workspace we have our we are loading our
workspace we have our we are loading our registered
registered model okay we register it we have to
model okay we register it we have to import packages we are going to
import packages we are going to um create scoring
um create scoring script deploy to an ACI model test the
script deploy to an ACI model test the model if you want to do this you can go
model if you want to do this you can go through all the steps it does talk about
through all the steps it does talk about a confusion Matrix and that is something
a confusion Matrix and that is something that can show up on the exam is actually
that can show up on the exam is actually talking about a confusion Matrix but we
talking about a confusion Matrix but we do cover that in lecture content so you
do cover that in lecture content so you generally understand what that is but um
generally understand what that is but um you know I'm just I'm too tired I don't
you know I'm just I'm too tired I don't want to run through all this there's not
want to run through all this there's not a whole lot of value other than reading
a whole lot of value other than reading reading through it yourself here um so I
reading through it yourself here um so I think we're all done here
think we're all done here [Music]
[Music] okay okay one service we uh forgot to
okay okay one service we uh forgot to check out was Data labeling so let's go
check out was Data labeling so let's go over there and give that a go so I'm
over there and give that a go so I'm going to go ahead and create ourselves a
going to go ahead and create ourselves a new project I say my labeling project
new project I say my labeling project and we can say whether we want to
and we can say whether we want to classify images or text um we have
classify images or text um we have multiclass multi label bounding box uh
multiclass multi label bounding box uh segmentation let's go with multic
segmentation let's go with multic class I'll go back here for a second um
class I'll go back here for a second um multic class
multic class whoops I I don't know if we uh create
whoops I I don't know if we uh create create data set but we could probably
create data set but we could probably upload some local
upload some local files uh let's say uh my St Trek Data
set it doesn't let us choose the image file type here be nice if these were
images going to tell us what here it's very finicky this input here
here it's very finicky this input here uh file dis set references a single or
uh file dis set references a single or multiple files in your public data store
multiple files in your public data store or private public L okay so we'll go
or private public L okay so we'll go next uh if we can upload files directly
next uh if we can upload files directly that'd be nice oo upload a folder I like
that'd be nice oo upload a folder I like that so what we'll do um is we do have
that so what we'll do um is we do have some images in the free uh AI here under
some images in the free uh AI here under cognitive Services
cognitive Services assets uh we
assets uh we have um we'll go back here and we'll
have um we'll go back here and we'll say I think objects would be the
say I think objects would be the easiest oh but we just want a folder
easiest oh but we just want a folder right so yeah we'll just take
right so yeah we'll just take objects yep we'll upload the 17
objects yep we'll upload the 17 files uh yep we'll just let it stick to
files uh yep we'll just let it stick to that path that seems fine to
me we will go ahead and create it and so now we have a data set there
it and so now we have a data set there we'll go ahead and select that data set
we'll go ahead and select that data set we'll say next your data set is
we'll say next your data set is periodically checked for new data points
periodically checked for new data points any data points will be added as tasks
any data points will be added as tasks it doesn't matter we're only doing this
it doesn't matter we're only doing this for test uh enter the list of labels so
for test uh enter the list of labels so we have um uh TNG
we have um uh TNG DS9 uh
DS9 uh Voyager toss that's the types of Star
Voyager toss that's the types of Star Trek
Trek episodes um label
episodes um label which
which um Star Trek
series the image is from say next I don't want enabled but you can
next I don't want enabled but you can have Auto uh enabled assistant labeler
have Auto uh enabled assistant labeler I'm going to say no we'll create the
project okay and I'll just wait for that to create and I'll see you back here in
to create and I'll see you back here in a moment okay all right so I'm back here
a moment okay all right so I'm back here actually didn't have to wait long I
actually didn't have to wait long I think it instantly runs I just assumed
think it instantly runs I just assumed like I was waiting for a state that says
like I was waiting for a state that says completed but it's not something we have
completed but it's not something we have to do so uh we have 0 out of 17 progress
to do so uh we have 0 out of 17 progress we're going to go in here we're going to
we're going to go in here we're going to go label some data we can view the
go label some data we can view the instructions it's not showing up here
instructions it's not showing up here but that's fine if we go to tasks we can
but that's fine if we go to tasks we can start labeling so what season is from or
start labeling so what season is from or Series this is Voyager we'll hit submit
Series this is Voyager we'll hit submit this is Voyager we'll hit submit this is
this is Voyager we'll hit submit this is toss we'll hit submit this is
toss we'll hit submit this is TNG this is
TNG this is TNG this is
TNG this is DS9
DS9 DS9
DS9 Voyager
Voyager Voyager uh
Voyager uh TNG
TNG DS9 you get the idea though and you got
DS9 you get the idea though and you got some options here like change the
some options here like change the contrast if someone can't see the photo
contrast if someone can't see the photo or rotate it this is
or rotate it this is Voyager
Voyager Voyager uh
Voyager uh TNG
TNG DS9 uh
DS9 uh Voyager
Voyager Voyager and we're done so we'll go back
Voyager and we're done so we'll go back to our labeling job here we'll see we
to our labeling job here we'll see we have the breakdown there uh and now our
have the breakdown there uh and now our data set is
data set is labeled we can export our data set CSV
labeled we can export our data set CSV Coco as your ml data set I believe that
Coco as your ml data set I believe that means it'll go back into the data sets
means it'll go back into the data sets over here which will make our Liv a
over here which will make our Liv a little bit easier we go back to data
little bit easier we go back to data labeling okay so you just granted people
labeling okay so you just granted people access to the studio they'd be able to
access to the studio they'd be able to just go in here and and jump into that
just go in here and and jump into that job okay uh if we go over to the data
job okay uh if we go over to the data set I believe we should have a labeled
set I believe we should have a labeled version of it now so my labeling project
version of it now so my labeling project so I believe that is the uh the labeled
so I believe that is the uh the labeled stuff here
stuff here right yeah so it's labeled so there you
right yeah so it's labeled so there you go we're all done aure machine learning
go we're all done aure machine learning uh and so all that's left is to do some
uh and so all that's left is to do some cleanup
[Music] okay so we're all done with Azure
okay so we're all done with Azure machine learning uh if we want to we can
machine learning uh if we want to we can go to our compute and just uh kill the
go to our compute and just uh kill the services we have here now if we go to
services we have here now if we go to the resource Group and delete everything
the resource Group and delete everything it'll it'll take all these things down
it'll it'll take all these things down anyway but I'm just going to go a bit
anyway but I'm just going to go a bit paranoid so I'm going to just manually
paranoid so I'm going to just manually do this
delete okay and so we'll go back to portal. azure.com
and uh I'm going to go to my resource groups and everything is contained it
groups and everything is contained it should be all contained within my studio
should be all contained within my studio just be sure to check these other ones
just be sure to check these other ones for that and we can see all the stuff
for that and we can see all the stuff that we spun up we'll go ahead and hit
that we spun up we'll go ahead and hit delete Resource Group um I don't know if
delete Resource Group um I don't know if it includes like because I don't see
it includes like because I don't see like container registry right so I know
like container registry right so I know like it puts stuff
like it puts stuff there I guess it does it says container
there I guess it does it says container registry so that's pretty much
registry so that's pretty much everything right
everything right and that'll take down everything so and
and that'll take down everything so and if you're paranoid all you can do is go
if you're paranoid all you can do is go to all resources and double check over
to all resources and double check over here because if there's anything running
here because if there's anything running it will show up here okay um but that's
it will show up here okay um but that's pretty much it and so just delete and
pretty much it and so just delete and we're all done
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.