Support Vector Machine (SVM) is a supervised machine learning algorithm used for classification and regression, aiming to find an optimal hyperplane with the maximum margin to separate data points into different classes.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
welcome back in this video I will
discuss support Vector machine in
machine learning also I will discuss
what are the different type of svm
algorithms are there and how svm
algorithm works
support Vector machine or svm is one of
the most popular supervised machine
learning algorithm which are used to
solve the classification as well as
regression kind of problems so in this
case first we need to understand what is
the supervised here supervisor means in
this case we need to give the label data
as an input to this particular algorithm
and what is classification
classification problems are those where
the target contains discrete number of
possibilities for example if you want to
solve a problem where we want to classify
classify
a male as a spam mail or a not spam in
this case we have only two possibilities
so such kind of problems are called as
classification problems
but if the target label contains
continuous values such kind of problems
are called as regression kind of
problems for example let us say that we
want to know the increase in salary of
an employee based on his performance in
this case the increase in salary is a
continuous in nature hence such kind of
problems are called as regression kind
of problems
usually is svm or support Vector machine
algorithms are used to solve
classification kind of problems in
machine learning
the goal of svm algorithm is to draw a
based line or a decision boundary to
segregate the given data set into
multiple number of classes once you draw
that particular addition boundary
based on that particular decision
boundary we should be able to classify
the new example into one of the classes
over here now this particular decision
boundary is called as hyperplane in
terms of svm algorithm
now the question comes in front of us is
how to draw this particular hyperplane
given a data set let us say that we have
been given this particular data set in
this case we have two classes one is
which is drawn in blue color and another
one is drawn in you can say that green color
color
now this particular data is linearly
separable we can draw a straight line
and then we can separate this particular
data into two classes but which one is
the best line over here or the best
Edition boundary because if you look at
this particular diagram I have drawn two
lines one with the red color another one
with green color over here between these
two which one I supposed to select so
what we need to do over here is we need
to draw all possible lines the one which
is having the best accuracy that
particular line we need to select that
will be called as the hyperplane over here
here
the dimension of this particle
hyperplane depends on the features
present in that particular data set for
example we have only two features let us
say that we have X and Y in such case we
will be able to uh draw a straight line
and then we can classify that particular
data so in such case uh the hyperplane
will be a straight line but if you have
more than two data features for example
we have three features or more than
three features in such cases we cannot
draw a straight line we need to draw
something called as a plane or something
like that that will be called as the
hyperplane in this case or a two
dimensional hyper pipe over here
but very important thing is we need to
draw a hyperplane in such a way that we
will get the maximum margin I will
discuss this particular maximum margin
in detail at the later stage
now once you draw that particular
hyperplane while drawing we used to
consider two things over here the one is
which is called as the nearest points on
both the sides okay so for example if
you go with this particular data set
here this particular point and this
particular points are nearest for this
particular hyperplane so these two are
called as the support vectors in this
particular case so for this one this one
is the nearest one and this is the
second nearest one this one and this one
will be called as a support Vector for
this particular hyper plane but based on
uh the maximum margin we will select
between these two things but end of the
day the one which is having the nearest
data point for that particular
hyperplane that will be called as the
support Vector over here
now coming back to the next part of our
discussion that is uh what are the
different type of svm are there there
are mainly two type of svm we have one
is called as a linear SPM another one is
called as non-linear svm the linear svm
is one where we can separate the data
with the help of a straight line for
example we have been given a data and we
can separate that particular data with
the help of a straight line such kind of
svm is called as linear svm if we are
unable to draw a straight line to divide
that particular data set into classes
such kind of svm is called as non-linear
svm for example uh I will consider one
example to understand this linear svm in
detail let us say that this is a data
set given to us in this case one class
of data is drawn with star color another
set of data is called as is drawn with
the circle over here
now this particular data can be
separated with the help of straight n
hence it is called as the linear svm
over here but the question comes in
front of us uh when we draw multiple
number of straight lines to separate
this particular data which particular
straight line should be considered as a
hyper plane over here so that can be
understand with the help of this
particular diagram what we do over here
is for example if you consider d as a
hyperplane for this one this one and
this one will become the support vectors
over here so we need to draw a parallel line
line
which will test this particular support
vectors and they are parallel to what
you can say that our hyper plane and
then this distance we need to calculate
so this distance you can say that it is
uh M1
similarly we have to do it for this
particular B also for this particular B
this will become one support vector and
this will become another support Vector
we need to draw a straight line over
here we need to draw a straight line
over here I have drawn it here correctly
and then we need to calculate this
particular margin over here now let us
say that this margin is equal into M2
now if you compare this M1 and M2 the
margin M1 is smaller compared to this M2
so we can say that between these two
this particular B is the best line which
will divide this particular data set
into two classes over here so B will be
considered as the hyperplane in this
particular case over here but it may not
be the case all the time because we need
to do it for all possible lines the one
which will give you the maximum margin
that should be considered over here the
calculation of this particular maximum
margin can be shown mathematically
something like this let us assume that
this is your hyperplane and this is the
first support vector and this is a
second support vector what uh what
actually happens over here is uh for
this particular hyperplane we will uh
write this particle equation that is W
into X plus b is equal to zero now when
I write something like this on this
particular side on one side uh for this
particular support Vector we will get W
into X plus b is equivalent to minus 1
on another side we will get plus one if
you go beyond this particular thing it
will be less than minus 1 if you go
behind this side it will be greater than
or equal into 1 over here because of
that from here to here we will get
distance 1 from here to here we will get
distance one for each and every what we
can say that uh the support vectors so
what we need to do over here is we need
to get an hyperplane where we will get
the maximum value for this particular
condition that is 2 divided by
cardinality of w if you get this
particular maximum value for that
particular W should be considered as you
can say that the final hyperplane over
here the next type of svm is called as
non-linear svm this non-linear svm is
one where we will not be able to draw a
straight line to divide this particular
data into two classes over here if you
look at this particular thing if I draw
anywhere a straight line I will not be
able to classify this particular data
into two classes here so what I need to
do is I need to convert this particular
data into what is that called as linear
data for that reason we can use
something called as a mapping functions
or something like that in this case I
have used one mapping function here that
is z is equal to x square plus y Square
so when you do this particular thing on
all these particular data points the
data points will look something like
this so all green will go and sit over
here all blue will go and sit over here
so what we need to do we need to apply
this particular function on the top of
this one we will get another axis that
is called as Z over here so x axis and Z
axis so for this particular data set
definitely we can apply the linear svm
over here so again we need to draw all
possible lines which will separate this
particular data into two classes and
then we need to identify the one
straight line which will give you the
maximum margin and then that we need to
consider here this is how the support
Vector Machine Works in machine learning
I hope you understood what is support
Vector machine what are the different
type of support Vector machines and how
support Vector Machine Works in machine learning
learning
I hope the concept is clear if you like
the video do like and share with your
friends press the Subscribe button for
more videos press the Bell icon for
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.