0:00 I did a project on Bangalore property
0:02 price prediction and in that project I
0:05 used AWS for model deployment. But this
0:08 time in this project we are going to use
0:10 Google Cloud, especially Google Cloud
0:13 function or GCF and we'll see how the
0:17 whole thing works. And let's get started
0:20 [Music]
0:24 At the beginning of the project in the
0:26 intro video, I said that I will be
0:28 converting the regular model
0:31 into tf light and then tf light will be
0:34 deployed on Google Cloud but I'm going
0:37 to change that plan a little bit
0:39 because
0:40 I think tf flight it's better if we
0:43 deploy to mobile application directly
0:46 and that
0:47 video I will do in the future if I find
0:49 the time. But for now
0:51 we will just deploy our regular model
0:54 directly to Google Cloud.
0:57 Our React Native app will call
1:00 that function using http, so that will be
1:03 the change in the architecture.
1:06 Let's look at installation requirements.
1:08 Go to this github page of course video
1:11 description has a link of that.
1:14 It has installation instructions for
1:17 Google Cloud. So follow this section
1:20 and here go to
1:22 the GCP account
1:25 when you click on this
1:27 it will ask you to
1:28 enter your
1:30 gmail ID. Now if you already have a GCP
1:33 account, then you can skip some of these
1:35 steps but let's say you don't have an
1:37 account you enter your email I here.
1:40 Here you enter
1:42 let's say personal project,
1:45 then continue.
1:46 Then it will ask you for a phone number.
1:48 So you enter phone number it will send a
1:50 code. You verify it.
1:53 Upon verification
1:56 you can enter
1:58 your payment method. Now here when you
2:01 enter your
2:04 credit card. Do not worry you get some
2:08 credit for free 300 or something like
2:10 that and then
2:12 you also
2:14 don't have to worry about the auto
2:16 recharge because it will not charge you
2:19 until
2:20 you give the permission. So don't worry
2:22 just enter your credit card information.
2:24 They're not going to charge anything.
2:25 You're getting some credit for free
2:27 which will be enough for this project.
2:29 When you say start my free trial it will
2:32 ask you couple of questions you can just
2:34 say okay I'm here for..
2:38 I'm not sure yet. You can just give
2:40 anything. I'm interested in AI
2:44 engineer developer role. Done. Just couple
2:46 of
2:47 basic questions and you are done. Now you
2:49 can create a project. So see,
2:51 first you created
2:53 a GCP account. Now you're creating a
2:56 project from GCP account. There is a
2:57 documentation link here. You can read
2:59 through documentation but it's really
3:02 very easy. You just go here
3:04 and you click on new project
3:06 and you will say
3:09 potato disease classification
3:13 and
3:15 just say create.
3:18 So now it's creating the project. So in
3:20 this project, we're going to deploy our
3:23 google cloud cloud function so this
3:25 project is kind of an area where
3:29 our Google Cloud function will be
3:31 deployed so here you can click at the
3:33 top
3:34 select potato this is classification as
3:36 a project and here is the project ID
3:39 that you will be using later on
3:43 now create a GCP bucket
3:47 so GCP bucket is like Amazon s3 you
3:49 create a
3:51 it's like Google Drive, you know. Like you
3:53 have some area in the cloud where you
3:55 are deploying your objects.
3:58 Okay, so that's what it is. So here I can
4:01 say create a bucket
4:03 and
4:05 let's say
4:07 core basics tf models. Now it has to be
4:09 globally unique. I have already taken
4:12 this name so it's not working but I will
4:14 just say code basics tf models two. You
4:16 need to give your own unique name and in
4:19 your project you use your bucket name. If
4:22 you start using my bucket name probably
4:24 it's not gonna work. So
4:26 make sure you create your own bucket.
4:28 In the region, I will select US the
4:31 saphir from
4:33 London, Belgium whatever country
4:36 Mumbai just select the appropriate
4:38 region so that your
4:40 physical server data center is
4:43 you know in that region and you
4:46 don't have issues with the speed. So just
4:49 continue with all pretty much all the
4:50 default
4:52 values here
4:54 and now you can
4:58 create a folder. So in this bucket I'm
5:00 going to create a folder called models
5:04 in that I will upload my model so here
5:07 you can say upload files
5:10 and potatoes.h5
5:12 that h5
5:13 object I will upload.
5:15 Okay now I have my model
5:18 available here
5:19 and the next step
5:22 is see next up this is the step which we
5:24 already perform. You need to now install
5:27 Google Cloud sd sdk. So go here
5:32 and
5:34 [Music]
5:35 click here.
5:37 Click on this link. See I'm on Windows
5:39 MAC OS whatever OS you are on. You need
5:42 to follow that instruction. I'm on
5:43 windows, I will
5:45 click on this
5:47 it will
5:49 download that file. See it's downloading
5:51 here
5:52 and then
5:54 you will go through the
5:55 installation here
5:58 I already have it installed actually
6:00 so but I'm
6:02 going to probably
6:03 override it just kind of to kind of show
6:05 you all the steps.
6:08 It is still going. It's gonna take some
6:10 time. So just be patient
6:12 it will finish it at some point for sure.
6:17 So here is almost done you can click on
6:19 next
6:20 just click all this checkbox
6:22 so that you know start Google Cloud SDK
6:25 itself. So when you do
6:27 finish here
6:30 it's going to start
6:33 the Google Cloud SDK cell.
6:36 I already have this project selected. You
6:38 might get a different screen
6:40 but I already have this project selected.
6:42 So
6:43 I will just say create a new
6:45 configuration here
6:48 and I will just say
6:50 whatever
6:52 core basics two.
6:59 I will log in with a new account just to
7:01 show you
7:02 you know how it works.
7:05 So you give your credentials and then it
7:08 will say you are
7:10 authenticated.
7:12 Okay.
7:13 Now here
7:15 you need to
7:17 pick your project. So my project was what
7:21 So let's go to
7:28 here.
7:29 Gifted Pillar that's my
7:32 project ID. So I will select gifted
7:36 pillar so number two as my project ID
7:40 Okay.
7:45 I think we have already done this but
7:46 let me just just to make sure Google
7:49 cloud auth login.
7:53 okay I'm logged in. Now
7:55 all right
7:57 so I'm
7:58 authenticated and I'm ready to deploy my
8:01 my code to Google Cloud but hey which
8:04 core should we deploy? We need to write
8:06 code that we can deploy to Google Cloud
8:09 So here I will create in my project
8:11 structure a new directory called GCP,
8:14 okay?
8:15 And in that
8:16 I will
8:17 add a Python file called main.pi
8:21 and this file will have a code that will
8:24 be deployed to Google Cloud.
8:27 You also need to have a requirement.txt
8:30 file. So requirements.txt
8:34 Why do we need this file? Because when
8:36 you deploy this particular code to
8:38 Google Cloud you need to tell
8:40 which
8:42 libraries should be installed first
8:45 because these libraries are installed on
8:46 your local computer they're not
8:48 installed on Google Cloud yet so
8:50 you need to specify all of that and once
8:52 you have that here when you deploy your
8:54 code from here it will first install the
8:57 libraries from
8:59 requirement.txt file. Now I'm going to go
9:01 into my
9:03 meditation
9:05 all right
9:06 meditation
9:08 and let's start writing some code.
9:12 We'll import some essential libraries
9:14 I'm importing Google Cloud.
9:17 This particular module to access that
9:21 Google Cloud bucket
9:23 object. You know the model that we have
9:26 uploaded
9:28 and my bucket name
9:31 is
9:32 basically code basics tf
9:36 models. Okay what is my bucket now? I
9:38 forgot
9:40 my
9:42 tf models too
9:49 and then
9:52 okay my class names
9:54 they are same as what I had in my
9:56 notebook
9:58 and I'm going to create a variable for
10:00 model. Here I will
10:02 you know I will use this variable
10:05 to store my model. Now first
10:07 let's download the model and we are
10:10 calling it blob
10:11 binary large object,
10:15 you know. So
10:17 this is a generic function we are typing
10:18 where you specify bucket name
10:21 your source
10:23 blob name your
10:25 destination
10:26 file name
10:29 So source blob name meaning the blob on
10:32 your bucket
10:34 but destination file name means
10:36 this function will be running on a
10:38 different server in Google Cloud
10:41 and that server will be downloading the
10:44 model from the bucket.
10:48 So when it downloads it locally on that
10:50 server on cloud
10:51 here is the destination file path where
10:54 it should be storing it.
10:56 We'll see it in a minute, don't worry. Too
10:59 much about this it's not that
11:00 complicated. So now storage client is
11:04 basically storage.client.
11:08 We are doing this to download our object
11:11 save you know potatoes.h5 file from
11:13 the cloud
11:15 and
11:16 the storage client dot
11:19 get bucket.
11:21 What is my bucket? Well whatever you pass
11:23 in the function
11:24 parameter.
11:26 And there goes your bucket.
11:29 Once you have bucket you do bucket dot
11:31 blog
11:33 source blob name
11:36 and that is your blog
11:38 and you download this blog
11:41 to download to file name
11:44 so you download it to
11:45 this particular destination
11:49 file path.
11:51 And I just
11:53 I'll just now
11:55 write a predict function. You know how we
11:57 wrote it in our in our API folder
12:00 and in predict we'll get a request as an
12:03 input.
12:06 First thing you do is of course you want
12:08 to download the
12:11 the model
12:13 and the first thing is bucket name.
12:17 So bucket name is that
12:19 this one
12:23 source
12:24 blob name is this
12:28 why this models potatoes.h5? Well we
12:31 already saw it right?
12:33 models models directory inside that
12:35 there is a file called potatoes dot
12:39 five
12:41 and that
12:43 so this is
12:44 in the bucket on cloud somewhere on some
12:47 machine.
12:48 I want to download it locally into
12:51 the
12:52 dmp directory
12:54 as this file name.
12:58 okay?
13:01 And then
13:03 you can just say tf tf.keras
13:07 dot load model
13:11 So now you are loading this model
13:13 because once this call is executed the
13:15 model is available here. You are loading
13:17 it here.
13:20 Now this cloud function will be executed
13:21 multiple times and you need to execute
13:24 this function only on the first call.
13:28 Hence I will say if model is not none
13:33 then do this
13:35 otherwise if model is downloaded already
13:37 why do you want to download it again?
13:41 and
13:43 this particular variable is global
13:47 this is how python works basically.
13:52 Now request
13:54 will have
13:55 a parameter called files. Don't ask me
13:57 why that's how it is and it will have a
14:00 key call file.
14:01 If you remember from our postman request
14:04 and that is your image. So from postman
14:07 when you upload a file
14:08 or from mobile app.
14:10 It will come in the image variable
14:14 and that
14:15 image could be big file you know 800 by
14:18 600 you need to resize it to 256 by 256.
14:23 You need to convert it
14:25 to numpy array as well.
14:28 So
14:32 do
14:33 image
14:34 dot
14:37 open what image? okay.
14:39 Now I got the image I need to convert it
14:42 into rgb
14:45 because remember our images are in rgb
14:47 format you know they have three channels
14:50 and you need to resize it to
14:54 our standard size that the model was
14:57 trained on is 256 by 256.
15:02 And when you wrap
15:04 this whole thing into np dot array
15:08 what you get back
15:10 is numpy array presenting that image
15:16 and that image is
15:19 needs to be divided by 255 because
15:22 the values are between 0 to 255 range
15:25 You want to convert it to 0 to 1 range.
15:28 I have the image, but when I do
15:31 model or predict it expects a batch of
15:35 image.
15:36 So it expects image array and to convert
15:41 the single image into an image array
15:43 something like this ,you want to do
15:45 something like this, right?
15:49 is
15:50 expand dimension
15:58 okay?
15:58 And
15:59 you get your predictions back. I will
16:01 just print my prediction.
16:04 Just to for the debugging purpose. You
16:06 know
16:08 and then the remaining code is same.
16:10 We already looked at it like this
16:12 predictions will be
16:14 prediction for each of the three class.
16:16 We have
16:17 early blight late blight and healthy.
16:20 Let's say the values come to be like
16:23 0.5, 0.9 then this is the
16:28 class you know
16:29 let's say this is healthy, so then this
16:31 is the class and when you do np
16:35 dot arg max on this type of array
16:38 it will return you the index of the
16:40 maximum element. Maximum element is 0.9
16:42 and the index is 2. So that's what it
16:44 will do so we will
16:48 just pass prediction here. And by the way
16:50 predictions are for multiple arrays. See
16:53 it's an image array my image area has
16:55 only one
16:57 Hence, I always take the zeroth image.
17:00 I hope this is clear
17:02 and once you get an index you can use
17:05 this class name. So let's say index is
17:07 two. So if you use class name you get
17:09 healthy
17:10 so here
17:11 I do this
17:14 okay
17:17 and my confidence is this we have looked
17:19 into this before. So I'm not going to go
17:22 in detail too much.
17:23 And then you return this back
17:26 that's it my cloud function is ready and
17:29 I'm going to deploy this particular
17:31 cloud function. This predictive function
17:34 is the one
17:35 that I will be
17:37 deploying to my cloud
17:39 and how do I deploy it? Well there is a
17:40 command here.
17:42 So..
17:43 But you have to remember you need to use
17:46 this Google Cloud console okay don't use
17:48 GitBash
17:49 So i in Google Cloud console
17:52 I will
17:58 I'll go to this directory and then go to
18:01 like GCP directory you know
18:04 and here
18:05 I will just say
18:10 Gcloud
18:12 functions deploy
18:15 predict. What is predict? Predict is that
18:17 function name. See this function name?
18:20 That's this particular thing here
18:23 and I will deploy
18:27 that. Wait
18:29 whatever
18:30 memory 512 megabyte. Okay fine.
18:34 What is my project ID? Project ID is this
18:37 here
18:39 and enter
18:44 okay.. So
18:46 all right, project ID, I need to say
18:52 hyphen hyphen project.
18:58 It will take some time okay
19:00 What is this would you like to enable
19:02 and retry yes.
19:04 So Google Cloud API needs to be enabled
19:07 and if it is not enabled
19:09 you will get this prompt. You should just
19:11 say yes
19:17 we want anyone in the world
19:19 to be able to call our function.
19:25 Usually if you're working corporate
19:26 environment, you will have some kind of
19:28 authentication. But since this is a study
19:30 project, we are saying okay anyone can
19:32 call the function. Now here we got an
19:34 error because the cloud API didn't work.
19:36 Don't worry you can just copy this
19:39 particular
19:41 link here and you have to just click on
19:43 the button. You just say okay enable and
19:46 then retry that command
19:48 So I'm going here.
19:50 I'll say enable it will take few seconds.
19:52 So cloud build api will be enabled it
19:55 will bootstrap your environment
19:57 and then you can run the same command
19:59 again.
20:01 Perfect! It got enabled.
20:03 now,
20:04 I will go here
20:06 run same command. Again,
20:14 it takes some time, you know. Up to two
20:16 minutes, three minutes
20:18 to deploy. This because it will
20:21 install those modules, it will
20:24 prepare a server you know start a
20:26 process. So there is some
20:29 some setup involved here that's why it
20:30 takes time.
20:33 Wonderful it is deployed now and here is
20:36 the url
20:37 that we can use to call the function.
20:41 You can
20:42 get an error ,it's quite possible
20:45 and in case of error
20:47 I would suggest this
20:49 go here
20:52 go to let's say your home page here.
20:56 This is the home page
20:58 and
21:00 here in the error reporting you will
21:03 find
21:04 the errors you face the error while
21:06 deployment. Okay.
21:08 Now in our case
21:10 we did not see any error
21:13 So let me just refresh this real quick.
21:21 Here you will see cloud functions and
21:25 there you will see our function running.
21:26 So it's green the predict function is up
21:30 and running
21:31 you can click on it.
21:33 You can see the activity. You can see the
21:35 logs and so on.
21:38 We'll use a postman now to call that
21:41 function. So I'm just going to create a
21:43 new tab. We'll be making a post request
21:47 and the request will have
21:49 this url
21:52 This is the url that you're calling.
21:59 copy.
22:04 okay.
22:06 You can get the url
22:08 by
22:09 going
22:10 here in a function detail.
22:15 Okay go to function detail
22:19 and
22:22 trigger
22:23 Yeah see same url. You can just click on
22:25 this and it will copy
22:27 in the body. We know it's form data the
22:30 key is file we have looked into this
22:32 before
22:34 and the file now I'm going to create.
22:36 Select some files which are downloaded
22:38 from internet. Now these are the files
22:40 which have different
22:43 unix and why different size, basically.
22:47 So for example this one is 800 by 539 so
22:51 I wanted to test my function on
22:53 entirely different image.
22:57 Okay saying could not handle a request
23:01 What happened here?
23:03 Let's check it out.
23:09 Okay function is deployed. Okay let's
23:12 check the logs.
23:19 So looks like it crashed. So here is the
23:22 crash details
23:28 predictions is model dot break image
23:30 area attribute none time object has
23:33 no attribute predict. Okay so model, this
23:39 parameter
23:40 is
23:41 none.
23:42 Looks like that.
23:45 So model came out
23:47 to be none.
23:51 Oh because I think I made a mix I. should
23:55 have put here model is none. If model is
23:58 none then you're doing that, okay cool.
24:01 So I made this code
24:02 change
24:04 and
24:06 I will deploy it again. So just just
24:08 rerun the same command, that's it.
24:10 You need to wait for
24:12 two minutes more. I know it's
24:15 it's kind of annoying but
24:17 you got to wait
24:20 perfect. It is deployed again.
24:23 I will go to postman and make a call
24:27 hoping it will work this time. If it
24:30 doesn't then you can go to logs again
24:32 and figure out what is an error.
24:36 First time usually takes more time
24:38 because it's loading the model from
24:40 Google Bucket and
24:42 doing the prediction.
24:44 There is a concept of cold and hot
24:46 invocations.
24:48 Okay again, we faced an error. So let's
24:51 see what's going on.
24:53 I'm just going to refresh the logs here.
25:04 All right! Why am I getting the same
25:07 error again?
25:09 There's something wrong in the code.
25:14 So let's see global model is
25:16 none then download blob potatoes models
25:20 What is it h5 tmp
25:24 and I want to make sure codebasics tf
25:27 models 2. So code basics
25:30 tf
25:32 models 2 potatoes h5.
25:40 Here you can say print
25:43 I will download the
25:46 model now
25:56 mortar download dead
25:59 and I will
26:03 do this
26:06 because model is coming to be none. I
26:09 think right that's what the
26:12 error
26:13 is.
26:18 So predict model is coming to be
26:22 none the model is none.
26:27 Oh I made a big mistake actually it
26:30 should be tf.keras.models.loadmodel
26:31 [Music]
26:34 I think that was the issue.
26:38 Okay again rerun the same command.
26:44 Okay after waiting for some time
26:46 again it is
26:48 up active.
26:50 Let's try one more time.
26:54 Wonderful so it's working fine now. You
26:57 can see
26:58 I'm able to call Google Cloud function
27:00 the response came up fine. I will try
27:03 some other files. This is this one is for
27:05 late blight
27:09 and
27:10 let's see what happens with this one
27:15 okay that worked. Okay as well so overall
27:18 my function is ready. In the next video
27:20 we will be using mobile app to call this
27:23 function.
27:25 I have all the instructions
27:27 on my
27:29 Github page and
27:31 all the code everything is available. So
27:33 try it out if you like this series. So
27:36 far please give it a thumbs up and share
27:37 it with your friends. Thank you.