0:02 hello hi everyone Welcome to Cloud
0:04 Sprint in the last video we learned how
0:06 to choose location or the classes of the
0:09 buckets we also did some case study if
0:11 you have not watched that video link is
0:13 in the description do watch that as well
0:15 in this video we are going to learn that
0:17 how to protect our data in Cloud Google
0:19 cloud storage and how to save the cost
0:22 by enabling bucket versioning policies
0:25 we'll also do some use cases so watch it
0:26 till the end [Music]
0:29 [Music]
0:31 while working on Google Cloud Storage
0:33 your main job is to protect your data
0:36 and Google Cloud Storage offers us some
0:38 techniques to protect our data the first
0:41 one is object versioning feature it
0:43 basically supports the retrieval of
0:46 objects that are deleted or replaced in
0:48 this feature you can enable that your
0:51 version one two three four is maintained
0:53 and if you delete version one you still
0:55 have version 2 available for your users
0:57 this will protect you from any
1:00 accidental deletion the second option is
1:02 the retention policy retention policy
1:04 prevents the deletion or modification of
1:07 the buckets for a specified minimum
1:09 period of time after that they are
1:11 uploaded which means that you cannot
1:15 delete or modify a object or file once
1:17 you have uploaded in the bucket this
1:20 will help you to protect your data for a
1:21 specific period of time and controlling
1:24 that it's not deleted let's go ahead to
1:27 the console and find that out in detail
1:30 over here we will come to cloud storage
1:32 and we'll click on create this will
1:34 create a bucket for us once you create
1:37 on it you'll be asked to give a unique
1:39 name which is globally unique so I'll
1:42 say cloud Sprint
1:44 continue in the last video we learned
1:46 that how to choose location so let's say
1:49 I say Regional continue we also
1:51 understood that how to when to choose
1:53 which storage class for now let's keep
1:55 it default which is standard we also
1:57 learned about that what is uniform and
1:59 fine-grained axis and how to enforce
2:02 Public Access prevention on this bucket
2:05 the major part which we are talking in
2:06 this video is about how to protect your
2:10 data okay your data is always protected
2:11 with class students but you need to
2:13 choose additional data protection
2:15 options to prevent the data loss and gcp
2:17 offers us two options object versioning
2:19 and retention policy if you want to
2:22 store your data for data recovery if you
2:24 want that something is deleted I should
2:27 be able to recover that then you will be
2:29 using object versioning you can have
2:31 that maximum number of versions per
2:33 object you can configure that suppose
2:35 you just want to save last five versions
2:38 and expand non-current version
2:40 non-current version means which is not
2:43 the live object that's the difference so
2:46 once suppose a is your current version
2:49 and you overwrite it with B so a becomes
2:51 a non-current version and B becomes
2:52 light version that's the difference so
2:54 you want that a to be deleted after
2:56 seven days this will help you to save
2:58 the cost otherwise you will have you
3:01 know n number of versions per object and
3:03 that will add to the cost in your you
3:04 don't want that to happen
3:07 that's all about object versioning this
3:09 will help you to survive any accidental deletion
3:10 deletion
3:13 second option is retention policy as the
3:16 title says it is best for compliance if
3:18 you want to prevent the deletion or
3:20 modification of the buckets object for a
3:23 specific minimum duration of time after
3:25 being uploaded which means that you want
3:28 that object should not be deleted for 30
3:31 days nobody should be able to modify or
3:34 delete the objects within this bucket
3:36 because you have some compliance to be
3:39 fulfilled that X data must be available
3:42 for 30 days in a bucket you don't want
3:45 any this happened to be there because if
3:47 someone reads the data you lose that
3:48 compliance and you might have to pay
3:50 some files so in those kind of scenarios
3:52 you will be using the retention policy
3:53 if you don't want to protect your data
3:54 you can just choose none I hope these
3:56 two options are clear now versioning is
3:58 for data recovery retention policy is
4:00 mostly for best compliance and you
4:03 configure it while creating the market
4:05 one very very important point to note
4:08 here if you delete the bucket
4:12 if an honor deletes the bucket these
4:14 policies do not work
4:17 okay so be very very careful whom you
4:19 are giving what kind of access that is
4:22 why the earlier five six videos which I
4:25 have made on IM is very very important
4:26 if you have not checked them out go
4:28 ahead and check them out as well this is
4:30 how you're gonna protect your data
4:33 uh using these options and this will
4:35 help you to work more efficiently and
4:37 you can choose as per your need while
4:40 creating the buckets
4:42 let's go back to the presentation I hope
4:43 this is clear now that how you can
4:46 protect your data over gcp storage the
4:47 next option which Google Cloud Storage
4:50 offers is life cycle policies life cycle
4:53 rules let you apply actions to a bucket
4:55 objects when certain conditions are met
4:58 for example switching objects to a
5:00 colder storage class when they reach or
5:03 pass a certain age this life cycle will
5:06 help you on many front these policies
5:08 will help you to save costs also help
5:11 you to optimize your utilization
5:13 just a quick recap on the
5:16 storage classes and all we have standard
5:18 we have near line
5:21 for 30 days we have gold line for 90
5:23 days and we have archival for 365 days
5:26 now imagine a situation you have
5:30 a data a shopping history data which is
5:33 just used for 30 days suppose after 30
5:37 days you might use or might not use for
5:39 next one year and after one year you
5:42 want that data to be deleted
5:44 in that case this life cycle policies
5:46 will work on your behalf you can just
5:49 create a Json or you can configure it
5:51 from console or from your terraform
5:52 however you're going to create the
5:55 bucket what you'll do you'll create a
5:58 life cycle policies that keep the
6:00 objects in this bucket as a standard
6:04 resource for 30 days after that move it
6:05 to archival
6:08 and once the object Edge completes 365
6:11 days you can delete it
6:14 see how how powerful this tool can be done
6:15 done
6:17 you don't have to you don't have to take
6:19 care of each object the Google Cloud
6:22 Storage will take care of itself and it
6:24 will move it will change the classes for
6:27 the objects and it will delete it as
6:28 well when as and when needed how you
6:30 configure it
6:32 let's go ahead and check this on the
6:34 console as well let's go ahead and see
6:36 how the life cycle policies work in the
6:39 last while while understanding the
6:40 protection of data
6:43 we check these options okay let's create
6:49 confirm
6:52 it will create a bucket for me quickly
6:55 over here you need to go to lifecycle
6:57 we already have two rule existing
7:00 because we selected it's you need to
7:03 delete the objects all non-currents five
7:05 plus newer versions and seven plus days
7:08 when it becomes non-current
7:11 now I'll go ahead and add a rule because
7:14 I want to move something to code Line storage
7:15 storage
7:19 okay I want that whenever X thing
7:22 happens that X thing that condition will
7:25 choose here but I my job is to move the
7:28 objects to code line after this object
7:30 conditions met which we'll select in the
7:33 next window so my first job is to select
7:35 what kind of action I want to take I
7:37 want to move the objects to code Line
7:40 storage click on continue
7:42 when do you want this to happen you need
7:44 to choose that condition you want it on
7:46 the edge you want to create it before
7:49 the class match any if any new version
7:53 becomes available the live stage what is
7:56 your ask that's the main thing so I'll
7:59 say that okay let's choose it over age
8:04 as long as it becomes 90 days
8:08 okay clear I'm saying as any object age
8:11 becomes 90 days move it to core Line
8:14 storage click on create
8:17 you'll see this rule is created
8:21 and it will be attached to this bucket
8:23 which is cloud Sprint bucket
8:26 this is a third Rule now that you file
8:30 is is 90 days the class of that
8:32 particular object will be set to code line
8:33 line
8:35 let's add another rule okay
8:37 okay
8:44 this time I want to choose a life cycle
8:45 policy like I want to delete an object
8:49 click on continue when
8:53 say when the age of the file becomes 365 days
8:54 days
8:58 so whenever any object or a file
9:00 age becomes 365 days it should get
9:03 deleted automatically I'll say continue
9:05 I'll say create
9:09 you can see that delete object 365 plus
9:11 days since the object was updated means
9:16 that any object which is uh which is 365
9:19 days plus days old will be automatically
9:20 deleted because you don't need that file
9:22 right that's the point of creating
9:25 lifecycle policy that you can take the
9:28 leverage of the classes available and
9:30 you can take the benefits out of it you
9:34 can protect your data you can save cost
9:38 okay that's how you you configure a life
9:41 cycle you can delete all
9:43 in single click also check protection
9:45 once that object versioning is enabled
9:48 here we already enabled it you can also
9:51 manage rule this object versioning and
9:53 retention policy they don't work hand in
9:54 hand because of course it's
9:56 contradictory because at one place
9:59 you're saying that you cannot modify
10:00 something and at other place you're
10:02 saying you want to save multiple
10:04 versions so they can't work together
10:07 either you'll enable this or you'll
10:09 enable this
10:13 okay this is how you configure lifecycle
10:15 and you protect your data and your
10:17 objects can be uploaded here
10:18 all right
10:21 let's go back to the presentation okay I
10:24 hope that was helpful when to use which
10:26 class how to move it how to create a
10:28 life cycle and how to attach it to a
10:30 bucket this concludes all four Concepts
10:32 what you need to know is knowing the
10:36 location the classes the protection and
10:38 the life cycle let's go ahead and do a
10:40 case study now the case study says
10:42 there's a cost of your application logs
10:44 are exceeding the project bill which
10:46 means whatever logs you are generating
10:48 from your application is exceeding the
10:51 project field these logs are used by
10:54 teams regularly for 30 days and might be
10:56 used for some audit purpose once in
10:58 every quarter development team is using
11:00 these logs maximum for 30 days to test
11:02 or to check how your application have
11:04 behaved after that it's not used
11:07 frequently but it might be needed for
11:09 some audit purpose every quarter which
11:13 is every 90 days the logs are still
11:15 being saved in the bucket for forever
11:18 resulting two higher bills which means
11:21 there is no policy that we are just
11:23 dumping the logs again and again and
11:25 again and it is it is increasing your
11:28 bills exponentially
11:31 the task the major task is you need to
11:33 reduce the cost of storage by keeping
11:36 the logs only for 90 days how are you
11:39 going to do it pause the video think it through
11:40 through
11:43 let's check the answer
11:45 the answer is we'll create a life cycle
11:48 policies the first step will be moving
11:50 the objects to code line after 30 days
11:52 and the second option will be deleting
11:55 deleting the objects after 90 days we
11:57 need these logs for 30 days as a
11:58 standard storage as it is frequently
12:01 accessed by the development team so we
12:03 are not going to touch that for 30 days
12:05 after 30 days we'll directly move into
12:07 the cold Line storage because we need
12:09 every quarter
12:11 you know that if you need something
12:13 every quarter after 90 days we move it
12:16 to cold Line storage
12:18 we also need to save the third line
12:19 which is the data is being saved in
12:22 bucket for forever so we need to delete
12:24 the objects as the last line suggest
12:26 that we already need these logs for 90
12:28 days so after 90 days we are going to
12:31 delete that logs okay I hope this is
12:34 this is helpful to understand it that
12:35 how these all four things put together
12:37 works that's how you are going to
12:40 configure this case study
12:42 I hope this is clear and uh if you have
12:44 any question please uh comment and let
12:47 me know I'll try to answer them this
12:49 point around cloud storage CLI is called
12:52 GST it's a python application which lets
12:53 you access cloud storage from the
12:56 command line by using gsutil you can
12:59 create delete the buckets upload
13:01 download list move copy anything all
13:05 operational work you do it via gsut GS
13:06 util is very very powerful and very very
13:10 reliable uh utility by acli and by using
13:13 CLI you can do anything okay I really
13:15 recommend you to go through the
13:17 documentation link because it has a lot
13:20 of options it's this is gsutil
13:22 documentation and you can see all the
13:25 commands what geosutical can do for you
13:28 so for one of the example is gsutil CP
13:31 all the command have all a lot other
13:32 possibilities all these possibilities
13:34 you can find here
13:36 and I really recommend you to go ahead
13:39 and check few of them because this is
13:41 relevant for the exam as well as when
13:43 you work you should know all these
13:46 commands to work while you work with the
13:47 GS YouTube
13:50 all Google Cloud Storage right that sums
13:53 up our Google Cloud Storage topic and
13:55 thank you for your time thanks for