SAP-C02 Amazon AWS Certified Solutions Architect Professional – New Domain 3 – Migration Planning part 1
- Migration Stratergies
Hey everyone and welcome back. In today’s video we will be discussing about the migration strategies. Now there are six major migration strategies that we need to understand. The first one is rehost. You have replatform. Third is refactor or rearchitect. Fourth is repurchase. You have retired and retain. Now many times in exams like you might not be ask the question on like what exactly is rehost? But you will sometimes see the questions which has Lift and Shift. So you need to have a high level overview specifically for this Lift and Shift based migration strategy. So the simplest one is Rehost. So it is also referred as Lift and Shift and it basically deals with moving the application without any changes. So let’s say that you have a server in on premise which has the MySQL installed.
So what you do during the migration you create a new instance and you install the same version of the MySQL there. So there is no changes which is involved. All you have to do is you are moving from on premise server to the cloud server. Now the second migration strategy is Replatform. And Replatform basically deals with using some of the cloud based optimization to achieve certain tangible benefit. So it’s like moving the on premise database to the RDS. So you do make a certain cloud based optimization to take advantage of the cloud technology. Now it is referred as lift, tinker and shift. The third one is refactor and rearchitect. Now, in this type of approach you basically redesign your application in such a way that it takes care of the cloud native features. It’s like migrating your application in on premise to the serverless completely.
So this is also referred as the Refactor and Re architect because if you are planning to migrate it to serverless, then lot of architecture and factoring that would be needed. All right, so this is the third one. Fourth one is repurchase. Repurchase basically states that you move from a perpetual license to the software as a service model. The highest level of example that I can give you is Nessus. So Nessus is basically a vulnerability scanner. So if you have Nessus you can move it to AWS Inspector which follows the complete SaaS model here. The next one is retire. Retire basically states that you remove the applications which are no longer needed. Typically in enterprises you will see that there are a lot of applications which keeps on running and no one has idea on what that application is all about.
So just remove the application which are no longer needed. Let’s say you have an on premise FTP server which has not been used. So you can go ahead and retire them. And the last one is Retain. Retain basically means that if the application is critical and if you find that lot of refactoring is needed for it to be migrated and that is not the best choice, then you can just retain it and at a later stage, you can think about migration over here. So these are the high level overview about the migration strategy. Again, this is quite simple to understand. Now, when we talk about the migration in terms of AWS and in terms of exams, we need to remember that migration is not always related to servers. Now, organization might want to continue to use their onpremise servers and they might want to migrate their data, which is lying in the magnetic tapes to AWS Glacier.
So it can be the data level migration also. Now, throughout the domain of migration, it’s not necessary that we should only understand a migration related to service, but also migration related to various aspects. Now, there are a lot of services that helps us in migration into AWS. Some of them includes AWS snowball. Then you have the server migration service, you have the database migration service. You also have the application discovery service which helps here you have AWS No Mobile and various others. So these are some of the services that we will be discussing throughout this domain of this certification exam.
- AWS Import/Export
Hey, everyone, and welcome back to the Knowledge Portal video series. And today we’ll be speaking about a very interesting service provided by AWS call as AWS Import Export. So let’s go ahead and understand the use case behind which the service got launched for now, in a simple terms, AWS Import Export allows us to import or export live large amount of data between AWS and portable storage devices. So let me give you one of the examples. So, I have a portable storage device. So this is Western Digital, my passport of two terabytes. Now, I want to upload all the contents of my portable hard disk drive.
Let’s assume I want to upload the entire contents of this hard disk drive to a S Three bucket. Now, if you want to really upload two terabytes of data with like a normal Internet connection, it will take you like two to three months to do that. And when the data is large, like for a small or a medium sized organization, when you have like a ten terabyte of data, uploading that much amount of data via a normal Internet connection will take a lot of time. And this is the reason why AWS actually decided to make it simple, where AWS said, okay, you send us the data in a hard disk drive, and we will take the data and we will import it from our side to S Three bucket. So you don’t really have to introduce an Internet connection in uploading the data.
So that is the import feature. You also have an export feature where you have, let’s assume you have a lot of data. Let’s assume you have five terabytes of data in the S Three bucket and you want that data in your organization. Now, manually downloading the data via Internet again will take a lot of time. And you have an AWS export in which you can actually ask AWS to send the data to you at your postal address and they will send you the data. So it becomes much more simple for an organization to import and export large amount of data to AWS. So there are three functionalities that has been provided about AWS Import and export. The first one is import to S Three, where if you want to import data from a portable device to S Three, this is possible.
Second is export from S three. If you have large amount of data in S Three which you want it, you can do that as well with AWS Import Export Service. And third is import to AWS EBS, where you want to put data in form of EBS volume, then this is also something that you will be able to do with AWS Import and Export Service. Perfect. So now that we understand the basic on what AWS Import Export Service is all about, let’s go ahead and look into how we can implement it. So there are three steps which are involved. The first step is downloading the AWS Import and Export tool. This is the first step.
The next step is to save the credentials to a file. So let’s assume you want to do an import to S three. So in order to do that, the first thing that you’ll need is you need an S three bucket where you will be importing the data to. Where AWS will be importing the data to. Once you have an S three bucket, you have to put the credentials of that S three bucket in a file. So this is very important. You have to put a credentials with a specific permission to do that. And once you put the access and secret key over here, then you create either an import job or you create an export job. So this is it about the theoretical part. Let’s do one thing. Let’s go ahead and do the practical aspect related to the steps that we have discussed.
So I am in the documentation part and I’ll share you the link and in the documentation they have given the link to download the Import Export web service tool. So I’ll go ahead and download the web service tool. Let’s just let me try it again. I’ll start my download. Perfect. So it is downloaded, it is in form of a zip file. I’ll open this up and I’ll extract to my desktop. So let me create a folder so I’ll create a folder called as Import export and inside this folder I’ll extract. So within demo, let me take the full URL and I’ll post it over here. Perfect. So now that we have done this, if you will see there are few files which are extracted over here. One is the AWS credentials properties file. So this is where we have to put our access and secret key.
So let’s do one thing, let’s open this folder. Let’s open up the folder in our item editor so that it will become much more clear. So, what we have is this is our import and export and there is a file called as AWS credentials the properties. So this specific file so this specific file has two empty values. One is the access key ID and one is the secret key. As we have already discussed, we have to put the access key ID and the secret key over here. Now, this access key ID and secret key should be able to connect to the bucket in AWS S three. So I have a sample bucket called Kplabs Myinternal. So whatever access and secret key that you provide should be able to have permission on the bucket where you want to import, where you want to export from.
This is very important to remember and there is one more important thing to do. So once you configure this aspect, once you save your credentials to a file, the next thing that you need to do is you have to create an import job or you have to create an export job. So let me show you the format behind it. So within the example directory, they have given us the sample format for import export. So if you’ll go over here, this is the sample format where you have to fill in the details. Like you have to fill in the bucket name, the device ID. So this device ID is basically the ID of the storage device which will be sending to the AWS. Then you have to put the notification email followed by the return address.
Because if you’re sending your storage device to S three or to AWS, once they import the data, they will be sending you the storage device back. So on which address they will send you, the storage device is something that you will have to put it to your you can even encrypt your data and you can put an encryption password as well. So once you fill in all of this, you have to move this file to the root of the directory. So let’s do one thing. If you see over here, I have already filled both of these details and I’ll paste it on the root. Let me show you. I just filled it so that time would be saved. So this is the access and secret key ID. Don’t worry, this will be deleted file.
The time video is uploaded. So this is the AWS credentials properties and second is the manifest file. If you’ll see over here, I have filled all the details. I’ll just rename the bucket name. So you have Kplabs myinternal let’s just verify Kplabs Myinternal. Perfect. Along with that I have filled in the email address, the notification email address and a random written address. Okay, so since this is just for a demo, these are the details that I have filled in and I’ll click on save. Once you do that, you have to run a Java command in order to compile the file and it will give you a signature value. So let me just show you. If you go into the documentation, this is the first step. You see it is giving you step by step data.
This is the first step for downloading the tool. Second step is for saving your credentials in a file. This is something that we have already done. And the next step is to create an import job or an export job. So within the import job you have import to S Three or import to EBS. And for the export job you have export from S Three. So this is something that we have already discussed. So let’s click on import to S three. And if you’ll go down over here, it is telling us to fill the manifest file which we have already filled. And after that you have to run a command. So this command is specified over here. So you have to run this specific command. Let me just show you. So this is the command that you have to type in. In my case, the manifest file name, if you see it, is new manifest TXT.
So this is the command that we have to run. So let’s do one thing. Let’s open up the term command prompt and I’ll go to desktop demo and import export. Perfect. So now that we are here, we’ll copy the command and we’ll paste it over here and I’ll press on Enter. So what is happening is it is taking the data of the file. It takes the data of our AWS credentials properties as well as new manifest TXT, and then it will generate a signature file. So if you see over here, let me show you. Now a new signature file got generated. Now you are supposed to copy the signature file in the root of your storage device that you will be sending. So, if I’m going to send this specific portable storage device to Amazon, I have to copy this specific file in the root of the storage device. Amazon will verify this file. It’s very important.
Do not forget to send or do not forget to copy this file in the root of your storage device. Otherwise, they’ll just send you the storage device back without doing anything. Once this process is completed, the last step is to generate the label. So let me show you. So this is the file generation part. Now, after you generate a file, it is necessary that you will create a package and you will send that package to AWS. Now, in order to do that, you need to have a shipping label as well. And this is the reason why it is calling, telling us to call a shipping label API to retrieve the shipping address of the AWS. Perfect. So in order to do that, let’s go to the shipping or storage device and click on gender ping your prepaid shipping label. Now, within this, you have one more command.
So this is the command. Just copy this command and paste it within the terminal. Let us just paste it. Oops, I think it did not copy. Let’s try it out once again. I think it did not take that. I’ll copy it manually and I’ll paste it. Perfect. So what we’ll do, let’s try that again. So now it is asking us for the job ID. So basically, the job ID is the one which got created. So within the signature which got generated, you’ll get the job ID. Just copy this job ID and paste it. Press Enter, put in your name. Company would be KP Labs Street. Let’s say I just put Mumbai, city, Country, India, postal code, phone number. I’ll just put random. And once you fill in all these details, what AWS will do, or what this command will do is it will generate a shipping label inside our KP Labs, my internal bucket.
So let’s go ahead and go to our bucket. You’ll see what a shipping label basically is. So just open this file or let’s do one thing, let’s download this file, okay? So once this file is downloaded, you will know how or what a shipping label is. So this is what a shipping label basically does, is that if you’ll see over here, this is my address. And in the ship to column, in the ship to column, you have the address of the AWS. So this is the address of the AWS and this is my address. And you have various other barcode related information within this. So you have to print this out. And in the package that you will be sending to AWS, just attach this label as well so that AWS will know the details related to who has sent and other details. So this is the basics about AWS import export. I hope you got a basic understanding on what AWS export is and how you can actually send your own portable storage to.