|
Categories: AWS

Are you planning to migrate your workloads to AWS? Do you have large databases, document repositories, media files etc?  If you have not done so already, you will at some point will need to create an import/export job to send or receive data to/from Amazon.

The process itself is not difficult but I found that the information is a bit scattered across various documents.

At the time of writing, AWS Import/Export Snowball (details here- https://aws.amazon.com/importexport/) is still limited to the US so we will be using the disk option.

https://aws.amazon.com/importexport/disk/

As the name suggests, AWS Import/Export lets you send/receive data to/from your AWS environment. This is particularly useful when you are required to transfer large amount of data which would otherwise take a long time to upload/download over internet.

For the import jobs you have the following options.

  • Import to S3
  • Import to EBS
  • Import to Glacier

These are pretty self-explanatory but the actual steps may differ slightly. More information is available here.

http://docs.aws.amazon.com/AWSImportExport/latest/DG/CreatinganImportJob.html

For exporting data you only have the option of setting up an export from S3 job. So if your data resides on an EBS volume or on glacier, you will have to first move it on to a S3 bucket.

  • Export from Amazon S3

In this article I would be concentrating on the most commonly used option for importing data into AWS which importing into S3.

Here are the steps required to importing data on to S3.

Step 1:

Download the AWS Import/Export Web Service Tool and unzip the content to a folder.

http://awsimportexport.s3.amazonaws.com/importexport-webservice-tool.zip.

import1

Step 2:

Create an S3 bucket that you would like to use for the import job. Based on the nature of data, you may want to setup restriction on the S3 bucket that only certain IAM users have access to it. This can be achieved by using an IAM policy or using a Bucket Policy. Bucket policies give more granular control as you would be applying it on a single bucket. With appropriate policies in place you can even deny administrators access to a S3 bucket who by default without specific permission can still access a bucket.

import2

For all import and export jobs, the IAM user must have the following access permissions on the Amazon S3 log bucket:

s3:GetBucketLocation

s3:PutObject

s3:AbortMultipartUpload

s3:ListMultipartUploadParts

s3:ListBucketMultipartUploads

For import to Amazon S3, the IAM user must have the following access permissions on the Amazon S3 import bucket:

s3:GetBucketLocation

s3:PutObject

s3:AbortMultipartUpload

s3:ListMultipartUploadParts

s3:ListBucketMultipartUploads

Here’s a sample policy which will deny everyone (including administrators) access to a bucket except an IAM User which will be used for the import job. Keep in mind that the root account will still be able to edit the permissions on the bucket which is by design so that you don’t lock yourself out of a resource.

import3

Replace the Account Number, IAM User and Bucket Name with the values specific to your environment.

Step 3:

Create a set of Access and Secret Keys for the IAM user that will be used for the upload..

Now update the AWSCredentials.properties file which is part of the AWS Import/Export Web Service Tool.

import4

Step 4:

Next step is to create an import manifest file, which had details like: S3 bucket name, storage device S/N, Notification email addresses, etc. Full list of options and description is here

http://docs.aws.amazon.com/AWSImportExport/latest/DG/ManifestFileRef_Import.html

Here’s a sample text for manifest file

manifestVersion:2.0;

generator:Text editor

bucket:<Bucket Name>

deviceId:<Storage Device S/N>

eraseDevice:yes

notificationEmail:xyz@abc.com

pinCode:<Device Pincode>

acl:private

serviceLevel:standard

returnAddress:

    name:

    company:

    street1:

    street2:

    street3:

    city:

    stateOrProvice:

    postalCode:

    country:

    phoneNumber:

Please note in this example we are using pin protected storage device, there are two options for encrypting data for import/export.

PIN-code encryption: Hardware-based device encryption that uses a physical PIN pad for access to the data.

TrueCrypt software encryption: Disk encryption using TrueCrypt, which is an open-source encryption application.

For export jobs AWS always encrypted data using one of the above chosen option. There is no such restriction on import jobs but it is highly recommended best practice to encrypted data. Make sure the appropriate details are added in the import/export manifest file.

Important note: If you using Windows platform to create the import/export job then ensure that you save the import manifest file in ANSI format otherwise you will receive below error message.

ERROR: Your manifest is not well-formed (Service: AmazonImportExport; Status Cod e: 400; Error Code: MalformedManifestException

 

Step 5:

For creating the import job we are going to use AWS Import/Export Web Service Tool, there are in fact two more ways to achieve this -using the AWS SDK for Java, or the AWS REST API.

More details are available here: http://docs.aws.amazon.com/AWSImportExport/latest/DG/SendingCreateJobRequests.html

Ensure that AWSCredentials.properties and MyImportManifest.txt files have been populated. Also java should be installed and enabled on the machine.

Open a command prompt and navigate the directory for AWS Import/Export Web Service Tool and run the below command.

CmdPrompt>java -jar lib/AWSImportExportWebServiceTool-1.0.jar CreateJob Import MyImportManifest.txt .

Output will be something like below

import5

This will create a SIGNATURE file that you are required to copy to the root of the storage device before shipping it to AWS.

If your email address is part of notification email list in the manifest, you will receive an email to confirm that the job has been created successfully.

import6

Jobs will expire after 30 days if you fail to send the device to AWS. You cannot extend an expired job so only option would be to create a new job.

Step 6:

Generate the pre-paid shipping label for the storage device. You will need to print and paste this on the shipping package.

Open a command prompt and navigate to the directory for AWS Import/Export Web Service Tool and run the below command.

CmdPrompt>java -jar lib\AWSImportExportWebServiceTool-1.0.jar GetShippingLabel

import7

And you are all done! there are some considerations if you are sending multiple devices in a single shipment. See this link for details.

http://docs.aws.amazon.com/AWSImportExport/latest/DG/shipping-multiple-devices.html

General guidance regarding packaging can be found here. Make sure that you include the device’s power supply, power cable, and interface cables etc. in the shipment.

http://docs.aws.amazon.com/AWSImportExport/latest/DG/PackingYourMedia.html

Arrange for UPS to pick up your package by calling them directly, or take your package to a UPS package drop off facility to be shipped to AWS. You will not have to pay UPS for the shipping charges, but AWS will add these charges to the fee charged for processing your device.

Please note:

“If you ship your storage device without using the pre-paid shipping label, AWS Import/Export will return your storage device without performing the job and you will be charged for any applicable return shipping charges and device-handling fees.”