Different ways to upload your objects to AWS S3

Different ways to upload your objects to AWS S3

When you upload a file to Amazon S3, it is stored as an S3 object. Objects consist of the file data and metadata that describes the object. You can have an unlimited number of objects in a bucket. Before you can upload files to an Amazon S3 bucket, you need write permissions for the bucket. For more information about access permissions, see?Identity and access management in Amazon S3.

You can upload any file type—images, backups, data, movies, etc.—into an S3 bucket. The maximum size of a file that you can upload by using the Amazon S3 console is 160 GB. To upload a file larger than 160 GB, use the AWS CLI, AWS SDK, or Amazon S3 REST API.

Tools

If you are not a fan of the CLI ( Command line interpreter) and you prefer to use the graphical user interface (GUI) then i advise you to use these tools:

S3 browser

No alt text provided for this image

S3 Browser?is a freeware Windows client for?Amazon S3?and?Amazon CloudFront.?Amazon S3?provides a simple web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web.?Amazon CloudFront?is a content delivery network (CDN). It can be used to deliver your files using a global network of edge locations.

You need just to create an IAM user, grant him the necessary S3 permission and create a security credentials (AWS Access Key) to be used on S3 browser.

Filezilla Pro

No alt text provided for this image

Aimed at professional users, FileZilla Pro adds support for cloud storage protocols. In addition to all the features supported by FileZilla like FTP, FTP over SSL/TLS (FTPS) and SSH File Transfer Protocol (SFTP) .

No alt text provided for this image

Using the S3 console

Amazon?S3?is easy to use with a web-based management?console, it's so simple to upload/download or modify the objects using the S3 console and it offer you a lot of other features.

No alt text provided for this image

AWS CLI

The AWS Command Line Interface (AWS CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.

The AWS CLI v2 offers several?new features?including improved installers, new configuration options such as AWS IAM Identity Center (successor to AWS SSO), and various interactive features.?

Example:

This will let you to view the content of your S3 bucket:

$ aws s3 ls s3://mybucket/




??? ??? LastWriteTime ?????????? Length Name




??? ??? ------------ ??? ??? ??? ------ ----




??? ??? ??? ??? ??? ??? ??? ??? PRE myfolder/




2013-09-03 10:00:00 ??? ??? ? 1234 myfile.txtt        

You can perform recursive uploads and downloads of multiple files in a single folder-level command. The AWS CLI will run these transfers in parallel for increased performance.

$ aws s3 cp myfolder s3://mybucket/myfolder --recursive


upload: myfolder/file1.txt to s3://mybucket/myfolder/file1.txt


upload: myfolder/subfolder/file1.txt to s3://mybucket/myfolder/subfolder/file1.txte        

A sync command makes it easy to synchronize the contents of a local folder with a copy in an S3 bucket.

 aws s3 sync myfolder s3://mybucket/myfolder --exclude *.tm



upload: myfolder/newfile.txt to s3://mybucket/myfolder/newfile.txtp        

AWS SDK

A software development kit (SDK) is a set of tools provided by the manufacturer of (usually) a hardware platform, operating system (OS), or programming language.

You can use the AWS SDK to upload objects in Amazon S3. The SDK provides wrapper libraries for you to upload data easily. For information, see the?List of supported SDKs.

Here are a few examples with a few select SDKs:

Java

import com.amazonaws.AmazonServiceException
import com.amazonaws.SdkClientException;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.PutObjectRequest;

import java.io.File;
import java.io.IOException;

public class UploadObject {

    public static void main(String[] args) throws IOException {
        Regions clientRegion = Regions.DEFAULT_REGION;
        String bucketName = "*** Bucket name ***";
        String stringObjKeyName = "*** String object key name ***";
        String fileObjKeyName = "*** File object key name ***";
        String fileName = "*** Path to file to upload ***";

        try {
            //This code expects that you have AWS credentials set up per:
            // https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/setup-credentials.html
            AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
                    .withRegion(clientRegion)
                    .build();

            // Upload a text string as a new object.
            s3Client.putObject(bucketName, stringObjKeyName, "Uploaded String Object");

            // Upload a file as a new object with ContentType and title specified.
            PutObjectRequest request = new PutObjectRequest(bucketName, fileObjKeyName, new File(fileName));
            ObjectMetadata metadata = new ObjectMetadata();
            metadata.setContentType("plain/text");
            metadata.addUserMetadata("title", "someTitle");
            request.setMetadata(metadata);
            s3Client.putObject(request);
        } catch (AmazonServiceException e) {
            // The call was transmitted successfully, but Amazon S3 couldn't process 
            // it, so it returned an error response.
            e.printStackTrace();
        } catch (SdkClientException e) {
            // Amazon S3 couldn't be contacted for a response, or the client
            // couldn't parse the response from Amazon S3.
            e.printStackTrace();
        }
    }
}

;        

JavaScript:

// Import required AWS SDK clients and commands for Node.js
import { PutObjectCommand } from "@aws-sdk/client-s3";
import { s3Client } from "./libs/s3Client.js"; // Helper function that creates an Amazon S3 service client module.
import {path} from "path";
import {fs} from "fs";

const file = "OBJECT_PATH_AND_NAME"; // Path to and name of object. For example '../myFiles/index.js'.
const fileStream = fs.createReadStream(file);

// Set the parameters
export const uploadParams = {
  Bucket: "BUCKET_NAME",
  // Add the required 'Key' parameter using the 'path' module.
  Key: path.basename(file),
  // Add the required 'Body' parameter
  Body: fileStream,
};


// Upload file to specified bucket.
export const run = async () => {
  try {
    const data = await s3Client.send(new PutObjectCommand(uploadParams));
    console.log("Success", data);
    return data; // For unit tests.
  } catch (err) {
    console.log("Error", err);
  }
};
run();


.        

FTP/SFTP access to an Amazon S3 Bucket

This will be for the next article ^^

For more information about S3:

Rami Taher Wahdan

Software Engineer @ ESE UAE | Information Technology, Programming

1 年

I am using angular, do you have the code of how to do that in angular?

回复
Zouheir BOUDHINA

IT infrastructure Manager

2 年

Good job Hazem !

Rym REGAIEG, PhD

Cloud & DevOps [ AWS educator

2 年

Good job Hazem

Fares Hafaiedh

Ingénieur Systèmes et consultant Spécialiste Ansible

2 年

Keep Up the good work

Riahi maher ?

Cloud and Cybersecurity Architect | Help to assess and secure your IAM environment | Microsoft Zero-trust enabler | @Itergy

2 年

Nice job Hazem ! I'm pleased to work with you ! May be we Will work together on a Cloud project ! Keep going

要查看或添加评论,请登录

社区洞察

其他会员也浏览了