AWS Tutorial: How To Upload Content Directly to AWS S3 Bucket From Frontend (Node.js, Express, AWS-SDK)
How to upload media directly to S3 from Frontend

AWS Tutorial: How To Upload Content Directly to AWS S3 Bucket From Frontend (Node.js, Express, AWS-SDK)

In this article you will learn the basics of AWS and how to create a simple html front end that allows users to upload content directly to your AWS s3 bucket.

user -> node.js server -> GET SignedURL -> direct upload

I recently passed my AWS Certified Cloud Practitioner exam and wanted to practice my new skills by digging into the AWS console and seeing how stuff worked.

This was good timing because I have a Nova Scotia Community College - NSCC Capstone group project in IT - Programming where I am the AWS guy. So what better way to learn how to hook stuff up than by jumping in! (and hoping that I don't make an infinite loop and end up with a $69,000 bill on my AWS dashboard!!!)

I've spent 2 years redesigning my life away from the oilfield into software development and in pretty much 1 month I will graduate into a new field. I will be looking for backend software development positions as I like hooking stuff up and figuring out how stuff works rather than the display of the frontend. So AWS fits in with what I like to do.

What is AWS???

Have you ever wondered what happens with the apps on your phone when you use them? How it is all hooked up? How can it work so smoothly no matter where you are in the world? How stuff communicates with other stuff? How can you communicate with people across the world through an app in seconds?

All of that logic is most likely maintained by AWS.

AWS or Amazon Web Services is the cloud computing branch of Amazon which is responsible for 75% of Amazon profits.

AWS market share

In laymen's terms AWS allows companies to rent computers to do the work they need to do instead of owning their own datacenters. This reduces fixed costs and allows the company to scale up for Christmas shopping and scale down in the off season. All of this is handled on AWS where you have access to 200 microservices to build any application you can think of. From EC2 server instances, to RDS databases to Ground Control (a satellite communication service).

I recently passed the AWS Certified Cloud Practitioner exam so here is a picture of my badge because articles with all text will bore people.

https://www.instagram.com/reel/C2rybQXrPj9/?utm_source=ig_web_copy_link&igsh=MzRlODBiNWFlZA==

Click that link to see a funny joke about LinkedIn.

super duper cool cloud practitioner badge

Enough jabbering, get to the code!

our crazy front end

Above you can see our super simple html. When the user chooses a file and clicks upload, our code will run and the file will end up in our S3 bucket.

First

// make express server

// Importing the Express framework
import express from 'express'

// Importing the generateUploadURL function from './s3.js' file
import { generateUploadURL } from './s3.js'

// Creating an instance of the Express application
const app = express()

// Configuring Express to serve static files from the 'front' directory
app.use(express.static('front'))

// Handling GET requests to the '/s3Url' endpoint
app.get('/s3Url', async (req, res) => {
  // Generating an upload URL asynchronously
  const url = await generateUploadURL()

  // Sending the generated URL as a JSON response
  res.send({ url })
})

// Starting the Express server, listening on port 8080
app.listen(8080, () => console.log("listening on port 8080"))
        

Second

// make s3 instance to get signedURL - you'll need your own Keys in your .env file

// Importing necessary modules: dotenv for environment variables, aws-sdk for AWS operations,
// crypto for cryptographic functions, and promisify from util for converting callback-based
// functions to promise-based functions.
import dotenv from 'dotenv'
import aws from 'aws-sdk'
import crypto from 'crypto'
import { promisify } from "util"

// Promisifying the randomBytes function from the crypto module to generate random bytes asynchronously.
const randomBytes = promisify(crypto.randomBytes)

// Loading environment variables from the .env file using dotenv.
dotenv.config()

// Extracting AWS configuration variables from environment variables.
const bucketregion = process.env.AWS_REGION
const bucketName = process.env.AWS_BUCKET_NAME
const accessKeyId = process.env.AWS_ACCESS_KEY_ID
const secretAccessKey = process.env.AWS_SECRET_ACCESS_KEY

// Creating an instance of the AWS S3 service using the provided credentials and region.
const s3 = new aws.S3({
  credentials: { 
    accessKeyId, 
    secretAccessKey 
  },
  region: bucketregion,
  signatureVersion: 'v4'
})

// Function to generate a signed upload URL for a file to be uploaded to the S3 bucket.
export async function generateUploadURL() {
  // Generating random bytes to create a unique image name for the file to be uploaded.
  const rawBytes = await randomBytes(16)
  const imageName = rawBytes.toString('hex')

  // Setting up parameters for the S3 upload, including the bucket name, key (file name), and expiration time.
  const params = ({
    Bucket: bucketName,
    Key: imageName,
    Expires: 60
  })
  
  // Generating a signed upload URL with a validity period of 60 seconds using the putObject operation.
  const uploadURL = await s3.getSignedUrlPromise('putObject', params)
  
  // Returning the generated upload URL.
  return uploadURL
}
        

Third

// set up your s3 and IAM

To make this work we have a few things going on. On the AWS side you'll need to set up a:

S3 bucket

To make this work we have a few things going on. On the AWS side you'll need to set up stuff:

  • With this bucket policy to allow the express server to Get the signed URL

{
    "Version": "2012-10-17",
    "Id": "Policy1710024644835",
    "Statement": [
        {
            "Sid": "Stmt1710024634772",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::super-cool-s3-direct-upload-bucket/*"
        }
    ]
}        

  • and this CORS policy says what other requests can do in the bucket.

[
    {
        "AllowedHeaders": [
            "*"
        ],
        "AllowedMethods": [
            "PUT",
            "HEAD",
            "GET"
        ],
        "AllowedOrigins": [
            "*"
        ],
        "ExposeHeaders": []
    }
]        

IAM

We want the server to act as the user with a specific IAM policy and as a specific IAM user that we declare. This way when it gets the SignedURL, that url will allow the user to directly upload to the S3 bucket instead of getting the server to.

The IAM policy:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::*/*"
        }
    ]
}        

Make a new User that uses the policy above:

  • When you create the user you'll need to attach the policy above.
  • You also need to create SECRET_ACCESS_KEY_ID and ACCESS_KEY from the AWS security tab on the user
  • This will allow the user to use the signedURL that the server got from S3

create the 2 keys and put them in your .env file

Here is some of the original code that I found online from Sam Meech-Ward. He also has a really good programming Youtube channel.

https://youtu.be/yGYeYJpRWPM?si=lM2RagUcp-6bLD3k

https://github.com/Sam-Meech-Ward/s3-direct-upload

Between my explanation and these links, I'm sure you can figure it out.

Thanks for reading!

#halifax #novascotia #softwaredevelopment #aws #backenddevelopment #devops #software

Sam Meech-Ward

Amazon Web Services (AWS)

Nova Scotia Community College - NSCC

Mahmoud Pakniat

Python developer and SQL Server Developer at Behineh Pardazesh

11 个月

awesome

回复

要查看或添加评论,请登录

Justin Bishop的更多文章

社区洞察

其他会员也浏览了