AWS Tutorial: How To Upload Content Directly to AWS S3 Bucket From Frontend (Node.js, Express, AWS-SDK)
In this article you will learn the basics of AWS and how to create a simple html front end that allows users to upload content directly to your AWS s3 bucket.
I recently passed my AWS Certified Cloud Practitioner exam and wanted to practice my new skills by digging into the AWS console and seeing how stuff worked.
This was good timing because I have a Nova Scotia Community College - NSCC Capstone group project in IT - Programming where I am the AWS guy. So what better way to learn how to hook stuff up than by jumping in! (and hoping that I don't make an infinite loop and end up with a $69,000 bill on my AWS dashboard!!!)
I've spent 2 years redesigning my life away from the oilfield into software development and in pretty much 1 month I will graduate into a new field. I will be looking for backend software development positions as I like hooking stuff up and figuring out how stuff works rather than the display of the frontend. So AWS fits in with what I like to do.
What is AWS???
Have you ever wondered what happens with the apps on your phone when you use them? How it is all hooked up? How can it work so smoothly no matter where you are in the world? How stuff communicates with other stuff? How can you communicate with people across the world through an app in seconds?
All of that logic is most likely maintained by AWS.
AWS or Amazon Web Services is the cloud computing branch of Amazon which is responsible for 75% of Amazon profits.
In laymen's terms AWS allows companies to rent computers to do the work they need to do instead of owning their own datacenters. This reduces fixed costs and allows the company to scale up for Christmas shopping and scale down in the off season. All of this is handled on AWS where you have access to 200 microservices to build any application you can think of. From EC2 server instances, to RDS databases to Ground Control (a satellite communication service).
I recently passed the AWS Certified Cloud Practitioner exam so here is a picture of my badge because articles with all text will bore people.
Click that link to see a funny joke about LinkedIn.
Enough jabbering, get to the code!
Above you can see our super simple html. When the user chooses a file and clicks upload, our code will run and the file will end up in our S3 bucket.
First
// make express server
// Importing the Express framework
import express from 'express'
// Importing the generateUploadURL function from './s3.js' file
import { generateUploadURL } from './s3.js'
// Creating an instance of the Express application
const app = express()
// Configuring Express to serve static files from the 'front' directory
app.use(express.static('front'))
// Handling GET requests to the '/s3Url' endpoint
app.get('/s3Url', async (req, res) => {
// Generating an upload URL asynchronously
const url = await generateUploadURL()
// Sending the generated URL as a JSON response
res.send({ url })
})
// Starting the Express server, listening on port 8080
app.listen(8080, () => console.log("listening on port 8080"))
Second
// make s3 instance to get signedURL - you'll need your own Keys in your .env file
// Importing necessary modules: dotenv for environment variables, aws-sdk for AWS operations,
// crypto for cryptographic functions, and promisify from util for converting callback-based
// functions to promise-based functions.
import dotenv from 'dotenv'
import aws from 'aws-sdk'
import crypto from 'crypto'
import { promisify } from "util"
// Promisifying the randomBytes function from the crypto module to generate random bytes asynchronously.
const randomBytes = promisify(crypto.randomBytes)
// Loading environment variables from the .env file using dotenv.
dotenv.config()
// Extracting AWS configuration variables from environment variables.
const bucketregion = process.env.AWS_REGION
const bucketName = process.env.AWS_BUCKET_NAME
const accessKeyId = process.env.AWS_ACCESS_KEY_ID
const secretAccessKey = process.env.AWS_SECRET_ACCESS_KEY
// Creating an instance of the AWS S3 service using the provided credentials and region.
const s3 = new aws.S3({
credentials: {
accessKeyId,
secretAccessKey
},
region: bucketregion,
signatureVersion: 'v4'
})
// Function to generate a signed upload URL for a file to be uploaded to the S3 bucket.
export async function generateUploadURL() {
// Generating random bytes to create a unique image name for the file to be uploaded.
const rawBytes = await randomBytes(16)
const imageName = rawBytes.toString('hex')
// Setting up parameters for the S3 upload, including the bucket name, key (file name), and expiration time.
const params = ({
Bucket: bucketName,
Key: imageName,
Expires: 60
})
// Generating a signed upload URL with a validity period of 60 seconds using the putObject operation.
const uploadURL = await s3.getSignedUrlPromise('putObject', params)
// Returning the generated upload URL.
return uploadURL
}
领英推荐
Third
// set up your s3 and IAM
To make this work we have a few things going on. On the AWS side you'll need to set up a:
S3 bucket
To make this work we have a few things going on. On the AWS side you'll need to set up stuff:
{
"Version": "2012-10-17",
"Id": "Policy1710024644835",
"Statement": [
{
"Sid": "Stmt1710024634772",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::super-cool-s3-direct-upload-bucket/*"
}
]
}
[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"PUT",
"HEAD",
"GET"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": []
}
]
IAM
We want the server to act as the user with a specific IAM policy and as a specific IAM user that we declare. This way when it gets the SignedURL, that url will allow the user to directly upload to the S3 bucket instead of getting the server to.
The IAM policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject"
],
"Resource": "arn:aws:s3:::*/*"
}
]
}
Make a new User that uses the policy above:
Here is some of the original code that I found online from Sam Meech-Ward. He also has a really good programming Youtube channel.
Between my explanation and these links, I'm sure you can figure it out.
Thanks for reading!
#halifax #novascotia #softwaredevelopment #aws #backenddevelopment #devops #software
Python developer and SQL Server Developer at Behineh Pardazesh
11 个月awesome