Creating S3 Signed URLs in Node.js
Let's create something fun: presigned URLs to upload files directly to Amazon S3 buckets. We use this method here on Codú, and it is a very common way of handling image uploads.
At the end of the article, I've included a few benefits of using a presigned URL if you are new (if not, jump in and steal the code).
Let's dive in!
Setup
First things first, make sure you've got the AWS SDK installed:
npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner
Getting a Signed URL for Uploading
Here's the code to create a signed URL:
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3"; import { getSignedUrl } from "@aws-sdk/s3-request-presigner"; // Set up the S3 client const s3Client = new S3Client({ region: "eu-west-1" }); // Replace with your region /* If you are not hosting the app on AWS with permission to fetch the bucket, you'll need to pass the client keys to use. Example: export const s3Client = new S3Client({ region: "eu-west-1", credentials: { accessKeyId: process.env.ACCESS_KEY || "", secretAccessKey: process.env.SECRET_KEY || "", }, }); */ async function getSignedUrlForUpload(bucket, key, contentType) { const command = new PutObjectCommand({ Bucket: bucket, Key: key, ContentType: contentType, }); try { // Generate a signed URL that expires in 15 minutes const signedUrl = await getSignedUrl(s3Client, command, { expiresIn: 900 }); return signedUrl; } catch (err) { console.error("Oops! Something went wrong:", err); throw err; } } // Let's take it for a spin const bucket = "my-bucket-name"; const key = "uploads/uploads/epsteins-list.jpg"; const contentType = "image/jpeg"; getSignedUrlForUpload(bucket, key, contentType) .then(url => console.log("Your magical upload URL:", url)) .catch(err => console.error("Well, that didn't work:", err));
- We're importing the necessary AWS SDK bits, but this time we're using
PutObjectCommand
instead ofGetObjectCommand
. - We set up an S3 client with our region.
- Our
getSignedUrlForUpload
function takes a bucket name, a key (the file's path in your bucket), and the content type of the file. - We create a
PutObjectCommand
with our bucket, key, and content type. - Then, we use
getSignedUrl
to generate our URL, setting it to expire in 15 minutes (900 seconds).
Using it
Now that you've got this signed URL for uploading, here's how you might use it:
const file = document.querySelector('input[type="file"]').files[0]; fetch(signedUrl, { method: 'PUT', body: file, headers: { 'Content-Type': file.type } }) .then(response => { if (response.ok) { console.log('Upload successful!'); } else { console.error('Upload failed.'); } });
Remember, these URLs are temporary, so generate them when needed, ideally just before the upload starts. So the workflow I usually use is when I click "upload," I fetch the pre-signed URL to upload to and immediately start uploading to that URL.
The Benefits of Using Presigned URLs
In case you are new to presigned URLs, I wanted to look at some of the benefits:
Reduced Server Load: By enabling direct uploads to S3, presigned URLs decrease the load on your application servers. This can lead to improved performance and reduced server costs.
Improved Scalability: S3's robust infrastructure can handle high volumes of concurrent uploads more efficiently than most application servers, making your system more scalable.
Granular Access Control: Presigned URLs allow you to specify exact permissions (e.g., upload, download), duration of access, and even set conditions such as maximum file size.
Simplified Client-Side Implementation: Clients only need to perform a standard HTTP PUT request to the pre-signed URL, simplifying the upload process on the front end.
Happy uploading, and may your S3 buckets always have room for one more file!