In this final post for a two-part series, we’ll go over an additional script that can be used to upload all local static site contents to a previously configured Amazon Web Services S3 bucket.
It turns out that writing this script is much easier; we’re going to use the same AWS S3 CLI command to upload all of our site’s contents. This will leave us with multiple, very similar lines where the only difference is the file extension and content type for each type of file we’ll upload (HTML, CSS, JS, so on). We’ll explain how the bulk of this script works by explaining the command used. The complete script is shown below:
[CmdletBinding()] Param( # Name of the bucket being configured [Parameter(Mandatory=$true,Position=1)] [String] $Name, # Name of the path on the local computer to copy files from, assume current working directory by default [String] $Path = $pwd.Path ) # [https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html] :: upload boilerplate files aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.html" --recursive --metadata-directive REPLACE ` --content-type text/html --cache-control public,max-age=604800 # Upload stylesheets aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.css" --recursive --metadata-directive REPLACE ` --content-type text/css --cache-control public,max-age=604800 # Upload JavaScript files aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.js" --recursive --metadata-directive REPLACE ` --content-type text/javascript --cache-control public,max-age=604800 # Upload PNG images aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.png" --recursive --metadata-directive REPLACE ` --content-type image/png --cache-control public,max-age=604800 # Upload JPG images aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.jpg" --recursive --metadata-directive REPLACE ` --content-type image/jpg --cache-control public,max-age=604800 # Set the website configuration for the bucket, setting the index and error pages. aws s3 website s3://$($Name) --index-document index.html --error-document error.html
The command aws s3 cp copies files to, from and between S3 buckets. In addition to copying files, a variety of parameters are provided to change how the copying operations perform, as well as attributes and metadata for each of the copied files.
The most simple example we could start with would copy files in the current directory to a bucket we previously provisoned, where the name of the bucket is substituted from the $Name parameter in the required format. To copy files inside the directory, we also need to include the –recursive parameter:
aws s3 cp . s3://$($Name) --recursive
The first of these is –exclude which can accept a wildcard value. This causes the copy command to exclude everything - all the files inside of the current directory. Once we’ve added this parameter, the second parameter –include is used with a wildcard pattern, where anything with the specified pattern in its filename gets picked and copied to the bucket. This has the effect of copying over all the HTML files inside the current directory:
aws s3 cp . s3://$($Name) --exclude "*" --include "*.html" --recursive
As a minor formatting concern, we’ll break this command into two lines by using the backtick(`) character, which Powershell interprets appropriately:
aws s3 cp . s3://$($Name) --exclude "*" --include "*.html" --recursive --metadata-directive REPLACE ` --content-type text/html --cache-control public,max-age=604800
The –content-type parameter is set to the appropriate MIME type value that corresponds to an HTML page file. We can get the required value for this file from MDN’s “Incomplete list of MIME Types” page. Finally, the –cache-control parameter is set to indicate an expiration of 7 days, expressed as seconds. MDN also has a page describing possible values for this metadata attribute.
# [https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html] :: upload boilerplate files aws s3 cp . s3://$($Name) --exclude "*" --include "*.html" --recursive --metadata-directive REPLACE ` --content-type text/html --cache-control public,max-age=604800 # Upload stylesheets aws s3 cp . s3://$($Name) --exclude "*" --include "*.css" --recursive --metadata-directive REPLACE ` --content-type text/css --cache-control public,max-age=604800 # Upload JavaScript files aws s3 cp . s3://$($Name) --exclude "*" --include "*.js" --recursive --metadata-directive REPLACE ` --content-type text/javascript --cache-control public,max-age=604800 # Upload PNG images aws s3 cp . s3://$($Name) --exclude "*" --include "*.png" --recursive --metadata-directive REPLACE ` --content-type image/png --cache-control public,max-age=604800 # Upload JPG images aws s3 cp . s3://$($Name) --exclude "*" --include "*.jpg" --recursive --metadata-directive REPLACE ` --content-type image/jpg --cache-control public,max-age=604800
aws s3 website s3://$($Name) --index-document index.html --error-document error.html
[CmdletBinding()] Param( # Name of the bucket being configured [Parameter(Mandatory=$true,Position=1)] [String] $Name, # Name of the path on the local computer to copy files from, assume current working directory by default [String] $Path = $pwd.Path ) # [https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html] :: upload boilerplate files aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.html" --recursive --metadata-directive REPLACE ` --content-type text/html --cache-control public,max-age=604800 # Upload stylesheets aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.css" --recursive --metadata-directive REPLACE ` --content-type text/css --cache-control public,max-age=604800 # Upload JavaScript files aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.js" --recursive --metadata-directive REPLACE ` --content-type text/javascript --cache-control public,max-age=604800 # Upload PNG images aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.png" --recursive --metadata-directive REPLACE ` --content-type image/png --cache-control public,max-age=604800 # Upload JPG images aws s3 cp $Path s3://$($Name) --exclude "*" --include "*.jpg" --recursive --metadata-directive REPLACE ` --content-type image/jpg --cache-control public,max-age=604800 # Set the website configuration for the bucket, setting the index and error pages. aws s3 website s3://$($Name) --index-document index.html --error-document error.html
Once created, we have the bucket available in the S3 management console with the appropriate access configured.
And, visiting the S3 bucket URL works as well, with both our index and error pages functioning.
Additional Resources
Tags: