@mavogel/cdk-hugo-pipeline
v0.0.403
Published
Build you hugo website all on AWS with CI/CD and a dev environment.
Downloads
2,000
Readme
cdk-hugo-pipeline
This is an AWS CDK Construct for deploying Hugo Static websites to AWS S3 behind SSL/Cloudfront with cdk-pipelines
, having an all-in-one infrastructure-as-code deployment on AWS, meaning
- self-contained, all resources should be on AWS
- a blog with
hugo
and a nice theme (in my opinion) - using
cdk
and cdk-pipelines running - a monorepo with all the code components
- with a development stage on a
dev.your-domain.com
subdomain
Take a look at the blog post My blog with hugo - all on AWS in which I write about all the details and learnings.
Prerequisites
- binaries
brew install node@16 hugo docker
- a
Route53 Hosted Zone
foryour-domain.com
in the AWS account you deploy into.
If you use hugo modules add them as git submodules in the themes
directory, so they can be pulled by the same git command in the codepipeline
.
Usage
In this demo case, we will use the blist
theme: https://github.com/apvarun/blist-hugo-theme, however you can use any other hugo theme. Note, that you need to adapt the branch of the theme you use.
With a projen template (recommended)
and the blist theme.
mkdir my-blog && cd my-blog
npx projen new \
--from @mavogel/projen-cdk-hugo-pipeline@~0 \
--domain your-domain.com \
--projenrc-ts
npm --prefix blog install
# and start the development server on http://localhost:1313
npm run dev
By hand (more flexible)
Set up the repository
# create the surrounding cdk-app
npx projen new awscdk-app-ts
# add the desired hugo template into the 'blog' folder
git submodule add https://github.com/apvarun/blist-hugo-theme.git blog/themes/blist
# add fixed version to hugo template in the .gitmodules file
git submodule set-branch --branch v2.1.0 blog/themes/blist
Configure the repository
depending on the theme you use (here blist)
- copy the example site
cp -r blog/themes/blist/exampleSite/* blog/
- fix the config URLs as we need 2 stages: development & production. Note: internally the modules has the convention of a
public-development
&public-production
output folder for the hugo build.
# create the directories
mkdir -p blog/config/_default blog/config/development blog/config/production
# and move the standard config in the _default folder
mv blog/config.toml blog/config/_default/config.toml
- adapt the config files
## file: blog/config/development/config.toml
cat << EOF > blog/config/development/config.toml
baseurl = "https://dev.your-domain.com"
publishDir = "public-development"
EOF
cat << EOF > blog/config/production/config.toml
## file: blog/config/production/config.toml
baseurl = "https://your-domain.com"
publishDir = "public-production"
EOF
- ignore the output folders in the file
blog/.gitignore
cat << EOF >> blog/.gitignore
public-*
resources/_gen
node_modules
.DS_Store
.hugo_build.lock
EOF
- additionally copy
package.jsons
. Note: this depends on your theme
cp blog/themes/blist/package.json blog/package.json
cp blog/themes/blist/package-lock.json blog/package-lock.json
- Optional: add the script to the
.projenrc.ts
. Note: the command depends on your theme as well
project.addScripts({
dev: 'npm --prefix blog run start',
# below is the general commands
# dev: 'cd blog && hugo server --watch --buildFuture --cleanDestinationDir --disableFastRender',
});
and update the project via the following command
npm run projen
Use Typescript and deploy to your AWS account
Add this to the the main.ts
file
import { App, Stack, StackProps } from 'aws-cdk-lib';
import { HugoPipeline } from '@mavogel/cdk-hugo-pipeline';
export class MyStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
// we only need 1 stack as it creates dev and prod stage in the pipeline
new HugoPipeline(this, 'my-blog', {
domainName: 'your-domain.com', // <- adapt here
});
}
and adapt the main.test.ts
(yes, known issue. See #40)
test('Snapshot', () => {
expect(true).toBe(true);
});
which has a Route53 Hosted Zone
for your-domain.com
:
Deploy it
# build it locally via
npm run build
# deploy the repository and the pipeline once via
npm run deploy
- This will create the
codecommit
repository and thecodepipeline
. The pipeline will fail first, so now commit the code.
# add the remote, e.g. via GRPC http
git remote add origin codecommit::<aws-region>://your-blog
# rename the branch to master (wlll fix this)
git branch -m master main
# push the code
git push origin master
- ... wait until the pipeline has deployed to the
dev stage
, go to your urldev.your-comain.com
, enter the basic auth credentials (default:john:doe
) and look at you beautiful blog :tada:
Customizations
Redirects
You can add customizations such as HTTP 301
redirects , for example
- from
/talks/
to/works/
: - from
https://your-domain.com/talks/2024-01-24-my-talk
- to
https://your-domain.com/works/2024-01-24-my-talk
- or more complex ones
/post/2024-01-25-my-blog/gallery/my-image.webp
to/images/2024-01-25-my-blog/my-image.webp
, which is represented by the regexp'/(\.\*)(\\\/post\\\/)(\.\*)(\\\/gallery\\\/)(\.\*)/'
and capture group'$1/images/$3/$5'
. Here as full example: - from
https://your-domain.com/post/2024-01-25-my-blog/gallery/my-image.webp
- to
https://your-domain.com/images/2024-01-25-my-blog/my-image.webp
export class MyStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
// Note: test you regex upfront
// here https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/replace
// an escape them.
new HugoPipeline(this, 'my-blog', {
domainName: 'your-domain.com', // <- adapt here
cloudfrontRedirectReplacements: { // <- all regexp need to be escaped!
'/\\\/talks\\\//': '/works/', // /talks/ -> /\\\/talks\\\//
// /(.*)(\/post\/)(.*)(\/gallery\/)(.*)/
'/(\.\*)(\\\/post\\\/)(\.\*)(\\\/gallery\\\/)(\.\*)/': '$1/images/$3/$5',
},
});
}
However, you can also pass in a whole custom functions as the next section shows.
Custom Cloudfront function
For the VIEWER_REQUEST
, where you can also achieve Basic Auth
or redirects the way you want
export class MyStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const customCfFunctionCodeDevelopment = `
function handler(event) {
var request = event.request;
var uri = request.uri;
var authHeaders = request.headers.authorization;
var regexes = [/\/talks\//, /\/post\//];
if (regexes.some(regex => regex.test(request.uri))) {
request.uri = request.uri.replace(/\/talks\//, '/works/');
request.uri = request.uri.replace(/\/post\//, '/posts/');
var response = {
statusCode: 301,
statusDescription: "Moved Permanently",
headers:
{ "location": { "value": request.uri } }
}
return response;
}
var expected = "Basic am9objpkb2U=";
if (authHeaders && authHeaders.value === expected) {
if (uri.endsWith('/')) {
request.uri += 'index.html';
}
else if (!uri.includes('.')) {
request.uri += '/index.html';
}
return request;
}
var response = {
statusCode: 401,
statusDescription: "Unauthorized",
headers: {
"www-authenticate": {
value: 'Basic realm="Enter credentials for this super secure site"',
},
},
};
return response;
}
`
const customCfFunctionCodeProduction = `
function handler(event) {
var request = event.request;
var uri = request.uri;
if (uri.endsWith('/')) {
request.uri += 'index.html';
}
else if (!uri.includes('.')) {
request.uri += '/index.html';
}
return request;
}
`
// we do the escapes here so it passed in correctly
const escapedtestCfFunctionCodeDevelopment = customCfFunctionCodeDevelopment.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
const escapedtestCfFunctionCodeProduction = customCfFunctionCodeProduction.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
new HugoPipeline(this, 'my-blog', {
domainName: 'your-domain.com', // <- adapt here
// Note: keep in sync with the basic auth defined in the function
// echo -n "john:doe"|base64 -> 'am9objpkb2U='
basicAuthUsername: 'john',
basicAuthPassword: 'doe',
cloudfrontCustomFunctionCodeDevelopment: cloudfront.FunctionCode.fromInline(escapedtestCfFunctionCodeDevelopment),
cloudfrontCustomFunctionCodeProduction: cloudfront.FunctionCode.fromInline(escapedtestCfFunctionCodeProduction),
});
}
Known issues
- If with
npm test
you get the errordocker exited with status 1
,- then clean the docker layers and re-run the tests via
docker system prune -f
- and if it happens in
codebuild
, re-run the build
- then clean the docker layers and re-run the tests via
Open todos
- [ ] a local development possibility in
docker
Resources / Inspiration
- cdk-hugo-deploy: however here you need to build the static site with
hugo
before locally - CDK-SPA-Deploy: same as above
🚀 Unlock the Full Potential of Your AWS Cloud Infrastructure
Hi, I’m Manuel, an AWS expert passionate about empowering businesses with scalable, resilient, and cost-optimized cloud solutions. With MV Consulting, I specialize in crafting tailored AWS architectures and DevOps-driven workflows that not only meet your current needs but grow with you.
🌟 Why Work With Me?
✔️ Tailored AWS Solutions: Every business is unique, so I design custom solutions that fit your goals and challenges.
✔️ Well-Architected Designs: From scalability to security, my solutions align with AWS Well-Architected Framework.
✔️ Cloud-Native Focus: I specialize in modern, cloud-native systems that embrace the full potential of AWS.
✔️ Business-Driven Tech: Technology should serve your business, not the other way around.
🛠 What I Bring to the Table
🔑 12x AWS Certifications
I’m AWS Certified Solutions Architect and DevOps – Professional and hold numerous additional certifications, so you can trust I’ll bring industry best practices to your projects. Feel free to explose by badges
⚙️ Infrastructure as Code (IaC)
With deep expertise in AWS CDK and Terraform, I ensure your infrastructure is automated, maintainable, and scalable.
📦 DevOps Expertise
From CI/CD pipelines with GitHub Actions and GitLab CI to container orchestration Kubernetes and others, I deliver workflows that are smooth and efficient.
🌐 Hands-On Experience
With over 7 years of AWS experience and a decade in the tech world, I’ve delivered solutions for companies large and small. My open-source contributions showcase my commitment to transparency and innovation. Feel free to explore my GitHub profile
💼 Let’s Build Something Great Together
I know that choosing the right partner is critical to your success. When you work with me, you’re not just contracting an engineer – you’re gaining a trusted advisor and hands-on expert who cares about your business as much as you do.
✔️ Direct Collaboration: No middlemen or red tape – you work with me directly.
✔️ Transparent Process: Expect open communication, clear timelines, and visible results.
✔️ Real Value: My solutions focus on delivering measurable impact for your business.
🙌 Acknowledgements
Big shoutout to the amazing team behind Projen!
Their groundbreaking work simplifies cloud infrastructure projects and inspires us every day. 💡