The moving parts…

Cross-Account Deployment using AWS CodePipeline and CDK

Guy Morton

--

I recently released an open-source project that aims to create a solid foundation for anyone needing to build a cross-account deployment solution that follows the dedicated “DevOps account” model recommended by AWS for deploying code into different environments (dev/staging/prod) from a single centralised account.

The key benefit of isolating the deployment tools into a DevOps account is that it allows an organisation to centralise management of deployments, and the permissions required to do so, in one place. Developers never have to be given access to environments as the pipelines do all the deployment work. Minimising human access to systems is a key principle of the Well-Architected Framework.

This project uses the AWS CDK to craft a “pipeline factory” that can be used to repeatably instantiate common sets of configured infrastructure for a given repository and branch combination.

Here is what we will end up with (shown here with the optional GitLab source channel):

The CDK stacks

Three CDK stacks are used to build the pipelines and associated infrastructure:

  1. A shared infrastructure stack
  2. A cross-account roles stack
  3. A pipeline stack

There are also two “bonus” stacks that are optional-but-maybe-useful:

  1. A repo stack (creates a CodeCommit repo to which you can mirror your GitLab repo so you can use this project to deploy it— the readme file contains detailed instructions on how to achieve that)
  2. A parameter stack (creates Parameter Store entries you can use in your build process)

As there are dependencies between these stacks, and cross-account/cross-region sharing of CloudFormation outputs is not possible, each stack after the first receives some CLI-supplied configuration from the previous stack. The entire process to build a pipeline to deploy a branch from a repo takes about 20 minutes to complete.

What the stacks do

The pipeline-infra stack will create a CMK via KMS in the DevOps account and apply a key policy that allows the target account to use it to decrypt artefacts.

The project’s cross-account-role stack will create all the roles and trust relationships you will need, as well as setting up a least-privileges policy for CloudFormation to create your stacks. It ensures your CodePipeline will be able to operate as required on S3 buckets in different accounts.

Note that the least privileges CloudFormation policy may need to have things added to it if you use resources & actions that it does not currently include. Some suggestions on how you might refine those permissions are available in the readme.

The create-pipeline stacks will create the CodePipeline instance, configure it with the roles that the pipeline will need to assume, point it to your source repo and configure the trigger that will cause it to run.

How the stacks are deployed

This is a standard CDK v1 project, and is deployed using cdk deploy on the command line. A typical command looks something like this:

cdk deploy create-cross-account-role-<reponame>-<branch> \
-c repo=<reponame> \
-c branch=<branch> \
-c region=<region> \
-c devops_account_id=<devops-account-id> \
-c pipeline_key_arn=<pipeline-key-arn> \
--profile <target-account-profile>

Examples for deploying each stack can be found in the readme.

Anatomy of a pipeline

This project creates a pretty standard pipeline consisting of a source, build and deploy stage. Optionally, you can add a manual approval stage that will send email to nominated addresses to notify an approval is required.

The Source stage

Multiple Git sources can be used — GitLab, GitHub & CodeCommit have support out of the box and Bitbucket support can be added with little effort.

If you want a solution for using GitLab with AWS CodePipeline, this project supplies you with one. The readme file contains the details on how to set this up.

In all cases, the pipeline is triggered by commits to the repo branch you have configured it for.

The Build stage

In both cases a buildspec.yml at the root of the repo is executed in CodeBuild during the build stage, so you have full control over how your source code is built. Sample buildspec.yml files are included to get you started, eg this one will build a SAM project:

A context variable called build_env, if supplied, will be passed into the build as an environment variable $ENVIRONMENT — this allows you to pass a dev/test/prod build flag into the build process.

The Manual Approval stage

This optional stage can be added easily, by passing an approvers context to the stack deployment command:

cdk deploy s3-create-pipeline-<reponame>-<branch> \
-c target_bucket=<deploy-bucket-name> \
-c repo=<reponame> \
-c region=<region> \
-c branch=<branch> \
-c build_env=<build-env> \
-c approvers="guy.morton@versent.com.au,you@yours.com" \
-c cross_account_role_arn=<cross-account-role> \
--profile <devops-account>

The Deploy stage

There are two different deploy actions currently supported by this project:

  1. CloudFormation (works for SAM and single-stack CDK projects)
  2. S3 (ideal for PWA, SPA static web projects)

Additional types could be added in future by adding new stacks to the project that support the other deployment actions.

Design Decisions

One pipeline per repo branch

The project assumes a mapping of one repo+branch to one pipeline, and uses context variables to pass parameters at stack creation time. It supports the setting of regionas an override (necessary when your CDK or SAM project creates CloudFront resources in us-east-1, for instance).

It’s advised that you log your deploy commands so that they can be replayed if you need to update a pipeline or the cross-account roles stack.

How the stacks get named

As you can see from the CLI snippet above, each stack is uniquely identified using the reponame and branch values you supply. This ensures the resulting CloudFormation stacks have unambiguous names (so you know which infra belongs to each repo and branch), but also allows for you to create as many instances of each stack as you like.

In the case of the CloudFormation deploy pipeline you can pass stack_name as a context variable to override the default stack name that will otherwise be used.

In conclusion

I hope you find this project useful — please feel free to fork it and modify it to suit your own requirements. If you come up with improvements or bug fixes, submit a pull request and I will be happy to review it.

--

--

Guy Morton

Solutions Architect. Passionate about AWS Cloud, Culture, DevOps & IoT.