Bitbucket Pipelines is quickly becoming a favourite with organisations which are already entrenched into the Atlassian suite of products. People with prior experience of Bamboo may not be thrilled at the prospect, but hey, it’s worth a shot.

bitbucket-pipelines

The Problem

One of the key problems with establishing a CI/CD infrastructure flow is that of access control. Specifically for Bitbucket Pipelines, in the past you would need to provide access via repository variables. These would include a access_key and secret_key which would grant it the necessary permission to do the changes as part of the Terraform run.

The Solution

Bitbucket Pipelines now support a OIDC role based access flow for AWS, so you don’t need to manage a robot user and it’s key rotation. In this post, I’ll desribe how I went about enabling this flow in our project.

Step 1 - Setting up the basics in AWS

First up, you need to create the basics for use with Bitbucket Pipeline. I use a Cloudformation template to deploy the resources necessary for terraform run like S3 and Dynamodb. To this template we will add the required OIDC provider and an IAM role as well.

I use the following template, you can modify it suitably for any further customizations necessary

As you can see, the template creates -

  1. An S3 bucket which will be used to save the terraform state
  2. A Dynamodb table for terraform state locking
  3. An OIDC provider for Bitbucket pipelines to use
  4. IAM Role and policies to provide necessary access to Bitbucket pipelines
  5. The policies CBPolicy, IAMPolicy, S3Policy provide appropriate permissions to create and manage a codebuild project. These will ofcourse vary, depending on what you plan to manage with Terraform.
  6. For the BBOidc resource, replace the values AUDIENCE, IDENTITY PROVIDER URL and THUMBPRINT. These can be obtained from Bitbucket under Repository settings. Details are here

Once this template is deployed, we have the necessary basics setup to start setting up the pipeline on Bitbucket. Bear in mind though, this is more of an academic exercise. If you plan to roll this out in production, you should probably look at securing this role further. By defining conditions in the role’s IAM trust policy to tie down the scope to a particular repo or workspace, as well as possibly restricting it by IPs.

Step 2 - Terraform Configs

I will just share the provider block to keep it short, you can build up on this to create any necessary AWS resources. Be sure to modify the Cloudformation stack above for the proper permissions.

You will need to replace BUCKET_FROM_CFTEMPLATE, DYNAMODB_FROM_CFTEMPLATE, DYNAMODB_KEY, REPLACE_WITH_REGION_TO_USE. The Dynamodb and S3 bucket are created in Step 1, just copy over the values from there.

Step 3 - Configure Bitbucket Pipeline

Right, now to the heart of the matter - configuring Bitbucket pipeline. It is fairly simple, you just need to add a file bibucket-pipelines.yml in the same Bitbucket repo as your terraform code. The one I used is included below.

There is a lot going on in there, so I’ll try to break it down a bit.

First up are the definitions - Resources here can be resued in your pipeline using yaml anchor. So this is a good place to declare stuff you know you will be calling often. In this case, I have declared a script which sets up the environment variables necessary for the OIDC role flow to work. Yes, you need to attach this to EVERY step which needs access to AWS via the OIDC role.

Second, notice the flag oidc: true that is applied to all the steps that need access to AWS resources. This possibly tells Bitbucket to kick off some magic in the backend and populate all the environment variables we set up above.

Third, I use artifacts to pass on the plan output from one step to the next. This is resued when you run the apply step.

Fourth, I have a flag trigger: manual on the terraform apply step. This ensures the apply step is only triggered after a careful review of the plan step above. But you can always remove it to trigger the apply as soon as plan succeeds.

And that’s it really! Now everytime you commit a change to the repo, you should see a pipeline run triggered.

Conclusion

I hope this post proves useful to setup a OIDC role based access control for Bitbucket pipelines. There are a plethora of customization options available for them including running the builds on self hosted runners. So feel free to go through their docs to find out more, I will include the useful links below.

References (3)

  1. Configure Your Pipeline 
  2. Integrate Pipelines With Resource Servers Using Oidc 
  3. Deploy on Aws Using Bitbucket Pipelines Openid Connect