Introduction

Bitbucket Pipelines is often the ci-cd tool of choice for organisations which are already entrenched into the Atlassian suite of products. No one is thrilled at the prospect, but hey, you got to play the hand you have been dealt. So, here we go.

bitbucket-pipelines

Problem

One of the key problems with establishing a CI/CD infrastructure flow is that of access control. Specifically for Bitbucket Pipelines, in the past you would need to provide access via repository variables. These would include a access_key and secret_key which would grant it the necessary permission to do the changes as part of the Terraform run.

Solution

Bitbucket Pipelines now support a OIDC role based access flow for AWS, so you don’t need to manage a robot user and it’s key rotation. In this post, I’ll desribe how I went about enabling this flow in our project.

Step 1 - Setting up the basics in AWS

First up, you need to create the basics for use with Bitbucket Pipeline. I use a Cloudformation template to deploy the resources necessary for terraform run like S3 and Dynamodb. To this template we will add the required OIDC provider and an IAM role as well.

I use the following template, you can modify it suitably for any further customizations necessary

  1AWSTemplateFormatVersion: 2010-09-09
  2Description: Basic resources for Terraform
  3Resources:
  4  TFBucket:
  5    Type: AWS::S3::Bucket
  6    Properties:
  7      AccessControl: Private
  8      BucketEncryption:
  9        ServerSideEncryptionConfiguration:
 10          - ServerSideEncryptionByDefault:
 11              SSEAlgorithm: AES256
 12      PublicAccessBlockConfiguration:
 13        BlockPublicAcls: true
 14        BlockPublicPolicy: true
 15        IgnorePublicAcls: true
 16        RestrictPublicBuckets: true
 17      VersioningConfiguration:
 18        Status: Enabled
 19      Tags:
 20        - Key: Purpose
 21          Value: "Terraform state file remote storage"
 22  TFDynamoDBTable:
 23    Type: AWS::DynamoDB::Table
 24    Properties:
 25      BillingMode: PAY_PER_REQUEST
 26      SSESpecification:
 27        SSEEnabled: true
 28      TableName: terraform-remote-state
 29      AttributeDefinitions:
 30        - AttributeName: LockID
 31          AttributeType: S
 32      KeySchema:
 33         - AttributeName: LockID
 34           KeyType: HASH
 35      Tags:
 36        - Key: Purpose
 37          Value: "Terraform state file remote storage"
 38  BBOidc:
 39    Type: AWS::IAM::OIDCProvider
 40    Properties:
 41      ClientIdList:
 42        - 'AUDIENCE'
 43      Tags:
 44        - Key: Purpose
 45          Value: "Bitbucket pipelines to assume IAM role"
 46      ThumbprintList:
 47        - 'THUMBPRINT'
 48      Url: 'IDENTITY PROVIDER URL'
 49  TFRole:
 50    Type: AWS::IAM::Role
 51    Properties:
 52      AssumeRolePolicyDocument:
 53        Version: "2012-10-17"
 54        Statement:
 55          - Effect: Allow
 56            Principal:
 57              Federated:
 58                - !Ref BBOidc
 59            Action:
 60              - 'sts:AssumeRoleWithWebIdentity'
 61
 62      RoleName: terraform-iam-role
 63      Tags:
 64        - Key: Purpose
 65          Value: "Access for Bitbucket Pipelines"
 66  TFPolicy:
 67    Type: AWS::IAM::Policy
 68    Properties:
 69      PolicyName: terraform-base-policy
 70      Roles:
 71        - !Ref TFRole
 72      PolicyDocument:
 73        Version: '2012-10-17'
 74        Statement:
 75        - Effect: Allow
 76          Action:
 77            - 's3:ListBucket'
 78            - 's3:GetObject'
 79            - 's3:PutObject'
 80            - 's3:PutObjectAcl'
 81          Resource:
 82            - !Sub 'arn:aws:s3:::${TFBucket}'
 83            - !Sub 'arn:aws:s3:::${TFBucket}/*'
 84        - Effect: Allow
 85          Action:
 86            - 'dynamodb:GetItem'
 87            - 'dynamodb:PutItem'
 88            - 'dynamodb:DeleteItem'
 89          Resource:
 90            - !Sub 'arn:aws:dynamodb:${AWS::Region}:${AWS::AccountId}:table/${TFDynamoDBTable}'
 91  CBPolicy:
 92    Type: AWS::IAM::Policy
 93    Properties:
 94      PolicyName: codebuild-access-policy
 95      Roles:
 96        - !Ref TFRole
 97      PolicyDocument:
 98        Version: '2012-10-17'
 99        Statement:
100        - Effect: Allow
101          Action:
102            - 'codebuild:CreateProject'
103            - 'codebuild:DeleteProject'
104            - 'codebuild:UpdateProject'
105            - 'codebuild:CreateWebhook'
106            - 'codebuild:DeleteWebhook'
107            - 'codebuild:UpdateWebhook'
108            - 'codebuild:BatchGetProjects'
109          Resource:
110            - !Sub 'arn:aws:codebuild:${AWS::Region}:${AWS::AccountId}:project/*'
111        - Effect: Allow
112          Action:
113            - 'codebuild:ImportSourceCredentials'
114            - 'codebuild:DeleteSourceCredentials'
115            - 'codebuild:ListProjects'
116            - 'codebuild:ListCuratedEnvironmentImages'
117          Resource: '*'
118  IAMPolicy:
119    Type: AWS::IAM::Policy
120    Properties:
121      PolicyName: iam-access-policy
122      Roles:
123        - !Ref TFRole
124      PolicyDocument:
125        Version: '2012-10-17'
126        Statement:
127        - Effect: Allow
128          Action:
129            - 'iam:List*'
130            - 'iam:PassRole'
131            - 'iam:GetRole*'
132          Resource:
133            - !Sub 'arn:aws:iam::${AWS::AccountId}:role/*'
134  S3Policy:
135    Type: AWS::IAM::Policy
136    Properties:
137      PolicyName: s3-access-policy
138      Roles:
139        - !Ref TFRole
140      PolicyDocument:
141        Version: '2012-10-17'
142        Statement:
143        - Effect: Allow
144          Action:
145            - 's3:List*'
146            - 's3:GetObject*'
147            - 's3:PutObject'
148            - 's3:PutObjectAcl'
149          Resource:
150            - 'arn:aws:s3:::bucket'
151            - 'arn:aws:s3:::bucket/*'

As you can see, the template creates -

  1. An S3 bucket which will be used to save the terraform state
  2. A Dynamodb table for terraform state locking
  3. An OIDC provider for Bitbucket pipelines to use
  4. IAM Role and policies to provide necessary access to Bitbucket pipelines
  5. The policies CBPolicy, IAMPolicy, S3Policy provide appropriate permissions to create and manage a codebuild project. These will ofcourse vary, depending on what you plan to manage with Terraform.
  6. For the BBOidc resource, replace the values AUDIENCE, IDENTITY PROVIDER URL and THUMBPRINT. These can be obtained from Bitbucket under Repository settings. Details are here

Once this template is deployed, we have the necessary basics setup to start setting up the pipeline on Bitbucket. Bear in mind though, this is more of an academic exercise. If you plan to roll this out in production, you should probably look at securing this role further. By defining conditions in the role’s IAM trust policy to tie down the scope to a particular repo or workspace, as well as possibly restricting it by IPs.

Step 2 - Terraform Configs

I will just share the provider block to keep it short, you can build up on this to create any necessary AWS resources. Be sure to modify the Cloudformation stack above for the proper permissions.

 1# Backend config for remote_state
 2terraform {
 3  backend "s3" {
 4    bucket         = "BUCKET_FROM_CFTEMPLATE"
 5    key            = "DYNAMODB_KEY"
 6    region         = "REPLACE_WITH_REGION_TO_USE"
 7    encrypt        = true
 8    dynamodb_table = "DYNAMODB_FROM_CFTEMPLATE"
 9  }
10  required_providers {
11    aws = {
12      source  = "hashicorp/aws"
13      version = "~> 3.0"
14    }
15  }
16}
17# Configure aws provider
18provider "aws" {
19  region = "REPLACE_WITH_REGION_TO_USE"
20}

You will need to replace BUCKET_FROM_CFTEMPLATE, DYNAMODB_FROM_CFTEMPLATE, DYNAMODB_KEY, REPLACE_WITH_REGION_TO_USE. The Dynamodb and S3 bucket are created in Step 1, just copy over the values from there.

Step 3 - Configure Bitbucket Pipeline

Right, now to the heart of the matter - configuring Bitbucket pipeline. It is fairly simple, you just need to add a file bibucket-pipelines.yml in the same Bitbucket repo as your terraform code. The one I used is included below.

 1image: hashicorp/terraform:1.0.7
 2definitions: 
 3  scripts:
 4    - script: &aws-context
 5        export AWS_REGION=REPLACE_WITH_REGION_TO_USE;
 6        export AWS_WEB_IDENTITY_TOKEN_FILE=$(pwd)/web-identity-token;
 7        export AWS_ROLE_SESSION_NAME=build-session;
 8        export AWS_ROLE_ARN=REPLACE_WITH_ROLE_ARN_TO_USE;
 9        echo $BITBUCKET_STEP_OIDC_TOKEN > $(pwd)/web-identity-token
10  steps:
11    - step: &validate
12        name: Validate Terraform config
13        oidc: true
14        script:
15          - terraform init -backend=false
16          - terraform validate
17    - step: &plan
18        name: Terraform Plan
19        oidc: true
20        script:
21          - *aws-context
22          - terraform init
23          - terraform plan -input=false -out=tfplan.out
24        artifacts: 
25          - tfplan.out
26    - step: &apply
27        name: Terraform Apply
28        oidc: true
29        trigger: manual
30        script:
31          - *aws-context
32          - terraform init
33          - terraform apply -input=false -auto-approve tfplan.out
34pipelines:
35  branches:
36    main:
37      - step: *validate
38      - step: *plan
39      - step: *apply
40    default:
41      - step: *validate

There is a lot going on in there, so I’ll try to break it down a bit.

First up are the definitions - Resources here can be resued in your pipeline using yaml anchor. So this is a good place to declare stuff you know you will be calling often. In this case, I have declared a script which sets up the environment variables necessary for the OIDC role flow to work. Yes, you need to attach this to EVERY step which needs access to AWS via the OIDC role.

Second, notice the flag oidc: true that is applied to all the steps that need access to AWS resources. This possibly tells Bitbucket to kick off some magic in the backend and populate all the environment variables we set up above.

Third, I use artifacts to pass on the plan output from one step to the next. This is resued when you run the apply step.

Fourth, I have a flag trigger: manual on the terraform apply step. This ensures the apply step is only triggered after a careful review of the plan step above. But you can always remove it to trigger the apply as soon as plan succeeds.

And that’s it really! Now everytime you commit a change to the repo, you should see a pipeline run triggered.

Conclusion

I hope this post proves useful to setup a OIDC role based access control for Bitbucket pipelines. There are a plethora of customization options available for them including running the builds on self hosted runners. So feel free to go through their docs to find out more, I will include the useful links below.

References (3)

  1. Configure Your Pipeline 
  2. Integrate Pipelines With Resource Servers Using Oidc 
  3. Deploy on Aws Using Bitbucket Pipelines Openid Connect