What is AWS CodePipeline?

AWS CodePipeline is a continuous integration and continuous delivery (CI/CD) service that automates the build, test, and deploy phases of your release process.

Key Concepts

Pipeline

A workflow that describes how code changes go through release process. Consists of multiple stages executed sequentially.

Stage

A logical unit in pipeline (e.g., Source, Build, Test, Deploy). Each stage contains one or more actions.

Action

A task performed on artifacts (e.g., download from GitHub, compile code, deploy to EC2). Actions within same stage run in parallel.

Artifact

Files/data passed between stages. Stored in S3 bucket and encrypted.

Transition

Connection between stages. Can be disabled to pause pipeline flow.

Action Categories

1. Source

Get code/artifacts from repository.

Providers:

  • S3 - Download from S3 bucket
  • CodeCommit - AWS Git repository
  • GitHub - GitHub repository
  • Bitbucket - Bitbucket repository
  • ECR - Docker images

Example:

{
  "name": "Source",
  "actions": [{
    "name": "GetSource",
    "actionTypeId": {
      "category": "Source",
      "provider": "GitHub"
    },
    "configuration": {
      "Owner": "myusername",
      "Repo": "my-app",
      "Branch": "main"
    },
    "outputArtifacts": [{"name": "SourceOutput"}]
  }]
}

2. Build

Compile code, run tests, create artifacts.

Providers:

  • CodeBuild - AWS build service
  • Jenkins - Jenkins build server

Example:

{
  "name": "Build",
  "actions": [{
    "name": "CompileCode",
    "actionTypeId": {
      "category": "Build",
      "provider": "CodeBuild"
    },
    "inputArtifacts": [{"name": "SourceOutput"}],
    "outputArtifacts": [{"name": "BuildOutput"}]
  }]
}

3. Test

Run automated tests.

Providers:

  • CodeBuild - Run test suites
  • AWS Device Farm - Mobile app testing

4. Deploy

Deploy application to environments.

Providers:

  • CodeDeploy - Deploy to EC2/Lambda/ECS
  • CloudFormation - Deploy infrastructure
  • ECS - Deploy containers
  • Elastic Beanstalk - Deploy to Beanstalk
  • S3 - Deploy static website

5. Approval

Manual approval gate before proceeding.

Provider:

  • Manual - Human approval required

Example:

{
  "name": "Approval",
  "actions": [{
    "name": "ProductionApproval",
    "actionTypeId": {
      "category": "Approval",
      "provider": "Manual"
    },
    "configuration": {
      "CustomData": "Please review and approve deployment to production"
    }
  }]
}

6. Invoke

Trigger external functions.

Providers:

  • Lambda - Invoke Lambda function
  • Step Functions - Start Step Functions execution

How Pipeline Starts

Automatic Triggers (Most Common)

Source code change detection:

Developer pushes code to GitHub
    ↓
GitHub webhook notifies CodePipeline
    ↓
CodePipeline automatically starts
    ↓
Source stage downloads latest code

Supported triggers:

  • GitHub/CodeCommit - New commit pushed
  • S3 - New file uploaded
  • ECR - New Docker image pushed

Manual Start

Via Console: Click “Release change” button

Via CLI:

aws codepipeline start-pipeline-execution --name MyPipeline

Scheduled Trigger

EventBridge rule triggers pipeline on schedule:

# Every day at 2 AM
aws events put-rule \
  --name DailyPipelineTrigger \
  --schedule-expression "cron(0 2 * * ? *)"

Common Pipeline Patterns

Pattern 1: Simple Web Application

Source → Build → Deploy

Pattern 2: Multi-Environment with Testing

Source → Build → Test → Deploy-Dev → Approval → Deploy-Prod

Pattern 3: Infrastructure as Code

Source → Validate → Deploy-Staging → Test → Approval → Deploy-Production

Pattern 4: Microservices with Parallel Deployment

Source → Build → Test
                   ↓
         ┌─────────┼─────────┐
         ↓         ↓         ↓
    Deploy-API  Deploy-Web  Deploy-Worker

Cross-Account Deployment

Deploy CloudFormation stack from Account 1 (Pipeline) to Account 2 (Target).

Setup Requirements

Account 1 (Pipeline Account):

  1. Create KMS key for artifact encryption
  2. Create S3 bucket for artifacts
  3. Grant Account 2 access to both

Account 2 (Target Account):

  1. Create cross-account IAM role (assumed by CodePipeline)
  2. Create CloudFormation service role (used by CloudFormation)

Runtime Flow

1. CodePipeline (Account 1) starts
2. Source stage: Get CloudFormation template
3. Upload template to S3 bucket (Account 1, encrypted with KMS)
4. Deploy stage: CodePipeline assumes CrossAccountRole (Account 2)
5. Download template from Account 1 S3 (decrypt with KMS)
6. Call CloudFormation API in Account 2
7. CloudFormation assumes CloudFormationServiceRole
8. CloudFormation creates resources in Account 2

Two Roles in Account 2

CrossAccountCloudFormationRole:

  • Assumed by: CodePipeline in Account 1
  • Purpose: Allow CodePipeline to call CloudFormation APIs
  • Permissions: CloudFormation actions, S3 read, KMS decrypt

CloudFormationServiceRole:

  • Assumed by: CloudFormation service
  • Purpose: Create actual AWS resources
  • Permissions: EC2, RDS, S3, etc. (whatever stack needs)

Example: Complete Pipeline

{
  "pipeline": {
    "name": "MyWebAppPipeline",
    "roleArn": "arn:aws:iam::111111111111:role/CodePipelineRole",
    "artifactStore": {
      "type": "S3",
      "location": "my-pipeline-artifacts"
    },
    "stages": [
      {
        "name": "Source",
        "actions": [{
          "name": "SourceAction",
          "actionTypeId": {
            "category": "Source",
            "owner": "ThirdParty",
            "provider": "GitHub",
            "version": "1"
          },
          "configuration": {
            "Owner": "myusername",
            "Repo": "my-app",
            "Branch": "main"
          },
          "outputArtifacts": [{"name": "SourceOutput"}]
        }]
      },
      {
        "name": "Build",
        "actions": [{
          "name": "BuildAction",
          "actionTypeId": {
            "category": "Build",
            "owner": "AWS",
            "provider": "CodeBuild",
            "version": "1"
          },
          "inputArtifacts": [{"name": "SourceOutput"}],
          "outputArtifacts": [{"name": "BuildOutput"}]
        }]
      },
      {
        "name": "Deploy",
        "actions": [{
          "name": "DeployAction",
          "actionTypeId": {
            "category": "Deploy",
            "owner": "AWS",
            "provider": "CodeDeploy",
            "version": "1"
          },
          "inputArtifacts": [{"name": "BuildOutput"}],
          "configuration": {
            "ApplicationName": "MyApp",
            "DeploymentGroupName": "Production"
          }
        }]
      }
    ]
  }
}

Create Pipeline via CLI

aws codepipeline create-pipeline --cli-input-json file://pipeline.json

Common Commands

# Start pipeline execution
aws codepipeline start-pipeline-execution --name MyPipeline

# Get pipeline status
aws codepipeline get-pipeline-state --name MyPipeline

# List pipelines
aws codepipeline list-pipelines

# Update pipeline
aws codepipeline update-pipeline --cli-input-json file://updated-pipeline.json

# Delete pipeline
aws codepipeline delete-pipeline --name MyPipeline

Best Practices

  1. Use separate stages for different environments (Dev, Staging, Prod)
  2. Add manual approval before production deployment
  3. Enable CloudWatch Events for pipeline notifications
  4. Use parameter store/secrets manager for sensitive data
  5. Implement automated testing in Test stage
  6. Use cross-account deployment for security isolation
  7. Enable artifact encryption with KMS
  8. Tag pipelines for cost tracking and organization

Pricing

  • Pipeline: $1 per active pipeline per month
  • Active pipeline: Pipeline that exists and has at least one code change in the month
  • Free tier: 1 free active pipeline per month
  • Additional costs: S3 storage for artifacts, KMS encryption, CodeBuild minutes

Notes

  • Minimum 2 stages required (Source + one other)
  • Stages execute sequentially by default
  • Actions within same stage run in parallel
  • Artifacts automatically encrypted and stored in S3
  • Pipeline automatically retries failed actions (configurable)
  • Can integrate with third-party tools (Jenkins, GitHub, etc.)