digger icon indicating copy to clipboard operation
digger copied to clipboard

Support injecting default environment variables into steps

Open joerg opened this issue 1 year ago • 2 comments

Atlantis supports injecting certain default environment variables into run steps, see Notes blockt at https://www.runatlantis.io/docs/custom-workflows.html#custom-run-command. These variables are partially generated from Infos of the Digger workflow like the Digger/Atlantis project name so they can't just be injected from outside. A nice use case would be to set the state backend depending on the workflow:

projects:
- name: production
  dir: .
  workflow: default
- name: sandbox
  dir: .
  workflow: default

workflows:
  default:
    plan:
      steps:
      - init:
        extra_args: ["-backend-config="key=${BASE_REPO_NAME}/${PROJECT_NAME}.tfstate"]
      - plan:
        extra_args: ["-var-file=configs/${PROJECT_NAME}.tfvars"]

This singe workflow allows running multiple environments or deployments in a single directory of terraform code. The states will be separated automatically depending on the digger project name and also different terraform variables can be set per project.

As discussed with @motatoes on Slack, Digger should support at least the same set of Environment variables Atlantis does to be on par with their configuration here. As a side note, these injected default variables are in Atlantis only available in run steps, not in init/plan/apply steps. This is sufficient because a run step can set an environment variable like this TF_CLI_ARGS_plan="-backend-config=\"key=${BASE_REPO_NAME}/${PROJECT_NAME}.tfstate\"" but of course supporting them in init/plan/apply steps directly would be even more convenient.

The Atlantis code for these variables can be found here: https://github.com/runatlantis/atlantis/blob/main/server/core/runtime/run_step_runner.go#L39 .

joerg avatar Oct 25 '23 14:10 joerg

Thanks for filing @joerg ! Super helpful

ZIJ avatar Oct 26 '23 10:10 ZIJ

Not the neatest solution but I solved this with some yq magic before Digger ran.

Supports env_vars defined per workflow as well as expanding envs available in the runner's context. Might not be perfect it's not had thorough testing by any means. But hopefully, this helps somebody.

It would need some tweaks to support ${PROJECT_NAME} though.

yq -i '
  explode(.) |
  ( .workflows[] |= (. as $workflow | .. | select(tag == "!!str") |= ($workflow.env_vars.state[] as $env ireduce (.; sub("\${?" + $env.name + "}?", $env.value)))) ) |
  (del(."x-*")) |
  (.. | select(tag == "!!str")) |= envsubst(nu)
' digger.yaml

A full workflow might look something like the following:

name: Digger Job

on:
  workflow_call:
    inputs:
      configFile:
        required: false
        type: string
        default: digger.yaml
      mode:
        required: false
        type: string
    secrets:
      SLACK_WEBHOOK_URL:
        required: true

concurrency:
  group: ${{ github.workflow }}
  cancel-in-progress: false

jobs:
  digger-job:
    runs-on: ubuntu-latest
    timeout-minutes: 60
    permissions:
      actions: write       # required for plan persistence
      contents: write      # required to merge PRs
      id-token: write      # required for workload-identity-federation
      pull-requests: write # required to post PR comments
      statuses: write      # required to validate combined PR status
    steps:
      - uses: actions/checkout@v4
        if: github.event_name == 'issue_comment'
        with:
          clean: false
          ref: refs/pull/${{ github.event.issue.number }}/merge

      - uses: actions/checkout@v4
        if: github.event_name != 'issue_comment'
        with:
          clean: false

      - run: |
          mv ${{ inputs.configFile }} "${RUNNER_TEMP}/digger.yaml"
          rm -f digger.yml digger.yaml
          mv "${RUNNER_TEMP}/digger.yaml" .

      - uses: oNaiPs/secrets-to-env-action@v1
        with:
          secrets: ${{ toJSON(vars) }}

      - uses: oNaiPs/secrets-to-env-action@v1
        with:
          secrets: ${{ toJSON(secrets) }}

      - uses: mikefarah/[email protected]
        with:
          cmd: >-
            yq -i '
              explode(.) |
              ( .workflows[] |= (. as $workflow | .. | select(tag == "!!str") |= ($workflow.env_vars.state[] as $env ireduce (.; sub("\${?" + $env.name + "}?", $env.value)))) ) |
              (del(."x-*")) |
              (.. | select(tag == "!!str")) |= envsubst(nu)
            ' digger.yaml

      - uses: diggerhq/[email protected]
        env:
          GITHUB_CONTEXT: ${{ toJson(github) }}
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          TF_VAR_ci: true
        with:
          configure-checkout: false
          disable-locking: true
          drift-detection-slack-notification-url: ${{ secrets.SLACK_WEBHOOK_URL }}
          mode: ${{ inputs.mode }}
          no-backend: true
          setup-tfenv: true
          upload-plan-destination: ${{ inputs.mode != 'drift-detection' && 'github' }}

Found this extremely useful in achieving a DRY configuration.

collect_usage_data: false
auto_merge: true

x-common-variables:
  project: &project
    dir: .
    aws_role_to_assume:
      state: &aws_role arn:aws:iam::xxxxxxxxxxxx:role/terraform-digger-poc
      command: *aws_role
  workflow: &workflow
    plan:
      steps:
        - &init
          init:
          extra_args:
            - -backend-config=region=
            - -backend-config=bucket=terraform-xxxxxxxxxxxx-${AWS_REGION}-tfstate
            - -backend-config=key=terraform-digger-poc/${TF_STATE_KEY}/terraform.tfstate
            - -backend-config=assume_role=null
            - -reconfigure
        - plan
    apply:
      steps:
        - *init
        - apply

projects:
  - <<: *project
    name: &name dev-us-east-1
    workflow: *name
  - <<: *project
    name: &name dev-eu-west-2
    workflow: *name
  - <<: *project
    name: &name prod-us-east-1
    workflow: *name
  - <<: *project
    name: &name prod-eu-west-2
    workflow: *name

workflows:
  dev-us-east-1:
    <<: *workflow
    env_vars:
      state: &env_vars
        - name: AWS_REGION
          value: us-east-1
        - name: TF_CLI_ARGS_plan
          value: -var-file terraform.dev-us-east-1.tfvars
        - name: TF_STATE_KEY
          value: dev
        - name: TF_VAR_super_sensitive_password
          value_from: DEV_US_EAST_1_SUPER_SENSITIVE_PASSWORD
      commands: *env_vars

  dev-eu-west-2:
    <<: *workflow
    env_vars:
      state: &env_vars
        - name: AWS_REGION
          value: eu-west-2
        - name: TF_CLI_ARGS_plan
          value: -var-file terraform.dev-eu-west-2.tfvars
        - name: TF_STATE_KEY
          value: dev
        - name: TF_VAR_super_sensitive_password
          value_from: DEV_EU_WEST_2_SUPER_SENSITIVE_PASSWORD
      commands: *env_vars

  prod-us-east-1:
    <<: *workflow
    env_vars:
      state: &env_vars
        - name: AWS_REGION
          value: us-east-1
        - name: TF_CLI_ARGS_plan
          value: -var-file terraform.prod-us-east-1.tfvars
        - name: TF_STATE_KEY
          value: prod
        - name: TF_VAR_super_sensitive_password
          value_from: PROD_US_EAST_1_SUPER_SENSITIVE_PASSWORD
      commands: *env_vars

  prod-eu-west-2:
    <<: *workflow
    env_vars:
      state: &env_vars
        - name: AWS_REGION
          value: eu-west-2
        - name: TF_CLI_ARGS_plan
          value: -var-file terraform.prod-eu-west-2.tfvars
        - name: TF_STATE_KEY
          value: prod
        - name: TF_VAR_super_sensitive_password
          value_from: PROD_EU_WEST_2_SUPER_SENSITIVE_PASSWORD
      commands: *env_vars

kieranbrown avatar Feb 02 '24 12:02 kieranbrown