terraform icon indicating copy to clipboard operation
terraform copied to clipboard

Terraform Build command

Open DevOpsBoondoggles opened this issue 2 years ago • 9 comments

Terraform Version

Terraform v1.5.1
on windows_386

Your version of Terraform is out of date! The latest version
is 1.5.7. You can update by downloading from https://www.terraform.io/downloads.html

Use Cases

For most types of non infrastructure code, the idea in a CI pipeline is Build Once, Deploy Many. The first step you download all the files you need from package repositories and bundle them up and then move that through the environments. For Terraform (and other IAC tools ) this doesn't seem to be standard.
Every environment, which has a different VM, we download a fresh copy of the provider and modules from their sources. This means we Build Many, Deploy Many which isn't the best.
I think we can encourage and support a better pattern.

Attempted Solutions

Existing solutions are just using zip in a pipeline.

Proposal

I am suggesting a terraform build command that can be passed a command to pull down all the providers and modules and then zip them up.

Run "terraform build " it will init without backend to pull down all the providers, modules etc.
Then zip them into one file so you can upload it in your CI system as a pipeline artefact to be pulled down through the environments.

This would be especially useful in older enterprises that perhaps have days of lag in between deploying dev and prod, to help reduce the risk of supply chain corruption and incorrect versions between the environment stages.

References

I couldn't see any

DevOpsBoondoggles avatar Sep 11 '23 10:09 DevOpsBoondoggles

Hi @gabrielmccoll,

Thanks for filing the request. If I understand correctly, you are looking for a method to initialize Terraform without interacting with the backend configuration. That would normally be done by using terraform init -backend=false -- does that not do what you are looking for in this case?

Thanks!

jbardin avatar Sep 11 '23 13:09 jbardin

Hi there! I wasn't. I was trying to offer an idea around making terraform produce a distributable package of dependency's so that it can be deployed many from one build

DevOpsBoondoggles avatar Sep 11 '23 13:09 DevOpsBoondoggles

This sounds like the old terraform-bundle tool.

I can see why it would be desirable to have an easy way to obtain a zip of all the providers and modules in a CI system that does not let you persist artefacts between stages. I'm not sure exactly whether that's the situation @gabrielmccoll is describing - if you're able to persist a zip artefact between stages, why not the whole .terraform directory?

kmoe avatar Sep 11 '23 13:09 kmoe

So in my case each environment is on a segregated private network and therefore each build machine would be different. The zip would be uploaded to the pipeline system and pull down the whole thing. Just trying to encourage a better practice you know Hopefully I'm making sense

DevOpsBoondoggles avatar Sep 11 '23 13:09 DevOpsBoondoggles

Could you use Packer/Vagrant to create some base OS images of your VMs that include the provider/packages you want, then roll out that?

OneCricketeer avatar Sep 12 '23 14:09 OneCricketeer

Could you use Packer/Vagrant to create some base OS images of your VMs that include the provider/packages you want, then roll out that?

Heya thanks for the ideas ! Yeah I mean I already have a method of doing it via the Azure Devops Pipeline and extra steps. I guess I just thought it seemed like a way of making IAC closer to just C if you had inbuilt dependency bundling.

If people aren't into it that's okay though.

DevOpsBoondoggles avatar Sep 12 '23 15:09 DevOpsBoondoggles

The idea seems fine, it just overlaps with existing Hashicorp tools, in my opinion, based on the description given.

Similarly, you could build Docker images with all packages you need, and deploy those using terraform modules such as aws fargate / azure containers / kubernetes + helm / nomad providers

OneCricketeer avatar Sep 12 '23 16:09 OneCricketeer

We use BitBucket pipelines to run the plan and apply processes. To help with this, we use BitBucket Pipeline caches. We use cache keys to ensure the version of the content we want cached is associated with the file that controls which versions.

This is by no means a perfect setup, but it is working well for us.

definitions:
  caches:
    terraform-plugins:
      key:
        files:
          - .terraform.lock.hcl
          - versions.tf
      path: /root/.terraform.d/plugin-cache
    terraform-local-plugins: .terraform/plugins
    terraform-local-providers:
      key:
        files:
          - .terraform.lock.hcl
          - versions.tf
      path: .terraform/providers
    terraform-modules: .terraform/modules
    tflint-plugins:
      key:
        files:
          - .tflint.hcl
          - .pre-commit-config.yaml
      path: /root/.tflint.d/plugins

So, every time we run the pipeline, we see ...

Terraform .tfvars files
-----------------------
 - terraform.tfvars
 
Initializing the backend...
Successfully configured the backend "s3"! Terraform will automatically
use this backend unless the backend configuration changes.
Initializing modules...
Initializing provider plugins...
- Reusing previous version of hashicorp/aws from the dependency lock file
- Reusing previous version of cloudflare/cloudflare from the dependency lock file
- Reusing previous version of integrations/github from the dependency lock file
- Reusing previous version of hashicorp/random from the dependency lock file
- Reusing previous version of hashicorp/archive from the dependency lock file
- Using previously-installed hashicorp/aws v4.64.0
- Using previously-installed cloudflare/cloudflare v4.4.0
- Using previously-installed integrations/github v5.23.0
- Using previously-installed hashicorp/random v3.5.1
- Using previously-installed hashicorp/archive v2.3.0
Terraform has been successfully initialized!

From what I understand, there is no downloading and all modules are retained at the version that we are asking for.

Admittedly, BitBucket clears the cache weekly (and you can do it manually).

It's a little better, I think, than nothing.

rquadling avatar Sep 29 '23 09:09 rquadling

I can't get the BitBucket pipelines to support caching for multiple stages (something like path: **/.terraform/providers). Is this a known limitation? Anyone had the same challenge and found a solution?

nika-pr avatar Nov 29 '24 15:11 nika-pr