pretf
pretf copied to clipboard
Terraform modules in Python
Pretf can be used to create Terraform modules, but only as a code generator where you'd commit the JSON it generates. It won't work at run-time with module variables because Terraform uses interpolation and API calls and stuff to pass dynamic values into the module, and Pretf does not work at that level.
But what might work is reusable Python functions that take some arguments and then output a bunch of resources.
from some_module import security_group, waf_load_balancer
def terraform(var):
sg = yield security_group(name="test")
yield from load_balancer(
name=x,
public=y,
security_group_ids=[sg.id],
whitelist=z,
)
- Resource name conflicts are not a problem in modules because Terraform has them all inside the module. Perhaps all resources created in this way should use the name as a prefix.
- Variables and outputs returned by the function would be included in the root stack. This is probably bad.
- How to get useful objects back from the function? It needs to yield all resources but some in particular will be more useful to assign to a variable. E.g. make the load balancer resource easy to access, but not the security group rules, but still yield all of them to be included in the JSON.
- Such modules could be published to PyPi but I don't think I like that idea. I also don't like the idea of having a bespoke registry just for this.
This needs a lot of consideration before implementing anything. It could be explored in projects using Pretf, and then whatever works well could be brought into the Pretf codebase.
Current idea:
Collections API uses a decorator to create a Collection object.
Creating a collection is the same as any other Pretf terraform function, but you use a decorator on it to turn it into a collection.
from pretf.api import tf
from pretf.collections import collect
@collect
def waf_load_balancer(var):
# define variables that must be passed in when calling the collection
yield tf("variable.name", {"type": "string"})
# can use var to access variables
lb = yield tf("load balancer stuff", {"name": var.name)
yield tf("security group stuff")
yield tf("waf stuff")
# define outputs which become available as attributes on the collection
yield tf("output.id", {"value": lb.id})
Using a collection:
from some_module import waf_load_balancer
def terraform(var):
# call the function to get back a Collection object
# kwargs are used to populate var object that gets passed into the decorated function
# can yield from the collection to include in JSON
# which includes all blocks excluding variables and outputs.
lb = yield waf_load_balancer(name="test", ...)
# can access lb.id and any other outputs using attribute or dict access
lb_id = lb.id
To avoid complicating the implementation, nothing will be done to alter resource names to ensure uniqueness. Instead, creators of collections will need to name resources with a variable, then the user of the collection is responsible for and able to make them unique.
I've started implementing that idea.
- [x] basic collections support
- [x] nested collections
- [x] ability to yield and yield from collections
- [x] documentation and examples
- [ ] tutorial using recursive s3 objects
Has this been implemented? I started using pretf and hit a wall when it came to creating modules with dynamic definitions. Awesome work on this, by the way!
Thanks! The checklist above is accurate, there’s just no tutorial. It’s all released and useable.
https://pretf.readthedocs.io/en/latest/api/collections/
https://github.com/raymondbutcher/pretf/blob/master/examples/aws/s3.tf.py
https://github.com/raymondbutcher/pretf/blob/master/tests/test_collections.py