aws_docker_swarm
aws_docker_swarm copied to clipboard
setup to bootstrap docker swarm cluster and a controller on AWS using terraform
AWS DOCKER SWARM SETUP
AWS Setup for launching Docker Swarm Cluster on 3 nodes (1 master, 2 workers)
Quick Demo
Infrastructure

Dependencies
Configuration
- Create an IAM user and get
ACCESS KEY&SECRET ACCESS KEYon AWS CONSOLE - Hit
aws configureand addACCESS KEY&SECRET ACCESS KEY - Change the region and availability zone in
variables.tffile if you wish to launch the setup in another region. Currently it defaults to us-east-1
Usage
- Run init script which will create a S3 bucket for storing terraform remote state. Change the bucket name in setup
./init_aws.sh
- Launch global resources which contains ssh key. Change key path in ssh_key.tf
cd global
terraform apply
- Lauch VPC. Change accordingly in variables.tf
cd vpc
terraform apply
- Launch Nodes. Change accordingly in variables.tf
cd nodes
terraform apply
Output to Note
manager_ip- Its the IP of manager node which belongs to a swarm lanched on bootup of nodes.
- Services launched via Controller UI can pe accessed on
manager_ip:port_specified
controller_ip- Controller has Portainer running on Port 9000 which is a UI over Docker Engine.
- Hit
controller_ip:9000and login - Enter
manager_ip:2375when asked for Docker Endpoint on login