terraform-provider-kubernetes
terraform-provider-kubernetes copied to clipboard
New data source: kubernetes_all_nodes
Description
Add new data source kubernetes_all_nodes.
Potential Terraform Configuration
main.tf:
data "kubernetes_all_nodes" "test" {
}
output "test" {
value = data.kubernetes_all_nodes.test
}
Output:
test = {
"id" = "d1c9507ab3e33ed071fb9d16472665fab4c99639bec5534a45d28bc1f9d74105"
"nodes" = [
"k8s-master",
"k8s-worker-1",
"k8s-worker-2",
"k8s-worker-3",
]
}
References
Community Note
- Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
- If you are interested in working on this issue or have submitted a pull request, please leave a comment
Hi John. Thanks for the issue. Would you mind describing your use case for the proposed kubernetes_all_nodes data source?
https://registry.terraform.io/providers/hashicorp/kubernetes/latest/docs/resources/persistent_volume_claim
A persistent_volume_claim can have an affinity for a particular node.
for example:
resource "kubernetes_persistent_volume" "consul-server-2" {
metadata {
labels = {
"type" = "local"
}
name = "consul-server-2"
}
spec {
access_modes = [
"ReadWriteOnce",
]
capacity = {
"storage" = "100Gi"
}
mount_options = []
persistent_volume_reclaim_policy = "Delete"
storage_class_name = "manual"
volume_mode = "Filesystem"
node_affinity {
required {
node_selector_term {
match_expressions {
key = "kubernetes.io/hostname"
operator = "In"
values = [
*"k8s-worker-3", <<--------- HERE
]
}
}
}
}
persistent_volume_source {
host_path {
path = "/mnt/data"
}
}
}
}
Absent the data source, these must be hard coded.
@redeux I too have the use case for the kubernetes_all_nodes. For mine, i wanted to create the tls certificates that is applicable for all the nodes. Here we can get the hostname of all the nodes name and we can use it to create the self sign certificates.
Marking this issue as stale due to inactivity. If this issue receives no comments in the next 30 days it will automatically be closed. If this issue was automatically closed and you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. This helps our maintainers find and focus on the active issues. Maintainers may also remove the stale label at their discretion. Thank you!
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.