operator-sdk
operator-sdk copied to clipboard
ansible runner: ERROR! the role 'hello' was not found ....
Type of question
General operator-related help
Question
What did you do?
I'm trying to run the helloworld ansible operator. When I run make run I get an error 'ERROR! the role 'hello' was not found in ...'
What did you expect to see?
The Operator should exec role/hello/tasks/main.yaml
What did you see instead? Under which circumstances?
$ pwd
/home/kubeuser/operator-helloworld
$ make run
ANSIBLE_ROLES_PATH=":/home/kubeuser/operator-helloworld/roles" /home/kubeuser/operator-helloworld/bin/ansible-operator run
{"level":"info","ts":1660219614.717471,"logger":"cmd","msg":"Version","Go Version":"go1.18.4","GOOS":"linux","GOARCH":"amd64","ansible-operator":"v1.22.2","commit":"da3346113a8a75e11225f586482934000504a60f"}
{"level":"info","ts":1660219614.718415,"logger":"cmd","msg":"Watch namespaces not configured by environment variable WATCH_NAMESPACE or file. Watching all namespaces.","Namespace":""}
{"level":"info","ts":1660219614.921894,"logger":"controller-runtime.metrics","msg":"Metrics server is starting to listen","addr":":8080"}
{"level":"info","ts":1660219614.9222627,"logger":"watches","msg":"Environment variable not set; using default value","envVar":"ANSIBLE_VERBOSITY_HELLO_CACHE_HELLO_EXAMPLE_COM","default":2}
{"level":"info","ts":1660219614.9223125,"logger":"cmd","msg":"Environment variable not set; using default value","Namespace":"","envVar":"ANSIBLE_DEBUG_LOGS","ANSIBLE_DEBUG_LOGS":false}
{"level":"info","ts":1660219614.922322,"logger":"ansible-controller","msg":"Watching resource","Options.Group":"cache.hello.example.com","Options.Version":"v1","Options.Kind":"Hello"}
{"level":"info","ts":1660219614.9230807,"logger":"proxy","msg":"Starting to serve","Address":"127.0.0.1:8888"}
{"level":"info","ts":1660219614.9231188,"logger":"apiserver","msg":"Starting to serve metrics listener","Address":"localhost:5050"}
{"level":"info","ts":1660219614.9233184,"msg":"Starting server","path":"/metrics","kind":"metrics","addr":"[::]:8080"}
{"level":"info","ts":1660219614.923336,"msg":"Starting server","kind":"health probe","addr":"[::]:6789"}
{"level":"info","ts":1660219614.9233851,"msg":"Starting EventSource","controller":"hello-controller","source":"kind source: *unstructured.Unstructured"}
{"level":"info","ts":1660219614.9234116,"msg":"Starting Controller","controller":"hello-controller"}
{"level":"info","ts":1660219615.0246503,"msg":"Starting workers","controller":"hello-controller","worker count":8}
{"level":"error","ts":1660219616.1579287,"logger":"runner","msg":"\u001b[0;31mERROR! the role 'hello' was not found in /tmp/ansible-operator/runner/cache.hello.example.com/v1/Hello/operator-testing/hello-sample/project/roles:/tmp/ansible-operator/runner/cache.hello.example.com/v1/Hello/operator-testing/hello-sample/project/roles:/tmp/ansible-operator/runner/cache.hello.example.com/v1/Hello/operator-testing/hello-sample/project\u001b[0m\r\n","job":"8476150515463567770","name":"hello-sample","namespace":"operator-testing","error":"exit status 1"}
{"level":"error","ts":1660219616.1729877,"logger":"reconciler","msg":"Failed to get ansible-runner stdout","job":"8476150515463567770","name":"hello-sample","namespace":"operator-testing","error":"open /tmp/ansible-operator/runner/cache.hello.example.com/v1/Hello/operator-testing/hello-sample/artifacts/8476150515463567770/stdout: no such file or directory","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Reconcile\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:121\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:320\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:273\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:234"}
Environment
Operator type: /language ansible
Kubernetes cluster type:
$ minikube version
minikube version: v1.26.0
commit: f4b412861bb746be73053c9f6d2895f12cf78565
$ operator-sdk version
operator-sdk version: "v1.22.2", commit: "da3346113a8a75e11225f586482934000504a60f", kubernetes version: "1.24.1", go version: "go1.18.4", GOOS: "linux", GOARCH: "amd64"
$ kubectl version --short
Client Version: v1.24.3
Kustomize Version: v4.5.4
Server Version: v1.24.0
Additional context
$ python3.9 -m pip freeze | grep ans
ansible==6.2.0
ansible-core==2.13.2
ansible-runner==2.2.1
ansible-runner-http==1.0.0
$ mkdir operator-helloworld
$ cd operator-helloworld/
$ operator-sdk init --plugins=ansible --domain=hello.example.com
$ operator-sdk create api --group cache --version v1 --kind Hello --generate-role
$ more roles/hello/tasks/main.yml
---
# tasks file for Hello
- name: Hello World Task
debug:
msg: "Hello World! I live in a namespace called {{ ansible_operator_meta.namespace }}"
when: toggle_message
$ more config/samples/cache_v1_hello.yaml
apiVersion: cache.hello.example.com/v1
kind: Hello
metadata:
name: hello-sample
spec:
# TODO(user): Add fields here
toggle_message: true
$ make install
$ k create namespace operator-testing
$ k config set-context --current --namespace operator-testing
$ make run
k create -f config/samples/cache_v1_hello.yaml -n operator-testing
----------------------------------- Post edit-------------------------------------
I think I found the error in the Makefile
.PHONY: run
run: ansible-operator ## Run against the configured Kubernetes cluster in ~/.kube/config
ANSIBLE_ROLES_PATH="$(ANSIBLE_ROLES_PATH):$(shell pwd)/roles" $(ANSIBLE_OPERATOR) run
Installing ansible via pip does not set the variable ANSIBLE_ROLES_PATH at all. Look at the log, there is nothing before ":"
$ make run
ANSIBLE_ROLES_PATH=":/home/kubeuser/operator-helloworld/roles" /home/kubeuser/operator-helloworld/bin/ansible-operator run
If I export the variable for example to env ANSIBLE_ROLES_PATH=/home/kubeuser/.ansible/roles/ then make run works.
ANSIBLE_ROLES_PATH="/home/kubeuser/.ansible/roles/:/home/kubeuser/operator-helloworld/roles" /home/kubeuser/operator-helloworld/bin/ansible-operator run
{"level":"info","ts":1660223757.0628014,"logger":"cmd","msg":"Version","Go Version":"go1.18.4","GOOS":"linux","GOARCH":"amd64","ansible-operator":"v1.22.2","commit":"da3346113a8a75e11225f586482934000504a60f"}
...
{"level":"info","ts":1660223758.20413,"logger":"logging_event_handler","msg":"[playbook task start]","name":"hello-sample","namespace":"operator-testing","gvk":"cache.hello.example.com/v1, Kind=Hello","event_type":"playbook_on_task_start","job":"3337959214220656527","EventData.Name":"Gathering Facts"}
--------------------------- Ansible Task StdOut -------------------------------
TASK [Gathering Facts] *********************************************************
-------------------------------------------------------------------------------
{"level":"info","ts":1660223759.1950765,"logger":"logging_event_handler","msg":"[playbook debug]","name":"hello-sample","namespace":"operator-testing","gvk":"cache.hello.example.com/v1, Kind=Hello","event_type":"runner_on_ok","job":"3337959214220656527","EventData.TaskArgs":""}
--------------------------- Ansible Task StdOut -------------------------------
TASK [Hello World Task] ********************************
ok: [localhost] => {
"msg": "Hello World! I live in a namespace called operator-testing"
}
-------------------------------------------------------------------------------
{"level":"info","ts":1660223759.4042516,"logger":"runner","msg":"Ansible-runner exited successfully","job":"3337959214220656527","name":"hello-sample","namespace":"operator-testing"}
----- Ansible Task Status Event StdOut (cache.hello.example.com/v1, Kind=Hello, hello-sample/operator-testing) -----
Is this a bug?
Rr
Ill try to reproduce this, might be a change with Ansible 2.13
Issues go stale after 90d of inactivity.
Mark the issue as fresh by commenting /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.
Exclude this issue from closing by commenting /lifecycle frozen.
If this issue is safe to close now please do so with /close.
/lifecycle stale
Stale issues rot after 30d of inactivity.
Mark the issue as fresh by commenting /remove-lifecycle rotten.
Rotten issues close after an additional 30d of inactivity.
Exclude this issue from closing by commenting /lifecycle frozen.
If this issue is safe to close now please do so with /close.
/lifecycle rotten /remove-lifecycle stale
Rotten issues close after 30d of inactivity.
Reopen the issue by commenting /reopen.
Mark the issue as fresh by commenting /remove-lifecycle rotten.
Exclude this issue from closing again by commenting /lifecycle frozen.
/close
@openshift-bot: Closing this issue.
In response to this:
Rotten issues close after 30d of inactivity.
Reopen the issue by commenting
/reopen. Mark the issue as fresh by commenting/remove-lifecycle rotten. Exclude this issue from closing again by commenting/lifecycle frozen./close
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.
Was this resolved ?