Inconsistency in the behaviour of the $PATH environment variable within Devfile.
Describe the bug
I've observed a potential inconsistency in the behaviour of the $PATH environment variable within Devfile. When commands are executed using the command definition in the Devfile, they seem to have a different $PATH compared to commands launched in containers defined within the components section.
I'd like to confirm if this is the expected behaviour or if there might be a configuration issue. Perhaps someone can clarify how the $PATH is managed in these different contexts.
$PATH env inside the task:
/home/tooling/.local/share/coursier/bin:/home/tooling/.nvm/versions/node/v18.20.3/bin:/home/tooling/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
$PATH env inside the component container image:
/checode/checode-linux-libc/ubi9/bin/remote-cli:/home/user/.local/bin:/home/user/bin:/home/tooling/.sdkman/candidates/maven/current/bin:/home/tooling/.sdkman/candidates/java/current/bin:/home/tooling/.sdkman/candidates/gradle/current/bin:/home/user/.nvm/versions/node/v18.20.3/bin:/home/tooling/.local/share/coursier/bin:/home/tooling/.nvm/versions/node/v18.20.3/bin:/home/tooling/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
Che version
7.87@latest
Steps to reproduce
Parent devfile:
commands:
- exec:
commandLine: 'cp /home/user/.docker/.dockerconfigjson /home/user/.config/containers/auth.json 2>/dev/null || :'
component: ssf-developer-image
workingDir: /home/user
id: podman-auth
- exec:
commandLine: '[ -d .m2 ] && mkdir -p .m2/ && cp -u ${MAVEN_HOME}/conf/settings.xml .m2/'
component: ssf-developer-image
workingDir: /home/user
id: m2-settings
components:
- attributes:
pod-overrides:
metadata:
labels:
ssf.bit.admin.ch/name: ssf-java
ssf.bit.admin.ch/type: ssf-developer-image
container:
cpuLimit: "6"
cpuRequest: "1"
endpoints:
- exposure: none
name: kubedock
protocol: tcp
targetPort: 2475
env:
- name: KUBEDOCK_ENABLED
value: "true"
- name: KUBECONFIG
value: /home/user/.kube/config
- name: DOCKER_HOST
value: tcp://127.0.0.1:2475
- name: SSL_CERT_DIR
value: /var/run/secrets/kubernetes.io/serviceaccount
- name: HISTFILE
value: ~/.history/.bash_history
- name: MAVEN_CONFIG
value: -Xmx4G -Xss128M -XX:MetaspaceSize=1G -XX:MaxMetaspaceSize=2G
- name: TESTCONTAINERS_RYUK_DISABLED
value: "true"
- name: TESTCONTAINERS_CHECKS_DISABLE
value: "true"
image: ssf-java-image:1.1.1
memoryLimit: 4G
memoryRequest: 4G
mountSources: true
sourceMapping: /projects
volumeMounts:
- name: bin
path: /home/user/.local/bin
- name: history
path: /home/user/.history
- name: m2
path: /home/user/.m2
name: ssf-developer-image
- name: bin
volume:
size: 1Gi
- name: history
volume:
size: 512Mi
- name: m2
volume:
size: 5Gi
events:
postStart:
- podman-auth
- m2-settings
metadata:
name: ssf-java
version: 1.0.0
projects:
- git:
checkoutFrom:
revision: master
remotes:
origin: https://bitbucket/scm/bitssf/ssf-java-starter.git
name: sample
schemaVersion: 2.2.0
Child devfile:
parent:
uri: http://devfile-registry.openshift-devspaces-operator.svc.cluster.local:8080/devfiles/ssf-java/devfile.yaml
metadata:
name: ssf-jeap-sample
version: 1.0.0
schemaVersion: 2.2.0
commands:
- id: 01-run-tests
exec:
label: 01. Run Tests
component: ssf-developer-image
commandLine: 'mvn verify'
workingDir: '${PROJECT_SOURCE}'
- id: 02-live-coding
exec:
label: 02. Start Live Coding
component: ssf-developer-image
commandLine: 'mvn spring-boot:run -Dspring-boot.run.profiles=local'
workingDir: '${PROJECT_SOURCE}'
- id: 03-package-app
exec:
label: 03. Package App
component: ssf-developer-image
commandLine: mvn clean package
workingDir: '${PROJECT_SOURCE}'
- id: 04-postgresql-run
exec:
label: 04. Run Postgres with Podman
commandLine: 'podman run -p 5432:5432 -e POSTGRESQL_USER=user -e POSTGRESQL_PASSWORD=pass -e POSTGRESQL_DATABASE=db -e POSTGRESQL_ROOT_PASSWORD=root rhel8/postgresql-12:fdef8b12'
component: ssf-developer-image
workingDir: '${PROJECT_SOURCE}'
- id: 05-postgresql-install
exec:
label: 05. Install Postgres with OpenShift
commandLine: 'oc apply -f manifest/postgresql.yaml'
component: ssf-developer-image
workingDir: '${PROJECT_SOURCE}'
- id: 06-postgresql-3rdparty
exec:
label: 06. Install Postgres with Helm Chart
commandLine: |
helm upgrade --install db bitnami-helm-proxy/postgresql \
--set auth.postgresPassword=root \
--set auth.username=user \
--set auth.password=pass \
--set auth.database=db \
--set primary.persistence.mountPath=/var/lib/pgsql \
--set primary.persistence.size=1Gi \
--set postgresqlDataDir=/var/lib/pgsql/data
component: ssf-developer-image
workingDir: '${PROJECT_SOURCE}'
Expected behavior
PATH on task must have the same value as the PATH env defined inside the image component.
Runtime
OpenShift
Screenshots
No response
Installation method
OperatorHub
Environment
other (please specify in additional context)
Eclipse Che Logs
No response
Additional context
OpenShift on-premise
Not sure if this is related, but it's worth mentioning that tasks/commands executed in Che Code are run with /bin/sh/ instead of /bin/bash/. IIRC, the .bashrc and it's additions to $PATH will not be loaded by sh. So it's possible this is causing the difference in $PATH.
@eye0fra we could verify whether this is causing the bug by adding a source /home/user/.bashrc && to the start of commands. For example:
- exec:
commandLine: 'source /home/user/.bashrc && [ -d .m2 ] && mkdir -p .m2/ && cp -u ${MAVEN_HOME}/conf/settings.xml .m2/'
component: ssf-developer-image
workingDir: /home/user
id: m2-settings
Adding source /home/user/.bashrc would probably solve it if you defined the .bashrc and the PATH export there, although if the developer image already provides the PATH, the bug remains.
I noticed that a command defined in the devfile, triggered during the postStart event, does not share the same environment variables as it would if launched normally inside the workspace.
I noticed that a command defined in the devfile, triggered during the postStart event, does not share the same environment variables as it would if launched normally inside the workspace.
@eye0fra This is caused by a similar but different reason: postStart events are launched with /bin/sh/ inside the container, not bash. The workaround is to add a postStart event that calls source ~/.bashrc. IIRC, postStart events are called in the order that they are defined, and they all share the same shell "context" so adding a postStart event to initialize the bash environment variables as the first event should fix things.
For example:
events:
postStart:
- load-environment
- my-command-that-needs-bash-env
- my-other-command
commands:
- id: load-environment
exec:
component: tools
commandLine: source ~/.bashrc
- id: my-command-that-needs-bash-env
exec:
component: tools
commandLine: <some-command>
- id: my-other-command
exec:
component: tools
commandLine: <some-other-command>
You could also define the load-environment in a parent devfile, which would ensure the command is always provided with minimal user effort.
We could potentially take a similar approach to CheCode to resolve this in DevWorkspace Operator. However, I'm a bit hesitant to do this because it relies on the assumption that the tooling container uses bash (which is not always the case; some container images only provide sh or zsh).
However, if this workaround is not suitable and you would like to see this feature in DWO, I think this issue should be closed and a new Che issue should be opened (something like postStart events are missing $PATH environment variables).
Yes, I’m aware of the workaround, but as you mentioned, the challenge arises when dealing with multiple shell environments, which is my case. It becomes necessary to restore the SHELL environment variable to avoid unexpected terminal issues.
I agree that opening a dedicated issue to address this would be a good idea.
@eye0fra In this case, this merits opening a new Che issue. I'm happy to do so for you, however, I'm not sure of the specific details of your problem when using multiple shell environments. Do you want me to open an issue and then you can provide those details in a comment or would you like to open the problem yourself?
I will open a new one and put the details
@eye0fra Sounds great, thank you :)