OCPBUGS-18331: Include sshd config directories in remediation template
The generic sshd configuration file we were using to generate sshd remediations for OpenShift didn't include the sshd_config.d/ directory. This can be a problem for some clusters and configuration that spread their sshd configuration across those directories, instead of assuming all configuration will be in a single sshd configuration file.
This could lead to cases where applying remediations for sshd hardening breaks ssh in unexpected ways (e.g., like not being able to ssh into the clusters because the ssh keys are not longer accessible if they're under sshd_config.d).
Adding @BhargaviGudi for review.
Start a new ephemeral environment with changes proposed in this pull request:
/test
@rhmdnd: The /test command needs one or more targets.
The following commands are available to trigger required jobs:
/test 4.13-images/test 4.14-images/test 4.15-images/test 4.16-images/test e2e-aws-ocp4-cis/test e2e-aws-ocp4-cis-node/test e2e-aws-ocp4-e8/test e2e-aws-ocp4-high/test e2e-aws-ocp4-high-node/test e2e-aws-ocp4-moderate/test e2e-aws-ocp4-moderate-node/test e2e-aws-ocp4-pci-dss/test e2e-aws-ocp4-pci-dss-node/test e2e-aws-ocp4-stig/test e2e-aws-ocp4-stig-node/test e2e-aws-rhcos4-e8/test e2e-aws-rhcos4-high/test e2e-aws-rhcos4-moderate/test e2e-aws-rhcos4-stig/test images
Use /test all to run the following jobs that were automatically triggered:
pull-ci-ComplianceAsCode-content-master-4.13-imagespull-ci-ComplianceAsCode-content-master-4.14-imagespull-ci-ComplianceAsCode-content-master-4.15-imagespull-ci-ComplianceAsCode-content-master-4.16-imagespull-ci-ComplianceAsCode-content-master-images
In response to this:
/test
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.
/test e2e-aws-rhcos4-e8
Failures are unrelated to the change being made here and will be fixed separately.
/test
@rhmdnd: The /test command needs one or more targets.
The following commands are available to trigger required jobs:
/test 4.13-e2e-aws-ocp4-cis/test 4.13-e2e-aws-ocp4-cis-node/test 4.13-e2e-aws-ocp4-e8/test 4.13-e2e-aws-ocp4-high/test 4.13-e2e-aws-ocp4-high-node/test 4.13-e2e-aws-ocp4-moderate/test 4.13-e2e-aws-ocp4-moderate-node/test 4.13-e2e-aws-ocp4-pci-dss/test 4.13-e2e-aws-ocp4-pci-dss-node/test 4.13-e2e-aws-ocp4-stig/test 4.13-e2e-aws-ocp4-stig-node/test 4.13-e2e-aws-rhcos4-e8/test 4.13-e2e-aws-rhcos4-high/test 4.13-e2e-aws-rhcos4-moderate/test 4.13-e2e-aws-rhcos4-stig/test 4.13-images/test 4.14-images/test 4.15-e2e-aws-ocp4-cis/test 4.15-e2e-aws-ocp4-cis-node/test 4.15-e2e-aws-ocp4-e8/test 4.15-e2e-aws-ocp4-high/test 4.15-e2e-aws-ocp4-high-node/test 4.15-e2e-aws-ocp4-moderate/test 4.15-e2e-aws-ocp4-moderate-node/test 4.15-e2e-aws-ocp4-pci-dss/test 4.15-e2e-aws-ocp4-pci-dss-node/test 4.15-e2e-aws-ocp4-stig/test 4.15-e2e-aws-ocp4-stig-node/test 4.15-e2e-aws-rhcos4-e8/test 4.15-e2e-aws-rhcos4-high/test 4.15-e2e-aws-rhcos4-moderate/test 4.15-e2e-aws-rhcos4-stig/test 4.15-images/test 4.16-e2e-aws-ocp4-cis/test 4.16-e2e-aws-ocp4-cis-node/test 4.16-e2e-aws-ocp4-e8/test 4.16-e2e-aws-ocp4-high/test 4.16-e2e-aws-ocp4-high-node/test 4.16-e2e-aws-ocp4-moderate/test 4.16-e2e-aws-ocp4-moderate-node/test 4.16-e2e-aws-ocp4-pci-dss/test 4.16-e2e-aws-ocp4-pci-dss-node/test 4.16-e2e-aws-ocp4-stig/test 4.16-e2e-aws-ocp4-stig-node/test 4.16-e2e-aws-rhcos4-e8/test 4.16-e2e-aws-rhcos4-high/test 4.16-e2e-aws-rhcos4-moderate/test 4.16-e2e-aws-rhcos4-stig/test 4.16-images/test e2e-aws-ocp4-cis/test e2e-aws-ocp4-cis-node/test e2e-aws-ocp4-e8/test e2e-aws-ocp4-high/test e2e-aws-ocp4-high-node/test e2e-aws-ocp4-moderate/test e2e-aws-ocp4-moderate-node/test e2e-aws-ocp4-pci-dss/test e2e-aws-ocp4-pci-dss-node/test e2e-aws-ocp4-stig/test e2e-aws-ocp4-stig-node/test e2e-aws-rhcos4-e8/test e2e-aws-rhcos4-high/test e2e-aws-rhcos4-moderate/test e2e-aws-rhcos4-stig/test images
Use /test all to run the following jobs that were automatically triggered:
pull-ci-ComplianceAsCode-content-master-4.13-imagespull-ci-ComplianceAsCode-content-master-4.14-imagespull-ci-ComplianceAsCode-content-master-4.15-imagespull-ci-ComplianceAsCode-content-master-4.16-imagespull-ci-ComplianceAsCode-content-master-images
In response to this:
/test
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.
/test 4.13-e2e-aws-rhcos4-e8 /test 4.15-e2e-aws-rhcos4-e8 /test 4.16-e2e-aws-rhcos4-e8 /test e2e-aws-rhcos4-e8
Code Climate has analyzed commit 7b8f481b and detected 0 issues on this pull request.
The test coverage on the diff in this pull request is 100.0% (50% is the threshold).
This pull request will bring the total coverage in the repository to 58.3% (0.0% change).
View more on Code Climate.
/hold for test
Verification passed with 4.12.49 + compliance operator + code from PR #11551
- Install CO
- Create ssb with upstream-rhcos4-e8
$ oc compliance bind -S default-auto-apply -N test profile/upstream-rhcos4-e8
Creating ScanSettingBinding test
$ oc get suite -w
NAME PHASE RESULT
test LAUNCHING NOT-AVAILABLE
test LAUNCHING NOT-AVAILABLE
test RUNNING NOT-AVAILABLE
test RUNNING NOT-AVAILABLE
test AGGREGATING NOT-AVAILABLE
test AGGREGATING NOT-AVAILABLE
test DONE NON-COMPLIANT
test DONE NON-COMPLIANT
$ oc get suite
NAME PHASE RESULT
test DONE NON-COMPLIANT
$ oc get scan
NAME PHASE RESULT
upstream-rhcos4-e8-master DONE NON-COMPLIANT
upstream-rhcos4-e8-worker DONE NON-COMPLIANT
- Verify whether MachineConfig resources are having line
Include /etc/ssh/ssh_config.d/*.config
# $OpenBSD: sshd_config,v 1.103 2018/04/09 20:41:22 tj Exp $
# This is the sshd server system-wide configuration file. See
# sshd_config(5) for more information.
# This sshd was compiled with PATH=/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin
Include /etc/ssh/sshd_config.d/*.conf
...
/unhold
@yuumasato @Vincent056 should be ready for another review.
@BhargaviGudi @xiaojiey Hi, just wanted to double check whether you were able to ssh into the cluster. I was not able to do that, XD.
@BhargaviGudi @xiaojiey Hi, just wanted to double check whether you were able to ssh into the cluster. I was not able to do that, XD.
Yes, I can. Notify you on the detailed step I used for ssh to the node.
/lgtm