puppet-splunk icon indicating copy to clipboard operation
puppet-splunk copied to clipboard

Fix Idempotence issue for RHEL 7 forwarder install

Open TraGicCode opened this issue 8 years ago • 8 comments

I could really use a second pair of 👀 on this. I was sifting through the issues on github for this module and noticed someone having a problem a RHEL 7. While i was not able to reproduce their issue specifically ( i assume it's fixed ) i did notice the idempotence test for the forwarder is failing when run under beaker. I know pretty much nothing about se_linux on RHEL 7 but what i do know is the following attribute on the file resource solves the forwarder idempotence test. Could anyone verify this fix is indeed valid?

FYI. Here is the failing beaker test without this change.

First Run

  Info: Loading facts
  Info: Loading facts
  Warning: /etc/puppetlabs/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
     (in /etc/puppetlabs/puppet/hiera.yaml)
  Notice: Compiled catalog for rhel-74-x64.attlocal.net in environment production in 0.46 seconds
  Info: Applying configuration version '1507523793'
  Notice: /Stage[main]/Archive::Staging/File[/opt/staging]/ensure: created
  Notice: /Stage[main]/Splunk::Forwarder/Archive[/opt/staging/splunk/splunkforwarder-7.0.0-c8a78efdd40f-linux-2.6-x86_64.rpm]/ensure: download archive from https://download.splunk.com/products/universalforwarder/releases/7.0.0/linux/splunkforwarder-7.0.0-c8a78efdd40f-linux-2.6-x86_64.rpm to /opt/staging/splunk/splunkforwarder-7.0.0-c8a78efdd40f-linux-2.6-x86_64.rpm  with cleanup
  Notice: /Stage[main]/Splunk::Forwarder/Package[splunkforwarder]/ensure: created
  Notice: /Stage[main]/Splunk::Forwarder/Splunkforwarder_input[default_host]/ensure: created
  Info: /Stage[main]/Splunk::Forwarder/Splunkforwarder_input[default_host]: Scheduling refresh of Service[splunk]
  Notice: /Stage[main]/Splunk::Forwarder/Splunkforwarder_output[tcpout_defaultgroup]/ensure: created
  Info: /Stage[main]/Splunk::Forwarder/Splunkforwarder_output[tcpout_defaultgroup]: Scheduling refresh of Service[splunk]
  Notice: /Stage[main]/Splunk::Forwarder/Splunkforwarder_output[defaultgroup_server]/ensure: created
  Info: /Stage[main]/Splunk::Forwarder/Splunkforwarder_output[defaultgroup_server]: Scheduling refresh of Service[splunk]
  Notice: /Stage[main]/Splunk::Forwarder/Splunkforwarder_web[forwarder_splunkd_port]/ensure: created
  Info: /Stage[main]/Splunk::Forwarder/Splunkforwarder_web[forwarder_splunkd_port]: Scheduling refresh of Service[splunk]
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/deploymentclient.conf]/ensure: created
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/inputs.conf]/mode: mode changed '0644' to '0600'
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/inputs.conf]/seluser: seluser changed 'unconfined_u' to 'system_u'
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/outputs.conf]/mode: mode changed '0644' to '0600'
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/outputs.conf]/seluser: seluser changed 'unconfined_u' to 'system_u'
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/web.conf]/mode: mode changed '0644' to '0600'
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/web.conf]/seluser: seluser changed 'unconfined_u' to 'system_u'
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/limits.conf]/ensure: created
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/server.conf]/ensure: created
  Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: executed successfully
  Notice: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]/returns: executed successfully
  Info: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]: Scheduling refresh of Service[splunk]
  Notice: /Stage[main]/Splunk::Virtual/Service[splunk]/ensure: ensure changed 'stopped' to 'running'
  Info: /Stage[main]/Splunk::Virtual/Service[splunk]: Unscheduling refresh on Service[splunk]
  Info: Creating state file /opt/puppetlabs/puppet/cache/state/state.yaml
  Notice: Applied catalog in 10.79 seconds

Second run

  Info: Loading facts
  Info: Loading facts
  Warning: /etc/puppetlabs/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
     (in /etc/puppetlabs/puppet/hiera.yaml)
  Notice: Compiled catalog for rhel-74-x64.attlocal.net in environment production in 0.44 seconds
  Info: Applying configuration version '1507523806'
  Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/server.conf]/seluser: seluser changed 'unconfined_u' to 'system_u'
  Notice: Applied catalog in 0.17 seconds

TraGicCode avatar Oct 09 '17 04:10 TraGicCode

mh I've no idea. @vinzent @oranenj can you help us out here?

bastelfreak avatar Oct 09 '17 17:10 bastelfreak

Hard to tell what's going on without actually testing the module myself, but my initial guess is that something changes the SELinux policy while puppet is running (eg. one of the packages it installs) so that on the next run, the file contexts for that particular path are different than on the first run and puppet corrects them.

To see which is "correct", run restorecon on that path. If it changes from unconfined_u to system_u, then that's what the policy declares and the puppet code should probably default to that too.

oranenj avatar Oct 09 '17 20:10 oranenj

I'm also not sure, but what @oranenj says makes sense (if the settings are coming from the package rather than from Puppet, you could also try looking at the rpm's scripts).

wyardley avatar Oct 09 '17 22:10 wyardley

I'll give this a shot.

TraGicCode avatar Oct 10 '17 01:10 TraGicCode

Dear @TraGicCode, thanks for the PR!

This is pccibot, your friendly Vox Pupuli GitHub Bot. I noticed that your pull request contains merge conflict. Can you please rebase?

You can find my sourcecode at voxpupuli/vox-pupuli-tasks

vox-pupuli-tasks[bot] avatar Dec 13 '19 12:12 vox-pupuli-tasks[bot]

Dear @TraGicCode, thanks for the PR!

This is pccibot, your friendly Vox Pupuli GitHub Bot. I noticed that your pull request contains merge conflict. Can you please rebase?

You can find my sourcecode at voxpupuli/vox-pupuli-tasks

vox-pupuli-tasks[bot] avatar Dec 13 '19 12:12 vox-pupuli-tasks[bot]

Dear @TraGicCode, thanks for the PR!

This is pccibot, your friendly Vox Pupuli GitHub Bot. I noticed that your pull request contains merge conflict. Can you please rebase?

You can find my sourcecode at voxpupuli/vox-pupuli-tasks

vox-pupuli-tasks[bot] avatar Jan 05 '20 13:01 vox-pupuli-tasks[bot]

Dear @TraGicCode, thanks for the PR!

This is pccibot, your friendly Vox Pupuli GitHub Bot. I noticed that your pull request contains merge conflict. Can you please rebase?

You can find my sourcecode at voxpupuli/vox-pupuli-tasks

vox-pupuli-tasks[bot] avatar Jan 05 '20 13:01 vox-pupuli-tasks[bot]