robottelo
robottelo copied to clipboard
Fix number of imported templates
The number of total templates and templates containing robottelo should match this repo and branch https://github.com/SatelliteQE/foreman_templates/tree/automation
I have no idea why it was once changed to a wrong number without further explanation: https://github.com/SatelliteQE/robottelo/commit/95434f5e3b184732fe963874d19b747a02241edb
Yes, on a first look, the templates do not match, however these tests have been passing for quite some time, at least for Sat. stream. The question is, whether this change does not break them or if the current template numbers are incorrect and the tests have false positives.
@dosas What does the PRT result say?
@pnovotny This is exactly what I would like to discuss. When I count the templates here https://github.com/SatelliteQE/foreman_templates/tree/automation/import I count 18, right? Even if these changes 'break' the tests we should ensure that this is not masking a bug or at least understand why one template is not imported. The changes made in this commit https://github.com/SatelliteQE/robottelo/commit/95434f5e3b184732fe963874d19b747a02241edb were made two years ago and the last changes to the templates were made 3 years ago. I stumbled accross this when I was testing against foreman 3.9
trigger: test-robottelo pytest: tests/foreman/api/test_templatesync.py
trigger: test-robottelo pytest: tests/foreman/api/test_templatesync.py
PRT Result
Build Number: 6076
Build Status: UNSTABLE
PRT Comment: pytest tests/foreman/api/test_templatesync.py --external-logging
Test Result : ====== 2 failed, 25 passed, 319 warnings, 4 errors in 2721.32s (0:45:21) =======
You really should use this opportunity to investigate why it has passed till now, you might be overseeing a bug.
@lhellebr who is you?
@dosas Since you are the one proposing this change, I would say it's up to you to show us this is actually an error in the test. Since so far, automation expected these values and it has passed. So the fact it has passed until now but your new version is correct, does it mean there was a bug both in automation and in Satellite until now?
FYI, with this PR, PRT is failing with errors such as
not_imported_count = [
template['imported'] for template in filtered_imported_templates['message']['templates']
].count(False)
> assert not_imported_count == 8
E assert 9 == 8
tests/foreman/api/test_templatesync.py:171: AssertionError
Does it mean your PR is wrong or there is a bug in Satellite?
Maybe @ogajduse or @rplevka will know something since they reviewed this PR: https://github.com/SatelliteQE/robottelo/pull/9326 .
Also, note that the PR states:
Updated numbers of templates: because of missing puppet module in sat7, some templates cannot be imported.
@lhellebr I am not claiming my PR is right and the current code is wrong. I made this PR as a base for discussing this (not obvious and from an outside point of view weird/wrong) behavior. It is not easily possible to start a discussion or ask questions without a PR since issues tend to be ignored and there is no other forum or channel of communication that I know of.
Reading your explanations I am starting to think that this might be a upstream vs. downstream specific issue? So a solution that works for both probably would require some additional settings? I am thankful for any explanation or hint that points me in the right direction.
If some templates cannot be imported the less confusing fix would have been to adapt the template repository and the tests.
This pull request has not been updated in the past 45 days.
trigger: test-robottelo pytest: tests/foreman/api/test_templatesync.py
first things first - I can't find words to express how sorry I am that it took me so awkwardly long to reply to this (i simply missed the notification). I can't recall properly, but i think the count mismatch might be caused by this, special template, which is "empty": https://github.com/SatelliteQE/foreman_templates/blob/automation/import/ptable/empty.erb
I re-triggered the automation and will review the results. My guess is there would be some warning logged in the production.log
notifying about failing to import such template.
PRT Result
Build Number: 6923
Build Status: UNSTABLE
PRT Comment: pytest tests/foreman/api/test_templatesync.py --external-logging
Test Result : =========== 2 failed, 29 passed, 292 warnings in 1794.48s (0:29:54) ============
@dosas so, mystery solved.
i ran the tests again and checked the logs. there's 1 template that was skipped, due invalid puppet
parameter (we use satellites with disabled puppet).
{
"name": "LFrNRdWJLpjenkins Start OpenSCAP scans",
"id": null,
"changed": false,
"imported": false,
"additional_errors": null,
"additional_info": null,
"exception": "unknown input type 'puppet_parameter'",
"validation_errors": {},
"file": "jenkins_job.erb",
"type": "job_template"
}
I'd suggest to define a cached property of a Capsule class, that would hold the enabled "features" and choose the appropriate count based on the puppet in target_sat.features
conditional.
Also, I think the counts should not just be hardcoded in the tests, but since they are tightly connected to the repo we work with, they should be defined at the same place as the repo (constants in this case)
@dosas so, mystery solved.
i ran the tests again and checked the logs. there's 1 template that was skipped, due invalid
puppet
parameter (we use satellites with disabled puppet).{ "name": "LFrNRdWJLpjenkins Start OpenSCAP scans", "id": null, "changed": false, "imported": false, "additional_errors": null, "additional_info": null, "exception": "unknown input type 'puppet_parameter'", "validation_errors": {}, "file": "jenkins_job.erb", "type": "job_template" }
I'd suggest to define a cached property of a Capsule class, that would hold the enabled "features" and choose the appropriate count based on the
puppet in target_sat.features
conditional. Also, I think the counts should not just be hardcoded in the tests, but since they are tightly connected to the repo we work with, they should be defined at the same place as the repo (constants in this case)
Thank you for investigating and finding the problem. I will try to implement the proposed changes.
trigger: test-robottelo pytest: tests/foreman/api/test_templatesync.py
trigger: test-robottelo pytest: tests/foreman/api/test_templatesync.py
PRT Result
Build Number: 6941
Build Status: SUCCESS
PRT Comment: pytest tests/foreman/api/test_templatesync.py --external-logging
Test Result : ================ 31 passed, 288 warnings in 1663.71s (0:27:43) =================
trigger: test-robottelo pytest: tests/foreman/api/test_templatesync.py
PRT Result
Build Number: 6969
Build Status: SUCCESS
PRT Comment: pytest tests/foreman/api/test_templatesync.py --external-logging
Test Result : ================ 31 passed, 291 warnings in 1856.09s (0:30:56) =================
@pnovotny would you mind revisiting this?
LGTM