uyuni
uyuni copied to clipboard
Duplicate package NEVRA
Problem description
Uyuni reposync seems to import duplicate packages if the upstream repository changes a package without changing the NEVRA details (so only the checksum and contents differ).
This also causes issues with custom package install/upgrade scheduling when managing package upgrades via API.
It also wastes disk space since the older package data shouldn't be downloaded by clients (or at least DNF seems to download the later packages only, this might be undefined behaviour).
Pulp has had a similar issue in the past with duplicate NEVRA: https://github.com/pulp/pulp_rpm/issues/2691
EDIT: This behaviour is not exclusive to RPM based repositories. I've also encountered DEB package duplicates.
Handling duplicate NEVRA?
My personal opinion is that there should not be packages with duplicate NEVRA and Uyuni reposync should replace existing but differing packages with the ones available in the upstream repository (because that's probably what the repository maintainer intended).
I know it is wrong (and probably also against spec) to change repository's packages without updating the NEVRA details but lots of repository maintainers have proven to be naughty (CentOS, AlmaLinux, EPEL, ELRepo, Nagios, Microsoft, MariaDB and Docker CE to name a few that I've seen doing this).
What are your views on this from Uyuni's point? Are there other ways to combat related problems caused by duplicate NEVRA? Potential issues if NEVRA was treated as unique by Uyuni?
Workaround
I've currently just deleted duplicate packages/NEVRA manually to avoid the mentioned issues.
This way I also keep the Uyuni channel closer to what is actually available in the upstream repository (as intended by the repository maintainer).
Steps to reproduce
Run an Uyuni Server with lots of RPM repositories (with daily reposyncs) for a while and you are going to see duplicate NEVRA at some point. Alternatively you could set up your own RPM repository and change packages without changing the NEVRA details to simulate the same thing.
Uyuni version
This affects older versions of Uyuni such as 2023.09 and 2023.10 too.
---------------------------------------------
Repository : Uyuni Server Stable
Name : Uyuni-Server-release
Version : 2023.12-230900.210.2.uyuni3
Arch : x86_64
Vendor : obs://build.opensuse.org/systemsmanagement:Uyuni
Support Level : Level 3
Installed Size : 1.4 KiB
Installed : Yes
Status : up-to-date
Source package : Uyuni-Server-release-2023.12-230900.210.2.uyuni3.src
Summary : Uyuni Server
Example of duplicate NEVRA/packages and what could happen with them
This is a simplified representation of the process I use to automatically schedule package updates via the API.
Fetching list of available updates with system.listLatestUpgradablePackages
API method returns a list of packages which contain the duplicates and therefore it is not safe/possible to schedule updates just by passing a list of to_package_id
values to the system.schedulePackageInstall
API method.
system.listLatestUpgradablePackages(key, 1000010123):
[
{
"from_epoch": " ",
"to_release": "latest",
"name": "ncpa",
"from_release": "1.el9",
"to_epoch": " ",
"arch": "x86_64",
"to_package_id": 824895,
"from_version": "2.4.1",
"to_version": "3.0.1",
"from_arch": "x86_64",
"to_arch": "x86_64"
},
{
"from_epoch": " ",
"to_release": "latest",
"name": "ncpa",
"from_release": "1.el9",
"to_epoch": " ",
"arch": "x86_64",
"to_package_id": 828916,
"from_version": "2.4.1",
"to_version": "3.0.1",
"from_arch": "x86_64",
"to_arch": "x86_64"
}
]
system.schedulePackageInstall(key, 1000010123, [824895, 828916], now):
< Valid Action ID is returned >
Uyuni Event Details
This action's status is: Failed.
Packages Scheduled:
ncpa-3.0.1-latest.x86_64
ncpa-3.0.1-latest.x86_64
Client execution returned:
...
ID: pkg_installed
Function: pkg.installed
Name: pkg_installed
Result: false
Comment: An exception occurred in this state: Traceback (most recent call last):
...
raise SaltInvocationError(
salt.exceptions.SaltInvocationError: You are passing a list of packages that contains duplicated packages names: [OrderedDict([('ncpa.x86_64', '3.0.1-latest')]), OrderedDict([('ncpa.x86_64', '3.0.1-latest')])]. This cannot be processed. In case you are targeting different versions of the same package, please target them individually
...
More details about the example duplicate packages
Looks like the repository maintainer accidentally released this package with SHA1 signature at around Dec 14 and then replaced the package with SHA256 signature at around Dec 19 without changing the NEVRA details.
I downloaded both packages and inspected them with RPM. Below is a comparison between the signatures of both packages. All other details were identical (even build date).
Dec 14 package checksum: 9437972b8540216c331e9aea6d556d600674cfbf7b2e5a0c65939c2a0e422bff
Dec 14 package signature: RSA/SHA1, Thu Dec 14 18:23:02 2023, Key ID 471d5f4645ecf0ad
Dec 19 package checksum: 9954aca0ca540664992ffed50ea795d39aadb345b52bc0b71c11556085aa33ac
Dec 19 package signature: RSA/SHA256, Tue Dec 19 17:16:19 2023, Key ID 471d5f4645ecf0ad
Search results per NEVRA/NVREA (both are in the same channel)
packages.findByNvrea(key, "ncpa", "3.0.1", "latest", "", "x86_64"):
[
{
"path": "packages/1/943/ncpa/3.0.1-latest/x86_64/9437972b8540216c331e9aea6d556d600674cfbf7b2e5a0c65939c2a0e422bff/ncpa-3.0.1-latest.x86_64.rpm",
"provider": "Unknown",
"release": "latest",
"name": "ncpa",
"epoch": "",
"part_of_retracted_patch": false,
"id": 824895,
"version": "3.0.1",
"last_modified": "2023-12-15 12:32:08",
"arch_label": "x86_64"
},
{
"path": "packages/1/995/ncpa/3.0.1-latest/x86_64/9954aca0ca540664992ffed50ea795d39aadb345b52bc0b71c11556085aa33ac/ncpa-3.0.1-latest.x86_64.rpm",
"provider": "Unknown",
"release": "latest",
"name": "ncpa",
"epoch": "",
"part_of_retracted_patch": false,
"id": 828916,
"version": "3.0.1",
"last_modified": "2023-12-20 12:32:54",
"arch_label": "x86_64"
}
]
@rjmateus can we consider it solved then?
I don't think it's solved. It would only make sense to avoid a collision if the package is on the same channel, otherwise, we would need to duplicate it, since different products can have the same package with different content or signatures. @mcalmer do you know if it was implemented?
I think inside of the same channel this can happen. I think we have nothing to prevent this.
Hello, We are facing same issue on patching/updating packages on Ubuntu 2204 thourgh salt/suse-manager, we are calling API schedulePackageInstall
Do you have already some solution for this case? When we run manuall update package in SUSE Manager, it works, but if we are controll the updating of our systems through the scheduler
ID: pkg_installed Function: pkg.installed Name: pkg_installed Result: false Comment: An exception occurred in this state: Traceback (most recent call last):
File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/state.py", line 2401, in call ret = self.states[cdata["full"]]( File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/loader/lazy.py", line 149, in call return self.loader.run(run_func, *args, **kwargs) File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/loader/lazy.py", line 1234, in run return self._last_context.run(self._run_as, _func_or_method, *args, **kwargs) File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/loader/lazy.py", line 1249, in _run_as return _func_or_method(*args, **kwargs) File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/loader/lazy.py", line 1282, in wrapper return f(*args, **kwargs) File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/states/pkg.py", line 1703, in installed result = _find_install_targets( File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/states/pkg.py", line 584, in _find_install_targets desired = _repack_pkgs(pkgs, normalize=normalize) File "/usr/lib/venv-salt-minion/lib/python3.10/site-packages/salt/modules/pkg_resource.py", line 38, in _repack_pkgs raise SaltInvocationError( salt.exceptions.SaltInvocationError: You are passing a list of packages that contains duplicated packages names: [OrderedDict([('openssh-client:amd64', '1:8.9p1-3ubuntu0.6')]), OrderedDict([('vim-common:all', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('vim-tiny:amd64', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('openssh-sftp-server:amd64', '1:8.9p1-3ubuntu0.6')]), OrderedDict([('cryptsetup-bin:amd64', '2:2.4.3-1ubuntu1.2')]), OrderedDict([('xxd:amd64', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('cryptsetup:amd64', '2:2.4.3-1ubuntu1.2')]), OrderedDict([('vim:amd64', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('vim-tiny:amd64', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('libctf-nobfd0:amd64', '2.38-4ubuntu2.4')]), OrderedDict([('venv-salt-minion:amd64', '3006.0-2.37.3')]), OrderedDict([('openssh-client:amd64', '1:8.9p1-3ubuntu0.6')]), OrderedDict([('openssh-sftp-server:amd64', '1:8.9p1-3ubuntu0.6')]), OrderedDict([('software-properties-common:all', '0.99.22.9')]), OrderedDict([('openssh-server:amd64', '1:8.9p1-3ubuntu0.6')]), OrderedDict([('vim-common:all', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('distro-info:amd64', '1.1ubuntu0.2')]), OrderedDict([('systemd-hwe-hwdb:all', '249.11.4')]), OrderedDict([('cryptsetup-initramfs:all', '2:2.4.3-1ubuntu1.2')]), OrderedDict([('vim-runtime:all', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('vim-runtime:all', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('libcryptsetup12:amd64', '2:2.4.3-1ubuntu1.2')]), OrderedDict([('openssh-server:amd64', '1:8.9p1-3ubuntu0.6')]), OrderedDict([('distro-info-data:all', '0.52ubuntu0.6')]), OrderedDict([('libctf-nobfd0:amd64', '2.38-4ubuntu2.4')]), OrderedDict([('python3-distro-info:all', '1.1ubuntu0.2')]), OrderedDict([('xxd:amd64', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('vim:amd64', '2:8.2.3995-1ubuntu2.15')]), OrderedDict([('python3-software-properties:all', '0.99.22.9')])]. This cannot be processed. In case you are targeting different versions of the same package, please target them individually
There is also part of the script what we are using
if packageids: ### Checks if there are Elements on the Package List verbose_print(f"Found {packageids} Packages. Schedule Patching via SUSEMANAGER") action = client_global.system.schedulePackageInstall( key_global, systemid, packageids, earliest_occurrence ) ### Schedules Patching via SuseManager - Returns ActionId verbose_print(f"ActionID: {action} ") if ( wait_for_action(action) == 0 ): ### wait for Patching ends - see waitForActionEnd - Function if schedule_package_ref(name) == 0: verbose_print("Patching Completed successfully") return 0 ### Returns 0 if patching completed successfully else: if schedule_package_ref(name) == 0: verbose_print("Patching Failed") return 1 ### Returns 1 if patching failed
verbose_print("No Patches found for this Server") return 0 ### Return 0 if there are no Patches
@kalokja, I worked around that issue by deduplicating the available packages list retrieved with the API method system.listLatestUpgradablePackages
. This way the Salt Minion won't end up with trying to install duplicate packages and eventually raising the SaltInvocationError exception.
Basically what I do is I create a list of available updates and ensure the list does not contain duplicates per NVRA. If a duplicate is found then the newer one (per package ID) is kept in the list. Finally that list is then converted into a list of package IDs which is passed to the system.schedulePackageInstall
API method.
Here is a slightly modified and commented part of the script I'm using:
# Loop throught each available package and create a list of deduplicated packages
# NVRA = Name-Version-Release.Arch
nvra = f"{available_package['name']}-{available_package['to_version']}-{available_package['to_release']}.{available_package['arch']}"
available_package['nvra'] = nvra
# Deduplicate packages
if available_package['nvra'] not in [pkg['nvra'] for pkg in deduplicated_packages]:
# First occurrence of this package, no duplicates found
deduplicated_packages.append(available_package)
else:
# There is already a package with the same NVRA so we will only keep the newer package ID
for deduplicated_package in deduplicated_packages:
if deduplicated_package['nvra'] != available_package['nvra']:
continue
# Change the package ID to the latest one
if int(available_package['to_package_id']) > int(deduplicated_package['to_package_id']):
# NOTE: Technically this shouldn't matter as the package manager of the operating system
# will choose which specific package file it will install just based on the name-version-release.arch given by Salt Minion.
# This is undefined behavior with YUM/DNF and probably other package managers too.
deduplicated_package['to_package_id'] = available_package['to_package_id']
There are probably better and more optimized ways to do this but this works for me.
I also got error msg salt.exceptions.SaltInvocationError: You are passing a list of packages that contains duplicated packages names
while batch-upgrading with uyuni-tools/group_system_update.py
(see https://github.com/uyuni-project/uyuni-tools).
I assume this is related? Or should i investigate further?
I can add that I am now seeing this same issue on 2024.05 with Ubuntu 22.04 repo's, while selecting available patches and/or packages and scheduling install via the Web UI:
ID: pkg_installed
Function: pkg.installed
Name: pkg_installed
Result: false
Comment: An exception occurred in this state: Traceback (most recent call last):
File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/state.py", line 2401, in call ret = self.states[cdata["full"]]( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/loader/lazy.py", line 149, in call return self.loader.run(run_func, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/loader/lazy.py", line 1234, in run return self._last_context.run(self._run_as, _func_or_method, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/loader/lazy.py", line 1249, in _run_as return _func_or_method(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/loader/lazy.py", line 1282, in wrapper return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/states/pkg.py", line 1703, in installed result = _find_install_targets( ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/states/pkg.py", line 584, in _find_install_targets desired = _repack_pkgs(pkgs, normalize=normalize) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/venv-salt-minion/lib/python3.11/site-packages/salt/modules/pkg_resource.py", line 38, in _repack_pkgs raise SaltInvocationError( salt.exceptions.SaltInvocationError: You are passing a list of packages that contains duplicated packages names: [OrderedDict([('vim-runtime:all', '2:8.2.3995-1ubuntu2.17')]), OrderedDict([('libpam-modules-bin:amd64', '1.4.0-11ubuntu2.4')]), OrderedDict([('bluez-cups:amd64', '5.64-0ubuntu1.3')]),
Something else to consider, in the web UI, on the details section for a client, when this issue occurs the available package counts will be different depending on which view you are using. In this example, the Detail / Overview for a particular client says 133 packages available, but when you drill down into the package detail view, the count is higher, (137 in this example). Subsequently, when manually updating the client via apt upgrade, the number of packages matches the detail / overview page, 133, and updates normally.
Client overview:
Package detail view:
See also #8998