haikuporter
haikuporter copied to clipboard
"waiting for package to be activated" when changing recipe dependencies and rebuilding without cleaning work directory
- Create or modify a recipe
- Try to build it, build fails
- Change dependencies of the recipe (update or add one)
- Try to build again
- Now you get stuck on "waiting for package to be activated"
- Delete work directory for the recipe
- Now it works fine.
Works indeed, and for small project perfectly OK, but for larger projects it's a pain to restart the full build because of an error during build that could have been fixed with a patch?
I face this situation almost all the times. I noticed that it happens more often when I change the build prerequisites, adding or removing cmd
Adding a -c
to the build flags is enough to clear the error (so you can try again). Definitely happening on failed builds.
$ haikuporter/haikuporter pytouhou --get-dependencies
(failed build)
(build fixed)
$ haikuporter/haikuporter pytouhou --get-dependencies
.
.
----------------------------------
waiting for build package pytouhou-20201105_hg-1 to be activated
waiting for build package pytouhou-20201105_hg-1 to be activated
waiting for build package pytouhou-20201105_hg-1 to be activated
waiting for build package pytouhou-20201105_hg-1 to be activated
waiting for build package pytouhou-20201105_hg-1 to be activated
(endless loop) <ctl + c>
$ haikuporter/haikuporter pytouhou --get-dependencies -c
$ haikuporter/haikuporter pytouhou --get-dependencies
(works as expected)
$ haikuporter/haikuporter pytouhou --get-dependencies -c
Doesn't that clean the build before a rebuild (eg same as hp pyhouhou -c)?
I once managed to get past this error by moving build package out and back.
cd ports/media-video/vlc/work-3.0.11/boot/system/packages/ ;
mv vlc-3.0.11-1-build.hpkg ~
mv ~/vlc-3.0.11-1-build.hpkg .
Also a few times after several minutes of waiting for build package
it started the build without me doing anything.
What I usually do is just Ctrl-C-ing haikuporter and trying again. Sometimes it works, but most of the time not. To me, this looks like a race condition somewhere or a similar issue.
Pretty annoying indeed.
I seem to recall that this is somehow a package_daemon problem, but I do not remember more than that. At any rate, HaikuPorter should have a timeout and not just retry futilely forever.
Here is where it loops forever: https://github.com/haikuports/haikuporter/blob/eee899ad694b4fd65f5ce60365fe685f34ae652a/HaikuPorter/BuildPlatform.py#L205-L207
Apparently removing boot/system/packages/administrative/activated-packages from the work directory before restarting haikuporter will fix this.
hm.. should haikuporter be doing this on startup? (if activated-packages exists, and i'm just starting up... delete it)
It does not help all the time, unfortunately. Yesterday I always had the issue and could not get a package to build despite having deleted that file. There was nothing suspicious in the syslog.
Yep, deleting that file or even the whole work foldrr doesnt helps sometimes.
hm.. should haikuporter be doing this on startup? (if activated-packages exists, and i'm just starting up... delete it)
I think we should fix the root cause in haiku instead (this file should not prevent the packages from reactivating)
Can you try this workaround and see if it "prevents" this from happening:
diff --git a/HaikuPorter/BuildPlatform.py b/HaikuPorter/BuildPlatform.py
index 9611545..bf4e5be 100644
--- a/HaikuPorter/BuildPlatform.py
+++ b/HaikuPorter/BuildPlatform.py
@@ -150,6 +150,7 @@ class BuildPlatformHaiku(BuildPlatform):
# activate the build package
packagesDir = buildPlatform.findDirectory('B_SYSTEM_PACKAGES_DIRECTORY')
activeBuildPackage = packagesDir + '/' + os.path.basename(packagePath)
+ time.sleep(3)
self.deactivateBuildPackage(workDir, activeBuildPackage,
revisionedName)
Some time ago I've traced this down to a race condition between the package_daemon and packagefs. I haven't spent too much time on it, but I could not think of a way to properly solve the issue at the time. The details are a bit hazy, but I think what it boiled down to was that Volume::_CheckActivePackagesMatchLatestState
in src/servers/package/Volume.cpp
would return something unexpected because of the timing of the build package file arriving. AFAIR the issue broadly is that there is state about activated packages held in two places, the packagefs and the package_daemon, and both can race each other (the state cannot be fully controlled by the package_daemon as packagefs needs to work standalone at boot time).
I am unfortunately very busy currently and cannot dig into it at this time. For someone looking into it, I'd start by thinking through Volume::_GetActivePackages
and how the data from PACKAGE_FS_OPERATION_GET_PACKAGE_INFOS
might race with the local enumeration of packages inside the package_daemon in later processing. As a first step, enabling and possibly extending debug output in the package_daemon should show the problem.
Can you try this workaround and see if it "prevents" this from happening:
Didn't do the trick here after inserting the line, so far the tip from @pulkomandy does the job for me
I've tried the workaround by @mmlr and it seems to fix the issue for me at least so far.
Need to re-check ... will report back later EDIT still doesn't work here (32bit R1B3)
Although it doesn't work for me when I change some requirements, it did work for me when I was trying to build strawberry, before the change removing the activated directory didn't work, cleaning the build didn't work, neither a reboot. So I made the change after a new failed attempt and it did launch the build for strawberry, but it seems it mounts the system volume now, isn't it dangerous? (haven't fiddled with it but maybe one could make changes in there that are saved after the build is finished?)
before the change removing the activated directory didn't work, cleaning the build didn't work, neither a reboot.
Indeed, it seems to solve the same case for me. @threedeyes noticed that if recipe name starts with a letter down the alphabet the likelihood of this case is significantly higher. For example I could almost never build yate
recipe until I applied this workaround. Renaming a recipe to start with _
(e.g. _yate
) makes it work. So it looks like there are at least two cases which lead to waiting for package to be activated
.
I have found that deleting the boot
directory from the work-...
directory reliably "fixes" this problem.
Yes, nothing new here. See my comment from december 2020:
Apparently removing boot/system/packages/administrative/activated-packages from the work directory before restarting haikuporter will fix this.
Can we just patch haikuporter to remove the activated-packages file and do a clean activation of all packages everytime? Or should we fix the underlying problem? Which is probably a better idea, as mmlr already suggested where the issue likely is (see early january 2021 post)
Still not fixed, getting the same annoying waiting for ...
(randomly), I don't think I saw ... deactivate
earlier, but that showed up a while ago here :/
Yesterday the x84_64 buildmaster got stuck for hours waiting for...
. 3 MB log file worth of that message repeated :-/:
https://build.haiku-os.org/buildmaster/master/x86_64/?buildrunDir=7500&viewMode=expanded
This issue is for a specific case where the dependencies for a recipe are changed, and the old work directory cannot be reused.
I think that is not possible on the builder, which does a clean build everytime? If I'm right, I would suggest opening a separate issue, even if the error message is the same.
Oh, ok. I thought it was worth mentioning it, just in case. I'll leave the opening of a separate issue to people with actual knowledge of the builder (I'm just a nosy passer-by, and wouldn't have much to add).
Yes, you're right, I have changed the title of this issue to clarify that it is more specific now, I'll wait for your otehr issue so we can investigate it separately.