pixi icon indicating copy to clipboard operation
pixi copied to clipboard

Tracking Issue: Full R Support

Open roaldarbol opened this issue 1 year ago • 28 comments

Problem description

This is a list of issues which need to be solved to have complete R support - not all are directly issues within pixi, but rather obstacles to a smooth pixi-based workflow.

Packaging

  • [x] Ability to generate recipes from R-Universe JSONs
  • [ ] Mass packaging of R packages from R-Universe to conda-forge or r-forge
    • Most R packages are already available on conda-forge with the r- prefix (e.g. r-dplyr), so it's already ready to be used. However, not all packages are there and the recipes are not super easy to create or maintain. rattler-build is the new alternative to conda-build that is based on a new recipe format, and there's been put quite a bit of work into covering edge-cases. Packages uploaded with rattler-build are currently placed in r-forge. R-Universe offers a really nice API that's easy to parse, so the hope here is to have automated packaging of all the packages that are available on R-Universe. Once/if this can be done, I think the idea is to create it as an actual conda channel.
    • Once the automated recipe generation and updating is implemented, maybe have a coverage (how many percent of the packages on R-Universe are successfully built in r-forge).
  • [ ] #2187 Ability to generate recipe from Github repository (needs a DESCRIPTION parser)

GUI

  • [x] https://github.com/conda-forge/rstudio-feedstock/issues/29#issuecomment-2220142756 Updated RStudio version on conda-forge
  • [ ] https://github.com/posit-dev/positron/issues/2659 Get Positron to discover the Pixi R installation
  • [ ] https://github.com/posit-dev/positron/issues/3724 Similar to the above, but essentially like the Python discovery

Workflow

  • [ ] https://github.com/roaldarbol/rpix/issues/2. Upload the rpix package to r-forge. Afterwards I'll implement features as needed.
  • [x] https://github.com/prefix-dev/pixi/issues/786
  • [x] Once templating is implemented, create a template (or once it's decided whether it will use copier) - done, here's the template

Docs

  • [ ] https://github.com/prefix-dev/pixi/issues/1601

roaldarbol avatar Jun 26 '24 09:06 roaldarbol

This looks good! Thanks for the issue :)

tdejager avatar Jun 28 '24 07:06 tdejager

Would be awesome! Any chance it could also support installing R packages from (public) Github repo's? (i.e not on CRAN or Bioconductor).

andreiprodan avatar Jul 04 '24 13:07 andreiprodan

At least initially, no. But it would include everything on R-Universe, where you can find the vast majority of packages - and creating a universe for your packages is quite simple.

I guess @wolfv would know how much more complicated it is to install directly from a Github source, and whether it's feasible. Currently it uses the the JSON from the r-universe API (e.g. https://stan-dev.r-universe.dev/api/packages/). I think, to support GH packages, a new parser would need to be written for parsing DESCRIPTION. 😊

I hope to make a write-up of all the R goodies quite soon!

roaldarbol avatar Jul 04 '24 14:07 roaldarbol

Yeah, there is no technical limitation why it wouldn't work directly from a Github package. Right now, we just use the easy to parse JSON but you could also write a little parser for the R native file - or write the recipe yourself! :)

wolfv avatar Jul 05 '24 11:07 wolfv

Just to offer some insight, the DESCRPTION file is just a Debian Control File, except it's encoded in ASCII and does not support comments. In theory it should be as easy to use a parser for these, such as debian-control. All info about the file (and R packages) can be found on the manual

notPlancha avatar Jul 10 '24 12:07 notPlancha

This is awesome. Thanks for all the hard work to support R users.

A few questions:

Mass packaging of R packages from R-Universe to conda-forge or r-forge

What is r-forge? Are you planning to create a new channel on anaconda.org? I confirmed that there is currently no channel with that name: https://anaconda.org/r-forge

Also, heads up that the name could cause some potential confusion in the R community. R-Forge has already existed for years (it's an R-specific source control system). It's not as popular now that GitHub exists, but saying "download the package from r-forge" could be ambiguous.

For the mass packaging, have you coordinated with the conda-forge R maintainers (@conda-forge/r)? They do a lot of work to maintain thousands of R packages. It's a lot of work to keep up with all the conda-forge migrations.

jdblischak avatar Jul 15 '24 15:07 jdblischak

@jdblischak Thanks! Really good question, thanks for asking! Most of this was implicit knowledge, but seeing as this is issue is getting a bit of traction, I've updated that section now. I haven't been in touch with the conda-forge R maintainers except from trying to create recipes for packages - and all my attempts failed, they were super helpful, but overall I can agree, it's hard work. It also seems that the packages are not updated very regularly (https://anaconda.org/r/repo).

For reference see the edit of the initial post. With rattler-build and the R-universe API, it's been super easy to create package recipes, and I think it'll be feasible to have automated packaging without too many edge cases (but @wolfv will know much better than me whether that's the case - I'm a dreamer 😉). We also talked about automatic cron jobs to check whether a package has been updated. R-universe does this once every hour - don't know if it'd need to be as often, but possibly once every day might be reasonable.

I didn't know about the other R-Forge, good to know about. I think @wolfv just made the name as a way to have language-specific forges (e.g. also see rust-forge.

roaldarbol avatar Jul 15 '24 17:07 roaldarbol

Hey @jdblischak indeed thanks! I was working on more automatic recipe generation for R recipes from inside rattler-build. The r-forge stuff is currently here: https://github.com/wolfv/r-forge and follows a common pattern (have also tried a julia-forge and a rust-forge but DISCLAIMER this is all just prototypey.

I think for certain ecosystems (such as potentially R) it makes sense to maintain them in a more centralized manner than conda-forge. Since most recipes can be auto-generated and updated I think a mono-repo approach for most R packages could be potentially useful. But of course that's something to discuss with conda-forge and even more importantly the current maintainers of R packages in conda-forge.

The "forges" use the rattler-build --recipe-dir . --skip-existing=all functionality of rattler-build to find all packages, build them in the correct order and skip everything that has already been built.

As far as recipe-generation goes I would like to come up with a generalized "patching" functionality so that the bulk of the recipe is generated, and then enhanced with patches (e.g. to add system-dependencies).

wolfv avatar Jul 15 '24 17:07 wolfv

@wolfv I also think it might be worth getting in touch with the conda-forge R maintainers at some point, but I reckon it's probably better you than me - I simply don't know enough about the packaging process.

roaldarbol avatar Jul 15 '24 17:07 roaldarbol

Most of this was implicit knowledge, but seeing as this is issue is getting a bit of traction, I've updated that section now.

@roaldarbol Thanks! It is much clearer now.

It also seems that the packages are not updated very regularly (https://anaconda.org/r/repo).

That is the "r" channel, parts of the "defaults" channel provided by the Anaconda developers. It has nothing to do with the community channel conda-forge.

I think for certain ecosystems (such as potentially R) it makes sense to maintain them in a more centralized manner than conda-forge. Since most recipes can be auto-generated and updated I think a mono-repo approach for most R packages could be potentially useful. But of course that's something to discuss with conda-forge and even more importantly the current maintainers of R packages in conda-forge.

@wolfv I agree there would be advantages to a centralized mono-repo approach. This has been discussed before, eg in https://github.com/bgruening/conda_r_skeleton_helper/issues/48 But this would be a huge change, both technically and socially. On the technical side, we'd have to figure out how to apply the conda-forge migrations to this monorepo. On the social side, we'd have to document that R users no longer submit new recipes to staged-recipes or open an Issue on an individual feedstock, but instead must direct all their activities to the new mono-repo.

The "forges" use the rattler-build --recipe-dir . --skip-existing=all functionality of rattler-build to find all packages, build them in the correct order and skip everything that has already been built.

I also worry about duplication of effort. In addition to the existing CRAN skeleton for conda-build, grayskull now also supports R recipes. With the addition of rattler-build, there are now at least 3 different ways to generate an R recipe.

Here's an old PR to the CRAN skeleton that attempted to directly parse the SystemRequirements field (https://github.com/conda/conda-build/pull/3826). A better approach today would be to use an existing database, such as https://github.com/rstudio/r-system-requirements, but even this would require mapping the linux package names to the correct corresponding conda package name.

As far as recipe-generation goes I would like to come up with a generalized "patching" functionality so that the bulk of the recipe is generated, and then enhanced with patches (e.g. to add system-dependencies).

That's how NixOS builds its R packages. The recipes are auto-generated from a script, and then system requirements and other patches are added afterwards:

https://github.com/NixOS/nixpkgs/blob/master/pkgs/development/r-modules/default.nix

jdblischak avatar Jul 15 '24 18:07 jdblischak

That is the "r" channel, parts of the "defaults" channel provided by the Anaconda developers. It has nothing to do with the community channel conda-forge.

Ah, my bad! No problem on that front then. 😊

I also worry about duplication of effort. In addition to the existing CRAN skeleton for conda-build, grayskull now also supports R recipes. With the addition of rattler-build, there are now at least 3 different ways to generate an R recipe.

grayskull has some shortcomings when it comes to packaging R packages, and r_conda_skeleton_helper is still the preferred way. I saw that you've been following these developments for quite sometime (https://github.com/bgruening/conda_r_skeleton_helper/issues/58), so you'll know more about the history and shortcomings of the current methods than me.

@wolfv Is the plan that rattler-build will be replacing the need for grayskull for creating recipes? I noticed that there's an issue there about adding the new recipe format on grayskull.

roaldarbol avatar Jul 15 '24 19:07 roaldarbol

I think for certain ecosystems (such as potentially R) it makes sense to maintain them in a more centralized manner than conda-forge. Since most recipes can be auto-generated and updated I think a mono-repo approach for most R packages could be potentially useful. But of course that's something to discuss with conda-forge and even more importantly the current maintainers of R packages in conda-forge.

@wolfv I've been thinking about this more, and I think the key is your use of most. Some R packages require much more maintenance. I'm thinking of packages like r-arrow and r-tiledb that require careful pinning to the correct version of their corresponding C++ library. Maintainers will want to be notified when their package has an update PR, and they will not want to give up their write-access (full disclosure: I am a maintainer of r-tiledb). But with the mono-repo approach, there is no way (that I am aware of) to only receive notifications when certain files are touched by a PR, or only grant write-access to specific files within a PR.

As a first pass, I would recommend starting the mono-repo with all the conda-forge R feedstocks that only have conda-forge/r as the sole maintainer. After that is working, then you could open Issues on the remaining feedstocks offering to takeover maintenance by adding the recipe to the mono-repo and archiving the feedstock.

jdblischak avatar Jul 16 '24 19:07 jdblischak

I'll just try to parse out which separate issues I see in this conversation so we can create separate issues for them in the appropriate location:

  1. Application of patches. Seems like a rattler-build issue.
    • Lots of good info here though, thanks for the links @jdblischak!
    • @jdblischak I'm not familiar with the intricacies of packaging complex packages, so I hope it's okay if I ask some stupid questions. I also tried reading up a bit in the Writing R Extensions manual, and learned a bit, but it's a mammoth document... Do I understand correctly that SystemRequirements is a quite unstructured field compared to Depends, License, etc.? And that https://github.com/rstudio/r-system-requirements then is a set of rules for attempting to parse the relevant system requirements and their version? Is it mostly build-time requirements that are needed? Do you have an estimate of the proportion of packages need manual patching?
  2. General packaging strategy. Could possibly be an issue on r-forge as that's currently the prototype/playground for the automated packaging.
    • For me, the intriguing aspect is the automation - currently, users are encouraged to contribute a recipe to conda-forge if it's not present, and accept responsibility to maintain it - big ask for your average R user, and unlike Python devs, R devs don't think of contributing to conda (I guess it's possible, although not mentioned, that you can request a package - but only 1 single time has that happened, not counting mfansler). The automated pull from the r-universe API would bridge that gap, and most packages do not have tricky packaging needs, and doesn't need manual maintenance. In the short/intermediate term then both conda-forge and r-forge could be used as channels and have great coverage - and due to pixi having strict channel priority, conda-forge channels would always take priority if the user wants it to.

Can I just say what I'm realising: Packaging is hard. Y'all are doing an amazing job.

roaldarbol avatar Jul 16 '24 22:07 roaldarbol

Instead of figuring out these all these packaging problems, I wonder if it would be a feasible alternative to support renv instead, which already implements lock files, git remotes and prebuilt packages via p3m. Something similar to how pypi dependencies are deferred to uv, if that makes sense?

phue avatar Jul 17 '24 08:07 phue

my 2 cents as a dailyrenvuser: it's IMHO the best current solution to manange R packages. The downsides of renv (cannot provide R itself or system libraries, like conda/mamba) would be nicely addressed by pixi

andreiprodan avatar Jul 17 '24 08:07 andreiprodan

Do I understand correctly that SystemRequirements is a quite unstructured field compared to Depends, License, etc.?

Correct. It is purely to inform end users. It is completely optional/voluntary, and R itself never parses it.

And that https://github.com/rstudio/r-system-requirements then is a set of rules for attempting to parse the relevant system requirements and their version?

Correct. They used to maintain an explicit database with the system requirements mappings, sysreqsdb, but from the README of r-system-requirements, apparently that manual approach was too cumbersome.

Is it mostly build-time requirements that are needed? Do you have an estimate of the proportion of packages need manual patching?

Hard to say, especially build-time versus run-time. Looking at the manual for the R package {pak}, which uses r-system-requirements, it explicitly states that it doesn't attempt to distinguish between build-time and run-time. I attempted to do a quick analysis with their function sysreqs_db_list(), but I couldn't figure it out. I did a spot check of packages with obvious build-time ({curl} requires libcurl4-openssl-dev, {xml2} requires libxml2-dev) and run-time ({rmarkdown} requires pandoc) dependencies, but these were all empty. Presumably I am doing something wrong, since there is clearly a pandoc rule in r-system-requirements.

sysreqs <- pak::sysreqs_db_list(sysreqs_platform = "ubuntu-22.04")
str(subset(sysreqs, name == "curl"))
## Classes ‘tbl’ and 'data.frame':	0 obs. of  5 variables:
##  $ name        : chr
##  $ patterns    : list()
##  $ packages    : list()
##  $ pre_install : list()
##  $ post_install: list()
str(subset(sysreqs, name == "xml2"))
## Classes ‘tbl’ and 'data.frame':	0 obs. of  5 variables:
##  $ name        : chr
##  $ patterns    : list()
##  $ packages    : list()
##  $ pre_install : list()
##  $ post_install: list()
str(subset(sysreqs, name == "rmarkdown"))
## Classes ‘tbl’ and 'data.frame':	0 obs. of  5 variables:
##  $ name        : chr
##  $ patterns    : list()
##  $ packages    : list()
##  $ pre_install : list()
##  $ post_install: list()
str(subset(sysreqs, name == "chrome"))
## Classes ‘tbl’ and 'data.frame':	1 obs. of  5 variables:
##  $ name        : chr "chrome"
##  $ patterns    :List of 1
##   ..$ : chr "\\bchrome\\b"
##  $ packages    :List of 1
##   ..$ : NULL
##  $ pre_install :List of 1
##   ..$ : chr  "[ $(which google-chrome) ] || apt-get install -y gnupg curl" "[ $(which google-chrome) ] || curl -fsSL -o /tmp/google-chrome.deb https://dl.google.com/linux/direct/google-ch"| __truncated__ "[ $(which google-chrome) ] || DEBIAN_FRONTEND='noninteractive' apt-get install -y /tmp/google-chrome.deb"
##  $ post_install:List of 1
## ..$ : chr "rm -f /tmp/google-chrome.deb"

Anyways, one useful metric is how many packages require compilation. This will give you a sense of how many are trivial to build binaries for. You'll also want to investigate any packages with restrictive licenses.

x <- as.data.frame(available.packages())
table(x$NeedsCompilation)
## 
##    no   yes 
## 16267  4760 
table(x$License_restricts_use == "yes")
## 
## FALSE  TRUE 
##     9     3 

You can also look at how many R packages that nixOS patches for a rough estimate of the number of packages that have system requirements, as well as those that they have marked as broken:

jdblischak avatar Jul 18 '24 18:07 jdblischak

Just wanna say that pixi was the best thing that happened to python in a very long time.

It is solving all python package dependency issues that were a nightmare for developers.

I really hope pixi keeps improving and expanding to other languages (specially R and Julia).

I'd like to thank all the contributors of this project, pixi is really incredible.

GitHunter0 avatar Oct 30 '24 15:10 GitHunter0

Not sure it'll be useful, but hopefully can be used at least as a reference, but I wrote a DESCRIPTION file parser here at andystopia/cran-work. A crate in the repo tests the parser against the most recent version of every package's DESCRIPTION file (21821 files). Result: 0 errors.

andystopia avatar Dec 09 '24 19:12 andystopia

I've worked on this a little more, and my repo, andystopia/cran-work, can now generate rattler-build files from CRAN & Bioconductor DESCRIPTION files directly, including historical versions of packages!

The following command will generate an r-matrix directory with a build yaml contained within, and should be sufficient to build the latest Matrix package from the CRAN.

cargo run --release -p description-to-rattler -- cran recipe Matrix --export

You can leave off --export, if you just want the definition printed to the stdout.

andystopia avatar Dec 11 '24 04:12 andystopia

Small update: I've created a corpus of R DESCRIPTION files here: https://github.com/andystopia/cran_description_files. It contains the 177k+ CRAN descriptions to date. My goal is to convert these to build recipes, or at least, a more universal file format. I imagine with a little work, it'd be possible to make an index of R packages that could make version solving faster. CRAN is pretty, to prohibitively, slow to query for versions. There is a README, but right now it's not showing on GitHub (I'm not sure it will for a repo of this size).

The index creator will either entirely download or update an existing corpus with new descriptions when ran. It takes several hours (~9-11 hours) to create this index from scratch, so I recommend trying to update the existing index instead (< 10 minutes)

andystopia avatar Mar 07 '25 04:03 andystopia

This is great @andystopia, thanks for putting that together!

@wolfv what is the status on building R packages with rattler-build at the moment? (and the automatic package builds). I'll have a bit more time in the coming months, and would really love to begin getting the pixi/conda-forge/R combination off the ground. If we can parse the DESCRIPTION files, are we then getting significantly closer to being able to generate package builds on r-forge initially?

I'll give {rpix} some love in the coming weeks/months too - to those of you who use R and want to rely on pixi/conda-forge, could you maybe jump over in the discussions and let me know what you would consider essential functionality?

roaldarbol avatar Mar 10 '25 12:03 roaldarbol

@roaldarbol I wanted to share my personal experience as an R-user trying to achieve reproducibility:

Pixi is great: it's super fast, easy to use, nice docs. However, not really usable with R because conda-forge packages are outdated. E.g. this very popular R package cannot be used with the most recent R version and only works with r-base >=4.4, <4.5 https://prefix.dev/channels/conda-forge/packages/r-seurat

Okay let's use R 4.4.3 then.

If we then try to install https://prefix.dev/channels/conda-forge/packages/r-rpresto (which is an optional, but very useful dependency), this fails because it requires r-base >=4.3,<4.4.0a0 .

To create reproducible R projects, I use https://github.com/ropensci/rix, which is based on nix. It covers nearly all CRAN and Bioconductor packages (updated daily, checked weekly), plus you can include any github package (sometimes requires manual patches, which can be easily done in the default.nix file).

EDIT: @roaldarbol and I've tried { rpix }, which worked nicely in my handy despite being in alpha (at least the add function. However I think it's fine to run pixi add outside of the shell. And the elephan in the room is the missing coverage of up-to-date packages (at least most CRAN + Biocondcutor should work out of the box, some support for packages only on GitHub would also be nice).

EDIT2: I think a good solution would be similar to https://github.com/prefix-dev/pixi/issues/1543#issuecomment-3193917616. Optimally, rebuilding all CRAN + bioconductor packages regularly (weekly maybe like rix does) + giving user the possibility to add r-universe (better GitHub) recipes

mihem avatar Aug 16 '25 21:08 mihem

not really usable with R because conda-forge packages are outdated.

@mihem thanks for sharing your experience. Friendly reminder that the conda-forge R packages are maintained by a (very small) group of volunteers. Any assistance from other R users is welcome and appreciated.

E.g. this very popular R package cannot be used with the most recent R version and only works with r-base >=4.4, <4.5

No packages have been available for R 4.5 because the transition was delayed due to problems building on Windows. The R 4.5 migration recently started, and thus soon all conda-forge R packages should be available for R 4.5.

If we then try to install https://prefix.dev/channels/conda-forge/packages/r-rpresto (which is an optional, but very useful dependency), this fails because it requires r-base >=4.3,<4.4.0a0 .

I am having trouble reproducing this experience. The most recent version of RPresto, 1.4.7, was released to CRAN on 2025-01-08. It was built for R 4.3 and 4.4 (https://github.com/conda-forge/r-rpresto-feedstock/commit/244ae5effecefcb511ad2c2cb1118602265f4541) and uploaded to the conda-forge channel on the very same day.

Using pixi, I am able to install the latest r-rpresto and r-seurat with R 4.4:

date
## Wed Sep 10 13:03:40 EDT 2025
pixi --version
## pixi 0.54.2
mkdir rconda
cd rconda/
pixi init
pixi add r-base r-rpresto r-seurat
## ✔ Added r-base >=4.4.3,<4.5
## ✔ Added r-rpresto >=1.4.7,<2
## ✔ Added r-seurat >=5.3.0,<6
pixi run Rscript --version
## Rscript (R) version 4.4.3 (2025-02-28)
pixi run Rscript -e 'packageVersion("RPresto")'
## [1] ‘1.4.7’
pixi run Rscript -e 'packageVersion("Seurat")'
## [1] ‘5.3.0’

jdblischak avatar Sep 10 '25 17:09 jdblischak

@jdblischak thanks for your response.

I know that this is done by a small group of volounteers and I am very grateful for that. I just wanted to say that unfortunately the claim that pixi works well with many python and R package is not true concerning the R package in my (very limited) experience. I would be also willing to try to contribute there, but then the experience with Nix/rix with R packages has been so smooth. And then the proposal made here building all/most CRAN packages would be of course preferable.

Thanks, will try this again, maybe also make this reproducible using Docker.

EDIT: Agree can install those two packages with R 4.4.3. Sorry for that. But i think the main point that many conda-forge packages are missing, oudated remains, and R 4.5.0 was published 5 months ago. I completely understand that this is not easy for a few volunteers, so maybe the named approaches sound quite exhaustive and promising ... or we all just switch to nix and support python packages there (then we had one ecosystem providing reproducible solutions for R and python) :)

mihem avatar Sep 10 '25 17:09 mihem

Quick note: I'm new on this thread but have been using R for 18+ years and have a peer-reviewed package on rOpenSci/CRAN. I’d suggest looping in Jeroen Ooms (leads R-universe and helps run rOpenSci). CRAN is the main R repo, but Jeroen is very experienced on open-source package admin and repos and would be the best person to advise here.

On packaging: conda-forge works great when a package exists, but several important ones (e.g. r-igraph) lag, and there’s a long tail that’s missing entirely. One practical idea:

System requirements mapping: for a reasonably mapping from SystemRequirements in DESCRIPTION → build deps, consider Posit’s rstudio/r-system-requirements (MIT-licensed). https://github.com/rstudio/r-system-requirements and the json db here https://github.com/rstudio/r-system-requirements/tree/main/rules

Why this might interest Jeroen: R integration in pixi is a path to bringing R code (the long tail across bio, geospatial, stats, etc.) into a modern, reproducible package manager that has a very fast Rust solver for dependencies and similar overall mission, in line with R-universe’s mission of broader coverage and better tooling.

I’ll try to flag @Jeroen Ooms to chime in here!

soshsquatch avatar Nov 09 '25 19:11 soshsquatch

If @jeroen (https://github.com/jeroen ) doesn't reply I wonder if @juliasilg of Posit could maybe chime in. The SystemRequirement nix and rix rules could maybe be integrated into the Posit r-system-requirements /Pak rules into a single database of rules, until something more formal comes along in R. Last I heard, @jerone used r-lib/Pak on r-universe to standardize SystemRequirements.

soshsquatch avatar Nov 11 '25 00:11 soshsquatch

If @jerone doesn't reply I wonder if @juliasilg of Posit could maybe chime in. The SystemRequirement nix and rix rules could maybe be integrated into the Posit r-system-requirements /Pak rules into a single database of rules, until something more formal comes along in R. Last I heard, @jerone used r-lib/Pak on r-universe to standardize SystemRequirements.

I think you tagged the wrong person, I know nothing about this topic.

jerone avatar Nov 11 '25 21:11 jerone

If @jerone doesn't reply I wonder if @juliasilg of Posit could maybe chime in. The SystemRequirement nix and rix rules could maybe be integrated into the Posit r-system-requirements /Pak rules into a single database of rules, until something more formal comes along in R. Last I heard, @jerone used r-lib/Pak on r-universe to standardize SystemRequirements.

I think you tagged the wrong person, I know nothing about this topic.

Sorry meant to tag @jeroen of r-universe , https://github.com/jeroen

soshsquatch avatar Nov 11 '25 21:11 soshsquatch