Daniel Bast
Daniel Bast
Used conda-pack a while ago on daily basis with pyspark and called it even multiple times a day to distribute my current DataScience conda environment to all the Spark workers...
While you can distribute on a HPC via constructor by running a command/script on all machines, the pyspark case is different as you need a yarn compatible .tar.gz or zip...
For inspiration (in case this PR gets improved and polished): * there now also is the [gorialla-cli](https://github.com/gorilla-llm/gorilla-cli), which speaks to a hosted gorilla model... and as a pretty interesting user...
@jaimergp Thanks for looking into this .. the PR is not really WIP anymore, but hard to test: * the used conda-build feature (with tests) to burn in that meta...
conda-build 3.26.0 is available and makes testing this much easier by logging the extra-meta data.
Yay, thanks for the help here!
This is all green now. @jaimergp @goanpeca any reason not to merge?
> I don't know why npm is adding all that cruft to dist/delete/index.js :( Afaik the reason is: every new function used from a dependency is copied into the index.js...
this requires activating dependabot for the repo... and it will propose updates for all actions in and outside of test.yaml
@conda-bot check