chris-aeviator
                                            chris-aeviator
                                        
                                    I remember Timo Schick mentioning that this is not possible yet somewhere in this repository a while ago. I used plain Bert to train my multi classification task
I have in the past also copied it into the `tasks.py` file and approached (yday) to make it better. When importing the class from the examples dir, rather than adding...
@remilapeyre wouldnt it be possible to combine the two, and abstract away that complexity to users? Healthchecks ftw!!
I fixed this with webpack by using resolve-> alias as described in here https://sanchit3b.medium.com/how-to-polyfill-node-core-modules-in-webpack-5-905c1f5504a0
it might be more useful to add a script that adds/ removes containers via an orchestrator than to further pre-define them inside the compose file.
I'd like to point out that your timings (1.6 s /token) matches the timings I'm getting on a CPU only server with Hugginface Transformers library (which uses the 24GB Version...
my measurements are on a Xeon Gold 6126 2,6GHz, my context is roughly similar.
I'm doing this all the time, check ENV variables for this
I'm facing the same issue when spark-submitt'ing a job from a recent python 3.10 environment (developer's laptop) to spark running inside this container
> Could you provide an environment for us to consistently reproducing the issue? `conda create -n spark310 python=3.10 && conda install pyspark && spark-submit …` EDIT: I'm using spark-submit from...