promptsource
promptsource copied to clipboard
The hosted version is down 😰
When I visit https://bigscience.huggingface.co/promptsource I get redirected to http://34.88.112.77:8501/ and get a connection timeout error.
Any way I can help?
hi @Fraser-Greenlee , i killed it yesterday since it had close to no CPU usage... can you run it locally? I am happy to revive the instance although I need a little more than one user from time to time to justify the cost :)
otherwise, i can also move it to a hf spaces
Shoot… I can run it locally but I keep running out of memory, how big is the instance your running?
to add, I am also a user that visits time to time. I have no issues running locally but its helpful if theres a hosted version so I can work w/ others on just figuring out what tasks to fine tune :)
I will say that adding more visibility in the readme might increase traffic, I only learned about the hosted version AFTER I had already went through the process on installing locally (yes its in readme, but not as prominent as the instructions to setup)
edit: im just illiterate and cant read, its pretty prominent
thanks for the feedback @Fraser-Greenlee @Eliotdoesprogramming
@Fraser-Greenlee what do you mean by OOM? Do you have a traceback of the error? Is it in the preprocessing of the data? The server was a tiny machine with 4 vcpus and 16GB of ram.
i can re-spawn the machine in the short term, although in the longer term, I'll probably just put it on a HF spaces which doesn't have persistent disk, which means every single time there is a new commit on master, the preprocessed datasets cache will be lost, and you'll have to wait for dataset re-processing. Would that be a problem? At the current pace, new commits are not that frequent anyway
I don't think it's a problem having to wait for the reprocessed. Definitely hosting on a more affordable option makes sense.
Does the reprocessing happen automatically or on selection of the dataset?
gotcha!
Yes at the selection: whenever you query a dataset, HF datasets will check whether there is a preprocessed cache, if it not, HF datasets will download and pre-preprocess it