KukumavMozolo
KukumavMozolo
` pip-sync requirements_lock.txt ` reports missing torch for any of these packages: ``` torch-cluster==1.6.0 \ --hash=sha256:362076bd713268ba52ed31ac9517686e206084a50bc4ea50543bdeb992d25bfe # via -r requirements.in torch-geometric==2.0.4 \ --hash=sha256:d64e4c7486fcf0c7fa82f0acbf5dd52035855469708bf89f8bc7fc607671c8b7 # via -r requirements.in torch-scatter==2.0.9 \ --hash=sha256:a3e90ca7d97b0269ba5c86995f93dedd9dba80c42a65aab51d0250581ef024b4...
Hi there, is there any chance that you will add torch as a requirement to setup.py or adopt [PEP 517](https://peps.python.org/pep-0517/)? I am running into problems installing this package while trying...
For some reason data_point["input"] in the following code in finetune.py is always empty. ``` def generate_and_tokenize_prompt(data_point): full_prompt = prompter.generate_prompt( data_point["instruction"], data_point["input"], data_point["output"], ) ``` Is that intended behavior?
Hi there, i stumbled over the problem that the algorithm try's to allocate more memory than i currently have. This happens in this implementation and the sklearn implementation of hdbscan...
### Describe the bug When resuming training with train_gpt_xtts.py the GPTArgs in the saved config,json is deserialized incorretly as its parent XttsArgs, therefore not containing all required files. ### To...
Hi, I noticed that there seems to be a duplicated file in owlready2==0.38 called driver (copie 1).py. It seems to be a duplication of driver.py. Would you mind removing or...
As far as i know there is no implemented method that supports to estimate the autocorrelated noise (noise_variance) from points_sampled via maximum_likelihood or leave_one_out. I wonder why that is and...
Hi there, i am trying to run a unit test that i wrote to test a certain functionality. It works fine if let it execute through "sudo docker build -t...
When executing the following code: ``` x = torch.zeros((2,4)) y = torch.ones((1,4)) print(x+y) #---> tensor([[1., 1., 1., 1.], # [1., 1., 1., 1.]]) x = SparseTensor.from_dense(x) y = SparseTensor.from_dense(y) res...
Hi!!, i am trying to serialize a trained umap model with pickle.dumps. Unfortunately there is something going wrong, memory is exploding from 5gb to > 252gb and for some reason...