AudioLDM icon indicating copy to clipboard operation
AudioLDM copied to clipboard

Training and Dataset

Open onlyscc opened this issue 2 years ago • 14 comments

Could the authors kindly share the training code and dataset?

onlyscc avatar Feb 06 '23 21:02 onlyscc

We'll consider sharing the training code after paper acceptance. Open-sourcing training pipeline now may lead to some issues if my paper gets rejected later. Thanks for your patience and understanding.

haoheliu avatar Feb 06 '23 22:02 haoheliu

Thanks for your reply. Good luck to your excellent work.

onlyscc avatar Feb 07 '23 20:02 onlyscc

Any updates? Really interested in this.

chavinlo avatar Feb 17 '23 07:02 chavinlo

what about dataset?Could the authors kindly share the dataset?Thank youuuuu!!!

zjzser avatar Mar 08 '23 00:03 zjzser

@zjzser The datasets we used are all open-sourced datasets. Please find the details in our paper. Thanks.

haoheliu avatar Mar 08 '23 10:03 haoheliu

any updates on the training code? would be amazing to get this in the open ecosystem!

samim23 avatar Apr 21 '23 20:04 samim23

any updates on the training code?

Wang-Charles avatar May 16 '23 09:05 Wang-Charles

It seems that the paper has been accepted. Could you update the training code?

arsity avatar Jun 05 '23 12:06 arsity

I'm interested in this as well. I'm interested to see the shift in performance of different epoch during the training process.

hykilpikonna avatar Jun 16 '23 22:06 hykilpikonna

Congratulations on your paper's acceptance! Would you be so generous as to share the training code?

AlgernonMXF avatar Jun 28 '23 09:06 AlgernonMXF

any update? looking forward to it!

dongzhuoyao avatar Aug 16 '23 12:08 dongzhuoyao

Hi, thanks for the perfect work, looking forward to the release of the training pipeline as everyone does 😊

BenoitWang avatar Oct 13 '23 11:10 BenoitWang

I would also be highly interested in the training pipeline

limchr avatar Nov 08 '23 07:11 limchr

Will there be any change in license like MIT or Apache 2.0 or so ? with the information of source code / training code? or fine-tuning code?

Tortoise17 avatar Nov 16 '23 16:11 Tortoise17