Sean Owen

Results 245 comments of Sean Owen

No, you need GPU RAM. Fine tuning is a pretty different question but broadly it will take more resources than inference.

@matthayes is working on it

I would do the latter, no real point in fine tuning twice

It will be a mix of the two. I don't think you actually want the model to unlearn everything, even if you want certain facts to take precedence. As far...

Right now you have to modify the code to set a different path to training data, but yes

@matthayes I see the dataset is even there, just not turned loose yet - is it meant to be private?

I asked, and looks like the plan is to just host it in Github here, for now.

Scratch that - it's up now: https://huggingface.co/datasets/databricks/databricks-dolly-15k

OK thank you though I have some concerns about what you suggest. For example you don't need to download the model by hand. Just let HF download it. You also...

Thank you, though there are many videos and resources about Dolly now, so I don't think we're going to list some on the project sites.