transformerlab-app
transformerlab-app copied to clipboard
Open Source Application for Advanced LLM Engineering: interact, train, fine-tune, and evaluate large language models on your own computer.
The import screen shows old info while it is loading an API call after the first time. This is problematic if anything has changed!
I am hosting my server in AWS on a machine that is reporting in UTC and working on my local client in eastern time. If I look at a training...
Specifically MLX only supports some weight file formats (safetensors and nfz I think?). We currently only check architecture which means you sometimes get a "No safetensors for..." error when trying...
It looks like it is looking for local_model = true, which gets added to info.json on trianing but not on export nor on import. If we use stored_in_filesystem instead it...