Rajat Sen
Rajat Sen
I just checked in a fresh install. Model loading and inference still works despite of that error message. Please verify whether you can run inference. If so, for now please...
Thanks for the question. You can use the ```model.save_weights()``` function instead. After that you can initialize the model and then load_weights.
See an example here. It is meant to familiarize users mainly with paxml finetuning. You can of course bring your own data loader as well.
Sorry for the confusion. During pretraining we only train for quantiles=[0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]. Therefore at the moment the current checkpoint can only infer those...
Can you try setting the environment variable: XLA_PYTHON_CLIENT_PREALLOCATE=false ?
While this is not the whole pretraining code but a we have provided code to finetune the model here :https://github.com/google-research/timesfm/tree/master/notebooks