grok-1
grok-1 copied to clipboard
Original bfloat16 weights
Can we download the original (supposedly bfloat16) weights for fine-tuning? The checkpoint is int8 quantized.
Besides, it would be greatly appreciated if pre-training details including hyperparameters are also open-sourced (see https://github.com/xai-org/grok-1/issues/23), as they're extremely important for conducting a full fine-tune.
I second the request for the most precise version of Grok xAI has. Indeed, I think the unquatized version is what most clearly qualifies as "source code" (as it may be preferred for modification), making Grok truly open source.