hfppl
hfppl copied to clipboard
Probabilistic programming with HuggingFace language models
There are cases (e.g., testing / debugging) where it's useful to sample with temperature=0. However, currently, LMContext requires 0 < temperature < infinity: https://github.com/probcomp/hfppl/blob/a191fca4f2055094ede389d10f63e6250ab4601e/hfppl/distributions/lmcontext.py#L112 Setting it to 0 results in...
When running `examples/hard_constraints.py` using two gpus, has following error: ```bash hfppl/hfppl/llms.py", line 244, in past_padded return torch.cat( ^^^^^^^^^^ RuntimeError: Expected all tensors to be on the same device, but found...