Johannes Buchner

Results 373 comments of Johannes Buchner
trafficstars

Please post the entire error message and the debug.log Please also post your entire arguments: storage_backend, fitter_kwargs Perhaps there is a rounding issue, if the sampled points receive strange weights....

That's odd, logweights is a 1-dimensional array instead of two-dimensional.

Could you check if changing the line https://github.com/JohannesBuchner/UltraNest/blob/v3.3.3/ultranest/integrator.py#L1543 > logweights = np.array(main_iterator.logweights[:itmax]) to > logweights = np.array(main_iterator.logweights[:itmax,:]) fixes this issue?

Isn't that your likelihood though? You can test with something like: ``` ndraw_max = 500 for i in range(100): us = np.random.uniform(size=(ndraw_max, ndim)) ps = prior_transform(us) Ls = log_likelihood(ps) ```...

I guess however that running a vectorized gaussian likelihood with ultranest would not show this extreme memory usage. So maybe there is a memory leak somewhere. You may need to...

Please reopen if you can reproduce this issue with a non-jax toy likelihood function. This [page](https://stackoverflow.com/questions/41105733/limit-ram-usage-to-python-program) suggests you can use ulimit or prlimit to limit the memory allowance of a...

This could probably be addressed together with https://github.com/JohannesBuchner/UltraNest/issues/99

Looks like you are probably hitting limitations of floating points. Between +0.1999999999 and +0.2000000001, there aren't a lot of values, so the proposal procedure may have difficulty finding new, unique...

following https://conda-forge.org/blog/2020/10/29/macos-arm64/, I created https://github.com/conda-forge/conda-forge-pinning-feedstock/pull/6126

Yes please. Some of it is documented at https://johannesbuchner.github.io/UltraNest/performance.html#output-files but it would be best to have it in the API docs.