Max Berrendorf

Results 166 comments of Max Berrendorf

> @mberr can you elaborate on how the token representation might be useful for the chemical fingerprints (cf [#1509 (comment)](https://github.com/pykeen/pykeen/pull/1509#issuecomment-2668432972))? > > We don't seem to have any examples and...

@cthoyt , if you have time, you could check #15 with the example from https://github.com/mberr/torch-max-mem#-getting-started

Hang on, the current status does explicitly *not* warn for mps by default 😅

No, only if you set a default batch size in `def knn(x, y, batch_size, k: int = 3):`

The result looks as expected: - a warning about `mps` not being considered a safe device to run AMO on - some result, since the tensors were of sufficiently small...

Interesting; this seems to be a `cdist`-specific OOM-like error? 🤔 EDIT: also described here: https://discuss.pytorch.org/t/runtime-error-invalid-buffer-size-when-calculating-cosine-similarity/152088

Added that error in [a0edfbe](https://github.com/mberr/torch-max-mem/pull/15/commits/a0edfbec4b3d715fb9580c6aecf81796f7af71c3)

Re > Should it be possible to run knn(x, y) with no explicit batch size? It is relatively easy to add an automatic maximum batch size inference on top; you'll...

> Great, just pulled and re-ran with the big number. After a totally epic slowdown of my computer and funny audio noises, here's the next output: [ca24138](https://github.com/mberr/torch-max-mem/pull/15/commits/ca241389bc61afa87d345e2f9ede2f2f219e9002)

I already merged https://github.com/mberr/torch-max-mem/pull/15 to bring the warnings and some of the unrelated improvements to `main`; the actual optimization of batch sizes does not yet seem to work (according to...