Julian Risch
Julian Risch
Hi @sjrl we are planning a release in the next two weeks. Could this PR maybe make it into the new release? Did you have the chance to test multi-gpu...
FYI: we might upgrade to torch 1.13.1 once it's released.
So what's your opinion on the best way forward? I'd say we merge the changes that we have up to now and support just `torch.nn.DataParallel` but not `torch.nn.DistributedDataParallel`. Investigating the...
Related issue about multimodal embedding https://github.com/deepset-ai/haystack/issues/5943
I can share some intuition why the idea of pooling won't work well for long documents. First, pooling would ignore the order of the chunks/words. This is not a big...
@tradicio Thanks for the feedback, I agree we should add these advanced metadata of _split_id and _split_overlap to a next iteration of DocumentSplitter, yes. 👍
@Guest400123064 Thank you for opening this PR! We really appreciate it. Our team will need a little bit more time to review your PR. Having had a first quick look,...
@vblagoje sounds fair 👍 Will be good to have this use case in mind once we re-design the REST API.
@wochinge What's the current situation here? 🙂 Ready to merge?
@bogdankostic @bglearning Could you share an update on Document VQA here? I know you you briefly worked on it and did some research recently. 🙂