M. Yusuf Sarıgöz

Results 34 comments of M. Yusuf Sarıgöz

Suggested implementation is in the issue. WDYT? @generall and @joein

It's zero-size tensor (scalar), so regular equal and not-equal operators should work --we use `torch.allclose` and the similar to compare vectors and embeddings to avoid ambiguous meaning of comparing them.

Instead of just saving the names of the module and the class to import them, what about using [dill](https://pypi.org/project/dill/) to pickle the class directly? WDYT? @generall and @joein

Not from the same notebook actually. My initial consideration was defining encoders in notebooks, e.g., on Colab, training models, saving servable and then using it elsewhere outside the notebook. But...

Another idea might be giving users a simple utility to create a boilerplate, e.g., `quaterion new project-name` may generate a basic template with dependencies defined, `encoders.py`, `training.py`, `inference.py`, `notebook.ipynb` etc....

It also makes a good competitive advantage to similar projects, and easily reproduceable projects may help accelerate the adoption. Raising a separate issue for this.

I guess we can do something with [cell magics](https://ipython.readthedocs.io/en/stable/config/custommagics.html) for this issue. Other alternatives such as class serialization etc. are neither reliable nor safe.

Our tutorials have some omitted code parts, and notebooks are not good for restoring a `SimilarityModel` after saving at the moment until #38. So I doubt if this will be...

> it is about tutorials which have complete examples, like nlp and cv Yes it might be relevant for these two tutorials after some modifications.