AdaGram.jl icon indicating copy to clipboard operation
AdaGram.jl copied to clipboard

Pretrained Models

Open AnanS opened this issue 10 years ago • 9 comments

First of all, great paper, thanks for making the source code available!

In the paper, it says that 'all trained models are available at http://github.com/sbos/AdaGram.jl'. However, I could not find any pretrained model on GitHub, and the project homepage (http://bayesgroup.ru/adagram) gives a 404.

I was wondering if the pretrained models are indeed available somewhere already, or if you have any plans of releasing them any time soon.

Thanks!

-- AnanS

AnanS avatar Nov 18 '15 15:11 AnanS

Thank you for the interest in this project. We are planning to share the models soon, they are currently online in AWS S3 and if you contact with me by email I can send you links to them. Any suggestions of a proper free-traffic file hosting is welcome.

sbos avatar Nov 19 '15 07:11 sbos

Here are couple of models available:

http://panchenko.me/data/joint/adagram/

these are trained on 60GB text corpus or wikipedia.

alexanderpanchenko avatar Jan 22 '16 20:01 alexanderpanchenko

Is it possible to get an idea of the parameters that have been used to train the pre-trained models (only those that are not the default setting)?

For example, DIM = 300, ALPHA = 0.2, MIN-FREQ = 20, SENSE-THRESHOLD = 1e-17

ClaudeCoulombe avatar Jul 03 '17 21:07 ClaudeCoulombe

all params are defaults except alpha

alexanderpanchenko avatar Jul 04 '17 13:07 alexanderpanchenko

Nice, thank you Alexander!

ClaudeCoulombe avatar Jul 04 '17 19:07 ClaudeCoulombe

You are welcome, you can cite this publication where the pretrained models are described http://www.lrec-conf.org/proceedings/lrec2016/pdf/625_Paper.pdf

On Jul 4, 2017 9:18 PM, "Claude Coulombe" [email protected] wrote:

Nice, thank you Alexander!

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/sbos/AdaGram.jl/issues/11#issuecomment-312937290, or mute the thread https://github.com/notifications/unsubscribe-auth/ABY6vrzUJNiP8zYQAufIQESoPJgM-zwgks5sKo_4gaJpZM4Gkw0N .

alexanderpanchenko avatar Jul 04 '17 19:07 alexanderpanchenko

Hi Alexander,

Thank you for providing the pre-trained models. I was able to work with them a few weeks ago, but today the link seems to be down.

rahuldeve avatar Feb 20 '19 10:02 rahuldeve

Some server maintenance...

On 20. Feb 2019, at 11:58, Rahul Dev E [email protected] wrote:

Hi Alexander,

Thank you for providing the pre-trained models. I was able to work with them a few weeks ago, but today the link seems to be down.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

alexanderpanchenko avatar Feb 20 '19 11:02 alexanderpanchenko

It should be available soon. If you use these pretrained models please also cite this work:

http://www.lrec-conf.org/proceedings/lrec2016/summaries/625.html

On 20. Feb 2019, at 11:58, Rahul Dev E [email protected] wrote:

Hi Alexander,

Thank you for providing the pre-trained models. I was able to work with them a few weeks ago, but today the link seems to be down.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

alexanderpanchenko avatar Feb 20 '19 11:02 alexanderpanchenko