Dan Ofer

Results 56 comments of Dan Ofer

Can this be fixed in the requirements? (Or ideally, have it import the compatible version ; ` tf.compat.v1.get_default_graph`)?

I have the exact same error and data set up (TF idf transformed data, multiple labels). `Pre-procesing for non-uniform negative sampling! epoch:0 sum of loss:4.424449801445007 epoch:1 sum of loss:4.447336554527283 epoch:2...

I have the same error. Trying to run the LM example on a new dataset (Language model training from scratch), and I get the same error.

Ditto. I assuemd it was meant to be a list of the texts (loaded from the files into memory), but it fails with `num_samples should be a positive integeral value,...

I'll try it, thanks! On Wed, Jun 15, 2016 at 9:45 PM, Dmitry Ulyanov [email protected] wrote: > Hey, it's very simple, just replace all :cuda() to :cl() with find and...

+ It's also not clear how to get predictions from the trained model on new data/ a new pair of drugs. Do i put in SIDER codes? STITCH? other codes?...

The huggingface library in general would massively benefit from keeping things in the code and not an unholy, messy blend of CLI. (A bit like how fast-BERT does it. https://github.com/kaushaltrivedi/fast-bert)...

An opion to "break it down" a bit would be for the baseline "Replaced token" generator to do random or uniform sampling. (And have a model/EM as the generator function...

Is it maybe relevant to add "load a pretrained (BERT) model" to this task?

They saved the model without the correct output layer, here's the fix: ``` base_model = densenet.DenseNet121(weights=None, include_top=False, input_shape=(224,224,3), pooling="avg") predictions = tf.keras.layers.Dense(14, activation='sigmoid', name='predictions')(base_model.output) base_model = tf.keras.Model(inputs=base_model.input, outputs=predictions) base_model.load_weights("./temp/CheXNet_Keras_0.3.0_weights.h5") base_model.layers.pop()...