dbl
dbl
There's more installation issues ... ## Summary Succeeded: 28 ## Failed: 12 Total: 40 vagrant up Bringing machine 'default' up with 'virtualbox' provider... ==> default: Importing base box 'precise64'... ==>...
PyTorch has just added support for 'mps' GPUs. Will this eventually be integrated into AI Feynman?
I am on MacOS 13.1 with an Anaconda virtual environmen: decision-transformer-gym Where is FROZEN_SYM_BD defined? E.g. - $vi +5 conf/conf_cfgs.py ``` MetaCfg = FROZEN_SYM_BD['fairdiplomacy.MetaCfg'] ```
Mac OS X 10.10.5 David-Laxers-MacBook-Pro:python-goose davidlaxer$ python -V Python 2.7.10 :: Anaconda 2.3.0 (x86_64) David-Laxers-MacBook-Pro:python-goose davidlaxer$ conda -V conda 3.16.0 ``` David-Laxers-MacBook-Pro:~ davidlaxer$ git clone https://github.com/grangier/python-goose.git Cloning into 'python-goose'... remote:...
I have experimented with adjustments to the 'lda_loss' function: E.g. Lda2vec.py: ``` normalized = tf.nn.l2_normalize(self.mixture.topic_embedding, axis=1) loss_lda = self.lmbda * fraction * self.prior() + (self.learning_rate*tf.reduce_sum(tf.matmul(normalized, normalized, adjoint_b = True, name="topic_matrix")))...
"UNK " is added to the tokenizer word lists in nlppip.py because the from keras.preprocessing.text import Tokenizer is one-based. ``` self.tokenizer.word_index[""] = 0 self.tokenizer.word_docs[""] = 0 self.tokenizer.word_counts[""] = 0 ```...
Analogy is one of the properties of dense word embedding vectors. I evaluated 'analogy' based upon cosine-similarity distance metric on word embedding vectors learned from the twenty newsgroups dataset. I...
Comments? https://medium.com/deep-learning-turkey/google-colab-free-gpu-tutorial-e113627b9f5d
1. vocab_size does not get set with load_embed = False E.g. ``` vocab_size = embed_matrix.shape[0] ``` 2. utils.load_preprocessed_data() returns 6 parameters (not 7) when load_embed = False ``` (idx_to_word, word_to_idx,...
I get this error when pretrained_embeddings=None ```bash m = model(num_docs, vocab_size, num_topics=num_topics, #embedding_size=embed_size, restore=False, #logdir="/data/", pretrained_embeddings=None, freqs=freqs) --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () 25 #logdir="/data/", 26 pretrained_embeddings=None,...