rhdong

Results 72 comments of rhdong

> That's a very interesting proposal. > > From a high level view (and I'm probably wrong) it looks like it proposes a new type of variable and a new...

> The change to the existing SparseApply* kernels which removes Ref(T) from the signature is backwards incompatible and can't be done. Adding new kernels for the hash apply is fine,...

@yuefengz @byronyi @alextp @smilingday @facaiy @seanpmorgan @omalleyt12 Hi all, I just commit an important update for optimizer reusing scheme based on ResourceVariable and come up API detailed design. And I...

I think this version scheme is simple and natural enough for core.

I will provide the source code with unittest cases sooner.

> Is this RFC related to the recently proposed paper "DynamicEmbedding: Extending TensorFlow for Colossal-Scale Applications" by Google? https://arxiv.org/pdf/2004.08366.pdf No, this is a different scheme proposed in an earlier paper...

@yuefengz @tanzhenyu @byronyi @alextp Hi, I just updated this RFC and this update contains some key features include the scheme of compatible with all `tf.initializer` without hacking too much on...

> is it compatible with tensorflow serving ? @rhdong Yes

> Hi @rhdong , I fix some bugs(shape of TrainableWrapper) and build tf 2.4.0, based on your code. It seems the dynamic_embedding didn't updated in training process. > > Code...

> > Hi @rhdong , I fix some bugs(shape of TrainableWrapper) and build tf 2.4.0, based on your code. It seems the dynamic_embedding didn't updated in training process. > >...