Results 114 comments of Lukasz Stafiniak

Maybe this comes back in some form, but I now plan to move away from `~config` to arbitrarily labeled configurations.

I removed dynamic indexing because it was complicating shape inference and tinygrad did not support it either (at the time, but I'm guessing still). In the meantime, shape inference got...

There is still a glaring bug around, I'm triaging and sprinting to fix the worst things for a 0.6.0 release very soon. It looks to me that tinygrad is using...

It looks pretty trivial to detect and optimize the one-hot embedding pattern into a dynamic index at the low-level internal representation simplification stage. This requires extending Low_level.t but no changes...

I responded too quickly, it is not trivial in fact because it also requires adding dynamic indexing to indexing.ml .

> How would you detect it though? Detect a for-loop containing a summation accumulating assignment with a reduction (matrix multiplication), whose one side is equality between `Embed_index` and `Get`. (Edit:...

I think an embedding operation can be already implemented using the `range` operation with the vocabulary size as non-tensor input. But with the shape inference machinery, I'd like to have...

The other part ("Consider moving away from incremental backprop code construction, to unrestricted backward mode autodiff") got duplicated by #307 .

Marking this as "not planned" as this would be a major departure from the original design of OCANNL -- although if that design turns out simply stupid, as it might...

Postponing this: I need to discuss the options, this does not look like an indisputable win.