horde-ad icon indicating copy to clipboard operation
horde-ad copied to clipboard

[blocked on #20] Try sparser and not ad-hoc data structures for the scaling factors

Open Mikolaj opened this issue 2 years ago • 1 comments

Currently we just use vectors for the scaling factor in eval1, which is too dense and so not optimal, MatrixOuter for eval2, which is ad hoc, and tensors for evalX and evalS, which are monstrous. Perhaps for the scaling factor in eval_n+1 just use Delta_n or a close analogue?

[Edit: a fix of most of #13 now unblocked this. Then it turned out #20 blocks it still.] This ticket is currently blocked on #13 that would let us use other compiler versions than GHC 9.2 and #14 that would fix degraded and unpredictable performance in 9.2 series (and HEAD), perhaps as soon as in GHC 9.2.3. It's too easy to break performance in this code-base and too hard to detect it and bisect the culprit if measurements are unreliable for huge portions of the repo history. That's particularly important when implementing optimizations.

Related scribbles copied from the google doc:

eval1 :: Delta1 r -> Delta1 r -> ST s ()

eval1 :: VectorRecipe r -> Delta1 r -> ST s ()
data VectorRecipe r where
   vrScale :: Float -> r -> r
   ...

Mikolaj avatar Mar 19 '22 14:03 Mikolaj