elm-anima
elm-anima copied to clipboard
Caching
Pure functions can use caching to reduce computational load. In our case, we
have an expensive automaton that we generally expect to be a pure function -
the viewer
. We probably don't need aggressive memoization, which has memory
costs, but a simple cache could cut out unnecessary computations, especially
during the stable app states.
I expect a cache of such a pure function to be of the form -
caching : (input -> key) -> (input -> output) -> Automaton input output
caching keyFn fn =
let automaton =
Automaton.hiddenState Nothing updater
updater input store =
cachingFn (keyFn input) fn input store
cachingFn key fn input store =
case store of
Just (key', output') ->
if key == key' then
(output', store)
else
cachingFn key fn input Nothing
Nothing ->
let output = fn input
in (output, Just (newKey, output))
in
automaton
What we're doing here is replacing the cost of computing fn
with a lower cost function keyFn
. An alternative would be to use a comparator and store the actual input value instead of the computed key.
Even though the viewer is generally expected to be a pure function, it is worth thinking of it as a process (i.e. automaton) so that we can cache some computations.
Related - Lazy.