ComplexityMeasures.jl
ComplexityMeasures.jl copied to clipboard
Feature: "attention entropy"
The "attention entropy" does essentially the following:
- Given input time series
x
, it identifies local minima and maxima inx
. - Counts the number of steps between local extrema, in some way (max-min, min-max, max-max, min-min; we'd model this by having a parameter that can take on these four values).
- Construct a new time series
y
which consists of the number of steps between extrema (sox
is drastically shortened in most cases) - Use
probabilities(::UniqueElements, y)
to get probabilities - Plug these probabilities into the
Shannon
entropy formula
This can be implemented as an OutcomeSpace
. Maybe MotifSpacing
is a good name? This method is generalizable to any sort of pattern spacing. It is just a matter of encoding differently. An easy way to do so is just to dispatch on MotifSpacing(::Pattern)
, where Pattern
could be MinMaxSpacing
, MaxMinSpacing
, MaxMaxSpacing
, MeanMeanSpacing
, MedianMedianSpacing
, MedianQuantileSpacing
, etc.
It will not be straight-forward to decode
/encode
. However, codify
can be implemented: it simply returns the encoded time series y
.