ComplexityMeasures.jl icon indicating copy to clipboard operation
ComplexityMeasures.jl copied to clipboard

Feature: "attention entropy"

Open kahaaga opened this issue 1 year ago • 0 comments

The "attention entropy" does essentially the following:

  • Given input time series x, it identifies local minima and maxima in x.
  • Counts the number of steps between local extrema, in some way (max-min, min-max, max-max, min-min; we'd model this by having a parameter that can take on these four values).
  • Construct a new time series y which consists of the number of steps between extrema (so x is drastically shortened in most cases)
  • Use probabilities(::UniqueElements, y) to get probabilities
  • Plug these probabilities into the Shannon entropy formula

This can be implemented as an OutcomeSpace. Maybe MotifSpacing is a good name? This method is generalizable to any sort of pattern spacing. It is just a matter of encoding differently. An easy way to do so is just to dispatch on MotifSpacing(::Pattern), where Pattern could be MinMaxSpacing, MaxMinSpacing, MaxMaxSpacing, MeanMeanSpacing, MedianMedianSpacing, MedianQuantileSpacing, etc.

It will not be straight-forward to decode/encode. However, codify can be implemented: it simply returns the encoded time series y.

kahaaga avatar Jan 15 '24 12:01 kahaaga