Tidal
Tidal copied to clipboard
Fix + sometimesBy performance issues
Describe the bug
I'm seeing CPU spikes, seems to be a combination of sometimesBy
and fix
. I can chew more CPU (up to 100% in this case) by gradually uncommenting the fix
lines. It drops down about 50% by removing the sometimesBy
and is negligible without either.
To Reproduce
do -- fix-perf-bug -----------------------------------------------------------
p "fix-perf-bug"
$ spin 8
$ sometimesBy 0.15 (# n 0)
-- $ fix (# gain 0.9) (s "foo")
-- $ fix (# gain 0.4) (s "bar")
-- $ fix (# gain 0.6) (s "baz")
-- $ fix (# djf 0.02) (s "bing")
-- $ fix (|+ sustain 0.5) (s "qux")
-- $ fix (# speed 0.65) (s "gabba")
$ sound "[bd]*4"
Expected behavior
Can apply fix
without big CPU overhead.
Environment (please complete the following information):
- OS X 10.13.6 running on MacBook Pro (Retina, Mid 2012), 2.6 GHz Intel Core i7, 16 GB 1600 MHz DDR3
$ ghc --version
The Glorious Glasgow Haskell Compilation System, version 8.0.2
- Tidal version: master at
6c25387d6ba088258c6ae2fe0079936395ff8f8d
- SuperCollider version: 3.9.3, SuperDirt: develop at
ccbeb4420f9fe953793301077c682be448857961
I made this https://github.com/jwaldmann/tidal-fuzz-test/blob/master/Fix.hs for profiling. Results:
runtime as a function of the depth of the stack: seems exponential (with base 2, in fact)
for one execution, distribution of runtime by function:
COST CENTRE MODULE SRC %time %alloc
sam Sound.Tidal.Pattern src/Sound/Tidal/Pattern.hs:28:1-43 13.6 15.9
_fast Sound.Tidal.Core src/Sound/Tidal/Core.hs:(284,1)-(286,68) 7.8 3.3
withResultTime.\ Sound.Tidal.Pattern src/Sound/Tidal/Pattern.hs:706:49-63 6.5 1.4
cyclesInArc Sound.Tidal.Pattern src/Sound/Tidal/Pattern.hs:(108,1)-(111,40) 6.1 6.8
cat.f Sound.Tidal.Core src/Sound/Tidal/Core.hs:(203,9)-(207,75) 6.1 3.8
sect Sound.Tidal.Pattern src/Sound/Tidal/Pattern.hs:70:1-54 4.7 1.1
nextSam Sound.Tidal.Pattern src/Sound/Tidal/Pattern.hs:36:1-20 4.3 0.9
matchManyToOne.q.match Sound.Tidal.Pattern src/Sound/Tidal/Pattern.hs:(857,13)-(859,55) 3.3 6.2
fmap Sound.Tidal.Pattern src/Sound/Tidal/Pattern.hs:359:3-48 3.3 6.0
...
Also, in the absolute execution counts of individual functions, numbers 32768, 65536, 524288, 1048576 appear frequently. That's exactly powers of two!
In the source, we have
fix f = contrast f id
contrast = contrastBy (==)
contrastBy comp f f' p p' = overlay (f matched) (f' unmatched)
where matches = matchManyToOne (flip $ Map.isSubmapOfBy comp) p p'
matched :: ControlPattern
matched = filterJust $ (\(t, a) -> if t then Just a else Nothing) <$> matches
unmatched :: ControlPattern
unmatched = filterJust $ (\(t, a) -> if not t then Just a else Nothing) <$> matches
and this uses matches
twice. See my remark on (non)linearity in #636.
@jwaldmann Aha, so using partition
rather than filter
should help
I don't see how to write a linear implementation of
partitionValues :: (a -> Bool) -> Pattern a -> (Pattern a, Pattern a)
partitionValues f p = ( something of p, someotherthing of p ) -- copies p
With a (hypothetical) manifest representation of patterns as set-of-events, each element of that set could be treated linearly (it goes to the left or to the right), but we don't have this?
Is the issue basically that GHC isn't able to compose something like f $ g $ h $ pat
into (f . g . h) pat
where the latter has a single pattern "query"?
Would a (not great) workaround be a function like
manySometimesBy [0.5, 0.8, 0.6] [f, g, h] pat
where the explicit implementation would avoid the exponential lookup problem?
I don't think so for sometimesBy (haven't looked closely at fix)... in the case of sometimesBy I think it's that the way the function works is to roll the dice for each event in two patterns (the original pattern and the transformed pattern). Right now the implementation is rolling the dice twice (to get the same answer) when both patterns have an event that starts at the same time. I think there will always be worst cases where the original pattern and the transformed pattern don't line up in time - but right now every case has performance like the worst case. Perhaps it is also forcing more of the events than necessary when they are "cancelled" (ie. ultimately not occurring because of random choice) - not sure...