gord chung
gord chung
you know how you'd implement this? seems like we'd need a new storage format. if we capture nanosecond raw data, we can't create 3600point blocks anymore or else that's 3.6ms...
oh. so like using a policy that is not necessarily formally created in gnocchi. i thought this meant, we had no aggregation and just storing raw data (which isn't really...
i'm curious, in your initial implementation, are there any significant changes to the workflow or data model required to make Gnocchi work with GPU? it'd be interesting to see if...
yeah, well that's my point. you can have more than one rule that match '*' pattern... but that doesn't make any sense because multiple rules can't be applied to a...
i guess we can close?
i'm not sure how sorting by name only solves issue? it will ultimately defer to the pattern regex anyway? we should really be ordering by regex length i think my...
so if i have a policy of 5s and 1min (let's assume timespan is infinite). if i set 'back_window_timespan' as: - 6s what happens? - 62s what happens?
@mergify rebase
edited to included `AggregationDoesNotExist`. it does seem like the API validation already ensures this isn't possible. so maybe the discussion is where we want this validation.
copied from PR, for reference. ok, so i ran: ``` In [25]: def extra_work(tss): ...: combine = numpy.concatenate(tss) ...: times, indices = numpy.unique(combine['timestamps'], return_inverse=True) ...: filler = numpy.NaN ...: val_grid...