Jakob Kasbauer

Results 13 comments of Jakob Kasbauer

The Apple M1 architecture can't execute the scientific python binaries. https://stackoverflow.com/a/65095679 However this problem is not related to AutoEq...

#### I suggest to include the Q-Values into the Loss-Function for the Tensorflow Optimizer. https://github.com/jaakkopasanen/AutoEq/blob/aad110b2ae899a08b08e6b9385e81e2d640d0015/frequency_response.py#L523 In machine learning this would be called Regularization. However Q-Values behave differently than Multiplication-Weigths for...

##### About replacing `np.arange` with `range`: I assume that the numba compiler can infer more information from `range`. For the compiler's point of view, the output of `np.arange` could be...

Never compare floating point numbers for exact equality. Use numpy.allclose() instead. https://docs.scipy.org/doc/numpy/reference/generated/numpy.allclose.html https://floating-point-gui.de/errors/comparison/

Vectorized Numpy Implementation: ``` from skimage.util import view_as_windows sliding_windows = view_as_windows(data, subsequenceLength) assert sliding_windows.shape[0] == len(data)-subsequenceLength+1 complexity = np.sqrt(np.sum(np.square(np.diff(sliding_windows, axis=1)), axis=1)) assert sliding_windows.shape[0] == complexity.size ```

The following paper shows how to create a feature matrix for classification by using pairwise distances. https://www.researchgate.net/profile/Rohit-Kate/publication/276422351_Using_dynamic_time_warping_distances_as_features_for_improved_time_series_classification/links/5c0ed52892851c39ebe437b5/Using-dynamic-time-warping-distances-as-features-for-improved-time-series-classification.pdf

The following notebook reproduces the PDF which is contained in "ContrastProfile_GettingStarted.zip" [contrast_profile_notebook.pdf](https://github.com/TDAmeritrade/stumpy/files/8825896/contrast_profile_notebook.pdf)

After re-reading the paper, I have come to the conclusion that I can't imagine a specific topic that needs an extra tutorial. The paper already explains online-updates, k-Platos, and comparisons...

k-NN for each distance-matrix row ? k-NN for each distance-matrix column ? k-NN for each motif ? The most useful method is k-NN for each motif. It is also the...

I also expiremented the with the noise compensation method: Many aspects in the realm of matrix profiles algorithms suffer from instability in edge cases: #### How to choose sigma_n ?...