RobustSTL icon indicating copy to clipboard operation
RobustSTL copied to clipboard

Any idea about this error?

Open Math9999 opened this issue 6 years ago • 8 comments

Hello.

It would be great if you could support.

C:\XYZ\XYZ\RobustSTL.py:54: RuntimeWarning: invalid value encountered in double_scalars season_value = np.sum(weight_sample * weights)/np.sum(weights) [!] 2 iteration will strat

Intel MKL ERROR: Parameter 7 was incorrect on entry to DGELS. Traceback (most recent call last): File "", line 2, in File "", line 16, in main File "C:\XYZ\XYZ\RobustSTL.py", line 121, in RobustSTL return _RobustSTL(input, season_len, reg1, reg2, K, H, dn1, dn2, ds1, ds2) File "C:\XYZ\XYZ\RobustSTL.py", line 97, in _RobustSTL trend_extraction(denoise_sample, season_len, reg1, reg2) File "C:\XYZ\XYZ\RobustSTL.py", line 36, in trend_extraction delta_trends = l1(P,q) File "C:\XYZ\XYZ\l1.py", line 41, in l1 lapack.gels(+P, uls) ValueError: -7

All the best

A.B.

Math9999 avatar Nov 24 '19 13:11 Math9999

did you check whether the weights contains np.inf, NaN ? Another possibility of the error is when np.sum(weights)=0

LeeDoYup avatar Feb 06 '20 02:02 LeeDoYup

@LeeDoYup I also get the same error. Yes, you are right, the weights are close to 0.

If there is a huge level shift from one time period to the next, i.e. |y_j - y_t| is large, then the denoising through bilateral filters involves computing the product of terms (which are close to 0. because of exponential factors). This causes the weights to go to zero. This problem is also amplified if the time period set is high. Do you have a workaround for dealing with this issue?

anirudhasundaresan avatar Mar 03 '20 23:03 anirudhasundaresan

@anirudhasundaresan I think the problem is from the algorithm itself... (I am not the author, but i implemented it). Nowadays, i am busy for other submission (until next week).

I will look around the issue after i finish my work. If you solve the problem before i start, please make a pull request !

LeeDoYup avatar Mar 04 '20 07:03 LeeDoYup

I've encountered a similar error, probably for the same reason:

main:54: RuntimeWarning: invalid value encountered in double_scalars Traceback (most recent call last): File "", line 1, in RobustSTL(y,12) File "/Users/Common 1/SA/RobustSTL/RobustSTL/RobustSTL.py", line 121, in RobustSTL return _RobustSTL(input, season_len, reg1, reg2, K, H, dn1, dn2, ds1, ds2) File "/Users/Common 1/SA/RobustSTL/RobustSTL/RobustSTL.py", line 97, in _RobustSTL trend_extraction(denoise_sample, season_len, reg1, reg2) File "/Users/Common 1/SA/RobustSTL/RobustSTL/RobustSTL.py", line 36, in trend_extraction delta_trends = l1(P,q) File "/Users/Common 1/SA/RobustSTL/RobustSTL/l1.py", line 56, in l1 primalstart={'x': x0, 's': s0}, dualstart={'z': z0}) File "/Users/Shared/anaconda3/lib/python3.7/site-packages/cvxopt/coneprog.py", line 1033, in conelp W = misc.compute_scaling(s, z, lmbda, dims, mnl = 0) File "/Users/Shared/anaconda3/lib/python3.7/site-packages/cvxopt/misc.py", line 285, in compute_scaling W['d'] = base.sqrt( base.div( s[mnl:mnl+m], z[mnl:mnl+m] ))

ValueError: domain error

I printed the value of s in coneprog.py just before line 1033 and found that it was all nan's.

chuckcoleman avatar May 04 '20 14:05 chuckcoleman

I am getting the same error for most of the time series. Does anyone has solved this or any idea? image

salman087 avatar Nov 12 '20 15:11 salman087

I think the error comes from l1 optimizer, which is a part of cvxopt library. I didn't manually implement the l1.py, but use the part of cvxopt. Can you debug which values have -7?

LeeDoYup avatar Nov 12 '20 15:11 LeeDoYup

I'm not sure, but I think when sol['x'] is NonType return sol['y'][:n] in 'l1.py' line 57. image

SeungHyunAhn avatar Nov 17 '20 08:11 SeungHyunAhn

I'm seeing this as well, and (for me at least) the root cause is the weights returned by bilateral_filter all go to zero causing nan's on the following line due to divide by zero

season_value = np.sum(weight_sample * weights)/np.sum(weights)

Edit: So the issue in my case was fixed by setting the bilateral_filter hyper-parameters, in particular ds2. Since the denominator is squared large values of | y_j - y_t | cause nan's so ds2 is required to scale this back

I have energy consumption data with daily and weekly seasonality - if I set T to daily (48 samples) then this issue occurs since there's a large difference between the weekday and weekend level at the same time of the day. I think there are other issues - if I train on a largish dataset in jupyter the kernel crashes

david-waterworth avatar Jun 28 '22 00:06 david-waterworth