Here's my error message:
ERROR (theano.gof.opt): Optimization failure due to: local_abstractconv_check
ERROR (theano.gof.opt): node: AbstractConv2d{border_mode='half', subsample=(1, 1), filter_flip=True, imshp=(None, 1, None, None), kshp=(64, 1, 3, 3)}(DimShuffle{0,3,1,2}.0, DimShuffle{3,2,0,1}.0)
ERROR (theano.gof.opt): TRACEBACK:
ERROR (theano.gof.opt): Traceback (most recent call last):
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 1772, in process_node
replacements = lopt.transform(node)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\tensor\nnet\opt.py", line 402, in local_abstractconv_check
node.op.class.name)
AssertionError: AbstractConv2d Theano optimization failed: there is no implementation available supporting the requested options. Did you exclude both "conv_dnn" and "conv_gemm" from the optimizer? If on GPU, is cuDNN available and does the GPU support it? If on CPU, do you have a BLAS library installed Theano can link against?
Traceback (most recent call last):
File "C:/Users/ywjys/Desktop/AcapeTest/acapellabot.py", line 147, in
acapellabot.isolateVocals(f, args.fft, args.phase)
File "C:/Users/ywjys/Desktop/AcapeTest/acapellabot.py", line 98, in isolateVocals
predictedSpectrogramWithBatchAndChannels = self.model.predict(expandedSpectrogramWithBatchAndChannels)
File "C:\Users\ywjys\Desktop\AcapeTest\venv1\lib\site-packages\keras\engine\training.py", line 1164, in predict
self._make_predict_function()
File "C:\Users\ywjys\Desktop\AcapeTest\venv1\lib\site-packages\keras\engine\training.py", line 554, in _make_predict_function
**kwargs)
File "C:\Users\ywjys\Desktop\AcapeTest\venv1\lib\site-packages\keras\backend\theano_backend.py", line 1397, in function
return Function(inputs, outputs, updates=updates, **kwargs)
File "C:\Users\ywjys\Desktop\AcapeTest\venv1\lib\site-packages\keras\backend\theano_backend.py", line 1383, in init
**kwargs)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\compile\function.py", line 320, in function
output_keys=output_keys)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\compile\pfunc.py", line 479, in pfunc
output_keys=output_keys)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\compile\function_module.py", line 1776, in orig_function
output_keys=output_keys).create(
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\compile\function_module.py", line 1456, in init
optimizer_profile = optimizer(fgraph)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 101, in call
return self.optimize(fgraph)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 89, in optimize
ret = self.apply(fgraph, *args, **kwargs)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 230, in apply
sub_prof = optimizer.optimize(fgraph)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 89, in optimize
ret = self.apply(fgraph, *args, **kwargs)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 1879, in apply
nb += self.process_node(fgraph, node)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 1777, in process_node
lopt, node)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 1673, in warn_inplace
return NavigatorOptimizer.warn(exc, nav, repl_pairs, local_opt, node)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 1659, in warn
raise exc
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\gof\opt.py", line 1772, in process_node
replacements = lopt.transform(node)
File "C:\Users\ywjys\AppData\Local\Programs\Python\Python36\lib\site-packages\theano\tensor\nnet\opt.py", line 402, in local_abstractconv_check
node.op.class.name)
AssertionError: AbstractConv2d Theano optimization failed: there is no implementation available supporting the requested options. Did you exclude both "conv_dnn" and "conv_gemm" from the optimizer? If on GPU, is cuDNN available and does the GPU support it? If on CPU, do you have a BLAS library installed Theano can link against?
Process finished with exit code 1
Does anyone know what to do ?