OpenUtau icon indicating copy to clipboard operation
OpenUtau copied to clipboard

Transcribe Audio to Note dependencies error

Open Nexys4t opened this issue 1 year ago • 3 comments

Acknowledgement

  • [X] I have read Getting-Started and FAQ

🐛 Describe the bug

Followed the instructions, window seems to open up while it's working, then error pops up. The dependencies were also installed as instructed in the Getting Started wiki page of OpenUTAU. EDIT: I have tried with several audio files of clean vocals, result is the same.

Explains how to reproduce the bug

  1. Import wav file.
  2. Right-click, use Transcribe audio to create a note part.
  3. Window pops up showing how much time is left until the process is finished.
  4. An error window pops up when it's finished.

OS & Version

Windows 11

Logs

[ErrorCode:RuntimeException] Non-zero status code returned while running Softmax node. Name:'/model/model/cf_lay.0/att2/att/Softmax' Status Message: bad allocation

System.AggregateException: One or more errors occurred. ([ErrorCode:RuntimeException] Non-zero status code returned while running Softmax node. Name:'/model/model/cf_lay.0/att2/att/Softmax' Status Message: bad allocation) ---> Microsoft.ML.OnnxRuntime.OnnxRuntimeException: [ErrorCode:RuntimeException] Non-zero status code returned while running Softmax node. Name:'/model/model/cf_lay.0/att2/att/Softmax' Status Message: bad allocation at Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus) at Microsoft.ML.OnnxRuntime.InferenceSession.RunImpl(RunOptions options, IntPtr[] inputNames, IntPtr[] inputValues, IntPtr[] outputNames, DisposableList1 cleanupList) at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection1 inputs, IReadOnlyCollection1 outputNames, RunOptions options) at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection1 inputs, IReadOnlyCollection1 outputNames) at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection1 inputs) at OpenUtau.Core.Analysis.Some.Some.Analyze(Single[] samples) in C:\projects\openutau\OpenUtau.Core\Analysis\Some.cs:line 203 at OpenUtau.Core.Analysis.Some.Some.Transcribe(UProject project, UWavePart wavePart, Action1 progress) in C:\projects\openutau\OpenUtau.Core\Analysis\Some.cs:line 254 at OpenUtau.App.Views.MainWindow.<>c__DisplayClass92_0.<Transcribe>b__0() in C:\projects\openutau\OpenUtau\Views\MainWindow.axaml.cs:line 1122 at System.Threading.Tasks.Task1.InnerInvoke() at System.Threading.Tasks.Task.<>c.<.cctor>b__272_0(Object obj) at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state) --- End of stack trace from previous location --- at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread) --- End of inner exception stack trace ---

0.1.338.0

Nexys4t avatar Jan 12 '24 16:01 Nexys4t

i have this problem too,Is it beacuse the cpu?How to solve that?

dfdu233 avatar Mar 07 '24 15:03 dfdu233

Having errors with transcribing as well. Here:s the error message on my part:

[ErrorCode:RuntimeException] Non-zero status code returned while running MatMul node. Name:'/model/model/cf_lay.0/att2/att/MatMul' Status Message: bad allocation

System.AggregateException: One or more errors occurred. ([ErrorCode:RuntimeException] Non-zero status code returned while running MatMul node. Name:'/model/model/cf_lay.0/att2/att/MatMul' Status Message: bad allocation) ---> Microsoft.ML.OnnxRuntime.OnnxRuntimeException: [ErrorCode:RuntimeException] Non-zero status code returned while running MatMul node. Name:'/model/model/cf_lay.0/att2/att/MatMul' Status Message: bad allocation at Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus) at Microsoft.ML.OnnxRuntime.InferenceSession.RunImpl(RunOptions options, IntPtr[] inputNames, IntPtr[] inputValues, IntPtr[] outputNames, DisposableList1 cleanupList) at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection1 inputs, IReadOnlyCollection1 outputNames, RunOptions options) at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection1 inputs, IReadOnlyCollection1 outputNames) at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection1 inputs) at OpenUtau.Core.Analysis.Some.Some.Analyze(Single[] samples) in C:\projects\openutau\OpenUtau.Core\Analysis\Some.cs:line 203 at OpenUtau.Core.Analysis.Some.Some.Transcribe(UProject project, UWavePart wavePart, Action1 progress) in C:\projects\openutau\OpenUtau.Core\Analysis\Some.cs:line 254 at OpenUtau.App.Views.MainWindow.<>c__DisplayClass91_0.<Transcribe>b__0() in C:\projects\openutau\OpenUtau\Views\MainWindow.axaml.cs:line 1110 at System.Threading.Tasks.Task1.InnerInvoke() at System.Threading.Tasks.Task.<>c.<.cctor>b__272_0(Object obj) at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state) --- End of stack trace from previous location --- at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread) --- End of inner exception stack trace ---

0.1.327.0

TibetanSandPig avatar Mar 22 '24 09:03 TibetanSandPig

This issue is stale because it has been open for 60 days with no activity. It will be closed if no further activity occurs. Thank you.

github-actions[bot] avatar May 22 '24 02:05 github-actions[bot]