Ahm
Ahm
It will be good to get actions after import: 1. Just copy with notification 2. Show in file explorer 3. Embed in active note 4. Embed in custom note 5....
Please multiple models support! Not priority but please implement it later :)
I checked and there is support for Mac: # Native Apple Silicon (M1/M2) Support for TensorFlow, PyTorch, and Coqui-TTS | Library | Native Support | Installation Commands | GPU Support...
**For tensorflow-macos** TensorFlow on Apple Silicon utilizes the CPU’s multiple cores (using the tensorflow-macos version) without needing NUMA-like handling, as the system’s unified memory allows for more efficient data sharing...
**For torch** I tried: ``` import torch # Check MPS availability if torch.backends.mps.is_available(): device = torch.device("mps") # Use GPU print("Using MPS (GPU):", device) # Example tensor on GPU x =...
Yes I read it. I know the Apple silicon is for some reason not compactible with multiple libraries for AI. I struggle to find some TTS that will run on...
@scalar27 Thank you for feedback. I always have same problem on apple silicon it produce garbage.... Are you going to try standard size model?
Can you share the code for generation? How is the GPU utilization?
Please! Implement