use-whisper
use-whisper copied to clipboard
useWisper runs on client, how to protect OPENAI_API_TOKEN?
There's an example of how to do this in the readme - you can use the onTranscribe method and call your server where your API token is used.
const App = () => {
/**
* you have more control like this
* do whatever you want with the recorded speech
* send it to your own custom server
* and return the response back to useWhisper
*/
const onTranscribe = (blob: Blob) => {
Maybe I'm super stupid but I am also having this same issue. The onTranscribe func is turning the blob into a base64. This is not compatible with the requirements for the whisper api file types. What am I missing here??? trying to connect this onTranscribe to my API endpoint in my Next.js 14 app so that the key doesnt get exposed to the client.
The base64 encoding is to transport the data. In your backend you need to decode the base64 data and tuern it into a file, which you then send to the whisper API endpoint.
Maybe I'm super stupid but I am also having this same issue. The onTranscribe func is turning the blob into a base64. This is not compatible with the requirements for the whisper api file types. What am I missing here??? trying to connect this onTranscribe to my API endpoint in my Next.js 14 app so that the key doesnt get exposed to the client.
@PrimeObjects Hey is it working for you? Mine is giving {blob: undefined, text: undefined} as response, do you have any idea about this
Doesn't work with streaming though? It's still calling /transcription via Open AI with streaming, only calls my API when I press stop...