R0NAM1

Results 38 comments of R0NAM1

(Apparently you can't open issues in forks, so this has to be here)

I've been looking everywhere as well for any code examples of people using AioRTC with Flask, haven't been able to.

I was able to solve this by setting up threading as such: t = loop.run_until_complete(webRtcStart()) Thread(target=loop.run_forever).start()

``` @app.route('/rtcoffer', methods=['GET', 'POST']) @login_required def webRTCOFFER(): # Get Event Loop If It Exists, Create It If Not. try: loop = asyncio.get_event_loop() except RuntimeError: loop = asyncio.new_event_loop() asyncio.set_event_loop(loop) # Run...

Turn's out I am also still having an issue, @namoshizun I don't know how to fix that off that bat, sorry. I might see if I can restructure my rtc...

@namoshizun I don't know what I was talking about before with trying some code from yours, I think I misread how you do your asyncio stuff. My guess is either...

> Hi @R0NAM1 , Sorry for asking in your thread. I've managed to stream video using Flask backend with aiortc. I'm using loop.run_forever to keep the player running in the...

> @R0NAM1 Thanks for clarifying. I switched from using Gunicorn to Hypercorn, and now it works. I suppose Gunicorn does not work well with async tasks. Regarding your question, I...

> pc is my RTCPeerConnection, when it closes on the frontend, the backend site will automatically stop > > ``` > disconnect() { > if (this.pc) { > this.pc.close(); >...

A good thing an LLM is used for is querying large datasets in a human way, basically talking with data. This is useful for coding especially because the LLM can...