OnnxStream
OnnxStream copied to clipboard
Feature Proposal: Integration of OnnxStream with Automatic1111webui using RAM instead of VRAM
Dear OnnxStream Developers,
I am reaching out to propose an enhancement to integrate OnnxStream as an additional extension with the Automatic1111webui, utilizing RAM rather than VRAM. The rationale behind this suggestion aligns with the previous discussions regarding VRAM usage. By enabling this integration, individuals utilizing Redion, as well as those with GPUs possessing lower VRAM, which currently falls outside the supported specifications, would gain the ability to generate AI outputs.
Moreover, this integration could potentially lead to a performance boost when compared to standard CPU and RAM usage scenarios. Further, if a feature for model conversion is incorporated, it could potentially facilitate easier usage on various devices, including smartphones.
I believe this enhancement could significantly broaden the accessibility and usability of OnnxStream, especially for individuals with hardware constraints. I am eagerly looking forward to any considerations or discussions regarding this proposal.
Thank you for your time and consideration.
hi,
I waited a little while before replying. I myself am trying to figure out the best direction for the project :-)
I don't think the integration with AUTO1111 is beneficial to the end user at the moment given the slowness of OnnxStream. A user using AUTO1111 expects to generate an image quickly, in order to adjust the settings and try the generation again. Furthermore, OnnxStream lacks all the features necessary for the integration with AUTO1111 (such as img2img, inpaint etc.)
From this point of view, I think @ThomAce's project is more interesting: https://github.com/ThomAce/OnnxStreamGui
Perhaps in the future, when OnnxStream performance allows it, the integration with AUTO1111 could make more sense :-)
Thanks, Vito
I understand. Thanks for the reply.