Yoko Li
Yoko Li
A few ideas - set backup models to query from - gracefully fail by generating an alternative from a local model
The idea is a companion can have a twitter account (say, @-Rosie), and as a Twitter user, I can tweet to @-Rosie and have a conversation with her in a...
i.e a companion should be able to generate a picture using a lora model and send it back to human or receive an image from human and react to that...
Thanks for the support! :) 谢谢支持
instead of waiting for the whole result to come back as it may take a long time
I have a function to convert an imageURL (s3) to Base64 like below ``` export default async function toBase64ImageUrl( imgUrl: string ): Promise { const fetchImageUrl = await fetch(imgUrl); const...