Add feature to select and ask like with Gemini Canvas
Is your feature request related to a problem? Please describe. It is very hard to ask LLM what is wrong with certain part of the code, Add feature to select and ask like with Gemini Canvas. You can select and directly ask, no need to upload images to LLM and waste tokens.
Describe the solution you'd like feature to select and ask like with Gemini Canvas
hmm, would supporting pasting a screen shot do the trick? you can save the screen shot and then add it to the conversation
In case how it works with google I suppose it does not burn the tokens as much compared to attaching the screenshot directly. The feature sends to llm only the part outlined.
you can just make a screeenshot of the size you want, no? also the number of tokens an image takes for an llm doesn't directly relate to the size of the image due to how LLMs work
What do you mean "size you want" ? I provided gemini canvas implementation, it is free you can check it out yourself how it works etc.
on mac with cmd-shift-4 you can select a block then I can add that to goose
I mean, it would still be cool to add this to goose of course, just wanted to provide a workarond
i use win 11
Hi, I would like to try working on this feature, just trying to fully understand what it is.
Is it just Goose gives me code, I highlight/select some part of code and it appears in the chat box ?
As in the image showed by akierum, does Goose have a preview feature like that ?
what do you mean with a preview feature like that? we don't have any of that UI, so that would have be done separately. once you have the screenshot and the prompt, we'd have to figure out how to pipe that into goose and start a new session with those elements
Thanks for clarifying, I initially thought the request was about selecting code that Goose itself outputs, which would get auto included in next query, but this is broader, we would first need to build the whole canvas UI, my bad. I don't think I'll be able to work on that part yet , but hopefully soon in future :)