chatgpt-advanced icon indicating copy to clipboard operation
chatgpt-advanced copied to clipboard

Proposal for a "deep mode" to get better result synopses

Open thorwhalen opened this issue 2 years ago • 4 comments

Clever idea, but doesn’t work for my particular purposes. The text extracted from the google results only are not quite informative enough.

Here's another way one could (possibly) get better results: Include a "deep mode" toggle. When deep mode is on, the definition of the {web_result} string will be as follows

results = google_search(query)
for result in results:    # assume results is a list of "jsons" containing parsed results
    html = get_content(result.url)  # get content from result url
    text = html2text(html)  # extract text
    result.synopsis = make_synopsis(text)  # make synopsis and add to result
web_result = make_result_string(results)

The only function that needs explanation is make_synopsis. Here, we have several choices. We already have such a function (logically), since the current version of chatgpt-advanced must do such a thing (perhaps by taking the "snippet" of a result, as is?). All my enhancement is saying is that we could make a more advanced mode for this aspect (and perhaps several, and perhaps parametrizable?).

One way we could do this is to use chatGPT itself: This is a standard kind of prompt template.

Write a synopsis of the following text, using no more than {max_words} words:

{text}

Note: We'd need to truncate the text to make it fit prompt size limits and ask for a synopsis that has a maximum number of words to be able to then inject in the prompt template without going over the prompt size limit.

@qunash and @ingumsky: I would do a pull request for this, but I'm not sure where to start (I'm not too familiar with chrome extensions, nor JS (I'm a python guy)). If you want to point me in the right direction, I can try to figure it out though. Shouldn't be too hard.

thorwhalen avatar Mar 01 '23 09:03 thorwhalen

ta, @thorwhalen, I agree, your suggestion looks good, but I'm not a decision-maker here, and neither am I a developer of the extension. I'm just a person who contributed with a handful of localised strings. I really hope @qunash will look into these though.

ingumsky avatar Mar 01 '23 10:03 ingumsky

Again, I. An probably figure it out myself and contribute, with a little bit of assistance on where to look.

thorwhalen avatar Mar 01 '23 21:03 thorwhalen

Hi, thanks for the suggestion. Currently it's not practical due to the limited context length. But might make more sense when (if) GPT-4 with 8K and 32K input length become widely available.

This is on my list of possible improvements.

qunash avatar Mar 22 '23 11:03 qunash

Hi, thanks for the suggestion. Currently it's not practical due to the limited context length. But might make more sense when (if) GPT-4 with 8K and 32K input length become widely available.

This is on my list of possible improvements.

This could work today by limiting to one search result article.

wala0003 avatar Mar 22 '23 19:03 wala0003