codex-readme icon indicating copy to clipboard operation
codex-readme copied to clipboard

Issue on generating README.md

Open chevybowtie opened this issue 1 year ago • 3 comments

Found your project on Reddit so I thought I'd give it a try.

I tried this on my project, and it started generating (what I assume is) the README to my screen, but it ended prematurely with the following:

Traceback (most recent call last):
  File "../codex-readme/codex_readme.py", line 196, in <module>
    generate_until_accepted(input_prompt, args.tokens)
  File "../codex-readme/codex_readme.py", line 169, in generate_until_accepted
    generated_readme = clear_screen_and_display_generated_readme(response)
  File "../codex-readme/codex_readme.py", line 116, in clear_screen_and_display_generated_readme
    next_response = next(response)
StopIteration

I'm not a python guy (just dabble), so naturally, I turned to ChatGPT. It said the following which may be relevant:

The generate_until_accepted function uses args without declaring it as a global variable or passing it as an argument. This could lead to unexpected behavior if args is defined elsewhere in the script. It would be better to pass args as an argument to enerate_until_accepted.

The StopIteration error occurs when the next() function is called on an iterator that has no more elements. In this case, the error is raised because next(response) is called after the completion request has been completed.

The issue may be caused by the stop parameter passed to the openai.Completion.create method. The stop parameter specifies a sequence of tokens at which to stop generating text. In this case, the stop parameter is set to '===================\n', which is the sequence of characters that marks the end of the input prompt.

If the completion request generates text that does not include the stop sequence, the next(response) call in the clear_screen_and_display_generated_readme function will raise a StopIteration error because there are no more elements to iterate over.

To fix the issue, you can check whether next_response['choices'][0]['finish_reason'] is equal to 'stop' before calling next(response) again. Here's an updated version of the clear_screen_and_display_generated_readme function that handles this case:

def clear_screen_and_display_generated_readme(response):
    '''
    Display the generated README and return its contents as a string.
    '''
    os.system('cls' if os.name == 'nt' else 'clear')
    generated_readme = README_START
    print(generated_readme)

    for next_response in response:
        completion = next_response['choices'][0]['text']
        print(completion, end='')
        generated_readme += completion
        if next_response['choices'][0]['finish_reason'] == 'stop':
            break

    return generated_readme

This updated function should exit the loop when it encounters the 'stop' finish reason, and return the generated README as a string.

chevybowtie avatar Mar 13 '23 01:03 chevybowtie

if i add your bugfix, than i get an blank readme.md file.

stoffl6781 avatar Mar 15 '23 21:03 stoffl6781

I added a try/catch to the next(response) which dropped out otherwise after the last call

def clear_screen_and_display_generated_readme(response):
    # Clear screen.
    os.system('cls' if os.name == 'nt' else 'clear')
    generated_readme = ''
    print(README_START)
    generated_readme = README_START
    while True:
        try:
            next_response = next(response)
        except StopIteration:
            print("No more items in the response")
            break
        except Exception as e:
            print(f"An error occurred: {e}")
            break
        completion = next_response['choices'][0]['text']
        # print("completion:", completion)
        # print(next(response))
        generated_readme = generated_readme + completion
        if next_response['choices'][0]['finish_reason'] != None: break
    return generated_readme

janrosendahl avatar Mar 17 '23 22:03 janrosendahl

Just merged this PR: https://github.com/tom-doerr/codex-readme/pull/3 Maybe it works for you now

tom-doerr avatar Mar 22 '23 00:03 tom-doerr