Local-File-Organizer icon indicating copy to clipboard operation
Local-File-Organizer copied to clipboard

Requested tokens (2288) exceed context window of 2048

Open tipe opened this issue 11 months ago • 3 comments

Processing Non-Immigrant Visa Classifications Chart-1.pdf ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:00 Traceback (most recent call last): File "/Users/tipe/Aktuell/4MAC/BIN & APP/_Scripts/Local-File-Organizer/main.py", line 337, in main() File "/Users/tipe/Aktuell/4MAC/BIN & APP/_Scripts/Local-File-Organizer/main.py", line 252, in main data_texts = process_text_files(text_tuples, text_inference, silent=silent_mode, log_file=log_file) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/tipe/Aktuell/4MAC/BIN & APP/_Scripts/Local-File-Organizer/text_data_processing.py", line 60, in process_text_files data = process_single_text_file(args, text_inference, silent=silent, log_file=log_file) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/tipe/Aktuell/4MAC/BIN & APP/_Scripts/Local-File-Organizer/text_data_processing.py", line 37, in process_single_text_file foldername, filename, description = generate_text_metadata(text, file_path, progress, task_id, text_inference) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/tipe/Aktuell/4MAC/BIN & APP/_Scripts/Local-File-Organizer/text_data_processing.py", line 71, in generate_text_metadata description = summarize_text_content(input_text, text_inference) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/tipe/Aktuell/4MAC/BIN & APP/_Scripts/Local-File-Organizer/text_data_processing.py", line 21, in summarize_text_content response = text_inference.create_completion(prompt) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/local_file_organizer/lib/python3.12/site-packages/nexa/gguf/nexa_inference_text.py", line 277, in create_completion return self.model.create_completion(prompt=prompt, **params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/local_file_organizer/lib/python3.12/site-packages/nexa/gguf/llama/llama.py", line 1785, in create_completion completion: Completion = next(completion_or_chunks) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/local_file_organizer/lib/python3.12/site-packages/nexa/gguf/llama/llama.py", line 1201, in _create_completion raise ValueError( ValueError: Requested tokens (2288) exceed context window of 2048

tipe avatar Jan 09 '25 15:01 tipe

You can increase it by doing this https://github.com/QiuYannnn/Local-File-Organizer/issues/30#issuecomment-2443639358

SBMatthew avatar Jan 12 '25 14:01 SBMatthew

You can increase it by doing this #30 (comment)

No matter whatever increased param you use, it generates an error. I tried 10000, it doesn't work. Looks like the developer is no longer active here. I'm a an amateur but looks like someone is working on the token management:

https://github.com/ddh0/easy-llama/commit/283ad6271f6bfe6e0b575468afac407fd61ab6a8

I hope these guys can discuss together to find a solution

grankona avatar Jan 27 '25 09:01 grankona

hi, i am a beginner in programming trying to use this tool. i still get the same error. how do i fix it? and it is super slow, takes at least a minute per file

Vampy2489 avatar May 08 '25 10:05 Vampy2489