topicGPT icon indicating copy to clipboard operation
topicGPT copied to clipboard

Number of topics

Open AbhayGoyal opened this issue 11 months ago • 2 comments

Hi, Can you tell me how to change the number of topics?

Why does the lvl2 keep on running for me?

AbhayGoyal avatar Feb 12 '25 19:02 AbhayGoyal

Hey, I probably had the same issue. When running lvl2 it seemed to get stuck while collecting all top-level topic assigned documents. My solution was processing the documents in fixed-size batches rather than one-by-one. This is achieved by introducing a new parameter (batch_size) and modifying the prompt construction so that each prompt is built from a batch of documents (in my case 5). This prevents the uncontrolled growth of prompts and improves performance.

anne-kreuter avatar Mar 04 '25 16:03 anne-kreuter

Ok yeah makes sense

On Tue, Mar 4, 2025 at 10:29 AM anne-kreuter @.***> wrote:

Hey, I probably had the same issue. When running lvl2 it seemed to get stuck while collecting all top-level topic assigned documents. My solution was processing the documents in fixed-size batches rather than one-by-one. This is achieved by introducing a new parameter (batch_size) and modifying the prompt construction so that each prompt is built from a batch of documents (in my case 5). This prevents the uncontrolled growth of prompts and improves performance.

— Reply to this email directly, view it on GitHub https://github.com/chtmp223/topicGPT/issues/17#issuecomment-2698247781, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEMF2JQG344M33HYEZTXPSD2SXIFXAVCNFSM6AAAAABXALWCB6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMOJYGI2DONZYGE . You are receiving this because you authored the thread.Message ID: @.***> [image: anne-kreuter]anne-kreuter left a comment (chtmp223/topicGPT#17) https://github.com/chtmp223/topicGPT/issues/17#issuecomment-2698247781

Hey, I probably had the same issue. When running lvl2 it seemed to get stuck while collecting all top-level topic assigned documents. My solution was processing the documents in fixed-size batches rather than one-by-one. This is achieved by introducing a new parameter (batch_size) and modifying the prompt construction so that each prompt is built from a batch of documents (in my case 5). This prevents the uncontrolled growth of prompts and improves performance.

— Reply to this email directly, view it on GitHub https://github.com/chtmp223/topicGPT/issues/17#issuecomment-2698247781, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEMF2JQG344M33HYEZTXPSD2SXIFXAVCNFSM6AAAAABXALWCB6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMOJYGI2DONZYGE . You are receiving this because you authored the thread.Message ID: @.***>

AbhayGoyal avatar Mar 04 '25 16:03 AbhayGoyal