gpt-crawler
gpt-crawler copied to clipboard
memory usage need to be optimized
I use gpt-crawler to crawl a 100k pages project, when crawled pages number goes to 4k, the project cost 4GB memory, meets npm memory limitation.
And the the project can't continue by reload.
It seems that 100k pages will cost 100GB memory.