gospider
gospider copied to clipboard
Gospider - Fast web spider written in Go
Hi 👋🏻 I tried to crawl ``http://localhost:3000`` but looks like i can't: ``[0000] ERROR Failed to parse domain`` How could i fix that please? Using the latest commit.
ref: https://github.com/m4ll0k/Aron https://github.com/s0md3v/Arjun
help
what does it mean --include-other-source
@j3ssie any way to avoid duplicate urls as it never ends on some domains and keep continue with duplicate urls which increase the time to crawl for hours on same...
 Hi guys. I want to thank you for the great tool. And there are some suggestions. As the pic above shows, there are many similar URL in one site,...
Issue: Many requests are made to domains that were not supplied as input Solution: Restrict requests to domains supplied as input, or provide an option to do so when desired.
 Not able to see anything while running the tool. I tried all possibilities like for single URL, used threads but it dont get executed and no results
Would be cool an option to get just the urls alone without the [words] and everything else. like a flag to do that
Hello there, In the latest version, there are several bugs from which one of the most annoying ones is that `-q` flag is not quite working as intended. here is...
Hey! Can you please make linkfinder default as false.