nuclei
nuclei copied to clipboard
[store scan result to database]
Please describe your feature request:
nuclei any plan support to store the result to database?such as sqlite?
Describe the use case of this feature:
for automation scan, user need to monitor the result files change. it's quite painful. like below command
nuclei -l urls.txt -t example.com -sresp -db test.db
the request and response, the other result will insert to test.db.
I can work on this one
@kchason @kchason there is elastic / jsonl export support which can be used for same purpose, or jsonl can be imported into local db as well, is there any specfic value having another source store result inforamtion?
@ehsandeep Hi, my scenario is the target list I scan is huge, I want to get the nuclei result on real time, that means I need to monitor the json file changed, once json files have been changed, I will read the json files. if the result can store in sqlite db, I just need to read the db, no need to monitor json files anymore.
@kchason @kchason there is elastic / jsonl export support which can be used for same purpose, or jsonl can be imported into local db as well, is there any specfic value having another source store result inforamtion?
I agree with the idea of cross-correlation and aggregation. I'd personally benefit from this as well. Elastic is great, but requires more infrastructure and my goal would be to have fields normalized to allow better searching/querying for local aggregation
Elastic is too heavy for me to run on a small vps.
So looking at options to export this, we can generate fields for each of the properties in ResultEvent
but some of them are objects not defined by the Nuclei project specifically, but Interaction
struct is from the interactsh
project. I assume your use-case is to have each field normalized to many-to-one relationships and not have the entire JSON object stored in a BLOB as a single property, correct?
I have it working where it exports the top level properties to a SQLite database/file but the nested properties and arrays become more complicated if we're normalizing them to the many-to-one relationship and using an ORM for maintainability for such changes would be more disruptive to the struct definitions themselves.
@kchason Hi, why not entire JSON object stored in a BLOB as a single property? this will reduce lots of work, even the future maintain, let user get which data they want from the JSON object.
That would be easy, but then queryability is limited, no? At that point, it's not really different than using jq
against a collection of files. I thought your goal (and mine) was to have it normalized for SQL queries?
We could use something like https://www.sqlite.org/json1.html but it's less efficient than the normalized version would be.
maybe still have different with JSON files? directly use jq maybe read the whole JOSN files, it cost more ram memory, compare with read from sqlite db?
I think use https://www.sqlite.org/json1.html is ok.
We could, but if we're storing the entire JSON object in a string or BLOB field, then wouldn't that be the same thing since you'd have to either string match or parse the entire object as JSON? I thought the goal was to be able to do easier normalization. We'd need to either do something like MongoDB or normalize the fields for a relational database
yes, if we can normalize the fields will be better, but the fields seems too many, if we select some of them to store, maybe will miss some fields which are important to others. Actually, I just need a small database to store the result, then, I write a script to monitor this database, push out the result.
I think if you want to monitor your results on real time you need nuclei streaming results at runtime not nuclei writing to a database (btw, why sqlite and not mysql?, the output format should be always abstract and agnostic), it's up to you to process that data and store it for your purposes.