parsedmarc
parsedmarc copied to clipboard
Index management empty no data in elastic database
Hello, I run Parsedmarc on Windows. ES and Kibana 8.3.2, Python 3.10.6. Configuration of ES and Kibana ok export.ndjson file is well imported Parsedmarc ok with msgraph and Client Secret. Finding mails, parsing, moving, ok
but no data in elastic, and the Index management page is empty.
No error in elastic, kibana or parsedmarc console.
any idea?
Thks
David
I can also see that parsedmarc find parse and move messeges withou error even if elasticsearch is not running!!!
Hello @davidande ,
You could enable the debug (see doc) => [general] => debug = True . You will get more details on the error on logs.
For your second point, issue already reported in #367 . If fixed, it will need a refactoring of the code as actually mails are processed, and on a second time data are stored.
Bests regards, Anael
Thanks @AnaelMobilia, in my case both elasticsearch and kibana run well. localhost 9200 either 5601 works well I use parsedmarc --debug I see that mails are well selected, parsed and moved. but no data stored. By the way, elasticearch logs show no entries of storing datas.
I have no problem on an Ubuntu machine. I get this from a Windows machine. FYI everything is lunched with admin rights on the Windows machine. I just did tests with Elasticsearch and Kibana 8.1.1 and python 3.10.6 that are the same used on the Ubuntu machine. same results
Thanks for your help
Are you using latests parsedmarc version ? Could you provide Windows version ? Could you share the parsedmarc log too ?
You should have something like :
2022-11-18 15:16:19,615 - DEBUG - [__init__.py:1095] - Processing 1 messages
2022-11-18 15:16:19,615 - DEBUG - [__init__.py:1099] - Processing message 1 of 1: UID 1234
2022-11-18 15:16:19,633 - INFO - [__init__.py:805] - Parsing mail from Example DMARC Report Generator <[email protected]>
2022-11-18 15:16:19,799 - DEBUG - [__init__.py:1152] - Moving aggregate report messages from INBOX to FOLDER/Aggregate
2022-11-18 15:16:19,799 - DEBUG - [__init__.py:1159] - Moving message 1 of 1: UID 1234
2022-11-18 15:16:20,475 - INFO - [elastic.py:295] - Saving aggregate report to Elasticsearch
Bests regards, Anael
Yes, I use de latest 8.3.2 version of Parsedmarc I'm on Win11Pro, but same result on w2019 or w2022
Here is my parsedmarc log
DEBUG:init.py:1087:Found 10 messages in Inbox DEBUG:init.py:1095:Processing 10 messages DEBUG:init.py:1099:Processing message 1 of 10: UID XXXX-YYYY-ZZZZ INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 2 of 10: UID XXXX-YYYY-ZZZZ INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 3 of 10: UID XXXX-YYYY-ZZZZRg678AAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 4 of 10: UID XXXX-YYYY-ZZZZRg677AAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 5 of 10: UID XXXX-YYYY-ZZZZRg676AAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 6 of 10: UID XXXX-YYYY-ZZZZRgZZZZAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 7 of 10: UID XXXX-YYYY-ZZZZRg674AAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 8 of 10: UID XXXX-YYYY-ZZZZRg673AAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 9 of 10: UID XXXX-YYYY-ZZZZRgZZZZ INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1099:Processing message 10 of 10: UID XXXX-YYYY-ZZZZRg671AAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1152:Moving aggregate report messages from Inbox to Archive/Aggregate DEBUG:init.py:1159:Moving message 1 of 10: UID XXXX-YYYY-ZZZZ DEBUG:init.py:1159:Moving message 2 of 10: UID XXXX-YYYY-ZZZZ DEBUG:init.py:1159:Moving message 3 of 10: UID XXXX-YYYY-ZZZZRg678AAA= DEBUG:init.py:1159:Moving message 4 of 10: UID XXXX-YYYY-ZZZZRg677AAA= DEBUG:init.py:1159:Moving message 5 of 10: UID XXXX-YYYY-ZZZZRg676AAA= DEBUG:init.py:1159:Moving message 6 of 10: UID XXXX-YYYY-ZZZZRgZZZZAA= DEBUG:init.py:1159:Moving message 7 of 10: UID XXXX-YYYY-ZZZZRg674AAA= DEBUG:init.py:1159:Moving message 8 of 10: UID XXXX-YYYY-ZZZZRg673AAA= DEBUG:init.py:1159:Moving message 9 of 10: UID XXXX-YYYY-ZZZZRgZZZZ DEBUG:init.py:1159:Moving message 10 of 10: UID XXXX-YYYY-ZZZZRg671AAA=`
As I can see there is a difference between our logs.
On mine there is nowhere this:
INFO - [elastic.py:295] - Saving aggregate report to Elasticsearch
To go further, here my Parsdmarc log with silent = false
DEBUG:init.py:1087:Found 1 messages in Inbox DEBUG:init.py:1095:Processing 1 messages DEBUG:init.py:1099:Processing message 1 of 1: UID AAMkADFhNTZhMDA5LWVhNTUtNDI0Zi1iYmM3LTVhNjgyNDUzNzA4ZABGAAAAAACJjIix4c3eSr8KMFU6Z5FYBwB0-VV1baKWTId8H5WQY0lkAAAAAAEMAAB0-VV1baKWTId8H5WQY0lkAADvAD_tAAA= INFO:init.py:805:Parsing mail from [email protected] DEBUG:init.py:1152:Moving aggregate report messages from Inbox to Archive/Aggregate DEBUG:init.py:1159:Moving message 1 of 1: UID AAMkADFhNTZhMDA5LWVhNTUtNDI0Zi1iYmM3LTVhNjgyNDUzNzA4ZABGAAAAAACJjIix4c3eSr8KMFU6Z5FYBwB0-VV1baKWTId8H5WQY0lkAAAAAAEMAAB0-VV1baKWTId8H5WQY0lkAADvAD_tAAA= { "aggregate_reports": [ { "xml_schema": "draft", "report_metadata": { "org_name": "google.com", "org_email": "[email protected]", "org_extra_contact_info": "https://support.google.com/a/answer/2466580", "report_id": "9750481xxxxxx6280907", "begin_date": "2022-11-19 01:00:00", "end_date": "2022-11-20 00:59:59", "errors": [] }, "policy_published": { "domain": "xxxxxxxxx.com", "adkim": "r", "aspf": "r", "p": "none", "sp": "none", "pct": "100", "fo": "0" }, "records": [ { "source": { "ip_address": "xxx.xxx.xxx.xxx", "country": "FR", "reverse_dns": "xxx.ovh.net", "base_domain": "ovh.net" }, "count": 1, "alignment": { "spf": false, "dkim": false, "dmarc": false }, "policy_evaluated": { "disposition": "none", "dkim": "fail", "spf": "fail", "policy_override_reasons": [] }, "identifiers": { "header_from": "xxxxx.com", "envelope_from": "xxxxx.ovh.net", "envelope_to": null }, "auth_results": { "dkim": [], "spf": [ { "domain": "xxxxx.ovh.net", "scope": "mfrom", "result": "pass" } ] } } ] } ], "forensic_reports": [] }
DEBUG:init.py:1087:Found 0 messages in Inbox DEBUG:init.py:1095:Processing 0 messages { "aggregate_reports": [], "forensic_reports": [] }
and no info about elastic.py
I have the same issue, I migrated/updated parsedmarc to latest version, configured it with graphapi and elastic as output. But now it isn't working, it cashes after processing the message, I doesn't appear in Elasticsearch.
parsedmarc: 8.3.2 (running in docker with dockerfile from repo) elasticsearch: 8.0.1
I've got the same output as @davidande
my config: [general] debug = True silent = False
[msgraph] auth_method=ClientSecret client_id= client_secret= tenant_id= mailbox=
[mailbox] reports_folder=Testpostvak
[elasticsearch] hosts = username:password@<IP>:9200 ssl = False monthly_indexes = True
Hi,
I have the same problem.
ElasticSearch 7.17.6 Parsedmarc 8.4.1
[general]
save_aggregate = True
save_forensic = True
chunk_size = 5
silent = False
debug = True
[imap]
host = (PROTECTED)
user = (PROTECTED)
password = (PROTECTED)
[mailbox]
watch = True
delete = False
batch_size = 10
check_timeout = 15
[elasticsearch]
hosts = 127.0.0.1:9200
ssl = False
monthly_indexes = True
3 of these messages where it's supposed to be processing DMARC reports, but nothing ever gets saved to ElasticSearch and I don't see any graphs in Grafana:
Jan 18 11:09:37 dmarc.example.com parsedmarc[363045]: INFO:elastic.py:295:Saving aggregate report to Elasticsearch
Jan 18 11:10:01 dmarc.example.com parsedmarc[363045]: INFO:elastic.py:295:Saving aggregate report to Elasticsearch
Jan 18 11:10:01 dmarc.example.com parsedmarc[363045]: INFO:elastic.py:295:Saving aggregate report to Elasticsearch
Is it a bug with elastic.py
?
pip list
shows these elasticsearch modules installed:
elasticsearch 7.12.0
elasticsearch-curator 5.8.4
elasticsearch-dsl 7.4.0
Hello, Regarding this issue, it looks like emails are correctly processed on init.py -> get_dmarc_reports_from_mailbox(). Then cli.py -> _main() -> process_reports() are called.
Thanks to @davidande it is possible to verify that reports are presents and should be processed (output of the JSON is from https://github.com/domainaware/parsedmarc/blob/master/parsedmarc/cli.py#L79).
But on https://github.com/domainaware/parsedmarc/blob/master/parsedmarc/cli.py#L125 , elasticsearch will not log data.
I have two ideas :
- All the process look correct and finish on aggregated report saved on https://github.com/domainaware/parsedmarc/blob/master/parsedmarc/elastic.py#L401 . Maybe there is some issue behind this save which is not catched properly ?
- Due to #315 / https://github.com/domainaware/parsedmarc/commit/af2afddf96fd80c04c1e3ff3d17cfc64c27d75b2, parsedmarc-py library is stuck on <7.14.0 . Maybe there is some race condition which generate invisible error ? Ping @seanthegeek as he is the commit' author.
Questions :
- @davidande do you have anything on elasticsearch log ?
- @tdm4 and @coolriku are you running parsedmarc on windows too ? (OS / version) ?
Bests regards, Anael
HI @AnaelMobilia -
I fixed my problem:
- Running
curl http://127.0.0.1:9200/_cat/shards?v=true
showed a lot of indices NOT ASSIGNED. When I first set up ElasticSearch I hadn't set this inelasticsearch.yml
:discovery.type: single-node
. There was something like 5.21 GB worth of indices, but the cluster was stuck in 'Yellow' state (would not go to 'Green' state) - I ended up deleting ALL the indices in ES with
curl -X DELETE 'http://127.0.0.1:9200/*'
- I had some reports still in INBOX/Archive/Aggregate so I moved them all back into INBOX
- parsedmarc picked up on these and this time it wrote to ElasticSearch properly and the graphs now show up in Grafana
So .. all along it was an ElasticSearch problem. Might be a good idea if the parsedmarc output says there's a problem when trying to send data to ElasticSearch which would help us find the solution quicker.
@davidande @coolriku - Check your ElasticSearch server.. make sure cluster status isn't red or yellow and check the ES guides on managing indices. My actions were a little drastic, but I really had no need for last years' DMARC data, only really interested in recent data.
@davidande @AnaelMobilia I found the problem(atleast for me), these settings are mandatory: see docs [general] save_aggregate = True save_forensic = True
When I cleaned up the config I removed those settings, silly me.
I'm running it in docker with provided dockerfile in repo.
Hello,
@AnaelMobilia : no log from elastic. It stays quiet! @tdm4: status of elasticsearch is green. curl command says Empty reply from server @coolriku: yes I also have save_aggregate and save_forensic to true
Just a reminder . for me It works fine on ubuntu but not on Windows (11 or 2019 neither 2022)
Hello everybody,
So at least i have found a solution for Parsedmarc to communicate with my elasticsearch. In parsedmarc.ini
[elasticsearch] hosts = http://elastic:MyPassword@localhost:9200 ssl = False
I did a new fresh install on Windows server. adding elasticsearch infos in parsedmarc.ini it works fine
@coolriku : do You still have problem?
I did a new fresh install on Windows server.
adding elasticsearch infos in parsedmarc.ini
it works fine
@coolriku : do You still have problem?
No I don't have any problems after fixing the config