logdeep
logdeep copied to clipboard
log anomaly detection toolkit including DeepLog
Sorry for the late reply, These are the three code snippets I wrote before, run them in orderI hope it will be useful to you! @huhui ,@arunbaruah ,@nagsubhadeep, @Magical66 1....
Hi @[donglee-afar](https://github.com/donglee-afar): I read the answers to the issue areas, but I still don't understand the data processing process. I don't know how to convert "sequece_hdfs.csv" to "hdfs_train" in logdeep...
Hi, Thanks for making such an amazing project. I have trying to use it for my log files. I could parse log files to its equivalent csv files using Logparser...
How would I go about using deeplog for Apache logs? 192.168.0.14 - - [15/Sep/2021:07:28:39 -0400] "GET /media/plg_system_popup/js/jquery.js HTTP/1.1" 200 293755 "https://192.168.0.52/" "Mozilla /5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101 Firefox/78.0"
(python38) houjingwen@MacBook-Pro-2 demo % python3 loganomaly.py train Traceback (most recent call last): File "loganomaly.py", line 11, in from logdeep.models.lstm import loganomaly,deeplog,robustlog File "/Users/houjingwen/Desktop/logdeep-master/logdeep/models/lstm.py", line 1, in import torch ModuleNotFoundError: No...
Hi, Can you kindly let me know how you got 4855 sequences in hdfs_train? While I used your 'sample_hdfs.py' script to generate a sequence file from a 100k structured file...
Thanks for your excellent project. But I have a little confused. I use drain as logparser, but the template count is 47. so I want to know what log parsing...
Traceback (most recent call last): File "structure_bgl.py", line 66, in eventmap = match(BGL) File "structure_bgl.py", line 41, in match if re.match(r''+item,log_event) and re.match(r''+item,log_event).span()[1] == len(log_event): File "/home/lepton00/opt/miniconda/lib/python3.7/re.py", line 173, in...
Thanks for your awesome work! @donglee-afar I have two questions about hdfs_train, hdfs_test_normal, and hdfs_test_abnormal: 1) How to get them from the whole dataset? I mean, how to divide the...