community
community copied to clipboard
Update network_cnc_generic.py & pdf_annot_urls.py
Excluded the verification of IP addresses belonging to the MICROSOFT-CORP-MSN-AS-BLOCK to prevent triggering the signature unnecessarily when the machine is connected to the internet.
With the capability to extract domains from URLs, we can now mitigate the bypassing techniques employed by malware. Malicious actors often exploit legitimate domains or implement redirection tactics to evade detection. By analyzing URLs and extracting domains, we can effectively thwart these evasion strategies. After confirming that the URL is not listed in malicious TLDs, we extract any hidden domains embedded within the URL. Subsequently, we scrutinize these extracted domains to determine if they are blacklisted on DNSBL.
you putting different signatures in the different PR, please keep them separated, as for MS i did update to be more efficient. also where i can review that all the ip ranges are MS owned?
also while this is nice feature, i guess is time to enable some sigantures config, as i don-t want any signature to reach to external services on my sandbox for 2 reasons. leaking IP + delay on analysis
you putting different signatures in the different PR, please keep them separated, as for MS i did update to be more efficient. also where i can review that all the ip ranges are MS owned?
I included it in the description, and you can find it at https://www.microsoft.com/en-us/download/details.aspx?id=53602 . I'm actually unsure about the best location for storing new CSV/TXT files, whether it's in the 'extra' directory or the 'data' directory both folders are in different Projects + I gave you the possibility to edit the PR
also while this is nice feature, i guess is time to enable some sigantures config, as i don-t want any signature to reach to external services on my sandbox for 2 reasons. leaking IP + delay on analysis
I've conducted over 23 tests and attempted to enhance the malicious URL checking. Simply relying on identifying bad top-level domains (TLDs) and suspicious extensions isn't always efficient, especially when legitimate TLDs are utilized. That's why I experimented with checking domains/IPs using external services (DNSBL) to enhance efficiency. I understand your concern about analysis delays, which is why I employed threading. The script's speed is currently at 0.158|0.05 pdf_annot_urls_checker. It would be beneficial to enable a signature feature to determine whether external services can be utilized for confidentiality. What if we introduce a new capability feature in abstracts.py, such as a function called check_ip, and add the new feature as check_ip_dnsbl? This way, we can avoid redundancy?
yes it can be moved to abstracts.py, and yes soomething like allow_external_services_check or similar can be added to config somewhere. will have to think where and which proper name to use for that