Cortex
Cortex copied to clipboard
Can't run any analyzer
Describe the bug Running any analyzer in cortex gives the following error :
Traceback (most recent call last):
File "Cyberprotect/CyberprotectAnalyzer.py", line 46, in <module>
CyberprotectAnalyzer().run()
File "Cyberprotect/CyberprotectAnalyzer.py", line 12, in __init__
Analyzer.__init__(self)
File "/usr/local/lib/python3.8/site-packages/cortexutils/analyzer.py", line 17, in __init__
Worker.__init__(self, job_directory)
File "/usr/local/lib/python3.8/site-packages/cortexutils/worker.py", line 31, in __init__
self._input = json.load(sys.stdin)
File "/usr/local/lib/python3.8/json/__init__.py", line 293, in load
return loads(fp.read(),
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
To Reproduce Steps to reproduce the behavior:
-
Start Cortex version 3.1.1-1

-
Add any analyzer
-
run analyzer

Expected behavior Analyzer return a report for the observables
Environment: -Docker version 20.10.5, build 55c4c88
- ubuntu server 20.04
Additional Context docker-compose.yml
version: "2"
services:
elasticsearch:
image: elasticsearch:7.8.1
environment:
- http.host=0.0.0.0
- discovery.type=single-node
- script.allowed_types=inline
- thread_pool.search.queue_size=100000
- thread_pool.write.queue_size=10000
cortex:
image: thehiveproject/cortex:latest
environment:
- job_directory=./cortex-jobs
volumes:
- /var/run/docker.sock:/var/run/docker.sock
#- ./tmp:/tmp
#- ${job_directory}:${job_directory}
depends_on:
- elasticsearch
ports:
- "0.0.0.0:9001:9001"
I have the same error
I'm seeing the same issue with the Shodan integration.
I have the same issue. What is happening is that Cortex don't seem to pass the JSON object to the analyzers.
I tried to login to the container and run an analyzer manually with the command line and it works. So the issue don't come from the analyzer but from how cortex call them.
Here is a how I successfully ran an analyzer from the container:
./myAnalyzer.py { "data": "8.8.8.8", "dataType": "ip"} CTRL+D
My issue was caused by our custom Docker configuration in /etc/docker/daemon.json. Specifically:
- Using
log-driverset tosyslog. - Using
userns-remapset to some other user.
Both of these settings together broke how Cortex runs stuff via Docker.
I solved this issue by upgrading cortexutlils Python library.
Cortex 3 passes in this data via a file now. See: #176
You may need to update cortexutils as it supports reading the input from the input.json file in addition to stdin.
Cortex 3 now calls your analyzer with ./myAnalyzer.py /{job-directory}/cortex-<randomstring>/input/input.json. The cortexutils library reads the input.json file in it's __init__ method.
Otherwise, I believe omitting the job-directory environment variable and settings reverts Cortex to passing the input data in via stdin. I haven't confirmed this though.
Another note, if you're mounting a host directory to the job directory within the container --- make sure the permissions are set correctly to allow the account running cortex within the container to access the directory.