FACT_core
FACT_core copied to clipboard
Committing binaries always fails
The FACT version you are using
1b67926aa2f9b06b2592b9f685e47f115dff3341
Your question
my program is installed, but the submission of files always fails. There is no error reported in the background, and the file size is 1.4G. I don't know where the problem is, please help to answer.
It has nothing to do with the size of the file, I also use 800kb files.
Hi, it is hard to tell what went wrong. When uploading large files it may take a while until they appear in the web interface, because the file needs to pass unpacking first. Since a "POST /upload" shows up in the log, the upload itself seems to have worked.
Could you try uploading a firmware using the REST endpoint to see if it is a problem with the web frontend? You can view a description of the REST endpoints at the /doc
endpoint. You can also use this script for testing the REST upload (better test it with a small file first):
#!/usr/bin/env python3
from __future__ import annotations
from argparse import ArgumentParser
from base64 import b64encode
from pathlib import Path
import requests
def parse_args():
parser = ArgumentParser(description="FACT REST firmware upload script")
parser.add_argument("file", type=Path, help="the file to upload")
parser.add_argument("--device_class", "-c", type=str, help="the device class", default="test_class")
parser.add_argument("--device_name", "-n", type=str, help="the device name", default="test_device")
parser.add_argument("--vendor", "-v", type=str, help="the device name", default="test_vendor")
parser.add_argument("--version", "-V", type=str, help="the device name", default="1")
return parser.parse_args()
def main():
args = parse_args()
data: dict[str, str | list[str]] = {
"binary": b64encode(args.file.read_bytes()).decode(),
"device_class": args.device_class,
"device_name": args.device_name,
"device_part": "complete",
"file_name": args.file.name,
"release_date": "1970-01-01",
"requested_analysis_systems": [],
"vendor": args.vendor,
"version": args.version,
}
response = requests.put("http://localhost:5000/rest/firmware", json=data)
print(f"{response.status_code=}")
print(f"{response.content=}")
if __name__ == '__main__':
main()
My service is normal. Is it easy to fail to upload large files? If I upload according to your method, the service will be killed directly. If you upload through the web, the upload will automatically end after a few minutes.
root@ubuntu-virtual-machine:/home/ubuntu/FACT-master# python3 update.py /home/ubuntu/payload.bin --device_class=test_class --device_name=test_device --vendor=test_vendor --version=1 Traceback (most recent call last): File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 699, in urlopen httplib_response = self._make_request( File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 445, in _make_request six.raise_from(e, None) File "", line 3, in raise_from File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 440, in _make_request httplib_response = conn.getresponse() File "/usr/lib/python3.10/http/client.py", line 1374, in getresponse response.begin() File "/usr/lib/python3.10/http/client.py", line 318, in begin version, status, reason = self._read_status() File "/usr/lib/python3.10/http/client.py", line 287, in _read_status raise RemoteDisconnected("Remote end closed connection without" http.client.RemoteDisconnected: Remote end closed connection without response
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 489, in send resp = conn.urlopen( File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 755, in urlopen retries = retries.increment( File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 532, in increment raise six.reraise(type(error), error, _stacktrace) File "/usr/lib/python3/dist-packages/six.py", line 718, in reraise raise value.with_traceback(tb) File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 699, in urlopen httplib_response = self._make_request( File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 445, in _make_request six.raise_from(e, None) File "", line 3, in raise_from File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 440, in _make_request httplib_response = conn.getresponse() File "/usr/lib/python3.10/http/client.py", line 1374, in getresponse response.begin() File "/usr/lib/python3.10/http/client.py", line 318, in begin version, status, reason = self._read_status() File "/usr/lib/python3.10/http/client.py", line 287, in _read_status raise RemoteDisconnected("Remote end closed connection without" urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/ubuntu/FACT-master/update.py", line 40, in main() File "/home/ubuntu/FACT-master/update.py", line 34, in main response = requests.put("http://localhost:5000/rest/firmware", json=data) File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 130, in put return request("put", url, data=data, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 587, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 701, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 547, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
Does this happen just for the large file or also for the small file? I uploaded a 1 GiB file successfully using the script. The trace that you posted looks like a timeout, though.
I looks like you are working inside a VM. Please be advised that unpacking and analyzing such large files will take lots of RAM (probably >20 GB with the default worker settings).
Thank you, the upload was successful. It has a lot to do with RAM.
Sorry, now a new question. Is it because my file is too large, so when I analyze it, even if the RAM is 64GB, it will cause the machine to restart.
When it was analyzed that the memory was full, it restarted. I increased the memory from 20GB to 67GB.
Much of the RAM usage comes from different worker processes (and external tools) accessing the file's contents in parallel. You can try to reduce needed RAM by reducing the number of workers for unpacking and analysis plugins. You can do this by editing the file src/config/main.cfg
:
[unpack]
threads = 1
[plugin-defaults]
threads = 1
[cpu_architecture]
threads = 1
[cve_lookup]
threads = 1
...
What is more, you should select the "minimal" analysis plugin preset during upload. You still can run more analyses after uploading with the "update" button on the analysis page.