Execution client settings (e.g., MaxLogsPerResponse) not applied when using env file
Describe the bug
When attempting to apply execution client settings via the .env file (located at /sedge-data/.env), the settings do not take effect. For example, I tried increasing the MaxLogsPerResponse limit (default is 20,000) by adding the following to the env file:
NETHERMIND_JSONRPCCONFIG_MAXLOGSPERRESPONSE=300000
However, when running a node that requires more logs per response, I encountered the following error:
ERROR ThreadId(09) hopr_chain_rpc::client: request failed method="eth_getLogs" elapsed_in_ms=159 error=(code: -32005, message: Too many logs requested. Max logs per response is 20000., data: None)
What actually worked was modifying the docker-compose.yaml file (located at /sedge-data/docker-compose.yaml) and adding the following under the execution client section:
- --JsonRpc.MaxLogsPerResponse=300000
This resolved the issue successfully.
To Reproduce Steps to reproduce the behavior:
- Set up a Gnosis node without a validator using the Sedge tool.
- Edit the
.envfile and add:NETHERMIND_JSONRPCCONFIG_MAXLOGSPERRESPONSE=300000. - Start the Gnosis node via Sedge and try to fetch logs with a response size greater than
20000.
Expected behavior
Execution and consensus settings should be configurable via the .env file.
Server (please complete the following information):
- OS: Ubuntu 22.04
- Version: Sedge tool version: curl -L https://github.com/NethermindEth/sedge/releases/download/v1.7.2/sedge-v1.7.2-linux-amd64 --output sedge
Can I jump on this task?
Hey @Jaguaras thx for opening this issue. Did you try restarting the nethermind client or the docker compose stack (docker compose restart) after updating the .env file?
Was the node stopped or running when you modified the .env?
Hey @Jaguaras thx for opening this issue. Did you try restarting the nethermind client or the docker compose stack (
docker compose restart) after updating the.envfile?Was the node stopped or running when you modified the
.env?
Hey, sure I did that, but still the same issue.