cf-clearance-scraper icon indicating copy to clipboard operation
cf-clearance-scraper copied to clipboard

API Returns 500 Error in Docker Environment but Works Locally

Open Hacklason opened this issue 1 year ago • 10 comments

Description:

I've encountered an issue where the API returns {"code":500,"message":"Request Timeout"} when running within a Docker container. However, the API functions correctly when run directly on my local machine using Node.js.

Steps to Reproduce:

  1. Run the Docker container with:
docker run -d -p 3000:3000 \
-e PORT=3000 \
-e browserLimit=20 \
-e timeOut=30000 \
zfcsoftware/cf-clearance-scraper:latest
  1. Make a request to the API endpoint (e.g., http://localhost:3000/cf-clearance-scraper).

Expected Behavior:

The API should return a successful response as it does when running locally.

Actual Behavior:

The API returns a 500 error with the message: {"code":500,"message":"Request Timeout"}.

Additional Information:

  • Local Environment:
    • Node.js version: v20.16.0
    • Operating System: EndeavourOS (Kernel: 6.6.40-1-lts)
  • Docker Environment:
    • Docker version: 27.0.3

Questions:

  1. Are there any known issues with running this API in Docker?
  2. Could there be specific Docker configurations affecting the API behavior?

Any insights or suggestions would be greatly appreciated!

Hacklason avatar Aug 01 '24 13:08 Hacklason

Quick update: I built the Docker image locally and it works fine. The issue was with the zfcsoftware/cf-clearance-scraper:latest Docker image I was using before.

Hacklason avatar Aug 01 '24 21:08 Hacklason

Yup I see the same thing.

  1. git clone https://github.com/zfcsoftware/cf-clearance-scraper.git
  2. docker build -t cf-clearance-scraper .
  3. docker run -d -p 3000:3000 -e PORT=3000 -e browserLimit=20 -e timeOut=30000 cf-clearance-scraper

Fixes the issue.

Edit See my comments below, all of this was caused by my own errors, the dockerhub version is working just fine for me again.

krkeegan avatar Aug 02 '24 15:08 krkeegan

Note The dockerhub image did work yesterday, but stopped working after an automatic reboot this morning. Not sure if there is some bug related to the reboot.

krkeegan avatar Aug 02 '24 15:08 krkeegan

Note The dockerhub image did work yesterday, but stopped working after an automatic reboot this morning. Not sure if there is some bug related to the reboot.

That part seems weird because that image never worked properly on my end to begin with. Did you have the chance to see if that behavior happens with the manually built image?

Hacklason avatar Aug 02 '24 17:08 Hacklason

Note The dockerhub image did work yesterday, but stopped working after an automatic reboot this morning. Not sure if there is some bug related to the reboot.

it looks like your container does not start after reboot.

cod888 avatar Aug 02 '24 17:08 cod888

it looks like your container does not start after reboot.

Edit See next comment, I really have no idea happened.

No it restarts, it returned HTTP Code 500.

{ code: 500, message: 'Request Timeout' }

krkeegan avatar Aug 02 '24 17:08 krkeegan

Ugh, I don't know what I did now. Everything seems to be working again for me.

Here are the things that I can verify:

  1. I run a script every ten minutes that will log an error if cf-clearance does not work.
  2. The dockerhub version worked yesterday with the final run working fine at 2:55 am.
  3. My machine rebooted at 3am this morning
  4. Cf-Clearance did not work at 3:05 am this morning returning { code: 500, message: 'Request Timeout' }
  5. It never worked after that
  6. The locally built version worked
  7. The locally built version continues to work through a restart.

I went back to try and recreate the error with the dockerhub version and did the following:

  1. Stopped the container
  2. Deleted the container
  3. Ran the container again with --restart always
  4. It worked
  5. Restarted
  6. It still works

I honestly don't know what to think, if the container isn't running, I get a fetch timeout not a response of { code: 500, message: 'Request Timeout' }.

Possibly I did something else different, or maybe it will break again after some time.

I dunno, sorry for the confusion.

krkeegan avatar Aug 02 '24 18:08 krkeegan

Funny enough, there are some websites that cf-clearance-scraper just shits itself and returns a Request Timeout. Specially if the website happens to NOT be protected by cloudflare (like https://httpbin.org).

@krkeegan Could you try and fetch "https://httpbin.org/headers" using cf-clearance-scraper and check if you get this same problem?

Note: I'm using a HTTP/HTTPS proxy for all my requests

Hacklason avatar Aug 02 '24 19:08 Hacklason

OK, I figured out how I was dumb.

I am pretty sure when my machine rebooted it restarted an older version of cf-clearance. That is why it was still running and returning a 500 error.

So, nothing to see here, just my own mistakes.

@krkeegan Could you try and fetch "https://httpbin.org/headers" using cf-clearance-scraper and check if you get this same problem?

Yup, agree that returns {code: 500, message: 'Request Timeout' } for me as well.

krkeegan avatar Aug 02 '24 19:08 krkeegan

OK, I figured out how I was dumb.

I am pretty sure when my machine rebooted it restarted an older version of cf-clearance. That is why it was still running and returning a 500 error.

So, nothing to see here, just my own mistakes.

@krkeegan Could you try and fetch "https://httpbin.org/headers" using cf-clearance-scraper and check if you get this same problem?

Yup, agree that returns {code: 500, message: 'Request Timeout' } for me as well.

Thanks a bunch. I might try to debug it another day

Hacklason avatar Aug 02 '24 20:08 Hacklason

The library has been updated. It should now run smoothly. I am closing this issue, please let me know if you have any problems by writing a message through this issue. Thank you.

mdervisaygan avatar Aug 28 '24 16:08 mdervisaygan