crawl4ai icon indicating copy to clipboard operation
crawl4ai copied to clipboard

Page.goto: Timeout 60000ms exceeded.

Open jmontoyavallejo opened this issue 1 year ago • 1 comments
trafficstars

could you please add the possibility to change the timeout, in some places and containers could take more than 60 seconds crawl4ai/crawl4ai/async_crawler_strategy.py

line 251

response = await page.goto(url, wait_until="domcontentloaded", timeout=60000)

jmontoyavallejo avatar Oct 11 '24 19:10 jmontoyavallejo

[ERROR] 🚫 Failed to crawl "link", error: Failed to crawl "link": Page.goto: Timeout 60000ms exceeded. Call log: navigating to "link "domcontentloaded"

jmontoyavallejo avatar Oct 11 '24 19:10 jmontoyavallejo

Hi @jmontoyavallejo We've already added this feature, thank you for the suggestion. It's a really good idea. Although we haven't released the new version yet, you can currently pull from this "0.3.6" branch, but by Monday or Tuesday, we will also update the library in Pypi.

unclecode avatar Oct 12 '24 05:10 unclecode

Hi @jmontoyavallejo We've already added this feature, thank you for the suggestion. It's a really good idea. Although we haven't released the new version yet, you can currently pull from this "0.3.6" branch, but by Monday or Tuesday, we will also update the library in Pypi.

how to do it after updating the version?

dhifafaz avatar Oct 16 '24 03:10 dhifafaz

In the invocation of the function like crawler.arun(...,page_timeout=120000) it worked for me

jmontoyavallejo avatar Oct 16 '24 14:10 jmontoyavallejo

@jmontoyavallejo We will update our website docs very soon to contains details for timeout.

unclecode avatar Oct 18 '24 11:10 unclecode