botasaurus icon indicating copy to clipboard operation
botasaurus copied to clipboard

The All in One Framework to build Awesome Scrapers.

Results 115 botasaurus issues
Sort by recently updated
recently updated
newest added

--------------------------------------------------------------------------- NameError Traceback (most recent call last) Cell In[1], [line 3](vscode-notebook-cell:?execution_count=1&line=3) [1](vscode-notebook-cell:?execution_count=1&line=1) from botasaurus import * ----> [3](vscode-notebook-cell:?execution_count=1&line=3) @browser [4](vscode-notebook-cell:?execution_count=1&line=4) def scrape_heading_task(driver: AntiDetectDriver, data): [5](vscode-notebook-cell:?execution_count=1&line=5) # Navigate to the Omkar...

When using botasaurus, it will always print a "Running" message (`decorators_common.py ->print_running()`) into the console. There is no config option to turn it off, making it impossible to use botasaurus...

https://search.lionairthai.com/SL/UserProfile/ManageAddons.aspx CF bp failed

I am having an issue while testing botasaurus latest version. I have setup a venv on my PC, have installed botasaurus `pip install botasaurus` and running following code but keep...

In selenium run_cdp_command is: `driver.execute_cdp_cmd("cmd", {"key": values})` I try this with ` driver.run_cdp_command("cmd", {"key": values}) //or driver.run_cdp_command({"cmd": {"key": values}}) //or driver.run_cdp_command(["cmd", {"key": values}]) //or driver.run_cdp_command('"cmd" {"key": values}') ` but all...

The CF seems that can detect the Botosaurus. There is no IP banned, there is no OS related problem. I have the same behavior on windows 11 and on ubuntu...

By default, when saving to json, non-ascii characters are encoded in the format "\u0438". Can I add the option to save in utf-32mbr or utf-8 encoding without encoding.

Hello there It will be cool if we can include xpath also along with css selector in most of the methods like wait_for_element

Is there any way to set the depth limit of links to be crawled ? Similar feature present in scrapy that controls how many levels deep the spider will go...

How to write something like this `driver.wait_for_element(selector, EC.element_to_be_clickable).click()`