amazon-product-review-scraper icon indicating copy to clipboard operation
amazon-product-review-scraper copied to clipboard

Python package to scrape product review data from amazon

Results 8 amazon-product-review-scraper issues
Sort by recently updated
recently updated
newest added

``` from amazon_product_review_scraper import amazon_product_review_scraper review_scraper = amazon_product_review_scraper(amazon_site="amazon.ca", product_asin="B09NPS8ZXC") ``` ``` IDE and Envs\Anaconda3\lib\site-packages\amazon_product_review_scraper\amazon_product_review_scraper.py in __init__(self, amazon_site, product_asin, sleep_time, start_page, end_page) 27 self.reviews_dict = {"date_info":[], "name":[], "title":[], "content":[], "rating":[]} 28...

the proxy generator function is not working as in the HTML they have removed the id tag of the tables, and when we try soup.find(id='proxylisttable') it returns None, so we...

Error occurred during loading data. Trying to use cache server https://fake-useragent.herokuapp.com/browsers/0.1.11 Traceback (most recent call last): File "C:\Users\Administrator\anaconda3\lib\site-packages\fake_useragent\utils.py", line 154, in load for item in get_browsers(verify_ssl=verify_ssl): File "C:\Users\Administrator\anaconda3\lib\site-packages\fake_useragent\utils.py", line 99,...

```python from amazon_product_review_scraper import amazon_product_review_scraper review_scraper = amazon_product_review_scraper(amazon_site="amazon.in", product_asin="1475096062") reviews_df = review_scraper.scrape() ``` ```python --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in 1 from amazon_product_review_scraper import amazon_product_review_scraper 2 review_scraper =...

Hi, I'm just running the demo code as-is: ``` from amazon_product_review_scraper import amazon_product_review_scraper review_scraper = amazon_product_review_scraper(amazon_site="amazon.in", product_asin="B07X6V2FR3") reviews_df = review_scraper.scrape() reviews_df.head(5) ``` But am getting back a "Service Unavailable for...