Instagram-Comments-Scraper icon indicating copy to clipboard operation
Instagram-Comments-Scraper copied to clipboard

Instagram comment scraper using python and selenium. Save the comments into excel.

Results 10 Instagram-Comments-Scraper issues
Sort by recently updated
recently updated
newest added

Getting a TypeError while running the scrapper.py

Would it be possible to enter the username and have it collect comments from all posts? How? Thankss

I always got this: `DevTools listening on ws://127.0.0.1:1067/devtools/browser/c5dd8a90-5b53-43b0-bdc6-b9233dc94dca Found Message: no such element: Unable to locate element: {"method":"css selector","selector":".MGdpg > button:nth-child(1)"} (Session info: chrome=94.0.4606.61)` My parameter is 5 and time...

When I run the code after a full set up it consistently states "Message: no such element: Unable to locate element: {"method":"css selector","selector":".MGdpg > button:nth-child(1)"}" Has this issue come up...

pas di running selalu error > Traceback (most recent call last): File "scraper.py", line 5, in driver = webdriver.Chrome() File "/root/Instagram-Comments-Scraper/.venv/lib/python2.7/site-packages/selenium/webdriver/chrome/webdriver.py", line 81, in __init__ desired_capabilities=desired_capabilities) File "/root/Instagram-Comments-Scraper/.venv/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py", line 157,...

mas Agi, apakah scrapernya ada limitnya? saya scrap foto dengan 500 komentar tapi di komentar ke 10 sudah berhenti

I am getting stuck at login screen and the documentation doesn't really help at this point.

Hi, i have this error, need help python scraper.py https://www.instagram.com/p/CqT0ukiMbPf/ 5 DevTools listening on ws://127.0.0.1:55766/devtools/browser/466cc3f1-af4c-456b-95e8-8542b7cfb023 [10536:5680:0329/115639.286:ERROR:device_event_log_impl.cc(222)] [11:56:39.286] USB: usb_device_handle_win.cc:1046 Failed to read descriptor from node connection: Uno de los dispositivos...

I get the following error message when I try and run scraper.py. I have the most recent driver installed and updated google chrome. I tried manually putting in the driver...