instagram-crawler icon indicating copy to clipboard operation
instagram-crawler copied to clipboard

elenium.common.exceptions.ElementClickInterceptedException

Open gyunggyung opened this issue 4 years ago • 4 comments

I use that python crawler.py hashtag -t 연습 -o ./output -n 15

DevTools listening on ws://127.0.0.1:8393/devtools/browser/a7d1144b-bb0e-45c4-8249-0aea43743ed7
Traceback (most recent call last):
  File "crawler.py", line 96, in <module>
    get_posts_by_hashtag(args.tag, args.number or 100, args.debug), args.output
  File "crawler.py", line 42, in get_posts_by_hashtag
    ins_crawler = InsCrawler(has_screen=debug)
  File "C:\Users\hwnau\Desktop\test\instagram-crawler-master\inscrawler\crawler.py", line 70, in __init__
    self.login()
  File "C:\Users\hwnau\Desktop\test\instagram-crawler-master\inscrawler\crawler.py", line 87, in login
    login_btn.click()
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\webelement.py", line 80, in click
    self._execute(Command.CLICK_ELEMENT)
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\webelement.py", line 628, in _execute
    return self._parent.execute(command, params)
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 312, in execute
    self.error_handler.check_response(response)
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\errorhandler.py", line 242, in check_response
    raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.ElementClickInterceptedException: Message: element click intercepted: Element <button class="sqdOP  L3NKy   y3zKF     " disabled="" type="submit">...</button> is not clickable at point (391, 243). Other element would receive the click: <div class="                    Igw0E     IwRSH      eGOV_         _4EzTm    bkEs3                          CovQj                  jKUp7          DhRcB                                                    ">...</div>
  (Session info: headless chrome=81.0.4044.138)

gyunggyung avatar May 19 '20 09:05 gyunggyung

same here

gabrofig avatar May 20 '20 22:05 gabrofig

same question

Feywell avatar May 26 '20 09:05 Feywell

same error

hyemmie avatar Jun 07 '20 12:06 hyemmie

I ran into the same problem, but after reading some code I've figured out that it's because I didn't provide the auth information of mine. There are many options to provide your account info, but I filled them in to 'inscrawler/secret.py' or you can export them as environment variables.

uniglot avatar Jun 21 '20 09:06 uniglot