pyppeteer
pyppeteer copied to clipboard
pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed.
I run demo on win7, python 3.7, have download chromium with command ' pyppeteer-install' script as:
import asyncio from pyppeteer import launch
async def main(): browser = await launch(headless=False) page = await browser.newPage() await page.goto('https://www.google.com'); await page.screenshot({'path': 'example.png'}) await browser.close()
asyncio.get_event_loop().run_until_complete(main())
but it has errors below:
Traceback (most recent call last):
File "F:/run.py", line 28, in
Process finished with exit code 1
Same problem, and I according to this to fix my code like this:
async def spider(self, url):
self.log.detail_info(f"[*] Start crawl {url}")
self.log.detail_info(f"[*] {url} started at {time.strftime('%X')}")
# Handle Error: pyppeteer.errors.NetworkError: Protocol error Runtime.callFunctionOn: Target closed.
browser = await launch({
'args': ['--no-sandbox']
})
page = await browser.newPage()
await page.setViewport(self.set_view_port_option)
await page.goto(url, self.goto_option)
title = await page.title()
filename = await self.translate_word(title)
await page.evaluate(scroll_page_js)
pdf = await page.pdf(self.pdf_option)
await browser.close()
self.log.detail_info(f"[*] {url} finished at {time.strftime('%X')}")
return filename, pdf
But it doesn't work for me.This this answer said the error would happen if you call browser.close() while browser.newPage() has yet to resolve.
Has your problem been solved?
I have solved my problem by rewriting the js code to python code like this:
async def scroll_page(page):
cur_dist = 0
height = await page.evaluate("() => document.body.scrollHeight")
while True:
if cur_dist < height:
await page.evaluate("window.scrollBy(0, 500);")
await asyncio.sleep(0.1)
cur_dist += 500
else:
break
I guess the problem is that pyppeteer has a default running-timeout when executing js code or taking screenshot or saving page to pdf. Accroding to the results of my test, this default timeout is about 20 seconds. So when your work has been working for more than 20 seconds, that is, when your work times out, the running work will be automatically closed, which is independent of the timeout settings of the headless you set before.
My reason is that the version of the websockets package is incorrect, and websockets 7.0 is replaced by WebSockets 6.0.
In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.
Run:
pip3 install websockets==6.0 --force-reinstall
and everything should be okay. =)
EDIT 2019-12-26: People also recommend websockets==8.1, which might be a better idea to try first since it's more up to date. =)
@boramalper I just ran into this issue. Similar problem with scraping a webpage for longer than 20seconds.
My initial traceback was "pyppeteer.errors.NetworkError: Protocol Error (Runtime.callFunctionOn): Session closed. Most likely the page has been closed."
Which led me to this https://github.com/miyakogi/pyppeteer/issues/178 offering this solution. https://github.com/miyakogi/pyppeteer/pull/160/files
I really wasn't ready to modify the codebase of pyppetteer according to the above "solution", which to me is really more of just a hack. So I tried some other things and got the traceback which led me here.
Downgrading from websockets==7.0 to 6.0 instantly fixed this issue and should be the solution for anyone running into issue #171 or #178
Thanks!!
Same here as described by @GrilledChickenThighs
Should this be fixed somehow on this repo ?
It's a shame that this severe issue, that has a very simple fix, is yet to be resolved... It's been more than 6 months @miyakogi, what are you waiting for exactly?
See #170
In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.
Run:
pip3 install websockets==6.0 --force-reinstalland everything should be okay. =)
This suggestion saved me. After hours trying with other approaches. Thanks!
I checked my version of websockets, 8.0 didn't work either.
@miyakogi would you consider transferring ownership of this repo to someone who's willing to maintain it? pyppeteer is a valuable tool but it's effectively broken while this issue is there.
in the meantime you could use my fork which has the websockets fix (without downgrading) and supports the latest chrome revisions on windows: https://github.com/Francesco149/pyppeteer
pip install pyppeteer_fork
websockets==8.1 worked for me !
import pyppdf.patch_pyppeteer
above fix worked with parsing website and collecting links in rapid fashion.
duplicate of #62
@marksteward
No idea what's wrong, I think it has something to do with the
websocketsdependancy. With6.0, there are no issues, with8.1, issues left and right
Same here!!! waitForSelector function raise pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed. websockets 6.0 is OK
I had the same problem which was resolved by
In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.
Run:
pip3 install websockets==6.0 --force-reinstalland everything should be okay. =)
EDIT 2019-12-26: People also recommend
websockets==8.1, which might be a better idea to try first since it's more up to date. =)
I resolved the same issue by your solution, downgrading to 6.0. however, for your information 8.1 still has the same issue. because I was having this version originally when I faced this problem. so the bug is yet to be resolved by the author.
@mostafamirzaee It's actually not an issue in the websockets library per se, rather in chrome and pyppeteer. Chrome doesn't respond to pings over websockets, making the websockets (rightly) believe Chrome is dead, so it closes the connection. Older versions of websockets simply didn't support sending be pings to the server, hence why the problem doesn't show itself.
The fix is to tell websockets to not send any pings at all, and this fix has been applied to pyppeteer2, the on-going update to this library.
@boramalper since your comment is so far up would you mind including this info?