pyppeteer icon indicating copy to clipboard operation
pyppeteer copied to clipboard

pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed.

Open GigiisJiji opened this issue 7 years ago • 17 comments

I run demo on win7, python 3.7, have download chromium with command ' pyppeteer-install' script as:

import asyncio from pyppeteer import launch

async def main(): browser = await launch(headless=False) page = await browser.newPage() await page.goto('https://www.google.com'); await page.screenshot({'path': 'example.png'}) await browser.close()

asyncio.get_event_loop().run_until_complete(main())

but it has errors below: Traceback (most recent call last): File "F:/run.py", line 28, in asyncio.get_event_loop().run_until_complete(main()) File "C:\Python37\Lib\asyncio\base_events.py", line 568, in run_until_complete return future.result() File "F:/run.py", line 25, in main await page.screenshot({'path': 'example.png'}) File "C:\Users\admiunfd\Envs\gigipy3\lib\site-packages\pyppeteer\page.py", line 1227, in screenshot return await self._screenshotTask(screenshotType, options) File "C:\Users\admiunfd\Envs\gigipy3\lib\site-packages\pyppeteer\page.py", line 1232, in _screenshotTask 'targetId': self._target._targetId, pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed.

Process finished with exit code 1

GigiisJiji avatar Dec 03 '18 07:12 GigiisJiji

Same problem, and I according to this to fix my code like this:

    async def spider(self, url):
        self.log.detail_info(f"[*] Start crawl {url}")
        self.log.detail_info(f"[*] {url} started at {time.strftime('%X')}")
        # Handle Error: pyppeteer.errors.NetworkError: Protocol error Runtime.callFunctionOn: Target closed.
        browser = await launch({
            'args': ['--no-sandbox']
        })
        page = await browser.newPage()
        await page.setViewport(self.set_view_port_option)
        await page.goto(url, self.goto_option)
        title = await page.title()
        filename = await self.translate_word(title)
        await page.evaluate(scroll_page_js)
        pdf = await page.pdf(self.pdf_option)
        await browser.close()
        self.log.detail_info(f"[*] {url} finished at {time.strftime('%X')}")
        return filename, pdf

But it doesn't work for me.This this answer said the error would happen if you call browser.close() while browser.newPage() has yet to resolve.

Lucifaer avatar Dec 07 '18 07:12 Lucifaer

Has your problem been solved?

dp24678 avatar Dec 13 '18 05:12 dp24678

I have solved my problem by rewriting the js code to python code like this:

async def scroll_page(page):
    cur_dist = 0
    height = await page.evaluate("() => document.body.scrollHeight")
    while True:
        if cur_dist < height:
            await page.evaluate("window.scrollBy(0, 500);")
            await asyncio.sleep(0.1)
            cur_dist += 500
        else:
            break

I guess the problem is that pyppeteer has a default running-timeout when executing js code or taking screenshot or saving page to pdf. Accroding to the results of my test, this default timeout is about 20 seconds. So when your work has been working for more than 20 seconds, that is, when your work times out, the running work will be automatically closed, which is independent of the timeout settings of the headless you set before.

Lucifaer avatar Dec 13 '18 06:12 Lucifaer

My reason is that the version of the websockets package is incorrect, and websockets 7.0 is replaced by WebSockets 6.0.

dp24678 avatar Jan 22 '19 02:01 dp24678

In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.

Run:

pip3 install websockets==6.0 --force-reinstall

and everything should be okay. =)

EDIT 2019-12-26: People also recommend websockets==8.1, which might be a better idea to try first since it's more up to date. =)

boramalper avatar Mar 31 '19 12:03 boramalper

@boramalper I just ran into this issue. Similar problem with scraping a webpage for longer than 20seconds.

My initial traceback was "pyppeteer.errors.NetworkError: Protocol Error (Runtime.callFunctionOn): Session closed. Most likely the page has been closed."

Which led me to this https://github.com/miyakogi/pyppeteer/issues/178 offering this solution. https://github.com/miyakogi/pyppeteer/pull/160/files

I really wasn't ready to modify the codebase of pyppetteer according to the above "solution", which to me is really more of just a hack. So I tried some other things and got the traceback which led me here.

Downgrading from websockets==7.0 to 6.0 instantly fixed this issue and should be the solution for anyone running into issue #171 or #178

Thanks!!

pAulseperformance avatar Apr 10 '19 19:04 pAulseperformance

Same here as described by @GrilledChickenThighs

Should this be fixed somehow on this repo ?

andreroggeri avatar May 25 '19 03:05 andreroggeri

It's a shame that this severe issue, that has a very simple fix, is yet to be resolved... It's been more than 6 months @miyakogi, what are you waiting for exactly?

See #170

boramalper avatar May 25 '19 21:05 boramalper

In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.

Run:

pip3 install websockets==6.0 --force-reinstall

and everything should be okay. =)

This suggestion saved me. After hours trying with other approaches. Thanks! I checked my version of websockets, 8.0 didn't work either.

pengisgood avatar Aug 06 '19 08:08 pengisgood

@miyakogi would you consider transferring ownership of this repo to someone who's willing to maintain it? pyppeteer is a valuable tool but it's effectively broken while this issue is there.

ColdHeat avatar Sep 07 '19 01:09 ColdHeat

in the meantime you could use my fork which has the websockets fix (without downgrading) and supports the latest chrome revisions on windows: https://github.com/Francesco149/pyppeteer

pip install pyppeteer_fork

Francesco149 avatar Nov 21 '19 14:11 Francesco149

websockets==8.1 worked for me !

anaselmhamdi avatar Nov 26 '19 21:11 anaselmhamdi

import pyppdf.patch_pyppeteer

above fix worked with parsing website and collecting links in rapid fashion.

vkdvamshi avatar Dec 25 '19 23:12 vkdvamshi

duplicate of #62

Mattwmaster58 avatar Feb 23 '20 19:02 Mattwmaster58

@marksteward

No idea what's wrong, I think it has something to do with the websockets dependancy. With 6.0, there are no issues, with 8.1, issues left and right

Same here!!! waitForSelector function raise pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed. websockets 6.0 is OK

LennyLip avatar Mar 10 '20 14:03 LennyLip

I had the same problem which was resolved by

In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.

Run:

pip3 install websockets==6.0 --force-reinstall

and everything should be okay. =)

EDIT 2019-12-26: People also recommend websockets==8.1, which might be a better idea to try first since it's more up to date. =)

I resolved the same issue by your solution, downgrading to 6.0. however, for your information 8.1 still has the same issue. because I was having this version originally when I faced this problem. so the bug is yet to be resolved by the author.

mostafamirzaee avatar Apr 05 '20 14:04 mostafamirzaee

@mostafamirzaee It's actually not an issue in the websockets library per se, rather in chrome and pyppeteer. Chrome doesn't respond to pings over websockets, making the websockets (rightly) believe Chrome is dead, so it closes the connection. Older versions of websockets simply didn't support sending be pings to the server, hence why the problem doesn't show itself.

The fix is to tell websockets to not send any pings at all, and this fix has been applied to pyppeteer2, the on-going update to this library.

@boramalper since your comment is so far up would you mind including this info?

Mattwmaster58 avatar Apr 05 '20 14:04 Mattwmaster58