Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unhandled promise: Protocol Error #45

Open
Venoox opened this issue Aug 26, 2020 · 4 comments
Open

Unhandled promise: Protocol Error #45

Venoox opened this issue Aug 26, 2020 · 4 comments
Labels
enhancement New feature or request

Comments

@Venoox
Copy link

Venoox commented Aug 26, 2020

When loading multiples pages at once I get this unhandled promise

You have triggered an unhandledRejection, you may have forgotten to catch a Promise rejection:
Error: Protocol error (Network.getCookies): Target closed.
at /root/scraper/node_modules/puppeteer/lib/cjs/puppeteer/common/Connection.js:208:63
at new Promise (<anonymous>)
at CDPSession.send (/root/scraper/node_modules/puppeteer/lib/cjs/puppeteer/common/Connection.js:207:16)
at Object.getCookies (/root/scraper/node_modules/puppeteer-page-proxy/src/lib/cdp.js:6:38)
at CookieHandler.getCookies (/root/scraper/node_modules/puppeteer-page-proxy/src/lib/cookies.js:84:51)
at requestHandler (/root/scraper/node_modules/puppeteer-page-proxy/src/core/proxy.js:15:40)
at proxyPerRequest (/root/scraper/node_modules/puppeteer-page-proxy/src/core/proxy.js:72:23)
at useProxy (/root/scraper/node_modules/puppeteer-page-proxy/src/core/proxy.js:92:15)
at /root/scraper/index.js:90:15
at /root/scraper/node_modules/puppeteer/lib/cjs/vendor/mitt/src/index.js:47:62

[email protected]
[email protected]
[email protected]

@birdkiwi
Copy link

Do you use await page.close(); in your tasks?

@chapov
Copy link

chapov commented Sep 16, 2020

I use custom setRequestInterception in my code, and i have this problem.

await cluster.task(async ({ page, data: url }) => {
        const proxyAddress = utils.getProxy( );
        await useProxy( page, proxyAddress );
        // await page.setRequestInterception( true );
        page.on('request', async request => {
            let requestBlocked = false;

            const reqDomain = new URL( request.url() ).host;

            if ( domainsBL ) {
                if ( domainsBL.includes( reqDomain ) ) {
                    console.log('Block and crash :(');
                    request.abort();
                    requestBlocked = true;
                }
            }
        });
});

output:

chapov@WIN-KF7HQ5BAFTM:/opt/my/zen$ npm run scrapper 

> [email protected] scrapper /opt/my/zen
> nodemon scrapper.js --exec babel-node

[nodemon] 2.0.4
[nodemon] to restart at any time, enter `rs`
[nodemon] watching path(s): *.*
[nodemon] watching extensions: js,mjs,json
[nodemon] starting `babel-node scrapper.js`
queued task
Block and crach :(
(node:23058) UnhandledPromiseRejectionWarning: Error: Request is already handled!
    at Object.exports.assert (/opt/my/zen/node_modules/puppeteer/lib/cjs/puppeteer/common/assert.js:26:15)
    at HTTPRequest.abort (/opt/my/zen/node_modules/puppeteer/lib/cjs/puppeteer/common/HTTPRequest.js:317:21)
    at requestHandler (/opt/my/zen/node_modules/puppeteer-page-proxy/src/core/proxy.js:41:23)
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:97:5)
(node:23058) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 3)
(node:23058) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
^C

how can I get around this? Need to use a proxy for the per page, not for per requests

@birdkiwi
Copy link

@chapov I think you forgot about request.continue() after if statement.

@chapov
Copy link

chapov commented Sep 16, 2020

oh, I wrote in the wrong place, here is another topic...

@birdkiwi not, everything falls to request.abort (), stacktrace refers to:
/opt/my/zen/node_modules/puppeteer-page-proxy/src/core/proxy.js:41:23
there:

request.abort()

@Cuadrix Cuadrix added the enhancement New feature or request label Oct 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants