Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Q: Server Sent Event - maximum string length? #86

Open
moehlone opened this issue Jul 1, 2016 · 6 comments
Open

Q: Server Sent Event - maximum string length? #86

moehlone opened this issue Jul 1, 2016 · 6 comments
Assignees

Comments

@moehlone
Copy link

moehlone commented Jul 1, 2016

Hi, I was just experimenting and sent a content with a length of 95785 characters:

res.write("data: "+ content + "\n\n");
res.flush();

I do not receive any message in the browser until I remove gzipping..
Is there a maximum length?

@dougwilson dougwilson self-assigned this Jul 2, 2016
@dougwilson
Copy link
Contributor

Hi @moehlone, so there is no inherit limit on purpose with this module. It's certainly not out of the question that there is some kind of bug going on. We will investigate as we have the time. If you can also spend time to investigate and provide your findings on the cause, that would be very helpful :)

In addition, can you also provide us with the following details in order for us to start our investigation?

  1. The version of Node.js you are using
  2. The version of this module you are using.
  3. A complete, self-contained server that uses this module as as little extra code as possible that we can copy-and-paste into a file and run to reproduce the issue.
  4. Detailed instructions on how to reproduce the issue (including noting what browser and version if that's part of the reproduction), and what you are expecting to see if the issue was fixed.

Thank you!

@moehlone
Copy link
Author

moehlone commented Jul 4, 2016

Hi @dougwilson ,

I am running node with nvm at version v6.2.2. The package versions you can take from the package.json example below:

{
  "name": "compress_test",
  "version": "0.0.1",
  "scripts": {
    "start": "node app.js"
  },
  "dependencies": {
    "compression": "^1.6.2",
    "express": "~4.13.1"
  }
}

To reproduce you can take this app.js server snippet:

var express = require("express");
var compress = require("compression")
var app = express();

app.use(compress());

app.get('/', function(req, res) {
    res.send(' \
<!DOCTYPE html> <html> <body> \
    <script type="text/javascript"> \
            var source = new EventSource("test"); \
            source.onmessage = function(e) { \
                console.log(e); \
            }; \
</script> </body> </html>');
});

app.get('/test', function(req, res) {
    res.writeHead(200, {
        'Content-Type': 'text/event-stream',
        'Cache-Control': 'no-cache',
        'Connection': 'keep-alive'
    });
    res.write("data: "+ Array(16376).join('x') + "\n\n");
    res.flush();
});

app.listen(process.env.PORT || 8080); 

The length of 16376 is currently the maximum I can send/receive - anything above wont get received by the template. I can reproduce this in the current version of Chrome and Firefox, other browsers were not tested.

  1. download and unzip compress_sse_test.zip to new folder
  2. run npm install
  3. run node app.js
  4. open browser with http://localhost:8080 and inspect console -> SSE event was logged to console
  5. change Array(16376).join('x') to Array(16377).join('x')
  6. rerun node app.js
  7. reload browser -> nothing is logged to console

Maybe this helps - I will look into the source next weekend.

Best regards
Philipp

PS: remove compress middleware and everything works fine :/

@moehlone
Copy link
Author

Sorry, did not had any time to look into this, but it`s on my list!

@cdnbacon
Copy link

cdnbacon commented Sep 7, 2016

@moehlone @dougwilson we've had a related issue. We use Chunk Transfer Encoding in our setup with explicit flushes and gzip and we've ended up on a chunk boundary.

Removing the compression middleware worked, but the compression benefits are too good to pass up. We used curl and a handy module called gunzip-maybe to see the last bytes transferred (seems like the usual gunzip would not provide that granularity).

Our testing showed that the response aligned with the NodeJS defaults of 16K, see https://nodejs.org/api/zlib.html#zlib_class_options, which also include a default of flush = Z_NO_FLUSH, which wasn't great for our use case :)

Fortunately, you can pass the zlib factory method options over to compression and it will happily foward them along to the appropriate factory method, see https://github.com/expressjs/compression/blob/master/index.js#L192-L194

In our case, passing {flush: zlib.Z_SYNC_FLUSH} worked out and got us the last few bytes we were hoping to see at the end of our first chunk instead of at the beginning of the next one.

The constants are documented on https://nodejs.org/api/zlib.html#zlib_constants and link off to a site providing much valuable detail, http://zlib.net/manual.html#Basic. This site also provides some more color to the use of the constants http://www.bolet.org/~pornin/deflate-flush.html.

Hopefully setting up those options would solve your issue as it has for us.

@dougwilson if you think this warrants an update to the README, I'm happy to write a PR for it.

Cheers!

@aaliddell
Copy link

aaliddell commented Jul 27, 2017

I just hit this issue also and it appears it is related to an issue with the node zlib module not flushing properly when you've written more than a single chunk's worth of data (~16384 bytes).
I've written this up as an issue on the node project here nodejs/node#14523.

@aaliddell
Copy link

The fix for this was made in nodejs/node#14527 (and nodejs/node#14571), which made it into today's 8.3.0 release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants