Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fails to parse long outputs #34

Open
ssfrr opened this issue Sep 21, 2017 · 5 comments
Open

Fails to parse long outputs #34

ssfrr opened this issue Sep 21, 2017 · 5 comments

Comments

@ssfrr
Copy link

ssfrr commented Sep 21, 2017

I'm working on implementing better autocomplete integration into the Atom editor package, which requires getting data from Supercollider as JSON data (rather than just as strings to display to the user). I've been having some issues that I traced down to supercolliderjs seemingly having trouble with commands that generate a lot of output data.

Here's an example that reproduces the problem that can be run from node:

var sc = require('supercolliderjs');

sc.lang.boot({'sclang': '/usr/bin/sclang'})
.then(function(sclang) {
  sclang.interpret('SCDoc.documents')
    .then(function(result) {
      console.log('Got Docs');
    }, function(error) {
      console.error(error);
    });
}, function(error) {
  console.error(error)
});

Sometimes this succeeds, but more often than not I get a parse error:

SyntaxError: Unexpected token s in JSON at position 96738
    at JSON.parse (<anonymous>)
    at Object.fn (/home/sfr/local/src/supercolliderjs_test/node_modules/supercolliderjs/lib/lang/internals/sclang-io.js:270:28)
    at /home/sfr/local/src/supercolliderjs_test/node_modules/supercolliderjs/lib/lang/internals/sclang-io.js:98:21
    at Array.forEach (<anonymous>)
    at SclangIO.parse (/home/sfr/local/src/supercolliderjs_test/node_modules/supercolliderjs/lib/lang/internals/sclang-io.js:92:31)
    at Socket.<anonymous> (/home/sfr/local/src/supercolliderjs_test/node_modules/supercolliderjs/lib/lang/sclang.js:406:29)
    at emitOne (events.js:115:13)
    at Socket.emit (events.js:210:7)
    at addChunk (_stream_readable.js:266:12)
    at readableAddChunk (_stream_readable.js:253:11)
    at Socket.Readable.push (_stream_readable.js:211:10)
    at Pipe.onread (net.js:585:20)

I dumped the actual stdout JS variable right before the attempted parsing and the output is here: https://gist.github.com/ssfrr/92c9f725185b68db76d960a5110caaa6

Pasting into a json validator like jslint.com shows where the parsing is breaking down, and It looks like some random chunk got lost in the middle.

Is there a fixed-size buffer somewhere that is overflowing?

@ssfrr
Copy link
Author

ssfrr commented Sep 21, 2017

More (possibly) useful debugging:

echo 'SuperColliderJS.interpret(1234, "SCDoc.documents", nil, false)' | sclang > sclang_output

Dumps everything supercollider generates to a file. That file is about 250KB, so if there's any intermediate step that throws out data when it hits something smaller than that we're in trouble.

@ssfrr
Copy link
Author

ssfrr commented Sep 21, 2017

OK. I think I know what's going on. This gist shows the output from the above node script, with each stdout callback displayed as:

------STDOUT-------
content
-------------------

at line 181 you see that the text that the callback is called with doesn't start with SUPERCOLLIDERJS, because it's actually the end of the previous chunk. The callback handler for the ready state called here assumes that there's no leftover text from the previous chunk and that the SUPERCOLLIDERJS block starts at the beginning.

One solution might be to have the regex capture anything before the first SUPERCOLLIDERJS and append it to the previous chunk.

@crucialfelix
Copy link
Owner

crucialfelix commented Sep 21, 2017 via email

@crucialfelix
Copy link
Owner

I was looking for some kind of .flush method to call. There is one for IOStream but none for the main thread. Internally I know there is a buffer that can be flushed. sclang is a bit messy as you can see. This is why reading the compile startup is tricky because it comes out in blurps.

It looks like it had more to post so it just took a break to handle some event and then went back to posting.

A dedicated socket would certainly solve that.

Compressing the data before posting would help (but only if you had proper unicode support). You can of course fetch in smaller batches or dump to a file.

@crucialfelix
Copy link
Owner

btw. I had assumed that it would be best to dump the API to a JSON file rather than query it live over the bridge. If you cannot boot sc (compile errors) then you can still get access to the autocomplete/API dump if the file is still around. That was my thinking. I don't have the slightest bit of time to work on it anyway (and supercollider is a silly language, eh?)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants