Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check that the tests compile, directly in the language server #662

Open
mpizenberg opened this issue Dec 3, 2021 · 6 comments
Open

Check that the tests compile, directly in the language server #662

mpizenberg opened this issue Dec 3, 2021 · 6 comments

Comments

@mpizenberg
Copy link

mpizenberg commented Dec 3, 2021

If I'm correct, currently elmls is using elm-test make to figure out if the tests are compiling. And we've had some discussions if it could be useful to use elm-test-rs make since this does not spin up a Node process and is thus faster. But what if this could be done easily, directly in the language server instead? Let me explain.

The steps to do this, which are roughly the steps done by elm-test-rs are the following:

  1. Generate the list of test modules and their file paths.
  2. Generate an elm.json with the correct dependencies for the to-be-generated Runner.elm.
  3. Generate and compile Runner.elm with imports of all the tests files.

I don't know the code base of the language server, but if I had to guess I'd mark each of those tasks as follows:

  1. [done] Generate the list of test modules and their file paths.
  2. [to do] Generate an elm.json with the correct dependencies for the to-be-generated Runner.elm.
  3. [easy to do] Generate and compile Runner.elm with imports of all the tests files.

Generation of correct dependencies can be a bit tricky and resulted in many edge case bugs in elm-test before it used elm-json for this. And in elm-test-rs, we use pubgrub for this. The thing is, I'm fairly confident that I could extract some code from elm-test-rs and make a WebAssembly module, wrapped in an npm JavaScript package with the following function:

// Returns the `elm.json` for the `Runner.elm` file concatenating all tests.
//
// Inputs:
//   projectElmJson: the JSON string containing the `elm.json` file of the project.
//   elmJsonFetcher: a function able to retrieve the `elm.json` of a given package version.
//       elmJsonFetcher(author, package, version) -> string
function solveTestDeps(projectElmJson, elmJsonFetcher)

Generating the Runner.elm file just consists in filling the following template by remplacing the {{ imports }}, so I believe it could be easy todo with what you already currently have in the language server.

module Runner exposing (main)

{{ imports }}

main : Program () () ()
main = Debug.todo "main"

Let me know what you think of this.

@razzeee
Copy link
Member

razzeee commented Dec 4, 2021

While this is nice, I'm not sure why it would be worth it. We don't want to just know if it parses, we also want to run the tests. So we can't drop the dependency on elm-test after this anyway?

@mpizenberg
Copy link
Author

mpizenberg commented Dec 4, 2021

Well if this seems rather easy to you, going just the extra mile to be able to also run the tests should be a breeze. You see the full steps to get the tests to run are as follow:

  1. [done] Generate the list of test modules and their file paths.
  2. [to do] Generate an elm.json with the correct dependencies for the to-be-generated Runner.elm.
  3. [done] Find all exposed tests.
  4. [easy to do] Generate Runner.elm with a main test concatenating all found exposed tests.
  5. [easy to do] Compile it into a JS file wrapped into a Node worker module.
  6. [done] Compile Reporter.elm into a Node module.
  7. [easy to do] Generate and start the Node supervisor program.

So step 4/5 consists in compiling the Runner.elm with the two variables {{ imports }} and {{ potential_tests }}, and then this will be loaded by a node file node_runner.js via const { Elm } = require("./Runner.elm.js");. I've linked the node_runner.js file which is ready to be used as is. Since you already know how to find the tests, there is no need for the check logic inside the Runner.elm file, and potential_tests is actually actual_tests for you.

Step 6 consists in compiling Reporter.elm which is completely independant from the user code so can be done in advance and the resulting js file directly embedded in the code base of the language server.

Finally step 7 consists in filling the node_supervisor.js template which is ready to be used as is and just needs to fill the following template variables:

  • {{ workersCount }}: the number of node workers to use to run the tests
  • {{ verbosity }}: the verbosity level, you could get rid of this in the template
  • {{ initialSeed }}: the initial seed for the tests
  • {{ fuzzRuns }}: the number fuzz runs for fuzz tests
  • {{ reporter }}: the report format, one of json | junit | exercism | consoleColor | consoleNoColor
  • {{ globs }}: the globs used as arguments of the elm-test command
  • {{ paths }}: the paths of the tests files

@razzeee
Copy link
Member

razzeee commented Dec 4, 2021

can't we just use elm-test-rs as a web assembly?
I really don't want another implementation of tests, that's likely going to be different from what your going to run on your CI.
Yes, we want to have elmLS be standalone and potentially run on your CI too, but I don't think we'll get there anytime soon.

@mpizenberg
Copy link
Author

Nope it's impossible, or too complicated because webassembly doesn't have the capabilities to interact with the outside world, so reading and writing files, talking to the net, etc. There is a standard called WASI, which enables that but it's not broadly supported yet in webassembly runtimes. And even then, there are some dependencies in elm-test-rs, for SSL, which are not possible to compile to wasm with WASI. So no, it's not possible.

@mpizenberg
Copy link
Author

In my opinion, if I manage to do the dependency solver as wasm thing, integrating the rest to have a fully working tests runner within the language server is a matter of 2 hours of pair programming with both of us.

@razzeee
Copy link
Member

razzeee commented Dec 5, 2021

Sounds about right, I think we could spend more time on some edge cases after that and exposing the server to the client for the test integration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants