Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create a system for downstream documentation build testing #104

Open
ChrisRackauckas opened this issue Dec 10, 2022 · 3 comments
Open

Create a system for downstream documentation build testing #104

ChrisRackauckas opened this issue Dec 10, 2022 · 3 comments
Assignees

Comments

@ChrisRackauckas
Copy link
Member

We have a ton of information in our doc builds. It's where all of our tutorials are! So the biggest stress test is actually the documentation builds. The system of downstream testing by @giordano has been a massive success for our testing, so we would like to get something similarly setup for downstream testing of the documentation. One way I had in mind for doing this was:

  1. Setup a new test group in runtests.jl for GROUP=="DownstreamDocs"
  2. Instantiate the docs Project.toml
  3. Call include("make.jl)
  4. In the make.jl files, have a if GROUP!="DownstreamDocs" around the deployment
  5. Add DownstreamDocs to the Downstream.yml files that need the respective doc tests.

That seems innocent enough, and would work for a lot of cases. Though there are a few cases, like SciMLSensitivity.jl, SciMLDocs.jl, and DiffEqGPU.jl, which require GPUs as part of the documentation build and thus are run on the Buildkite instead of Github Actions. Some of them are just really long builds too (SciMLSensitivity We would like to put the high level SciMLDocs (with its new showcase!) onto many packages (since those are the first tutorials everyone will see), so if that requires GPU, then the system really needs to be able to use GPUs, at least for some of the tests.

For that, I might need the help of @thazhemadam @staticfloat to maybe get some thing I can paste into some Buildkite scripts?

(Note: since it won't need deployment it won't need the help with the cryptographic signatures stuff, thank god)

ChrisRackauckas added a commit to SciML/SciMLTutorials.jl that referenced this issue Dec 10, 2022
Now that all of the packages have docs with tests, and we have a whole lot more testing hardware than we ever had before, we are starting to slim down the tutorials (which only get rebuilt and fixed periodically) and moving them to the individual package docs. We are doing this and turning all of the docs into downstream tests (SciML/SciMLDocs#104) so that they are perpetually working. It will be a massive amount of compute, but we have that now, so let's make everything super robust.

Related PRs:

- SciML/JumpProcesses.jl#277
- SciML/DiffEqParamEstim.jl#188
- SciML/DiffEqBayes.jl#274
@thazhemadam
Copy link
Member

@ChrisRackauckas to be clear, the expectation here is that the documentation for every single downstream package (i.e., all packages whose docs get aggregated on docs.sciml.ai) needs to be run, to ensure that documentation for these respective packages build, right?
If yes, won't this happen when we're running the MultiDocumenter.make itself? I'm not sure I understand the benefit of running the same thing twice.

@ChrisRackauckas
Copy link
Member Author

the expectation here is that the documentation for every single downstream package (i.e., all packages whose docs get aggregated on docs.sciml.ai) needs to be run, to ensure that documentation for these respective packages build, right?

No. OrdinaryDiffEq tests (SciML/OrdinaryDiffEq.jl#1813) run downstream tests on things like SciMLSensitivity.jl. However, those do not include the tutorials in the SciMLSensitivity.jl documentation. OrdinaryDiffEq.jl should run the downstream SciMLSensitivity.jl test suite and its tutorial runs.

@thazhemadam
Copy link
Member

Ah okay, now I understand, thanks. That makes perfect sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants