Unfortunately, FMUs (fmi-standard.org) are not differentiable by design. To enable their full potential inside Julia, FMISensitivity.jl makes FMUs fully differentiable, regarding to:
- states and derivatives
- inputs, outputs and other observable variables
- parameters
- event indicators
- explicit time
- state change sensitivity by event
$\partial x^{+} / \partial x^{-}$ (if paired with FMIFlux.jl)
This opens up to many applications like:
- FMUs in Scientific Machine Learning, for example as part of Neural(O)DEs or PINNs with FMIFlux.jl
- gradient-based optimization of FMUs (typically parameters) with FMI.jl (also dynamic optimization)
- linearization, linear analysis and controller design
- adding directional derivatives for existing FMUs with the power of Julia AD and FMIExport.jl [Tutorial is WIP]
- ...
Supported AD-Frameworks are:
- ForwardDiff
- FiniteDiff
- ReverseDiff
- Zygote [WIP]
Here, FMISensitivity.jl uses everything the FMI-standard and Julia currently offers:
- FMI built-in directional derivatives [DONE] and adjoint derivatives [WIP]
- Finite Differences (by FiniteDiff.jl) for FMUs that don't offer sensitivity information, as well as for special derivatives that are not part of the FMI-standard (like e.g. event-indicators or explicit time)
- coloring based on sparsity information shipped with the FMU [WIP]
- coloring based on sparsity detection for FMUs without sparsity information [WIP]
- implicite differentation
- ...
FMISensitivity.jl is part of FMIFlux.jl. If you only need FMU sensitivities without anything around and want to keep the dependencies as small as possible, FMISensitivity.jl might be the right way to go. You can install it via:
1. Open a Julia-REPL, switch to package mode using ]
, activate your preferred environment.
2. Install FMISensitivity.jl:
(@v1) pkg> add FMISensitivity
3. If you want to check that everything works correctly, you can run the tests bundled with FMISensitivity.jl:
(@v1) pkg> test FMISensitivity
4. Have a look inside the examples folder in the examples branch or the examples section of the documentation of the FMI.jl package. All examples are available as Julia-Script (.jl), Jupyter-Notebook (.ipynb) and Markdown (.md).
To keep dependencies nice and clean, the original package FMI.jl had been split into new packages:
- FMI.jl: High level loading, manipulating, saving or building entire FMUs from scratch
- FMIImport.jl: Importing FMUs into Julia
- FMIExport.jl: Exporting stand-alone FMUs from Julia Code
- FMIBase.jl: Common concepts for import and export of FMUs
- FMICore.jl: C-code wrapper for the FMI-standard
- FMISensitivity.jl: Static and dynamic sensitivities over FMUs
- FMIBuild.jl: Compiler/Compilation dependencies for FMIExport.jl
- FMIFlux.jl: Machine Learning with FMUs
- FMIZoo.jl: A collection of testing and example FMUs
FMISensitivity.jl is tested (and testing) under Julia Versions 1.6 LTS and latest on Windows latest and Ubuntu latest. x64
architectures are tested. Mac and x86-architectures might work, but are not tested.
Coming soon ...
Tobias Thummerer, Lars Mikelsons and Josef Kircher. 2021. NeuralFMU: towards structural integration of FMUs into neural networks. Martin Sjölund, Lena Buffoni, Adrian Pop and Lennart Ochel (Ed.). Proceedings of 14th Modelica Conference 2021, Linköping, Sweden, September 20-24, 2021. Linköping University Electronic Press, Linköping (Linköping Electronic Conference Proceedings ; 181), 297-306. DOI: 10.3384/ecp21181297