New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Kolmogorov-Arnold Networks (KANs) #125389
Comments
Hey @anshika-anand, thanks for the request! To keep our maintenance burden manageable, PyTorch Core has fairly strict rules regarding adding a new Optimizer or Module function or feature: https://github.com/pytorch/pytorch/wiki/Developer-FAQ#i-have-a-new-function-or-feature-id-like-to-add-to-pytorch-should-it-be-in-pytorch-core-or-a-library-like-torchvision. Currently, KANs seem best suited to be utilized from an external library, such as pykan linked above. In the future, we'll happily reconsider if they increases in popularity to the point where it is expected within |
Proposal for adding support for Kolmogorov-Arnold Networks (KANs) in PyTorch. KANs are a promising alternative to traditional Multi-Layer Perceptrons (MLPs) and offer potential advantages in terms of accuracy, scaling, and interpretability.
KANs are inspired by the Kolmogorov-Arnold representation theorem and have a fundamentally different structure compared to MLPs. Instead of having fixed activation functions on nodes (neurons) and linear weights, KANs have learnable activation functions on edges (weights), and every weight parameter is replaced by a univariate function parameterized as a spline.
Additional context
KAN: Kolmogorov–Arnold Networks, The paper was submitted on 30th April,2024 by researchers from Massachusetts Institute of Technology, California Institute of Technology, Northeastern University, The NSF Institute for Artificial Intelligence and Fundamental Interactions.
cc @albanD @mruberry @jbschlosser @walterddr @mikaylagawarecki
The text was updated successfully, but these errors were encountered: