-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Apache License fails linter #41
Comments
Yes, I think this is a good idea, especially since conda-forge will migrate t SPDX only, soon. |
Related to this conda-forge/r-base-feedstock#110 |
Almost all CRAN supported licenses (except Apache 2.0) are afaik packaged via the R upstream tarball und thus are also part of the conda r-base package. The individual r package feedstocks refer to those r-base included licenses in their meta.yml. From that perspective a patch submitted to upstream to include also the Apache 2.0 license text file could be the first step here. After that, updating the license dict in the conda-build cran skeleton file conda_build/skeletons/cran.py could be a next step. And while doing so, the dict can also be updated to spdx. That maybe this is on a bit long time line and it makes sense to already include Apache-2.0 license file in r-base before it comes via upstream tarball. Any other thoughts? |
I'd like to second the idea of generating the appropriate SPDX identifier. I just had to change |
I have a r project with an Apache 2.0 license.
When using this helper, the license automatically pulled in becomes: "Apache License 2.0", which gets pulled into the license: recipe field
conda-forge-linter complains:
a) "The recipe license should not include the word 'License'."
b) "License is not an SPDX identifier (or a custom LicenseRef) nor an SPDX license expression."
fixed by changing to "Apache-2.0" (spdx identifiers are here: https://spdx.org/licenses/
Maybe conda_r_skeleton_helper could have a find/replace to translate the package license in CRAN to SPDX? This will prob be necessary for other licenses too.
Thanks!
The text was updated successfully, but these errors were encountered: