Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature/merge from upstream 2024 12 11 #477

Merged
merged 70 commits into from
Dec 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
70 commits
Select commit Hold shift + click to select a range
ac82f34
trilinos@develop: update kokkos dependency (#47838)
balay Dec 4, 2024
6aafefd
package version: Neovim 0.10.2 (#47925)
taliaferro Dec 4, 2024
aa81d59
directives: don't include `Optional` in `PatchesType`
tgamblin Nov 30, 2024
175a4bf
directives: use `Type[PackageBase]` instead of `PackageBase`
tgamblin Nov 30, 2024
f545269
directives: add type annotations to `DirectiveMeta` class
tgamblin Nov 30, 2024
8b1009a
`resource`: clean up arguments and typing
tgamblin Nov 25, 2024
22d104d
ci: add bootstrap stack for [email protected]:3.13 (#47719)
haampie Dec 5, 2024
3fcc38e
pandoramonitoring,pandorasdk: change docstrings that are wrong (#47937)
jmcarcell Dec 5, 2024
1f2a68f
tar: conditionally link iconv (#47933)
kftsehk Dec 5, 2024
4693b32
spack.mirror: split into submodules (#47936)
haampie Dec 5, 2024
c1b2ac5
solver: partition classes related to requirement parsing into their o…
alalazo Dec 5, 2024
901cea7
Add conflict for pixman with Intel Classic (#47922)
climbfuji Dec 5, 2024
112e47c
Don't inject import statements in package recipes
alalazo Dec 5, 2024
b808338
py-uxarray: new package plus dependencies (#47573)
climbfuji Dec 6, 2024
a8da799
Bump up the version for rocm-6.2.4 release (#47707)
srekolam Dec 6, 2024
f181ac1
Upgraded version specs for ECMWF packages: eckit, atlas, ectrans, fck…
srherbener Dec 6, 2024
94bd7b9
build_environment: drop off by one fix (#47960)
haampie Dec 6, 2024
5c88e03
directives.py: remove redundant import (#47965)
haampie Dec 6, 2024
77e2187
coverage.yml: fail_ci_if_error = true (#47731)
wdconinc Dec 6, 2024
05acd29
extensions.py: remove import of spack.cmd (#47963)
haampie Dec 7, 2024
f54c101
py-jedi: add v0.19.2 (#47569)
alecbcs Dec 7, 2024
422f829
mirrors: add missing init file (#47977)
haampie Dec 8, 2024
79d7996
celeritas: patch 0.5.0 for [email protected]: (#47976)
wdconinc Dec 8, 2024
0b7fc36
e4s ci: add lammps +rocm (#47929)
eugeneswalker Dec 8, 2024
8a9e16d
aws-pcluster stacks: static spack.yaml (#47918)
stephenmsachs Dec 8, 2024
fc105a1
build(deps): bump types-six in /.github/workflows/requirements/style …
dependabot[bot] Dec 9, 2024
f15e5f7
mold: Add 2.35.0 (#47984)
msimberg Dec 9, 2024
a72490f
coverage.yml: set fail_ci_if_error = false again (#47986)
wdconinc Dec 9, 2024
9cb2070
gh: add v2.59.0 -> v2.63.2 (#47958)
albestro Dec 9, 2024
da83ab3
add soci 4.0.3 (#47983)
kftsehk Dec 9, 2024
b2a86fc
py-plac: add v1.4.3; restrict to python@:3.11 for older (#47982)
wdconinc Dec 9, 2024
4d6347c
node-js: patch for %gcc@12.[1-2] when @22.2:22.5 (#47979)
wdconinc Dec 9, 2024
728f13d
mapl: add v2.51.0 (#47968)
mathomp4 Dec 9, 2024
7c74247
py-greenlet: remove preference for v2.0.2 (#47962)
wdconinc Dec 9, 2024
ab4a645
various pkgs: use https homepage when http redirects (github.io) (#47…
wdconinc Dec 9, 2024
49efa71
acts dependencies: new versions as of 2024/12/08 (#47981)
stephenswat Dec 9, 2024
c3e92a3
py-httpx: add v0.28, v0.28.1 (#47970)
dmagdavector Dec 9, 2024
728c5e0
add main branch (#47952)
gardner48 Dec 9, 2024
12dd120
geant4: add v11.3.0 (#47961)
wdconinc Dec 9, 2024
8d83baa
gromacs: conflict %apple-clang and +openmp (#47935)
mabraham Dec 10, 2024
0189e92
dlb: add v3.5.0 (#47916)
vlopezh Dec 10, 2024
d68462a
justbuild: add version 1.4.1 (#47902)
asartori86 Dec 10, 2024
fe0f4c1
gromacs: support version 2024.4 (#47900)
al42and Dec 10, 2024
24fc720
py-twine: add v6.0.1 (#47899)
adamjstewart Dec 10, 2024
36f3566
highfive: update maintainers. (#47896)
1uc Dec 10, 2024
42333ad
extrae: relax requirements on binutils (#47893)
jgraciahlrs Dec 10, 2024
f3c6f00
eztrace: new version for building from the dev branch of the git repo…
trahay Dec 10, 2024
449a462
gurobi: add versions 11 and 12 (#47889)
upsj Dec 10, 2024
855943f
py-mgmetis: remove constrains 3.X for mpi4py & 1.X for numpy depandan…
tech-91 Dec 10, 2024
5232ee1
tecplot: updated hash for 2024r1m1 (#47886)
LRWeber Dec 10, 2024
15f3851
py-scikit-learn: add v1.6.0 (#47998)
adamjstewart Dec 10, 2024
478647f
py-numpy: add v2.2.0 (#47999)
adamjstewart Dec 10, 2024
466c3ab
Remove remaining use of deprecated test callback (#47995)
tldahlgren Dec 10, 2024
30c0035
make level_zero variant consistent, add missing instances (#47985)
haampie Dec 10, 2024
b50dbb8
pipelines: simplify and lint aws-pcluster-* (#47989)
alalazo Dec 10, 2024
3a1c0f5
llvm: add v19.1.5 (#47897)
prckent Dec 10, 2024
7e5b5f8
veccore: add v0.8.2 (#47855)
wdconinc Dec 10, 2024
7bb6c9b
py-disbatch: add new package at version 3.0 (#47988)
lgarrison Dec 10, 2024
d47629a
py-jupyterlab-server: add v2.23 to 2.27 (#47969)
dmagdavector Dec 10, 2024
accd3ca
highfive: add v2.10.1 (#47914)
1uc Dec 10, 2024
c23ffbb
geant4: patch typo in wroot (#47955)
DingXuefeng Dec 10, 2024
b6e4ff0
py-nbclassic: add v1.1.0 (#47946)
dmagdavector Dec 10, 2024
84ea7db
hp2p: new package (#47950)
rfbgo Dec 10, 2024
316dcc1
Set the "build_jobs" on concretization/generate for CI (#47660)
kwryankrattiger Dec 10, 2024
cb8880b
Update compadre and py-pycompadre to v1.6.0 (#47948)
kuberry Dec 10, 2024
ae28528
sycl runtime needs umf (#48011)
rscohn2 Dec 10, 2024
a3985e7
Revert "Set the "build_jobs" on concretization/generate for CI (#4766…
scottwittenburg Dec 11, 2024
0352552
llnl.path: make system_path_filter a noop on non-win32 (#48032)
haampie Dec 11, 2024
e9d2732
log.py: improve utf-8 handling, and non-utf-8 output (#48005)
haampie Dec 11, 2024
d35bcd0
Merge remote-tracking branch 'upstream/develop' into fnal-develop
greenc-FNAL Dec 11, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/coverage.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,3 +32,4 @@ jobs:
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
with:
verbose: true
fail_ci_if_error: false
2 changes: 1 addition & 1 deletion .github/workflows/requirements/style/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,5 @@ clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20241105
types-six==1.17.0.20241205
vermin==1.6.0
4 changes: 2 additions & 2 deletions lib/spack/docs/packaging_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5137,7 +5137,7 @@ other checks.
- Not applicable
* - :ref:`PythonPackage <pythonpackage>`
- Not applicable
- ``test`` (module imports)
- ``test_imports`` (module imports)
* - :ref:`QMakePackage <qmakepackage>`
- ``check`` (``make check``)
- Not applicable
Expand All @@ -5146,7 +5146,7 @@ other checks.
- Not applicable
* - :ref:`SIPPackage <sippackage>`
- Not applicable
- ``test`` (module imports)
- ``test_imports`` (module imports)
* - :ref:`WafPackage <wafpackage>`
- ``build_test`` (must be overridden)
- ``install_test`` (must be overridden)
Expand Down
12 changes: 11 additions & 1 deletion lib/spack/llnl/path.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ def _is_url(path_or_url: str) -> bool:
return result


def system_path_filter(_func=None, arg_slice: Optional[slice] = None):
def _system_path_filter(_func=None, arg_slice: Optional[slice] = None):
"""Filters function arguments to account for platform path separators.
Optional slicing range can be specified to select specific arguments

Expand Down Expand Up @@ -100,6 +100,16 @@ def path_filter_caller(*args, **kwargs):
return holder_func


def _noop_decorator(_func=None, arg_slice: Optional[slice] = None):
return _func if _func else lambda x: x


if sys.platform == "win32":
system_path_filter = _system_path_filter
else:
system_path_filter = _noop_decorator


def sanitize_win_longpath(path: str) -> str:
"""Strip Windows extended path prefix from strings
Returns sanitized string.
Expand Down
22 changes: 14 additions & 8 deletions lib/spack/llnl/util/tty/log.py
Original file line number Diff line number Diff line change
Expand Up @@ -879,10 +879,13 @@ def _writer_daemon(
write_fd.close()

# 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O.
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default
# that prevents unbuffered text I/O. [needs citation]
# 2. Enforce a UTF-8 interpretation of build process output with errors replaced by '?'.
# The downside is that the log file will not contain the exact output of the build process.
# 3. closefd=False because Connection has "ownership"
read_file = os.fdopen(read_fd.fileno(), "r", 1, encoding="utf-8", closefd=False)
read_file = os.fdopen(
read_fd.fileno(), "r", 1, encoding="utf-8", errors="replace", closefd=False
)

if stdin_fd:
stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False)
Expand Down Expand Up @@ -928,11 +931,7 @@ def _writer_daemon(
try:
while line_count < 100:
# Handle output from the calling process.
try:
line = _retry(read_file.readline)()
except UnicodeDecodeError:
# installs like --test=root gpgme produce non-UTF8 logs
line = "<line lost: output was not encoded as UTF-8>\n"
line = _retry(read_file.readline)()

if not line:
return
Expand All @@ -946,6 +945,13 @@ def _writer_daemon(
output_line = clean_line
if filter_fn:
output_line = filter_fn(clean_line)
enc = sys.stdout.encoding
if enc != "utf-8":
# On Python 3.6 and 3.7-3.14 with non-{utf-8,C} locale stdout
# may not be able to handle utf-8 output. We do an inefficient
# dance of re-encoding with errors replaced, so stdout.write
# does not raise.
output_line = output_line.encode(enc, "replace").decode(enc)
sys.stdout.write(output_line)

# Stripped output to log file.
Expand Down
6 changes: 3 additions & 3 deletions lib/spack/spack/audit.py
Original file line number Diff line number Diff line change
Expand Up @@ -693,19 +693,19 @@ def invalid_sha256_digest(fetcher):
return h, True
return None, False

error_msg = "Package '{}' does not use sha256 checksum".format(pkg_name)
error_msg = f"Package '{pkg_name}' does not use sha256 checksum"
details = []
for v, args in pkg.versions.items():
fetcher = spack.fetch_strategy.for_package_version(pkg, v)
digest, is_bad = invalid_sha256_digest(fetcher)
if is_bad:
details.append("{}@{} uses {}".format(pkg_name, v, digest))
details.append(f"{pkg_name}@{v} uses {digest}")

for _, resources in pkg.resources.items():
for resource in resources:
digest, is_bad = invalid_sha256_digest(resource.fetcher)
if is_bad:
details.append("Resource in '{}' uses {}".format(pkg_name, digest))
details.append(f"Resource in '{pkg_name}' uses {digest}")
if details:
errors.append(error_cls(error_msg, details))

Expand Down
34 changes: 17 additions & 17 deletions lib/spack/spack/binary_distribution.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
import spack.hash_types as ht
import spack.hooks
import spack.hooks.sbang
import spack.mirror
import spack.mirrors.mirror
import spack.oci.image
import spack.oci.oci
import spack.oci.opener
Expand Down Expand Up @@ -369,7 +369,7 @@ def update(self, with_cooldown=False):
on disk under ``_index_cache_root``)."""
self._init_local_index_cache()
configured_mirror_urls = [
m.fetch_url for m in spack.mirror.MirrorCollection(binary=True).values()
m.fetch_url for m in spack.mirrors.mirror.MirrorCollection(binary=True).values()
]
items_to_remove = []
spec_cache_clear_needed = False
Expand Down Expand Up @@ -1176,7 +1176,7 @@ def _url_upload_tarball_and_specfile(


class Uploader:
def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool):
def __init__(self, mirror: spack.mirrors.mirror.Mirror, force: bool, update_index: bool):
self.mirror = mirror
self.force = force
self.update_index = update_index
Expand Down Expand Up @@ -1224,7 +1224,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class OCIUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
base_image: Optional[str],
Expand Down Expand Up @@ -1273,7 +1273,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class URLUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
signing_key: Optional[str],
Expand All @@ -1297,7 +1297,7 @@ def push(


def make_uploader(
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool = False,
update_index: bool = False,
signing_key: Optional[str] = None,
Expand Down Expand Up @@ -1953,9 +1953,9 @@ def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=No
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
configured_mirrors: Iterable[spack.mirrors.mirror.Mirror] = (
spack.mirrors.mirror.MirrorCollection(binary=True).values()
)
if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.")

Expand All @@ -1980,7 +1980,7 @@ def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirror.Mirror(url)
return spack.mirrors.mirror.Mirror(url)

mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]

Expand Down Expand Up @@ -2650,7 +2650,7 @@ def try_direct_fetch(spec, mirrors=None):
specfile_is_signed = False
found_specs = []

binary_mirrors = spack.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
binary_mirrors = spack.mirrors.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()

for mirror in binary_mirrors:
buildcache_fetch_url_json = url_util.join(
Expand Down Expand Up @@ -2711,7 +2711,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
if spec is None:
return []

if not spack.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
if not spack.mirrors.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
tty.debug("No Spack mirrors are currently configured")
return {}

Expand Down Expand Up @@ -2750,7 +2750,7 @@ def clear_spec_cache():

def get_keys(install=False, trust=False, force=False, mirrors=None):
"""Get pgp public keys available on mirror with suffix .pub"""
mirror_collection = mirrors or spack.mirror.MirrorCollection(binary=True)
mirror_collection = mirrors or spack.mirrors.mirror.MirrorCollection(binary=True)

if not mirror_collection:
tty.die("Please add a spack mirror to allow " + "download of build caches.")
Expand Down Expand Up @@ -2805,7 +2805,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):


def _url_push_keys(
*mirrors: Union[spack.mirror.Mirror, str],
*mirrors: Union[spack.mirrors.mirror.Mirror, str],
keys: List[str],
tmpdir: str,
update_index: bool = False,
Expand Down Expand Up @@ -2872,7 +2872,7 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None):

"""
rebuilds = {}
for mirror in spack.mirror.MirrorCollection(mirrors, binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(mirrors, binary=True).values():
tty.debug("Checking for built specs at {0}".format(mirror.fetch_url))

rebuild_list = []
Expand Down Expand Up @@ -2916,7 +2916,7 @@ def _download_buildcache_entry(mirror_root, descriptions):


def download_buildcache_entry(file_descriptions, mirror_url=None):
if not mirror_url and not spack.mirror.MirrorCollection(binary=True):
if not mirror_url and not spack.mirrors.mirror.MirrorCollection(binary=True):
tty.die(
"Please provide or add a spack mirror to allow " + "download of buildcache entries."
)
Expand All @@ -2925,7 +2925,7 @@ def download_buildcache_entry(file_descriptions, mirror_url=None):
mirror_root = os.path.join(mirror_url, BUILD_CACHE_RELATIVE_PATH)
return _download_buildcache_entry(mirror_root, file_descriptions)

for mirror in spack.mirror.MirrorCollection(binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True).values():
mirror_root = os.path.join(mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH)

if _download_buildcache_entry(mirror_root, file_descriptions):
Expand Down
4 changes: 2 additions & 2 deletions lib/spack/spack/bootstrap/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
import spack.binary_distribution
import spack.config
import spack.detection
import spack.mirror
import spack.mirrors.mirror
import spack.platforms
import spack.spec
import spack.store
Expand Down Expand Up @@ -91,7 +91,7 @@ def __init__(self, conf: ConfigDictionary) -> None:
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])

# Promote (relative) paths to file urls
self.url = spack.mirror.Mirror(conf["info"]["url"]).fetch_url
self.url = spack.mirrors.mirror.Mirror(conf["info"]["url"]).fetch_url

@property
def mirror_scope(self) -> spack.config.InternalConfigScope:
Expand Down
13 changes: 3 additions & 10 deletions lib/spack/spack/build_environment.py
Original file line number Diff line number Diff line change
Expand Up @@ -1426,27 +1426,20 @@ def make_stack(tb, stack=None):
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
filename = inspect.getfile(frame.f_code)
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1

lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
lines = [f"{filename}:{frame.f_lineno}, in {frame.f_code.co_name}:"]

# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)

# Calculate lineno of the error relative to the start of the function.
fun_lineno = lineno - start
fun_lineno = frame.f_lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]

for i, line in enumerate(sourcelines):
is_error = start_ctx + i == fun_lineno
mark = ">> " if is_error else " "
# Add start to get lineno relative to start of file, not function.
marked = " {0}{1:-6d}{2}".format(mark, start + start_ctx + i, line.rstrip())
marked = f" {'>> ' if is_error else ' '}{start + start_ctx + i:-6d}{line.rstrip()}"
if is_error:
marked = colorize("@R{%s}" % cescape(marked))
lines.append(marked)
Expand Down
11 changes: 6 additions & 5 deletions lib/spack/spack/ci.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,8 @@
import spack.config as cfg
import spack.error
import spack.main
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.paths
import spack.repo
import spack.spec
Expand Down Expand Up @@ -204,7 +205,7 @@ def _print_staging_summary(spec_labels, stages, rebuild_decisions):
if not stages:
return

mirrors = spack.mirror.MirrorCollection(binary=True)
mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
tty.msg("Checked the following mirrors for binaries:")
for m in mirrors.values():
tty.msg(f" {m.fetch_url}")
Expand Down Expand Up @@ -797,7 +798,7 @@ def ensure_expected_target_path(path):
path = path.replace("\\", "/")
return path

pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
raise SpackCIError("spack ci generate requires a mirror named 'buildcache-destination'")
Expand Down Expand Up @@ -1323,7 +1324,7 @@ def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: b
"""
tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})")
signing_key = bindist.select_signing_key() if sign_binaries else None
mirror = spack.mirror.Mirror.from_url(mirror_url)
mirror = spack.mirrors.mirror.Mirror.from_url(mirror_url)
try:
with bindist.make_uploader(mirror, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])
Expand All @@ -1343,7 +1344,7 @@ def remove_other_mirrors(mirrors_to_keep, scope=None):
mirrors_to_remove.append(name)

for mirror_name in mirrors_to_remove:
spack.mirror.remove(mirror_name, scope)
spack.mirrors.utils.remove(mirror_name, scope)


def copy_files_to_artifacts(src, artifacts_dir):
Expand Down
24 changes: 24 additions & 0 deletions lib/spack/spack/cmd/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

import argparse
import difflib
import importlib
import os
import re
Expand Down Expand Up @@ -125,6 +126,8 @@ def get_module(cmd_name):
tty.debug("Imported {0} from built-in commands".format(pname))
except ImportError:
module = spack.extensions.get_module(cmd_name)
if not module:
raise CommandNotFoundError(cmd_name)

attr_setdefault(module, SETUP_PARSER, lambda *args: None) # null-op
attr_setdefault(module, DESCRIPTION, "")
Expand Down Expand Up @@ -691,3 +694,24 @@ def find_environment(args):
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]


class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""

def __init__(self, cmd_name):
msg = (
f"{cmd_name} is not a recognized Spack command or extension command; "
"check with `spack commands`."
)
long_msg = None

similar = difflib.get_close_matches(cmd_name, all_commands())

if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)

super().__init__(msg, long_msg)
Loading
Loading