Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Start uncythonization #6104

Merged
merged 21 commits into from
Apr 14, 2022
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 14 additions & 6 deletions .github/workflows/conda.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ on:
branches:
- main
pull_request:
paths:
- setup.py
- continuous_integration/recipes/**
- .github/workflows/conda.yml
jakirkham marked this conversation as resolved.
Show resolved Hide resolved

# When this workflow is queued, automatically cancel any previous running
# or pending jobs from the same branch
Expand Down Expand Up @@ -56,13 +52,13 @@ jobs:
conda mambabuild continuous_integration/recipes/distributed \
--channel dask/label/dev \
--no-anaconda-upload \
--output-folder .
--output-folder build
jakirkham marked this conversation as resolved.
Show resolved Hide resolved

# dask pre-release build
conda mambabuild continuous_integration/recipes/dask \
--channel dask/label/dev \
--no-anaconda-upload \
--output-folder .
--output-folder build
jakirkham marked this conversation as resolved.
Show resolved Hide resolved
- name: Upload conda packages
if: |
github.event_name == 'push'
Expand All @@ -71,8 +67,20 @@ jobs:
env:
ANACONDA_API_TOKEN: ${{ secrets.DASK_CONDA_TOKEN }}
run: |
# convert distributed to other architectures
cd build && conda convert linux-64/*.tar.bz2 -p osx-64 \
-p osx-arm64 \
-p linux-ppc64le \
-p linux-aarch64 \
-p win-64

jakirkham marked this conversation as resolved.
Show resolved Hide resolved
# install anaconda for upload
mamba install anaconda-client

anaconda upload --label dev noarch/*.tar.bz2
anaconda upload --label dev linux-64/*.tar.bz2
anaconda upload --label dev linux-aarch64/*.tar.bz2
anaconda upload --label dev linux-ppc64le/*.tar.bz2
anaconda upload --label dev osx-64/*.tar.bz2
anaconda upload --label dev osx-arm64/*.tar.bz2
anaconda upload --label dev win-64/*.tar.bz2
jakirkham marked this conversation as resolved.
Show resolved Hide resolved
2 changes: 1 addition & 1 deletion continuous_integration/recipes/dask/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ requirements:
run:
- python >=3.8
- dask-core >={{ dask_version }}
- distributed {{ version }}=*{{ GIT_DESCRIBE_HASH }}_{{ GIT_DESCRIBE_NUMBER }}*
- distributed {{ version }}=*_{{ GIT_DESCRIBE_HASH }}_{{ GIT_DESCRIBE_NUMBER }}
- cytoolz >=0.8.2
- numpy >=1.18
- pandas >=1.0
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,3 @@
python:
- 3.8
- 3.9
cython_enabled:
- True # [linux]
- False
21 changes: 9 additions & 12 deletions continuous_integration/recipes/distributed/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,23 @@
{% set new_patch = major_minor_patch[2] | int + 1 %}
{% set version = (major_minor_patch[:2] + [new_patch]) | join('.') + environ.get('VERSION_SUFFIX', '') %}
{% set dask_version = environ.get('DASK_CORE_VERSION', '0.0.0.dev') %}
{% set build_ext = "cython" %} # [cython_enabled]
{% set build_ext = "python" %} # [not cython_enabled]


package:
name: distributed-split
name: distributed
version: {{ version }}

source:
git_url: ../../..

build:
number: {{ GIT_DESCRIBE_NUMBER }}
string: py{{ python | replace(".", "") }}_{{ GIT_DESCRIBE_HASH }}_{{ GIT_DESCRIBE_NUMBER }}
script: {{ PYTHON }} -m pip install . -vv
entry_points:
- dask-scheduler = distributed.cli.dask_scheduler:go
- dask-ssh = distributed.cli.dask_ssh:go
- dask-worker = distributed.cli.dask_worker:go

outputs:
- name: distributed-impl
jakirkham marked this conversation as resolved.
Show resolved Hide resolved
Expand All @@ -37,27 +41,20 @@ outputs:
version: {{ version }}
build:
number: {{ GIT_DESCRIBE_NUMBER }}
string: py_{{ GIT_DESCRIBE_HASH }}_{{ GIT_DESCRIBE_NUMBER }}_{{ build_ext }} # [not cython_enabled]
string: py{{ python | replace(".", "") }}_{{ GIT_DESCRIBE_HASH }}_{{ GIT_DESCRIBE_NUMBER }}_{{ build_ext }} # [cython_enabled]
noarch: python # [not cython_enabled]
skip: True # [cython_enabled and py<38]
string: py_{{ GIT_DESCRIBE_HASH }}_{{ GIT_DESCRIBE_NUMBER }}_{{ build_ext }}
noarch: python
script: >
python -m pip install . -vv --no-deps
--install-option="--with-cython=profile" # [cython_enabled]
track_features: # [cython_enabled]
- cythonized-scheduler # [cython_enabled]
entry_points:
- dask-scheduler = distributed.cli.dask_scheduler:main
- dask-ssh = distributed.cli.dask_ssh:main
- dask-worker = distributed.cli.dask_worker:main
requirements:
build:
- {{ compiler('c') }} # [cython_enabled]
host:
- python >=3.8
- pip
- setuptools
- cython # [cython_enabled]
run:
- python >=3.8
- click >=6.6
Expand Down
4 changes: 0 additions & 4 deletions distributed/cli/tests/test_dask_spec.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,11 @@
import sys

import pytest
import yaml

from distributed import Client
from distributed.scheduler import COMPILED
from distributed.utils_test import gen_cluster, gen_test, popen


@pytest.mark.skipif(COMPILED, reason="Fails with cythonized scheduler")
@gen_test(timeout=120)
async def test_text():
with popen(
Expand Down Expand Up @@ -38,7 +35,6 @@ async def test_text():
assert w["nthreads"] == 3


@pytest.mark.skipif(COMPILED, reason="Fails with cythonized scheduler")
@gen_cluster(client=True, nthreads=[])
async def test_file(c, s, tmp_path):
fn = str(tmp_path / "foo.yaml")
Expand Down
2 changes: 0 additions & 2 deletions distributed/deploy/tests/test_local.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@
from distributed.deploy.local import LocalCluster
from distributed.deploy.utils_test import ClusterTest
from distributed.metrics import time
from distributed.scheduler import COMPILED
from distributed.system import MEMORY_LIMIT
from distributed.utils import TimeoutError, sync
from distributed.utils_test import (
Expand Down Expand Up @@ -759,7 +758,6 @@ def scale_down(self, *args, **kwargs):
await cluster.close()


@pytest.mark.skipif(COMPILED, reason="Fails with cythonized scheduler")
def test_local_tls_restart(loop):
from distributed.utils_test import tls_only_security

Expand Down
4 changes: 1 addition & 3 deletions distributed/diagnostics/tests/test_progress.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@
Progress,
SchedulerPlugin,
)
from distributed.scheduler import COMPILED
from distributed.utils_test import dec, div, gen_cluster, inc, nodebug, slowdec, slowinc


Expand Down Expand Up @@ -93,8 +92,7 @@ def check_bar_completed(capsys, width=40):
assert percent == "100% Completed"


@pytest.mark.flaky(condition=not COMPILED and LINUX, reruns=10, reruns_delay=5)
@pytest.mark.skipif(COMPILED, reason="Fails with cythonized scheduler")
@pytest.mark.flaky(condition=LINUX, reruns=10, reruns_delay=5)
@gen_cluster(client=True, Worker=Nanny)
async def test_AllProgress(c, s, a, b):
x, y, z = c.map(inc, [1, 2, 3])
Expand Down
Loading