Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] --watch does not reload code changes, despite console output saying so "files changed, reloading arq worker..." #429

Open
epicwhale opened this issue Feb 14, 2024 · 4 comments

Comments

@epicwhale
Copy link

epicwhale commented Feb 14, 2024

I am not able to get --watch to work, despite the console output looking like it has restarted arq. Steps to reproduce below, what am I missing?

How to reproduce?

I created an isolated python project in a fresh conda environment (Python 3.11.7) and able to reproduce it:

demo.py file (similar to example in arq doc)

import asyncio
from arq import create_pool
from arq.connections import RedisSettings

async def my_task(ctx):
    return "HELLO-1"   #  <--- try changing this string output when worker is running

async def main():
    redis = await create_pool(RedisSettings())
    await redis.enqueue_job('my_task')

class WorkerSettings:
    functions = [my_task]

if __name__ == '__main__':
    asyncio.run(main())

Worker running using: arq_try arq demo.WorkerSettings --watch ./

(.conda) (base) ➜  arq_try arq demo.WorkerSettings --watch ./
09:08:06: Starting worker for 1 functions: my_task
09:08:06: redis_version=7.2.4 mem_usage=1.57M clients_connected=4 db_keys=35
09:08:11:   0.45s → 0c923d4152a0456693f3b11b0d08879b:my_task()
09:08:11:   0.00s ← 0c923d4152a0456693f3b11b0d08879b:my_task ● 'HELLO-1'  # <--- CORRECT
        ### now I edit demo.py and change the output string to HELLO-2  ####
files changed, reloading arq worker...
09:08:17: shutdown on SIGUSR1 ◆ 1 jobs complete ◆ 0 failed ◆ 0 retries ◆ 0 ongoing to cancel
09:08:17: Starting worker for 1 functions: my_task
09:08:17: redis_version=7.2.4 mem_usage=1.57M clients_connected=4 db_keys=36
09:08:20:   0.31s → 725b3dc60cd942feb5023aa2c90a5b27:my_task()
09:08:20:   0.00s ← 725b3dc60cd942feb5023aa2c90a5b27:my_task ● 'HELLO-1'  # <--- WRONG: THIS SHOULD BE HELLO-2 as the demo.py file was changed / modified
(.conda) (base) ➜  arq_try pip list
Package           Version
----------------- --------
anyio             4.2.0
arq               0.25.0
certifi           2024.2.2
click             8.1.7
h11               0.14.0
hiredis           2.3.2
httpcore          1.0.2
idna              3.6
pip               24.0
redis             5.0.1
setuptools        69.0.3
sniffio           1.3.0
typing_extensions 4.9.0
watchfiles        0.21.0
wheel             0.42.0

Project directory tree:

image

Any help or pointers appreciated!

@epicwhale
Copy link
Author

Anyone else facing this issue?

For now, I have configured my development with a VS code launch json config, that does a "autoReload": {"enable": true} as a temporary workaround

@vidjuheffex
Copy link

Lucky,

I don't even get:

files changed, reloading arq worker... 09:08:17: shutdown on SIGUSR1 ◆ 1 jobs complete ◆ 0 failed ◆ 0 retries ◆ 0 ongoing to cancel

@owalerys
Copy link

I also don't get the changes reflected in the worker even though it states it's been reloaded.

@owalerys
Copy link

The problem is that the worker's source is never reloaded on a detected change in files watched.

See this line:

arq/arq/cli.py

Line 77 in ec1532b

await worker.close()

My solution is to use watchfiles directly

watchfiles "arq worker.WorkerSettings" ./

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants