-
Notifications
You must be signed in to change notification settings - Fork 542
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FIX since Airflow 1.10.8 AIRFLOW__ variables are eligible to _CMD behaviour #503
base: master
Are you sure you want to change the base?
Conversation
ed9da04
to
2d7196f
Compare
would this allow for a secret to be set like so? services:
airflow:
image: puckel/airflow
secrets:
- fernet-key
environment:
- AIRFLOW__CORE__FERNET_KEY=$(cat /run/secrets/fernet-key) |
@dinigo yes, but you have to replace AIRFLOW__CORE__FERNET_KEY with AIRFLOW__CORE__FERNET_KEY_CMD and use the correct volumeMount to make sure the /run/secrets/fernet-key file contains the secret's value. |
I'm rebasing your changes with my fork, thanks :) |
I have problems using "AIRFLOW__CELERY__BROKER_URL_CMD" (no more entries in google , just this topic) Is correct to point this value to hole string like: "pyamqp://airflow:airflow@rabbitmq:5672/airflow" in secret? AIRFLOW__CELERY__BROKER_URL_CMD=$(cat /run/secrets/broker_url) ? And never gets de value. I'm here in this topic: #545 |
@aalemanq from the documentation of airflow v1.10.10
It could be that this repo is still in the v1.10.9. I don't really know. But you can use the official image Anyway why do you want to store the broker url as a docker secret? Do you use a third party managed redis service which needs passwords? I don't see the point otherwise. |
Hello, I have to securize passwords and no password upload to git.. I use rabbitmq. I try with 1.10.10 and same. I don't know what happens with this #### secrets to apply in airflow. It can't be so complicated to pass secrets in enviroment like other software omg... :( I I don't know why official doc just put this info: https://airflow.readthedocs.io/en/stable/howto/set-config.html and not works. Anybody can pass secrets via enviroments in docker-compose to deploy in swarm???? |
If you want to get visibility I suggest you file an issue at apache/airflow repo or at StackOverflow (or why not, use both) |
@aalemanq
Note that not all variables are eligible to _CMD behaviour (see the config documentation). In your case, for the broker endpoint config it's eligible. This should work with any recent Airflow version (I use 1.10.9), but be aware that the official apache/airflow docker image that is a backport of Airflow 2.0 has got issues with its entrypoint that I fixed in puckel/docker-airflow. I need to take a look at the official image and propose a fix. |
Thanks for your repply NBardelot! your work here is awesome so many thanks. I understand, it is so simple no? I create my secret in docker-compose and saved in swarm and mounted in container on deploy AIRFLOW__CELERY__BROKER_URL_CMD=$(cat /run/secrets/broker_url) (In docker swarm, this fail because you need to scape this $ with another This not works, and broker url empty= is default=redis and I can't connect against my rabbit using secrets. If you can I can show you all my workflow, applying secrets in docker-swarm, deploy, and logs with _CMD enviroments on docker-compose.yml applied. I try with last version 1.10.9 puckel too and CMD never gets the value :(. If I use normal enviroments without secrets it's works |
@aalemanq You do not need the You can just set In order to debug such a configuration you can:
|
Well, I have found several errors that confuse me, thank you very much for your interest and for giving support, I will never tire of repeating it. As you say, I have applied the environment variable: Same error, the variable does not work at the start of airflow. Debug:
First of all, before execute /entrypoint.sh manually, I check if environments are correct: -If I execute
I have to restart docker-compose again with next value in cmd broker_url env:
So far everything perfect. but.... The next step is to run the entrypoint.sh manually to simulate the start of thepuckel/docker-airflow:1.10.9 When I run this entrypoint
And it never takes my CMD variable from broker_url in the start using entrypoint For another hand, if I execute command directly and no entrypoint.sh manually, I get this ¿?
¿?¿? what is this? amqp://airflow:**@rabbitmq:5672/airflow%0A < ---- ¿?¿ %0A ¿? It's normal that inside entrypoint of 1.10.9 image there is no reference to enviroments _CMD?
edit: Because I don't understand why actually entrypoint in 1.10.9 don't have CMD enviroments. I get same error of my last post. Airflow gets the value but apply "%0A" at the final of sring O_O. But at least. eval "works" and my enviroment exists pointing to secrets. Apologies for my english and skill, I'm trying to be clear and no disturb u! Regards! |
@aalemanq pleasure to help, I've worked on this so if I can share... The You can check this with Be careful when you create the secret you probably insert a newline at the end of the string without wanting one. Example with the newline (due to
Without the newline (
|
It depends on the Airflow image. Are you using the image from The But the official image is not. See the following issue and PR i'm currently proposing to Airflow : |
Hello, I'm using puckel/docker-airflow:1.10.9 and if you grep CMD on entrypoint you can't see anything related enviroments CMD inside, it's normal? O_o, I had to override this entrypoint 0f5b8b1 in my compose to avoid airflow change my broker_url for redis. Do you think that I'm in the correct way? It's works.... About secrets...
Its ok!! O_O, I create a secret with docker-compose not by hand 3.7 version. For another hand I create manually this secrets with echo -n and use it like external,and IT'S WORKS!!!! It's fking works! I can't believe that , 1 month wasted O_O. Resume,
` In enviroments section:
`
Really, I'm astonished, When I go to official doc you can read this: https://airflow.readthedocs.io/en/stable/howto/set-config.html and cut your veins trying shit O_O And if I use official image...bye bye, it's worst, same entrypoint with no CMD O_O, and this guys put in their official docu but entrypoint can't manage it?¿? I don't understand nothing. Sorry for last words talking about my life X) Regards NBardelot , you save my day, my month,my mind and more things. |
You'll need to rebuild the Docker image for puckel/docker-airflow with the commit I propose in this PR in order to make it work. That's the goal of the PR :) |
Thanks a lot! Do you think that is the same if I copy entrypoint like now? or I lost some features? Thanks! PD: thanks, I never read about PR :D |
The previous commit concerning the file So if you use 1.10.9 you can safely replace the whole script using the commit of this PR. |
Thanks for stay here ;) Somebody knows if AIRFLOW__CELERY__FLOWER_BASIC_AUTH_CMD is implemented? I can't see any trace in entrypoint about this environment. Regards!! |
It is not yet implemented. That's why I propose this PR with a commit to implement it. |
AIRFLOW__CORE__LOAD_EXAMPLES \ | ||
|
||
# Setup the Fernet Key only if the user didn't provide it explicitely as an Airflow configuration | ||
if [[ -z "$AIRFLOW__CORE__FERNET_KEY" && -z "$AIRFLOW__CORE__FERNET_KEY_CMD" ]]; then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can cause the user to lose data if they forget to keep the Fernet key. For Airflow 1.10, the fernet key is optional. For Airflow 2.0, it is required.
See pull request on the official Airflow repository: apache/airflow#6801
It was integrated in 1.10.8. The following variables management are concerned by this change in the image entrypoint:
fernet_key
configuration can now be managed withAIRFLOW__CORE__FERNET_KEY_CMD
as well as the usualAIRFLOW__CORE__FERNET_KEY
sql_alchemy_conn
configuration can now be managed withAIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD
as well as the usualAIRFLOW__CORE__SQL_ALCHEMY_CONN
broker_url
configuration can now be managed withAIRFLOW__CELERY__BROKER_URL_CMD
as well as the usualAIRFLOW__CELERY__BROKER_URL
This PR takes this change into account and fixes errors where the REDIS_ and POSTGRES_ and the Fernet Key variables used in the entrypoint were not computed correctly.