1.) [dags] Alert_Mail.py - send alert email using airflow
- It can be used in the case when we want to monitor some table values and if it become less/more than the expected threshold value and we need to notify people over mail.
- 3 airflow operators used for the script
1.) MySqlHook: Link
2.) PythonOperator: Link
3.) send_email_smtp: Link
2.) [operators] Athenna_To_Mysql.py - create pipeline from athenna to Mysql
- This can be used as a customized airflow operator to transfer data from athenna to mysql
- External Libraries:
1.) upsert: Link
2.) create_engine: Link
3.) logging: Link - AthennaToMySqlOperator class variables:
1.) sql: Query to fetch data from S3 using Athenna Query Engine (source data)
2.) mysql_table: Destination Mysql table name
3.) aws_conn_id: Airflow Connections AWS
4.) client_type: 's3'
5.) region_name: AWS region for the connection
6.) database: Destination database name
7.) if_row_exists: 'update' or 'replace'
8.) index_col_name: index column name - Sample Dag File Using this Operator: Link
3.) [operators] Mysql_query_to_Email.py - create pipeline to extract data from Mysql and send to Email as Attachment
- This can be used as a customized airflow operator to extract data from Mysql and send to Email as Attachment
- External Libraries:
1.) yagmail: Link
2.) logging: Link - QueryToMail class variables:
1.) mysql_conn_id: Link
2.) sql_query: Path of Query to fetch data from Mysql DB
3.) archive_name: File name to be stored Inside Zip Folder
4.) file_name: Zip file name to attach in Mail
5.) body: Email body to be sent
6.) subject: Email subject
7.) receiver: List of Receiver's Email Id - Sample Dag File Using this Operator: Link