1. 程式人生 > 實用技巧 >airflow分散式部署(五)開機自啟動

airflow分散式部署(五)開機自啟動

airflow是一個常駐任務,我們希望它在任務失敗後可以自動重啟,systemd可以幫助我們實現這個功能,首先需要配置如下檔案

/etc/sysconfig/airflow

AIRFLOW_CONFIG=/root/airflow/airflow.cfg
AIRFLOW_HOME=/root/airflow

/usr/lib/systemd/system/airflow-webserver.service

[Unit]
Description=Airflow webserver daemon

[Service]
EnvironmentFile=/etc/sysconfig/airflow
User=root
Group
=root Type=simple ExecStart=/root/miniconda3/envs/py36/bin/airflow webserver Restart=on-failure RestartSec=5s PrivateTmp=true [Install] WantedBy=multi-user.target

/usr/lib/systemd/system/airflow-flower.service

[Unit]
Description=Airflow flower daemon
[Service]
EnvironmentFile=/etc/sysconfig/airflow
User=root
Group
=root Type=simple ExecStart=/root/miniconda3/envs/py36/bin/airflow flower Restart=on-failure RestartSec=5s PrivateTmp=true

/usr/lib/systemd/system/airflow-scheduler.service

[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants
=redis.service [Service] EnvironmentFile=/etc/sysconfig/airflow User=root Group=root Type=simple ExecStart=/root/miniconda3/envs/py36/bin/airflow scheduler Restart=always RestartSec=5s [Install] WantedBy=multi-user.target

/usr/lib/systemd/system/airflow-worker.service

[Unit]
Description=Airflow celery worker daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=mysql.service redis.service

[Service]
EnvironmentFile=/etc/sysconfig/airflow
User=root
Group=root
Type=simple
ExecStart=/root/miniconda3/envs/py36/bin/airflow worker
Restart=on-failure
RestartSec=10s

[Install]
WantedBy=multi-user.target

配置完,之後執行以下命令

#過載所有修改過的配置檔案
sudo systemctl daemon-reload
#檢視啟動耗時,關鍵是可以檢視啟動失敗的原因
sudo systemd-analyze verify airflow-webserver.service
#檢視啟動失敗任務的啟動日誌
sudo journalctl -u airflow-webserver.service
#檢視服務狀態
systemctl status airflow-scheduler.service
#判斷是否自動重啟
sudo systemctl is-enabled airflow-worker.service
sudo systemctl enable airflow-worker.service
#重啟任務
systemctl stop airflow-scheduler.service
systemctl start airflow-scheduler.service

報錯記錄

FileNotFoundError: [Errno 2] No such file or directory: 'gunicorn'
這是因為沒有把python3的path加入到PATH中,解決思路找到python3的路徑加入到PATH即可