SparkStreaming任務保持運行,定時任務監控進程,保證不掛掉
阿新 • • 發佈:2018-11-14
spa finished .sh hup exec wid back roi lin
cron任務:每隔1分鐘啟動腳本,檢查進程是否運行。crontab -e
*/1 * * * * /data/spark/test.sh
檢查進程,如果進程掛掉,重新啟動Spark任務:
#!/bin/sh is_Engine_exist=$(ps aux | grep LbsStreamingEngineTJ | grep -v grep | wc -l) if [ $is_Engine_exist = 0 ];then echo ‘Process Engine is down‘ echo ‘Bring Engine up‘ strDate=`date +%Y%m%d%H%M%S` strStart="start Engine ${strDate}" echo "${strStart}" >> /data1/log.txt nohup /data1/spark-1.6.0/bin/spark-submit --master spark://localhost:7077 --name LbsStreamingEngineTJ --class com.datafactory.streaming.LbsStreamingEngineTJ --executor-memory 512m --total-executor-cores 2 /data1/work/datafactory-0.1.0-SNAPSHOT1023.jar & echo ‘Bring Engine finished ‘ else strDate=`date +%Y%m%d%H%M%S` strRun="running ${strDate}" echo "${strRun}" >> /data1/log.txt fi
SparkStreaming任務保持運行,定時任務監控進程,保證不掛掉