模擬nginx+flume資料採集+Kafka叢集
阿新 • • 發佈:2019-02-13
執行shell指令碼模擬nginx生成日誌:nginx.sh
for((i=0;i<=500000;i++));
do echo "i am lilei"+$i >> 1.log
done
- 1
- 2
- 3
執行flume程式進行資料採集:
flume 任務檔案 exec.conf
a1.sources = r1
a1.channels = c1
a1.sinks = k1
a1.sources.r1.type = exec
a1.sources.r1.command = tail -F /root/1.log
a1.sources.r1.channels = c1
a1.channels.c 1.type = memory
a1.channels.c1.capacity = 10000
a1.channels.c1.transactionCapacity=100
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.topic=all1
a1.sinks.k1.brokerList=mini1:9092
a1.sinks.k1.requiredAcks=1
a1.sinks.k1.batchSize=20
a1.sinks.k1.channel=c1
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
啟動flume 程式:
bin/flume-ng agent -c conf -f conf/exec.conf -n a1 -Dflume.root.logger=INFO,console
- 1
kafka 叢集收集資料:
啟動kafka
bin/kafka-server-start.sh config/server.properties
- 1
建立topic
sh bin/kafka-console-consumer.sh --zookeeper mini1:2181 --from-beginning --topic all1
- 1