1. 程式人生 > 實用技巧 >Flume案例:多資料來源彙總案例

Flume案例:多資料來源彙總案例

多資料來源彙總案例

1)案例需求:

hadoop103上的flume-1監控檔案hive.log

hadoop104上的flume-2監控某一個埠的資料流,

flume-1flume-2將資料傳送給hadoop102上的flume-3flume-3將最終資料列印到控制檯

2)需求分析:

3)實現步驟:

0.準備工作

分發flume

[jason@hadoop102 module]$ xsync flume

hadoop102hadoop103以及hadoop104的/opt/module/flume/job目錄建立一個group2檔案

[jason@hadoop102 job]$ mkdir
group2 [jason@hadoop103 job]$ mkdir group2 [jason@hadoop104 job]$ mkdir group2

1.建立flume1.conf

配置source用於監控hive.log檔案,配置sink輸出資料到下一級flume

hadoop103上建立配置檔案並開啟

[jason@hadoop103 group2]$ touch flume1.conf


[jason@hadoop103 group2]$ vim flume1.conf

新增如下內容

# Name the components on this agent


a1.sources 
= r1 a1.sinks = k1 a1.channels = c1 # Describe/configure the source a1.sources.r1.type = exec a1.sources.r1.command = tail -F /opt/module/group.log a1.sources.r1.shell = /bin/bash -c # Describe the sink a1.sinks.k1.type = avro a1.sinks.k1.hostname = hadoop102 a1.sinks.k1.port = 4141
# Describe the channel a1.channels.c1.type = memory a1.channels.c1.capacity = 1000 a1.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a1.sources.r1.channels = c1 a1.sinks.k1.channel = c1

2.建立flume2.conf

配置source監控埠44444資料流,配置sink資料到下一級flume

hadoop104上建立配置檔案並開啟

[jason@hadoop104 group2]$ touch flume2.conf


[jason@hadoop104 group2]$ vim flume2.conf

新增如下內容

# Name the components on this agent
a2.sources = r1

a2.sinks = k1

a2.channels = c1


# Describe/configure the source
a2.sources.r1.type = netcat

a2.sources.r1.bind = hadoop104

a2.sources.r1.port = 44444


# Describe the sink
a2.sinks.k1.type = avro

a2.sinks.k1.hostname = hadoop102

a2.sinks.k1.port = 4141
 
# Use a channel
which buffers events in memory a2.channels.c1.type = memory a2.channels.c1.capacity = 1000 a2.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a2.sources.r1.channels = c1 a2.sinks.k1.channel = c1

3.建立flume3.conf

配置source用於接收flume1與flume2傳送過來的資料流,最終合併後sink到控制檯。

hadoop102上建立配置檔案並開啟

[jason@hadoop102 group2]$ touch flume3.conf


[jason@hadoop102 group2]$ vim flume3.conf

新增如下內容

# Name the components on this agent
a3.sources = r1
a3.sinks
= k1
a3.channels
= c1 # Describe/configure the source a3.sources.r1.type = avro
a3.sources.r1.bind
= hadoop102
a3.sources.r1.port
= 4141 # Describe the sink a3.sinks.k1.type = logger # Describe the channel a3.channels.c1.type = memory
a3.channels.c1.capacity
= 1000

a3.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a3.sources.r1.channels = c1
a3.sinks.k1.channel
= c1

4.執行配置檔案

分別開啟對應配置檔案:flume3.conf,flume2.conf,flume1.conf。

[jason@hadoop102 flume]$ bin/flume-ng agent --conf conf/ --name a3 --conf-file job/group2/flume3.conf -Dflume.root.logger=INFO,console
[jason@hadoop104 flume]$ bin/flume-ng agent --conf conf/ --name a2 --conf-file job/group2/flume2.conf
[jason@hadoop103 flume]$ bin/flume-ng agent --conf conf/ --name a1 --conf-file job/group2/flume1.conf

5.在hadoop103上向/opt/module目錄下的group.log追加內容

[jason@hadoop103 module]$ echo 'hello' > group.log

6.在hadoop104上向44444埠傳送資料

[jason@hadoop104 flume]$ telnet hadoop104 44444

7.檢查hadoop102上資料