讓你快速認識flume及安裝和使用flume1.5傳輸資料(日誌)到hadoop2.2 文件 安裝問題
1.啟動log4j警告,沒反應了
log4j:WARN No appenders could be found for logger (org.apache.flume.lifecycle.LifecycleSupervisor).
log4j:WARN Please initialize the log4j system properly.log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
因為我配了環境變數,所以在所有路徑下flume-ng都沒有問題,最後找到了http://stackoverflow.com/questions/12280403/flume-running-failed-in-linux,現象跟這個哥們一樣,但是我敲了-c conf ,所以估計是沒找的-c 下的conf檔案,在flume下執行就沒有問題了。
2. [ERROR - org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:207)] Failed to start agent because dependencies were not found in classpath. Error follows.
java.lang.NoClassDefFoundError: org/apache/hadoop/io/SequenceFile$CompressionType
at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:214)
at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
at org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.loadSinks(PropertiesFileConfigurationProvider.java:373)
at org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.load(PropertiesFileConfigurationProvider.java:223)
at org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(AbstractFileConfigurationProvider.java:123)
at org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$300(AbstractFileConfigurationProvider.java:38)
at org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:202)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:351)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:165)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:267)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SequenceFile$CompressionType
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
... 15 more
上網找了到這個文章http://cache.baiducontent.com/c?m=9f65cb4a8c8507ed4fece76310528c3e4a1fc2307c8c894f68d4e419ce3b4655023ba3ed2876435f8d922a7001de0f01fdf04733715060f18cc8f91988ecce6e38885664274dd50653840eafba10728377cc0cbef348bcedb12593df97849f0344ca225927c6e78b2b54498d33a6033192fdc55f152e41e4be7124bd0a3173882230a1478e&p=9772c54ad0c913e70be296385a00&newp=8f3d8416d9c159b10cbd9b78074492695803ed603cd6d50d6180&user=baidu&fm=sc&query=Failed+to+start+agent+because+dependencies+were+not+found+in+classpath%2E&qid=&p1=2
說是缺jar包
沒說怎麼弄啊,往上看看,上面有說明
cd ~ wget http://mirror.symnds.com/software/Apache/hadoop/common/hadoop-1.0.4/hadoop-1.0.4-bin.tar.gz tar xvzf hadoop-1.0.4-bin.tar.gz rm hadoop-1.0.4-bin.tar.gz cp ~/hadoop-1.0.4/hadoop-core-1.0.4.jar ~/apache-flume-1.3.1-bin/lib/ cp ~/hadoop-1.0.4/lib/commons-configuration-1.6.jar ~/apache-flume-1.3.1-bin/lib/ cp ~/hadoop-1.0.4/lib/commons-httpclient-3.0.1.jar ~/apache-flume-1.3.1-bin/lib/ cp ~/hadoop-1.0.4/lib/jets3t-0.6.1.jar ~/apache-flume-1.3.1-bin/lib/ cp ~/hadoop-1.0.4/lib/commons-codec-1.4.jar ~/apache-flume-1.3.1-bin/lib/
知道怎麼弄了吧,我的是hadoop2.3.0版本的,於是找到%hadoop_home%(hadoop安裝目錄下)/share/hadoop/下所有資料夾裡面的jar,再加上common下lib下的jar,其他資料夾不用找第二層的,這塊可能說的不清楚,可以留言,我沒挨個刪掉試試到底需要哪些jar,如果知道的請給我留言,謝謝。
3.2014-06-05 15:29:26,005 INFO [lifecycleSupervisor-1-1] instrumentation.MonitoredCounterGroup (MonitoredCounterGroup.java:start(95)) - Component type: SINK, name: sink1 started
2014-06-05 15:29:26,006 ERROR [SinkRunner-PollingRunner-DefaultSinkProcessor] flume.SinkRunner (SinkRunner.java:run(160)) - Unable to deliver event. Exception follows.
java.lang.IllegalStateException: Channel closed [channel=channel1]. Due to java.io.FileNotFoundException: /home/hadoop/aboutyun_tmp123/in_use.lock (Permission denied)
許可權問題,具體問題具體分析吧,我的是這幾個資料夾我用的root建的,改一下使用者和使用者組ok了
總結一下吧,網上好多用“讓你快速認識flume及安裝和使用flume1.5傳輸資料(日誌)到hadoop2.2 文件”,反正我是沒有成功,不知道各位的,所以好像很多問題沒有寫。希望對大家有所幫助。