1. 程式人生 > >CentOS 6.8 下編譯hadoop2.7.5

CentOS 6.8 下編譯hadoop2.7.5

本文測試環境:虛擬機器VMware Workstation Pro、作業系統 CentOS 6.8 64 位
基於上述環境成功編譯hadoop2.7.5、hadoop-2.6.5,其它環境或版本請參照編譯。
本文以編譯hadoop2.7.5的過程作介紹


一、編譯前準備
下載hadoop-2.7.5-src.tar.gz (http://mirrors.hust.edu.cn/apache/hadoop/common/hadoop-2.7.5/hadoop-2.7.5-src.tar.gz)
解壓hadoop後仔細閱編譯指導檔案
[[email protected] soft]# tar -zxvf hadoop-2.7.5-src.tar.gz
[[email protected] hadoop-2.7.5-src]# less BUILDING.txt
下載準備如下軟體
jdk-7u45-linux-x64.tar.gz
apache-ant-1.9.9-bin.tar.gz
apache-maven-3.0.5-bin.tar.gz
findbugs-3.0.1.tar.gz
hadoop-2.7.5-src.tar.gz
protobuf-2.5.0.tar.gz


二、安裝JDK
編譯要求:JDK 1.7+
安裝軟體:jdk-7u80-linux-x64.tar.gz
設定記憶體
在編譯之前防止 java.lang.OutOfMemoryError: Java heap space   堆疊問題,在centos系統中執行命令:
$ export MAVEN_OPTS="-Xms256m -Xmx512m"


tar -zxvf jdk-7u80-linux-x64.tar.gz -C /usr/local
修改配置檔案
export JAVA_HOME=/usr/local/jdk1.7.0_80
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar  
export PATH=$PATH:$JAVA_HOME/bin


三、安裝MAVEN
編譯要求:Maven 3.0 or later
安裝軟體:apache-maven-3.0.5-bin.tar.gz
[
[email protected]
soft]# tar -zxvf apache-maven-3.0.5-bin.tar.gz -C /opt
修改配置檔案
export M2_HOME=/opt/apache-maven-3.0.5
export PATH=$PATH:$M2_HOME/bin
測試:
[[email protected] ~]# mvn -v
Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 21:51:28+0800)
Maven home: /opt/apache-maven-3.0.5
Java version: 1.7.0_45, vendor: Oracle Corporation
Java home: /usr/local/jdk1.7.0_45/jre
Default locale: zh_CN, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-642.el6.x86_64", arch: "amd64", family: "unix"


四、安裝findbugs
編譯要求:Findbugs 1.3.9 
安裝軟體:findbugs-3.0.1.tar.gz
[
[email protected]
soft]# tar -zxvf findbugs-3.0.1.tar.gz -C /opt
修改配置檔案
[[email protected] ~]# vim /etc/profile
export FINDBUGS_HOME=/opt/findbugs-3.0.1
export PATH=$PATH:$FINDBUGS_HOME/bin
測試: 
[[email protected] ~]# findbugs -version
3.0.1
五、安裝依賴包 
根據編譯指導檔案BUILDING.txt,安裝相關依賴程式包
[[email protected]
protobuf-2.5.0]# yum -y install gcc-c++ build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-devua
[[email protected] protobuf-2.5.0]# rpm -q gcc-c++  build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-devua
gcc-c++-4.4.7-17.el6.x86_64
package build-essential is not installed
autoconf-2.63-5.1.el6.noarch
automake-1.11.1-4.el6.noarch
libtool-2.2.6-15.5.el6.x86_64
cmake-2.8.12.2-4.el6.x86_64
package zlib1g-dev is not installed
package pkg-config is not installed
package libssl-devua is not installed
[[email protected] protobuf-2.5.0]# yum -y install svn zlib-devel  pkgconfig openssl-devel 


六、安裝ProtocolBuffer
編譯要求:ProtocolBuffer 2.5.0 
安裝軟體:protobuf-2.5.0.tar.gz,不建議用其它版本


[[email protected] soft]# tar -zxvf protobuf-2.5.0.tar.gz 
[[email protected] protobuf-2.5.0]# ./configure
[[email protected] protobuf-2.5.0]# make
[[email protected] protobuf-2.5.0]# make check
[[email protected] protobuf-2.5.0]# make install
測試: 
[[email protected] protobuf-2.5.0]# protoc --version
libprotoc 2.5.0


七、修改maven的配置檔案,新增maven的下載源
[[email protected] hadoop-2.7.5-src]# cd /opt/apache-maven-3.0.5/conf/
[[email protected] conf]# vim settings.xml 
在mirrors中新增alimaven的下載源
<mirrors>
    <!-- mirror
     | Specifies a repository mirror site to use instead of a given repository. The repository that
     | this mirror serves has an ID that matches the mirrorOf element of this mirror. IDs are used
     | for inheritance and direct lookup purposes, and must be unique across the set of mirrors.
     |
    
        <mirror>
        <id>alimaven</id>
        <mirrorOf>central</mirrorOf>
         <name>aliyun maven</name>
        <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
        </mirror>
        <mirror>
        <id>mirrorId</id>
        <mirrorOf>repositoryId</mirrorOf>
        <name>Human Readable Name for this Mirror.</name>
        <url>http://my.repository.com/repo/path</url>
        </mirror>
     -->
</mirrors>


八、開始編譯hadoop
[[email protected] hadoop-2.7.5-src]$ mvn clean install -DskipTests
[[email protected] hadoop-2.7.5-src]$ mvn package -Pdist,native -DskipTests -Dtar
main:
     [exec] $ tar cf hadoop-2.7.5.tar hadoop-2.7.5
     [exec] $ gzip -f hadoop-2.7.5.tar
     [exec] 
     [exec] Hadoop dist tar available at: /yumserver/soft/hadoop-2.7.5-src/hadoop-dist/target/hadoop-2.7.5.tar.gz
     [exec] 
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /yumserver/soft/hadoop-2.7.5-src/hadoop-dist/target/hadoop-dist-2.7.5-javadoc.jar
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [16.893s]
[INFO] Apache Hadoop Build Tools ......................... SUCCESS [3.408s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [20.671s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [5.603s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.473s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [12.758s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [9.410s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [9.622s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [14.434s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [9.450s]
[INFO] Apache Hadoop Common .............................. SUCCESS [3:27.845s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [14.742s]
[INFO] Apache Hadoop KMS ................................. SUCCESS [31.051s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.127s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [5:07.461s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [50.218s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [1:07.357s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [8.005s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.165s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.150s]
[INFO] hadoop-yarn-api ................................... SUCCESS [3:18.694s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:01.248s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.108s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [28.937s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [47.973s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [11.405s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [30.311s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [43.604s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [17.566s]
[INFO] hadoop-yarn-client ................................ SUCCESS [25.410s]
[INFO] hadoop-yarn-server-sharedcachemanager ............. SUCCESS [10.557s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.135s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [5.765s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.621s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.062s]
[INFO] hadoop-yarn-registry .............................. SUCCESS [11.675s]
[INFO] hadoop-yarn-project ............................... SUCCESS [10.093s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.433s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [52.280s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [47.925s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [8.781s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [27.220s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [13.362s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [14.073s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [6.330s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.081s]
[INFO] hadoop-mapreduce .................................. SUCCESS [5.841s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [9.484s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [25.651s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [6.627s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [27.802s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [15.242s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [9.808s]
[INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [6.653s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [6.814s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [12.114s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [12.788s]
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [10.906s]
[INFO] Apache Hadoop Azure support ....................... SUCCESS [9.376s]
[INFO] Apache Hadoop Client .............................. SUCCESS [15.800s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [1.680s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [17.601s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [17.073s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.088s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [1:28.085s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 29:47.577s
[INFO] Finished at: Tue Jan 02 10:51:32 CST 2018
[INFO] Final Memory: 95M/415M
[INFO] ------------------------------------------------------------------------


表示編譯成功,檔案為hadoop-2.7.5-src/hadoop-dist/target/hadoop-2.7.5.tar.gz




九、錯誤資訊
如出現下面錯誤資訊,有可能是記憶體不足,調整虛擬機器記憶體容量即可解決問題。


系統資源不足。
有關詳細資訊, 請參閱以下堆疊跟蹤。
java.lang.OutOfMemoryError: Java heap space
at com.sun.tools.javac.util.Position$LineMapImpl.build(Position.java:153)
at com.sun.tools.javac.util.Position.makeLineMap(Position.java:77)
at com.sun.tools.javac.parser.Scanner.getLineMap(Scanner.java:1147)
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-hdfs: Compilation failure
[ERROR] An unknown compilation problem occurred

相關推薦

CentOS 6.8 編譯hadoop2.7.5

本文測試環境:虛擬機器VMware Workstation Pro、作業系統 CentOS 6.8 64 位 基於上述環境成功編譯hadoop2.7.5、hadoop-2.6.5,其它環境或版本請參照編譯。 本文以編譯hadoop2.7.5的過程作介紹 一、編譯前準備 下載

CentOS 6.8編譯安裝MySQL 5.6.14

CentOS 6.8下編譯安裝MySQL 5.6.14 概述: 通過原始碼安裝高版本的5.6.14。 正文: 一:解除安裝舊版本 使用下面的命令檢查是否安裝有MySQL Server rpm -qa | grep mysql 有的話通過下面的命令來解除安裝掉 目前我們查詢到的

CentOS 6.8編譯安裝MySQL 5.6.30

概述: CentOS 6.4下通過yum安裝的MySQL是5.1版的,比較老,所以就想通過原始碼安裝高版本的5.6.14。 正文: 一:解除安裝舊版本 使用下面的命令檢查是否安裝有MySQL Server rpm -qa | grep mysql 有的話通過下面的命令來解除

CentOS 6.4編譯安裝MySQL 5.6.14

概述: CentOS 6.4下通過yum安裝的MySQL是5.1版的,比較老,所以就想通過原始碼安裝高版本的5.6.14。 正文: 一:解除安裝舊版本 使用下面的命令檢查是否安裝有MySQL Server rpm -qa | grep mysql 有的話通過下面的命令來解除安裝掉 rpm -e

CentOS 6.8 安裝mysql-5.6

centos6.8 mysql 源碼CentOS 6.8 下安裝mysql-5.6安裝環境:CentOS release 6.8 (Final) percona-server-5.6.29-76.2安裝依賴庫和工具yum -y install gcc gcc-c++ libgcrypt openssl op

CentOS 6.x重置MySQL 5.7密碼

本人環境為MySQL 5.7 + CentOS 6.3。 1. 編輯MySQL配置檔案/etc/my.cnf 找到[mysqld],在下面新增一行skip-grant-tables2. 重啟MySQL

centos 6.8 安裝redmine

use 參數 訪問 .so option ble 註釋 環境 -a 一、實驗環境 centos6.8 64位 所需安裝包: ruby-2.3.4.tar.gz、rubygems-1.8.25.tgz、redmine-2.3.2.tar.gz 二、安裝步驟 1、安裝必要的

CentOS 6.9 Minimal 編譯OpenJDK 7

建議 定向 mic emp nta 生效 reads TP lease 今天學習《深入理解Java虛擬機:JVM高級特性與最佳實踐》一書,並動手在Linux系統上編譯OpenJDK 7,初次搞不太順利,特記錄下編譯操作細節。 一、前期準備 約定:工具默認安裝目錄是/usr/

centos 6.8搭建PPTP VPN伺服器

centos 6.8下搭建PPTP VPN伺服器 一. 通過yum安裝 ppp和pptpd yum -y install ppp pptpd 二. 配置pptpd相關引數 2.1 配置/etc/pptpd.conf 找到下面兩行,去掉註釋,並修改IP

centOS 6.8,ipconfig -a命令無效的變通方法

51cto blog src watermark fig article .net ESS mark 使用ip addr命令。鳴謝:https://blog.csdn.net/q290994/article/details/77477795centOS 6.8下,ipco

CentOS 6.8 安裝部署flask應用記錄

一、安裝Apache 1.1直接執行: yum install httpd -y 1.2將Apache設定為開機啟動及啟動、停止命令: 二、安裝Python 3.6.5: Centos 6.8上自帶的python是2.6.6,而且yum線上更新會提示沒有可用的更

win10編譯hadoop2.7.3的問題解決

基本的編譯過程參見http://blog.csdn.net/changge458/article/details/53576178 在編譯中遇到了幾個問題,其根本原因都是tomcat下載失敗,或者是下

CentOS 6.6 源碼編譯安裝MySQL-5.7.18

gre 選擇 nbsp 修改root密碼 mysql- led init.d password logs 一、添加用戶和組 1.添加mysql用戶組 # groupadd mysql 2.添加mysql用戶 # useradd -g mysql -s /bin/

CentOS 6編譯安裝MySQL 5.6

是否 local profile pre utf8 readline 依賴 版本 centos 6 一:卸載舊版本 使用下面的命令檢查是否安裝有MySQL Server rpm -qa | grep mysql 有的話通過下面的命令來卸載掉 rpm -e mysql /

CentOS 6.8 編譯安裝MySQL5.5.32

nec 註意 說明 copyright container 日誌 perl min mysqld MySQL多實例的配置 通過上文 CentOS 6.8 編譯安裝MySQL5.5.32 ,我們完成了編譯安裝,接下配置多實例 本圖借鑒徐亮偉"思維簡圖" 5,添加多實例目錄

CentOS 6.4中編譯安裝GCC 4.8.1 + GDB 7.6.1

在CentOS 6.4中編譯安裝GCC 4.8.1 + GDB 7.6.1  一、編譯安裝gcc 4.8.1 1. 安裝gcc和g++ 新安裝的CentOS缺少編譯環境,必須先安裝舊版本的gcc, 然後再進行自舉編譯 yum -y install gcc  yum -y i

32和64位的CentOS 6.0 安裝 Mono 2.10.8 和Jexus 5.0

正常 nginx ng- 錯誤 保存 模塊 永久 www 要點 轉載:http://www.cnblogs.com/shanyou/archive/2012/01/07/2315982.html CentOS是一個基於RHEL的Linux發行版,其目的是為了提供

centos 6.8 /etc/sysconfig/沒有iptables的問題

進行 配置命令 out sysconf div 命令 table 重啟 防火墻規則 解決辦法: 1.任意運行一條iptables防火墻規則配置命令: iptables -P OUTPUT ACCEPT 2.對iptables服務進行保存: serv

CentOS 6.9MySQL5.7.19安裝步驟

操作系統 4.0 nat affect core net ffi 安裝 quic 目錄 [TOC] 1、查看當前安裝的Linux版本 [bruce@www ~]$ sudo lsb_release -a LSB Version: :base-4.0-amd64:bas

CentOS 6.8 安裝JDK1.7

centos6安裝jdkCentOS 6.8 安裝JDK1.7 檢查Linux系統是否已安裝jdk [root@linux1 ~]# rpm -qa | grep java tzdata-java-2016c-1.el6.noarch java-1.6.0-openjdk-1.6.0.38-1.13.10.4