基於Centos7+Docker 搭建hadoop叢集
總體流程:
獲取centos7映象
為centos7映象安裝ssh
使用pipework為容器配置IP
為centos7映象配置java、hadoop
配置hadoop
1.獲取centos7映象
$ docker pull centos:7
//檢視當前已下載docker映象
$ docker image ls -a
2.編寫Dockerfile
$ vim Dockerfile
寫入以下內容:
FROM centos
MAINTAINER dys
RUN yum install -y openssh-server sudo
RUN sed -i ‘s/UsePAM yes/UsePAM no/g’ /etc/ssh/sshd_config
RUN yum install -y openssh-clients
RUN echo “root:0225” | chpasswd
RUN echo “root ALL=(ALL) ALL” >> /etc/sudoers
RUN ssh-keygen -t dsa -f /etc/ssh/ssh_host_dsa_key
RUN ssh-keygen -t rsa -f /etc/ssh/ssh_host_rsa_key
RUN mkdir /var/run/sshd
EXPOSE 22
CMD [“/usr/sbin/sshd”,”-D”]
構建映象命令,設定映象名為centos-ssh
$ docker build -t="centos-ssh" .
$ docker images
3.使用pipework設定容器ip
- 先安裝git客戶端
$ yum -y install git
$ git --version
- git clone pipeword
$ git clone https://github.com/jpetazzo/pipework
$ cd pipework
$ cp pipework /usr/loacl/bin
- 安裝bridge-utils
$ yum -y install bridge-utils
//建立網路
$ brctl addbr br1
$ ip link set dev br1 up
$ ip addr add 192.168.3.1/24 dev br1
//啟動centos-ssh image
$ docker run -d --name=centos.ssh centos.ssh
//設定ip
$ pipework br1 centos.ssh 192.168.3.20/24
$ ping 192.168.3.20
$ ssh 192.168.3.20
//再建立兩個相同的容器,到此三臺伺服器就搭建好了
$ docker run -d --name=centos7.ssh2 centos-ssh
$ docker run -d --name=centos7.ssh3 centos-ssh
$ pipework br1 centos7.ssh2 192.168.3.22/24
$ pipework br1 centos7.ssh3 192.168.3.23/24
4.為centos7映象配置java、hadoop
$ vim Dockerfile
寫入以下內容:
前提:在Dockerfile所在目錄下準備好 jdk-8u101-linux-x64.tar.gz 與 hadoop-2.7.3.tar.gz
FROM centos-ssh
ADD jdk-8u101-linux-x64.tar.gz /usr/local/
RUN mv /usr/local/jdk1.8.0_101 /usr/local/jdk1.8
ENV JAVA_HOME /usr/local/jdk1.8
ENV PATH $JAVA_HOME/bin:$PATH
ADD hadoop-2.7.3.tar.gz /usr/local
RUN mv /usr/local/hadoop-2.7.3 /usr/local/hadoop
ENV HADOOP_HOME /usr/local/hadoop
ENV PATH $HADOOP_HOME/bin:$PATH
RUN yum install -y which sudo
$ docker build -t="hadoop" .
//容器hadoop0啟動時,映射了埠號,50070和8088,是用來在瀏覽器中訪問hadoop WEB介面的
$ docker run --name hadoop0 --hostname hadoop0 -d -P -p 50070:50070 -p 8088:8088 hadoop
$ docker run --name hadoop1 --hostname hadoop1 -d -P hadoop
$ docker run --name hadoop2 --hostname hadoop2 -d -P hadoop
//配置ip
$ pipework br1 hadoop0 192.168.3.30/24
$ pipework br1 hadoop1 192.168.3.31/24
$ pipework br1 hadoop2 192.168.3.32/24
//連線到container
$ docker exec -it hadoop0 /bin/bash
$ docker exec -it hadoop1 /bin/bash
$ docker exec -it hadoop2 /bin/bash
//在各個centos中修改/etc/hosts
192.168.3.30 master
192.168.3.31 slave1
192.168.3.32 slave2
引用參考: