How to install Hadoop 2.7.3 cluster on CentOS 7.3
############################# #ENV #spark01 192.168.51.6 #spark02 192.168.51.18 #spark03 192.168.51.19 #spark04 192.168.51.21 #spark05 192.168.51.24 ############################ ##We must to improve file limits on every nodes echo "ulimit -SHn 204800" >> /etc/rc.local echo "ulimit -SHu 204800" >> /etc/rc.local cat >> /etc/security/limits.conf << EOF * soft nofile 204800 * hard nofile 204800 * soft nproc 204800 * hard nproc 204800 EOF ##We must to disable ipv6 on every nodes echo ‘net.ipv6.conf.all.disable_ipv6 = 1‘>>/etc/sysctl.conf echo ‘net.ipv6.conf.default.disable_ipv6 = 1‘ >>/etc/sysctl.conf echo ‘vm.swappiness = 0‘ >> /etc/sysctl.conf sysctl -p echo ‘echo never > /sys/kernel/mm/transparent_hugepage/defrag‘ >> /etc/rc.local chmod +x /etc/rc.d/rc.local #1)Edit /etc/hosts file on every nodes cat >/etc/hosts<<EOF 127.0.0.1 localhost 192.168.51.6 spark01 192.168.51.18 spark02 192.168.51.19 spark03 192.168.51.21 spark04 192.168.51.24 spark05 EOF #2)install jdk on every nodes wget http://god.nongdingbang.net/downloads/auto_jdk.sh sh auto_jdk.sh #3)create hadoop user on every nodes groupadd hadoop -g 700 useradd hadoop -g hadoop -u 700 echo "hadoop123"|passwd --stdin hadoop echo ‘hadoop ALL=(ALL) NOPASSWD: ALL‘ >>/etc/sudoers #4)set permission with opt directory on every nodes chown -R hadoop.hadoop /opt/ #5)Set up key-based (passwordless) login: #just do it no spark01 su - hadoop ssh-keygen ssh-copy-id -i ~/.ssh/id_rsa.pub [email protected]