hadoop安装1

tech2024-07-06  67

1虚拟机的配置(非重点) vi /etc/sysconfig/network-scripts/ifcfg-ens33 接下来static yes修改就行 IPADDR=xxx 先esc退出编辑,再:wq systemctl restart network systemctl stop firewalld 2.jdk的安装:

systemctl status firewalld//关闭防火墙 systemctl disable firewalld//禁用防火墙

Ctrl+l//清屏 cd /opt 上传安装包 tar -zxvf jdk-8u221-linux-x64.tar.gz//解压jdk包 cd jdk1.8.0_221 pwd vi /etc/profile 输入53回车 找到export PATH…在下面输入 输入 export JAVA_HOME=/opt/jdk1.8.0_221 export CLASSPATH=.: J A V A H O M E / / l i b / r t . j a r : JAVA_HOME//lib/rt.jar: JAVAHOME//lib/rt.jar:JAVA_HOME/lib/tools.jar: J A V A H O M E / l i b / d t . j a r e x p o r t P A T H = JAVA_HOME/lib/dt.jar export PATH= JAVAHOME/lib/dt.jarexportPATH=JAVA_HOME/bin: J A V A H O M E / j r e / b i n : JAVA_HOME/jre/bin: JAVAHOME/jre/bin:PATH 先esc退出编辑,在:wq保存 source /etc/profile 最后java -version 3包的导入 4.开始配置 [①hadoop-env.sh

The java implementation to use. export JAVA_HOME=/opt/greeinstall/jdk18111

②core-site.xml fs.defaultFS hdfs://192.168.56.137:9000 hadoop.tmp.dir /opt/greeinstall/hadoop260/hadoop2 hadoop.proxyuser.root.hosts hadoop.proxyuser.root.groups

③hdfs-site.xml

dfs.replication 1 dfs.namenode.secondary.http-address lijia1:50090 mapred-site.xml mapreduce.framework.name yarn mapreduce.jobhistory.address HostName:10020 mapreduce.jobhistory.webapp.address HostName:19888

④yarn-site.xml

yarn.nodemanager.aux-services mapreduce_shuffle yarn.nodemanager.aux-services.mapreduce.shuffle.class org.apache.hadoop.mapred.ShuffleHandler yarn.resourcemanager.hostname HostName yarn.log-aggregation-enable true yarn.log-aggregation.retain-seconds 604800 vi ./slaves localhost

④3.2Hadoop环境变量配置 vi /etc/profile export HADOOP_HOME=/opt/greeinstall/hadoop260 export HADOOP_MAPRED_HOME= H A D O O P H O M E e x p o r t H A D O O P C O M M O N H O M E = HADOOP_HOME export HADOOP_COMMON_HOME= HADOOPHOMEexportHADOOPCOMMONHOME=HADOOP_HOME export HADOOP_HDFS_HOME= H A D O O P H O M E e x p o r t Y A R N H O M E = HADOOP_HOME export YARN_HOME= HADOOPHOMEexportYARNHOME=HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR= H A D O O P H O M E / l i b / n a t i v e e x p o r t H A D O O P O P T S = " − D j a v a . l i b r a r y . p a t h = HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path= HADOOPHOME/lib/nativeexportHADOOPOPTS="Djava.library.path=HADOOP_HOME/lib" export PATH= P A T H : PATH: PATH:HADOOP_HOME/sbin:$HADOOP_HOME/bin

source /etc/profile

最新回复(0)