Hadoop Installation On Linux
Hadoop Installation On Linux
which javac
copy the path
readlink -f /usr/bin/javac
copy the path before /bin/javac
4.Use the cat command to store the public key as authorized_keys in the ssh directory:
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
5.Set the permissions for your user with the chmod command:
chmod 0600 ~/.ssh/authorized_keys
6.SSH to localhost
ssh localhost
bashrc
hadoop-env.sh
core-site.xml
hdfs-site.xml
mapred-site-xml
yarn-site.xml
##.bashrc
Save it.
Run the command on the terminal to make new environment variables visible.
source ~/.bashrc
##hadoop-env.sh
##core-site.xml
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
Create a directory for datanodes and namenodes and add its location in hdfs-site.xml
##hdfs-site.xml
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/home/amit/hadoop-3.3.0/data/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/home/amit/hadoop-3.3.0/data/datanode</value>
</property>
##mapred-site-xml
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
##yarn-site.xml
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
<property>
<name>yarn.resourcemanager.hostname</name>
<value>127.0.0.1</value>
</property>
<property>
<name>yarn.acl.enable</name>
<value>0</value>
</property>
<property>
<name>yarn.nodemanager.env-whitelist</name>
<value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CON
F_DIR,CLASSPATH_PERPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_
HOME</value>
</property>