一:单机版
1.sudo gedit ~/.bashrc 加入JDK路径
#HADOOP VARIABLES START export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64export HADOOP_INSTALL=/home/sendi/hadoop-2.6.0 export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib" #HADOOP VARIABLES END
ubantu下jdk路径可用如下命令取得:
update-alternatives --config java
2.执行下面命令使改动生效:
source ~/.bashrc
3.修改hadoop-env.sh的配置:
sudo gedit /usr/local/hadoop/etc/hadoop/hadoop-env.sh
找到JAVA_HOME改为:
/usr/lib/jvm/java-1.7.0-openjdk-amd64
再在下面加上一行代码:
export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib:$HADOOP_PREFIX/lib/native"
1.sudo gedit /usr/local/hadoop/etc/hadoop/core-site.xml
hadoop.tmp.dir /home/sendi/hadoop-2.6.0/tmp Abase for other temporary directories. fs.defaultFS hdfs://localhost:9000
mapred.job.tracker localhost:9001
mapreduce.framework.name yarn yarn.nodemanager.aux-services mapreduce_shuffle
dfs.replication 1 dfs.namenode.name.dir /home/sendi/hadoop-2.6.0/dfs/name dfs.datanode.data.dir /home/sendi/hadoop-2.6.0/dfs/data dfs.permissions false
5.sudo gedit /usr/local/hadoop/etc/hadoop/masters 添加:localhost
6.sudo gedit /usr/local/hadoop/etc/hadoop/slaves
添加:localhost
7.初始化文件系统HDFS
bin/hdfs namenode -format
8.启动,启动之后用下面链接查看结果
sbin/start-dfs.sh sbin/start-yarn.sh ------------------------------------------------------------------------------------------------------------------------------------------- sendi@sendijia:~/hadoop-2.6.0$ sbin/start-dfs.sh Starting namenodes on [localhost] localhost: starting namenode, logging to /home/sendi/hadoop-2.6.0/logs/hadoop-sendi-namenode-sendijia.out localhost: starting datanode, logging to /home/sendi/hadoop-2.6.0/logs/hadoop-sendi-datanode-sendijia.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /home/sendi/hadoop-2.6.0/logs/hadoop-sendi-secondarynamenode-sendijia.out sendi@sendijia:~/hadoop-2.6.0$ sbin/start-yarn.sh starting yarn daemons starting resourcemanager, logging to /home/sendi/hadoop-2.6.0/logs/yarn-sendi-resourcemanager-sendijia.out localhost: starting nodemanager, logging to /home/sendi/hadoop-2.6.0/logs/yarn-sendi-nodemanager-sendijia.out