首页 > Hadoop > hadoop源码编译

hadoop源码编译

一 动因
使用网站编译好的版本,执行每个命令都会报如下一条警告

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable

baidu了一下,有人这样说:Apache提供的hadoop本地库是32位的,而在64位的服务器上就会有问题,因此需要自己编译64位的版本。
还有一个原因是,对于开源软件本地化是必然的,所以要把编译这路走通,这样就可以放心的修改源代码了。

二 需要准备的软件
1.JDK和GCC

[hadoop@hdm01 ~]$ java -version
java version “1.6.0_31”
Java(TM) SE Runtime Environment (build 1.6.0_31-b04)
Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)

[hadoop@hdm01 ~]$ gcc -v
Using built-in specs.
Target: x86_64-redhat-linux
Thread model: posix
gcc version 4.4.7 20120313 (Red Hat 4.4.7-11) (GCC)

2.安装maven
baidu能找到下载链接和安装方法
环境变量输入如下两行:

export MAVEN_HOME=/usr/local/apache-maven-3.2.5
export PATH=$PATH:$MAVEN_HOME/bin

检查mvn是否安装正常

[hadoop@hdm01 ~]$ mvn –version
Apache Maven 3.2.5 (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-15T01:29:23+08:00)
Maven home: /usr/local/apache-maven-3.2.5
Java version: 1.6.0_31, vendor: Sun Microsystems Inc.
Java home: /usr/local/jdk1.6.0_31/jre
Default locale: en_US, platform encoding: UTF-8
OS name: “linux”, version: “3.8.13-44.1.1.el6uek.x86_64”, arch: “amd64”, family: “unix”

3.安装protobuf
下载wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
解压 tar -zxvf protobuf-2.5.0.tar.gz
进入protobuf-2.5.0目录,执行下面的命令

./configure
make
make check
make install

查看是否安装正常

[hadoop@hdm01 ~]$ protoc –version
libprotoc 2.5.0

4.安装cmake
下载 wget http://www.cmake.org/files/v2.8/cmake-2.8.12.2.tar.gz
解压 tar -zxvf cmake-2.8.12.2.tar.gz
进入cmake-2.8.12.2目录,执行下面的命令

./bootstrap
make
make install

查看是否安装正常

[hadoop@hdm01 ~]$ cmake –version
cmake version 2.8.12.2

5.安装autotool
yum install autoconf automake libtool
6.安装openssl-devel
yum install openssl-devel
7.有人说可能还需要findingbugs
如果需要可以自己去下载,解压并配置环境变量即可。
http://sourceforge.jp/projects/sfnet_findbugs/downloads/findbugs/3.0.0/findbugs-3.0.0-dev-20131204-e3cbbd5.tar.gz/

三、编译haodoop
1.下载源码包hadoop-2.5.2-src.tar.gz并解压
2.进入hadoop-2.5.2-src目录,执行

mvn clean package -Pdist,native -DskipTests -Dtar

接下来就是漫长的等待,当你看到类似如下一系列的SUCCESS之后,你就成功了。

[INFO] Executed tasks
[INFO]
[INFO] — maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist —
[INFO] Building jar: /tmp/hadoop-2.5.2-src/hadoop-dist/target/hadoop-dist-2.5.2-javadoc.jar
[INFO] ————————————————————————
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main …………………………… SUCCESS [ 5.674 s]
[INFO] Apache Hadoop Project POM …………………….. SUCCESS [ 2.778 s]
[INFO] Apache Hadoop Annotations …………………….. SUCCESS [ 8.117 s]
[INFO] Apache Hadoop Assemblies ……………………… SUCCESS [ 0.530 s]
[INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [ 3.700 s]
[INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [ 9.369 s]
[INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [ 8.499 s]
[INFO] Apache Hadoop Auth …………………………… SUCCESS [ 10.335 s]
[INFO] Apache Hadoop Auth Examples …………………… SUCCESS [ 5.028 s]
[INFO] Apache Hadoop Common …………………………. SUCCESS [03:19 min]
[INFO] Apache Hadoop NFS ……………………………. SUCCESS [ 17.550 s]
[INFO] Apache Hadoop Common Project ………………….. SUCCESS [ 0.172 s]
[INFO] Apache Hadoop HDFS …………………………… SUCCESS [05:07 min]
[INFO] Apache Hadoop HttpFS …………………………. SUCCESS [ 35.578 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ………….. SUCCESS [01:37 min]
[INFO] Apache Hadoop HDFS-NFS ……………………….. SUCCESS [ 9.484 s]
[INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [ 0.087 s]
[INFO] hadoop-yarn …………………………………. SUCCESS [ 0.058 s]
[INFO] hadoop-yarn-api ……………………………… SUCCESS [02:14 min]
[INFO] hadoop-yarn-common …………………………… SUCCESS [01:26 min]
[INFO] hadoop-yarn-server …………………………… SUCCESS [ 0.060 s]
[INFO] hadoop-yarn-server-common …………………….. SUCCESS [ 34.984 s]
[INFO] hadoop-yarn-server-nodemanager ………………… SUCCESS [01:36 min]
[INFO] hadoop-yarn-server-web-proxy ………………….. SUCCESS [ 5.434 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ……. SUCCESS [ 11.998 s]
[INFO] hadoop-yarn-server-resourcemanager …………….. SUCCESS [ 33.747 s]
[INFO] hadoop-yarn-server-tests ……………………… SUCCESS [ 1.623 s]
[INFO] hadoop-yarn-client …………………………… SUCCESS [ 11.093 s]
[INFO] hadoop-yarn-applications ……………………… SUCCESS [ 0.076 s]
[INFO] hadoop-yarn-applications-distributedshell ………. SUCCESS [ 5.470 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ….. SUCCESS [ 4.313 s]
[INFO] hadoop-yarn-site …………………………….. SUCCESS [ 0.172 s]
[INFO] hadoop-yarn-project ………………………….. SUCCESS [ 13.395 s]
[INFO] hadoop-mapreduce-client ………………………. SUCCESS [ 0.190 s]
[INFO] hadoop-mapreduce-client-core ………………….. SUCCESS [ 45.441 s]
[INFO] hadoop-mapreduce-client-common ………………… SUCCESS [ 36.735 s]
[INFO] hadoop-mapreduce-client-shuffle ……………….. SUCCESS [ 9.616 s]
[INFO] hadoop-mapreduce-client-app …………………… SUCCESS [ 20.500 s]
[INFO] hadoop-mapreduce-client-hs ……………………. SUCCESS [ 22.697 s]
[INFO] hadoop-mapreduce-client-jobclient ……………… SUCCESS [ 22.552 s]
[INFO] hadoop-mapreduce-client-hs-plugins …………….. SUCCESS [ 4.556 s]
[INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [ 12.547 s]
[INFO] hadoop-mapreduce …………………………….. SUCCESS [ 8.447 s]
[INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [ 18.952 s]
[INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [ 31.502 s]
[INFO] Apache Hadoop Archives ……………………….. SUCCESS [ 3.998 s]
[INFO] Apache Hadoop Rumen ………………………….. SUCCESS [ 12.100 s]
[INFO] Apache Hadoop Gridmix ………………………… SUCCESS [ 9.217 s]
[INFO] Apache Hadoop Data Join ………………………. SUCCESS [ 5.091 s]
[INFO] Apache Hadoop Extras …………………………. SUCCESS [ 6.771 s]
[INFO] Apache Hadoop Pipes ………………………….. SUCCESS [ 14.090 s]
[INFO] Apache Hadoop OpenStack support ……………….. SUCCESS [ 10.116 s]
[INFO] Apache Hadoop Client …………………………. SUCCESS [ 15.771 s]
[INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [ 0.451 s]
[INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [ 16.809 s]
[INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [ 11.497 s]
[INFO] Apache Hadoop Tools ………………………….. SUCCESS [ 0.071 s]
[INFO] Apache Hadoop Distribution ……………………. SUCCESS [ 58.960 s]
[INFO] ————————————————————————
[INFO] BUILD SUCCESS
[INFO] ————————————————————————
[INFO] Total time: 26:12 min
[INFO] Finished at: 2015-03-17T15:28:21+08:00
[INFO] Final Memory: 108M/241M
[INFO] ————————————————————————
[root@hdm01 hadoop-2.5.2-src]#

3.编译好的hadoop-2.5.2.tar.gz在hadoop-2.5.2-src木目录下的hadoop-dist/target/的目录中,接下来就可以安装了。
4.如果在编译过程中报错,缺少的包或者软件可以自己去网上找。

四 补充
1、获取hadoop命令更详细报错

export HADOOP_ROOT_LOGGER=DEBUG,console

2、编译后需设JAVA_LIBRARY_PATH环境变量
不然同样会报warning,设定HADOOP_ROOT_LOGGER会看到如下报错信息


15/03/17 16:32:13 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library…
15/03/17 16:32:13 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
15/03/17 16:32:13 DEBUG util.NativeCodeLoader: java.library.path=/hadoop-2.5.2/lib
15/03/17 16:32:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable

分类: Hadoop 标签: , ,
  1. 本文目前尚无任何评论.
  1. 本文目前尚无任何 trackbacks 和 pingbacks.