文档章节

Windows下编译 Hadoop

janlle
 janlle
发布于 2019/05/08 17:54
字数 1602
阅读 1
收藏 0

Windows下编译 Hadoop-2.9.2

系统环境

系统: Windows 10 10.0_x64
maven: Apache Maven 3.6.0
jdk: jdk_1.8.0_201
ProtocolBuffer: portoc-2.5.0
zlib: 1.2.3-lib
OpenSSL: 1_0_2r
cmake: 3.14.3-win64-x64
Cygwin: 2.897_x86_64
Visual Studio: Visual Studio 2010 Professional
hadoop: hadoop-2.9.2

Hadoop源码包你们的的编译环境要求

Building on Windows

----------------------------------------------------------------------------------
Requirements:

* Windows System
* JDK 1.7 or 1.8
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer
* Windows SDK 7.1 or Visual Studio 2010 Professional
* Windows SDK 8.1 (if building CPU rate control for the container executor)
* zlib headers (if building native code bindings for zlib)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
  tools must be present on your PATH.
* Python ( for generation of docs using 'mvn site')

Unix command-line tools are also included with the Windows Git package which
can be downloaded from http://git-scm.com/downloads

If using Visual Studio, it must be Visual Studio 2010 Professional (not 2012).
Do not use Visual Studio Express.  It does not support compiling for 64-bit,
which is problematic if running a 64-bit system.  The Windows SDK 7.1 is free to
download here:

http://www.microsoft.com/en-us/download/details.aspx?id=8279

The Windows SDK 8.1 is available to download at:

http://msdn.microsoft.com/en-us/windows/bg162891.aspx

Cygwin is neither required nor supported.

编译必须要设置的环境变量

  • 1.JAVA_HOME必须要设置这个环境变量指向jdk的安装目录

  • 2.Platform这个环境变量也必须设置32位系统为Platform=Win3264位系统为Platform=x64

  • 3.ZLIB_HOME为你zlib的安装目录例如ZLIB_HOME=C:\zlib-1.2.7

编译需要注意的点

  • 1.保证你hadoop源码的路径尽量短,比如放在某个盘的根目录

  • 2.保证你的maven仓库的位置尽量短,比如放在某个盘的根目录(注意设置maven的conf文件)

编译过程中可能出现的问题

  • 1.Command execution failed. Cannot run program "msbuild" (in directory "C:\hadoop-2.9.2-src\hadoop-common-project\hadoop-common")

解决方法:将C:\Windows\Microsoft.NET\Framework\v4.0.30319放入PATH环境变量

  • 2.(compile-ms-winutils) on project hadoop-common: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]

解决方法:这个问题是由于Visual Studio没有安装或安装的版本不对,换成对应的版本就好了

  • 3.(compile-ms-native-dll) on project hadoop-common: Command execution failed. Process exited with an error: 1(Exit value: 1) -> [Help 1]

解决方法:这个问题是由于zlib环境的问题,可能缺少ZLIB_HOME或者你下载的zlib的版本不正确或是你的zlib环境变量你们没有包含zlib.h或缺少unistd.h和getopt.h这两个文件可到GitHub上下载这两个文件.

  • 4.(dist) on project hadoop-kms: An Ant BuildException has occured: exec returned: 2

解决方法:这个问题是由于在编译hadoop-kms这个模块的时候下载Tomcat是由于网络问题出现网络断开连接重新编译也会出现这个问题,可以删除源码重新下载对应的源码进行编译(简单粗暴)

编译

下载hadoop源码选择对应的版本,解压源码后放在某个盘的根目录,进入源码的根目录执行mvn package -Pdist,native-win -DskipTests -Dtar后maven就开始去远程仓库下载需要的依赖这个过程可能需要很久跟带宽速度和墙有很大关系,过程中可能会出现各种问题但是自己都完美解决了,hadoop2.x的版本中我编译了hadoop2.7.7、hadoop-2.8.8、hadoop-2.9.2这三个版本都是编译成功的,但是在编译hadoop3.x版本就会失败,可能3.x对环境的需求相对于2.x有比较大的改变,后续再研究。。。

经过漫长的等待终于出现了久违的画面

[INFO] Reactor Summary for Apache Hadoop Main 2.9.2:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [  0.928 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  0.579 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.780 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  2.413 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.278 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.491 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.365 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  3.223 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  4.977 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  2.304 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:08 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  3.788 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [  9.379 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.052 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 23.895 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [ 48.088 s]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [ 25.830 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 25.398 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 10.032 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  2.641 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 13.092 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.053 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.054 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 19.308 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 53.206 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  3.731 s]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.056 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 16.246 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 20.362 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  2.566 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [  6.348 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [  3.742 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 25.768 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  1.535 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  6.465 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  3.396 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  3.146 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [  3.784 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [  4.994 s]
[INFO] Apache Hadoop YARN Timeline Service HBase tests .... SUCCESS [  1.584 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.336 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  2.052 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  1.535 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.052 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [  0.050 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  7.322 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.128 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 32.088 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 19.070 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  2.889 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  8.415 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  5.543 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [  4.486 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  1.283 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  4.521 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  2.765 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  3.277 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  3.957 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.462 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  1.628 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 14.629 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  2.915 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  1.689 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  1.238 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  1.851 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.052 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  3.016 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [  6.852 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  4.349 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [  2.039 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  6.595 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.961 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  7.219 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [  3.102 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  5.460 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 14.486 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.075 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 51.638 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  0.581 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.055 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  11:01 min
[INFO] Finished at: 2019-05-08T11:42:52+08:00
[INFO] ------------------------------------------------------------------------

C:\Users\Andy\Downloads\hadoop-2.9.2-src>

大功告成编译后的hadoop放在hadoop-2.9.2-src\hadoop-dist\target目录下。

© 著作权归作者所有

上一篇: Mac 下编译 Hadoop
下一篇: Anaconda 入门详解
janlle
粉丝 0
博文 24
码字总数 45495
作品 0
广州
程序员
私信 提问
从零开始的 Win7 64Bit 编译Hadoop3

前言 最近在接触Hadoop方面的知识。去官网逛了一圈发现最新版本已经是3.0.0-alpha1了。刚好在搭建Hadoop的环境,就决定使用最新版本的Hadoop试试水。因为自己使用的机器是windows的,逛了一圈...

Alex_Nine
2016/11/02
2.4K
0
windows下使用Eclipse运行MapReduce程序出错: Failed to set...

在windows下用Eclipse运行MapReduce程序会报错: 这个问题在网上很常见了, 搜下有很多, 原因和解决方案: 这个是Windows下文件权限问题,在Linux下可以正常运行,不存在这样的问题。 解决方法是...

lifephp
2013/09/01
1.5K
0
编译hadoop2.x的hadoop-eclipse-plugin和配置

一、编译 1.安装jdk,并且配置好环境变量。 2.eclipse已经下载并且配置好了。 3.安装ant,并且配置好了环境变量。 4.hadoop包在windows本地已经有了,要和hadoop集群上的hadoop包一样,eclip...

cjun1990
2015/07/06
468
0
Eclipse远程调试Hadoop集群

准备工作: Hadoop安装完成(我的版本为1.2.1)。 搞一个比较干净的Eclipse。 下载与Hadoop版本相匹配的插件:hadoop-eclipse-plugin-1.2.1.jar 安装、配置: 1. 将插件拷贝到eclipse安装目录...

Jackson_Mu
2014/12/17
2.7K
1
windows上编译和安装hadoop2 (一)

本文操作来自hadoop2官方wiki 博主进行了稍许整理,另外遇到一些问题,参考搜索引擎 一、编译Hadoop 1.1 部署编译环境 博主曾尝试使用官方提供的依赖软件最新版,被翻来覆去的打了好几次脸,...

在别处的老张头
2016/06/16
608
0

没有更多内容

加载失败,请刷新页面

加载更多

金蝶EAS DEP 服务端 脚本

1、服务端执行SQL //服务端更新单据状态var imp = JavaImporter(); imp.importPackage(Packages.com.kingdee.eas.scm.im.inv); imp.importPackage(Packages.com.kingdee.eas.srt.comm......

路过饭堂门口
27分钟前
24
0
Hive之导出文件按逗号分隔到本地文件

如下所示,默认导出的是用\t分隔的,需要使用管道符进行转换,经常使用到,记录下. List-1 hive -e "SELECT * from student" | sed 's/\t/,/g' > /tmp/student.csv...

克虏伯
36分钟前
34
0
转置/解压缩功能(zip的反转)?

我有一个2项元组的列表,我想将它们转换为2个列表,其中第一个包含每个元组中的第一个项目,第二个列表包含第二个项目。 例如: original = [('a', 1), ('b', 2), ('c', 3), ('d', 4)]# an......

技术盛宴
52分钟前
53
0
小猪o2o系统v14.0升级v14.1攻略含小猪CMS微店铺和智慧店铺及小程序百项升级

首先我们要注意升级前的以下几个内容: 即 对环境的要求 网站需求PHP7.1 MYSQL5.1以上 Sw解密组件 解密组件在swoole-loader内 解压缩网站包 修改Conf/db.php内数据库文件 替换数据库内xxx.com...

my_gode
今天
98
0
「干货来袭」Github最全Nodejs资源集

今天给大家分享一下Github上面最全的Nodejs资源集awesome-nodejs,该项目聚合了nodejs各类优质资源,对自立于全栈开发或者想了解nodejs技术栈的开发同学极有帮助,有想进一步了解nodejs的千万...

gamedilong
今天
56
0

没有更多内容

加载失败,请刷新页面

加载更多

返回顶部
顶部