Windows下编译 Hadoop-2.9.2

栏目: 编程工具 · 发布时间: 5年前

内容简介:解决方法:将C:WindowsMicrosoft.NETFrameworkv4.0.30319放入PATH环境变量解决方法:这个问题是由于Visual Studio没有安装或安装的版本不对,换成对应的版本就好了exited with an error: 1(Exit value: 1) -> [Help 1]

Windows下编译 Hadoop-2.9.2

系统环境

系统: Windows 10 10.0_x64
maven: Apache Maven 3.6.0
jdk: jdk_1.8.0_201
ProtocolBuffer: portoc-2.5.0
zlib: 1.2.3-lib
OpenSSL: 1_0_2r
cmake: 3.14.3-win64-x64
Cygwin: 2.897_x86_64
Visual Studio: Visual Studio 2010 Professional
hadoop: hadoop-2.9.2

Hadoop源码包你们的的编译环境要求

Building on Windows

----------------------------------------------------------------------------------
Requirements:

* Windows System
* JDK 1.7 or 1.8
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer
* Windows SDK 7.1 or Visual Studio 2010 Professional
* Windows SDK 8.1 (if building CPU rate control for the container executor)
* zlib headers (if building native code bindings for zlib)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
  tools must be present on your PATH.
* Python ( for generation of docs using 'mvn site')

Unix command-line tools are also included with the Windows Git package which
can be downloaded from http://git-scm.com/downloads

If using Visual Studio, it must be Visual Studio 2010 Professional (not 2012).
Do not use Visual Studio Express.  It does not support compiling for 64-bit,
which is problematic if running a 64-bit system.  The Windows SDK 7.1 is free to
download here:

http://www.microsoft.com/en-us/download/details.aspx?id=8279

The Windows SDK 8.1 is available to download at:

http://msdn.microsoft.com/en-us/windows/bg162891.aspx

Cygwin is neither required nor supported.

编译必须要设置的环境变量

  • 1.JAVA_HOME必须要设置这个环境变量指向jdk的安装目录
  • 2.Platform这个环境变量也必须设置32位系统为 Platform=Win32 64位系统为 Platform=x64
  • 3.ZLIB_HOME为你zlib的安装目录例如 ZLIB_HOME=C:\zlib-1.2.7

编译需要注意的点

  • 1.保证你hadoop源码的路径尽量短,比如放在某个盘的根目录
  • 2.保证你的maven仓库的位置尽量短,比如放在某个盘的根目录(注意设置maven的conf文件)

编译过程中可能出现的问题

  • 1.Command execution failed. Cannot run program "msbuild" (in directory "C:hadoop-2.9.2-srchadoop-common-projecthadoop-common")

解决方法:将C:WindowsMicrosoft.NETFrameworkv4.0.30319放入PATH环境变量

  • 2.(compile-ms-winutils) on project hadoop-common: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]

解决方法:这个问题是由于Visual Studio没有安装或安装的版本不对,换成对应的版本就好了

  • 3.(compile-ms-native-dll) on project hadoop-common: Command execution failed. Process

exited with an error: 1(Exit value: 1) -> [Help 1]

解决方法:这个问题是由于zlib环境的问题,可能缺少ZLIB_HOME或者你下载的zlib的版本不正确或是你的zlib环境变量你们没有包含zlib.h或缺少unistd.h和getopt.h这两个文件可到 GitHub 上下载这两个文件.

  • 4.(dist) on project hadoop-kms: An Ant BuildException has occured: exec returned: 2

解决方法:这个问题是由于在编译hadoop-kms这个模块的时候下载Tomcat是由于网络问题出现网络断开连接重新编译也会出现这个问题,可以删除源码重新下载对应的源码进行编译(简单粗暴)

编译

下载hadoop 源码 选择对应的版本,解压源码后放在某个盘的根目录,进入源码的根目录执行 mvn package -Pdist,native-win -DskipTests -Dtar 后maven就开始去远程仓库下载需要的依赖这个过程可能需要很久跟带宽速度和墙有很大关系,过程中可能会出现各种问题但是自己都完美解决了,hadoop2.x的版本中我编译了hadoop2.7.7、hadoop-2.8.8、hadoop-2.9.2这三个版本都是编译成功的,但是在编译hadoop3.x版本就会失败,可能3.x对环境的需求相对于2.x有比较大的改变,后续再研究。。。

经过漫长的等待终于出现了久违的画面

[INFO] Reactor Summary for Apache Hadoop Main 2.9.2:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [  0.928 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  0.579 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.780 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  2.413 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.278 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.491 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.365 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  3.223 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  4.977 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  2.304 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:08 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  3.788 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [  9.379 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.052 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 23.895 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [ 48.088 s]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [ 25.830 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 25.398 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 10.032 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  2.641 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 13.092 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.053 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.054 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 19.308 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 53.206 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  3.731 s]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.056 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 16.246 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 20.362 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  2.566 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [  6.348 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [  3.742 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 25.768 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  1.535 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  6.465 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  3.396 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  3.146 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [  3.784 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [  4.994 s]
[INFO] Apache Hadoop YARN Timeline Service HBase tests .... SUCCESS [  1.584 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.336 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  2.052 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  1.535 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.052 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [  0.050 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  7.322 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.128 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 32.088 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 19.070 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  2.889 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  8.415 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  5.543 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [  4.486 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  1.283 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  4.521 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  2.765 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  3.277 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  3.957 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.462 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  1.628 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 14.629 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  2.915 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  1.689 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  1.238 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  1.851 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.052 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  3.016 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [  6.852 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  4.349 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [  2.039 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  6.595 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.961 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  7.219 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [  3.102 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  5.460 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 14.486 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.075 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 51.638 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  0.581 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.055 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  11:01 min
[INFO] Finished at: 2019-05-08T11:42:52+08:00
[INFO] ------------------------------------------------------------------------

C:\Users\Andy\Downloads\hadoop-2.9.2-src>

大功告成编译后的hadoop放在 hadoop-2.9.2-src\hadoop-dist\target 目录下。


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

在你身边,为你设计

在你身边,为你设计

腾讯公司用户研究与体验设计部 / 电子工业出版社 / 2013-1 / 69.00元

设计属于所有人,也意在为所有人使用,这既是设计的价值,也是设计的责任。本书内容包括:设计理念、设计方法、用户研究、体验设计、设计流程和工具,以及团队成长与管理等方面的知识与经验分享。一起来看看 《在你身边,为你设计》 这本书的介绍吧!

RGB转16进制工具
RGB转16进制工具

RGB HEX 互转工具

随机密码生成器
随机密码生成器

多种字符组合密码

RGB HSV 转换
RGB HSV 转换

RGB HSV 互转工具