Building Spark 1.2.0 with Hadoop 2.6.0 support
Building Spark 1.2.0 with Hadoop 2.6.0 support
66号公路 发表于3年前
Building Spark 1.2.0 with Hadoop 2.6.0 support
  • 发表于 3年前
  • 阅读 91
  • 收藏 1
  • 点赞 0
  • 评论 0

新睿云服务器60天免费使用,快来体验!>>>   


mvn package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -DskipTests
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [  5.308 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 10.649 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  7.824 s]
[INFO] Spark Project Core ................................. SUCCESS [04:49 min]
[INFO] Spark Project Bagel ................................ SUCCESS [ 28.008 s]
[INFO] Spark Project GraphX ............................... SUCCESS [01:20 min]
[INFO] Spark Project Streaming ............................ SUCCESS [01:49 min]
[INFO] Spark Project Catalyst ............................. SUCCESS [01:54 min]
[INFO] Spark Project SQL .................................. SUCCESS [02:07 min]
[INFO] Spark Project ML Library ........................... SUCCESS [02:30 min]
[INFO] Spark Project Tools ................................ SUCCESS [ 17.481 s]
[INFO] Spark Project Hive ................................. SUCCESS [01:50 min]
[INFO] Spark Project REPL ................................. SUCCESS [01:15 min]
[INFO] Spark Project YARN Parent POM ...................... SUCCESS [ 17.009 s]
[INFO] Spark Project YARN Stable API ...................... SUCCESS [02:02 min]
[INFO] Spark Project Assembly ............................. SUCCESS [01:33 min]
[INFO] Spark Project External Twitter ..................... SUCCESS [ 34.236 s]
[INFO] Spark Project External Flume Sink .................. SUCCESS [ 55.964 s]
[INFO] Spark Project External Flume ....................... SUCCESS [ 34.743 s]
[INFO] Spark Project External MQTT ........................ SUCCESS [ 32.961 s]
[INFO] Spark Project External ZeroMQ ...................... SUCCESS [ 32.377 s]
[INFO] Spark Project External Kafka ....................... SUCCESS [ 52.589 s]
[INFO] Spark Project Examples ............................. SUCCESS [03:13 min]
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 16.538 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 30:14 min
[INFO] Finished at: 2015-02-09T03:17:36+08:00
[INFO] Final Memory: 79M/738M
[INFO] ------------------------------------------------------------------------
[hadoop@olinfa spark-1.2.0]$ sbin/start-master.sh 
starting org.apache.spark.deploy.master.Master, logging to /opt/bigdata/spark/spark-1.2.0/sbin/../logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-olinfa.out
[hadoop@olinfa spark-1.2.0]$ cat /opt/bigdata/spark/spark-1.2.0/sbin/../logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-olinfa.out
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Spark Command: /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/bin/java -cp ::/opt/bigdata/spark/spark-1.2.0/sbin/../conf:/opt/bigdata/spark/spark-1.2.0/assembly/target/scala-2.10/spark-assembly-1.2.0-hadoop2.6.0.jar:/opt/bigdata/spark/spark-1.2.0/lib_managed/jars/datanucleus-core-3.2.10.jar:/opt/bigdata/spark/spark-1.2.0/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar:/opt/bigdata/spark/spark-1.2.0/lib_managed/jars/datanucleus-rdbms-3.2.9.jar -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip olinfa --port 7077 --webui-port 8080
========================================

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/02/09 08:15:51 INFO Master: Registered signal handlers for [TERM, HUP, INT]
15/02/09 08:15:51 INFO SecurityManager: Changing view acls to: hadoop
15/02/09 08:15:51 INFO SecurityManager: Changing modify acls to: hadoop
15/02/09 08:15:51 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
15/02/09 08:15:52 INFO Slf4jLogger: Slf4jLogger started
15/02/09 08:15:52 INFO Remoting: Starting remoting
15/02/09 08:15:53 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@olinfa:7077]
15/02/09 08:15:53 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkMaster@olinfa:7077]
15/02/09 08:15:53 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
15/02/09 08:15:53 INFO Master: Starting Spark master at spark://olinfa:7077
15/02/09 08:15:53 INFO Utils: Successfully started service 'MasterUI' on port 8080.
15/02/09 08:15:53 INFO MasterWebUI: Started MasterWebUI at http://olinfa:8080
15/02/09 08:15:53 INFO Master: I have been elected leader! New state: ALIVE



标签: hadoop spark
  • 打赏
  • 点赞
  • 收藏
  • 分享
共有 人打赏支持
66号公路
粉丝 58
博文 80
码字总数 49936
×
66号公路
如果觉得我的文章对您有用,请随意打赏。您的支持将鼓励我继续创作!
* 金额(元)
¥1 ¥5 ¥10 ¥20 其他金额
打赏人
留言
* 支付类型
微信扫码支付
打赏金额:
已支付成功
打赏金额: