/usr/spark/sbin/start-all.sh 启动 spark失败,怎么搞

打杂uu 发布于 2015/09/24 14:49
阅读 10K+
收藏 0

@eagleonline 你好,想跟你请教个问题:

/usr/spark/sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/spark/sbin/../logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-master.out
failed to launch org.apache.spark.deploy.master.Master:
  Failed to find Spark assembly in /usr/spark/assembly/target/scala-2.10.
  You need to build Spark before running this program.
full log in /usr/spark/sbin/../logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-master.out
slave1: starting org.apache.spark.deploy.worker.Worker, logging to /usr/spark/sbin/../logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave1: failed to launch org.apache.spark.deploy.worker.Worker:
slave1:   JAVA_HOME is not set
slave1: full log in /usr/spark/sbin/../logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-slave1.out


加载中
0
打杂uu
打杂uu
hadoop 2.2 版本 不兼容 spark1.5, 查看官网得知, hadoop 2,2兼容版本是spark1.0.2
0
何其苦也
何其苦也
看起来有不止一个问题啊,首先你的spark好像没有编译?然后slave1上的JAVA HOME都没配置..
打杂uu
打杂uu
Booklearn Booklearn 1分钟前 hadoop都安装成功了,javahome肯定可以 hadoop 2.2 版本 不兼容 spark1.5, 查看官网得知, hadoop 2,2兼容版本是spark1.0.2
0
叶鸿影

我遇到的问题是这样解决的:

1、可以删除你的这个路径下的(/usr/spark/sbin/../logs) logs文件,整个logs文件删除

2、你的hadoop中的 java_home没有配置对:找到你的/hadoop/etc/hadoop/hadoop-env.sh文件, 把JAVA_HOME=${JAVA_HOME} 修改成下面的
JAVA_HOME=你的java安装路径
 

 

返回顶部
顶部