在spark集群中使用命令:

sbin/start-all.sh

启动集群时报错:

starting org.apache.spark.deploy.master.Master, logging to /home/yxk/cluster/spark/logs/spark-yxk-org.apache.spark.deploy.master.Master-1-linux.out
yxk@linux's password:
linux: starting org.apache.spark.deploy.worker.Worker, logging to /home/yxk/cluster/spark/logs/spark-yxk-org.apache.spark.deploy.worker.Worker-1-linux.out
linux: failed to launch: nice -n 0 /home/yxk/cluster/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://linux:7077
linux:   JAVA_HOME is not set
linux: full log in /home/yxk/cluster/spark/logs/spark-yxk-org.apache.spark.deploy.worker.Worker-1-linux.out

从日志分析中可以看出启动时找不到JAVA_HOME

不过检查当前用户下的java环境是正常的,经分析寻找资料在这个博客中找到了 原理https://stackoverflow.com/questions/33955635/why-does-start-all-sh-from-root-cause-failed-to-launch-org-apache-spark-deploy

 

解决方法

在当前用户的.bashrc文件中加入java的环境变量

$ vim ~/.bashrc

加入java环境变量

重新启动spark集群

使用jps命令查看当前的进程。显示正常

Logo

更多推荐