使用Docker运行spark
获取docker镜像sudo docker pull sequenceiq/spark:1.6.0运行docker容器sudo docker run -it --name spark --rm sequenceiq/spark:1.6.0 /bin/bash运行作业$ cd /usr/local/spark$ bin/spark-submit --mast...
·
获取docker镜像
sudo docker pull sequenceiq/spark:1.6.0
运行docker容器
sudo docker run -it --name spark --rm sequenceiq/spark:1.6.0 /bin/bash
运行作业
$ cd /usr/local/spark
$ bin/spark-submit --master yarn-client --class org.apache.spark.examples.JavaWordCount lib/spark-examples-1.6.0-hadoop2.6.0.jar file:/usr/local/hadoop/input/
我们也可以把启动容器和运行作业放在一起,比如:
sudo docker run -it --name spark --rm sequenceiq/spark:1.6.0 sh -c "\"spark-submit --master yarn-client --class org.apache.spark.examples.JavaWordCount /usr/local/spark/lib/spark-examples-1.6.0-hadoop2.6.0.jar file:/usr/local/hadoop/input/\""
更多推荐
已为社区贡献32条内容
所有评论(0)