Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
866 views
in Technique[技术] by (71.8m points)

hadoop yarn - How to exit spark-submit after the submission

When submitting spark streaming program using spark-submit(YARN mode) it keep polling the status and never exit

Is there any option in spark-submit to exit after the submission?

===why this trouble me===

The streaming program will run forever and i don't need the status update

I can ctrl+c to stop it if i start it manually but i have lots of streaming context to start and i need to start them using script

I can put the spark-submit program in background, but after lots of background java process created, the user corresponding to, will not able to run any other java process because JVM cannot create GC thread

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I know this is an old question but there's a way to do this now by setting --conf spark.yarn.submit.waitAppCompletion=false when you're using spark-submit. With this the client will exit after successfully submitting the application.

In YARN cluster mode, controls whether the client waits to exit until the application completes. If set to true, the client process will stay alive reporting the application's status. Otherwise, the client process will exit after submission.

Also, you may need to set --deploy-mode to cluster

In cluster mode, the Spark driver runs inside an application master process which is managed by YARN on the cluster, and the client can go away after initiating the application.

More at https://spark.apache.org/docs/latest/running-on-yarn.html


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...