Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
687 views
in Technique[技术] by (71.8m points)

apache spark - How to troubleshoot a DSX scheduled notebook?

I have a DSX notebook that I can run manually usng the DSX user interface and it populates some data in a Cloudant database.

I have scheduled the notebook to run hourly. Overnight I would have expected the job to have run many times, but the Cloudant database has not been updated.

How can I debug the scheduled job? Are there any logs that I can check to verify that the notebook has actually been executed? Is the output from my notebook saved to log files? Where can I find these files?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

One possibility is to look into the kernel logs of your notebook kernel. For that you need to use a Python notebook.

Check the following location on the gpfs in your Python notebook:

!ls /gpfs/fs01/user/USERID/logs/notebook/

To get the USERID execute the following code:

!whoami

You should find for each kernel a log file, e.g. kernel-python3-20170105_102510.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...