Is there any way to get the current number of partitions of a DataFrame?
I checked the DataFrame javadoc (spark 1.6) and didn't found a method for that, or am I just missed it?
(In case of JavaRDD there's a getNumPartitions() method.)
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…