Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
720 views
in Technique[技术] by (71.8m points)

scala - How to save a spark DataFrame as csv on disk?

For example, the result of this:

df.filter("project = 'en'").select("title","count").groupBy("title").sum()

would return an Array.

How to save a spark DataFrame as a csv file on disk ?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Apache Spark does not support native CSV output on disk.

You have four available solutions though:

  1. You can convert your Dataframe into an RDD :

    def convertToReadableString(r : Row) = ???
    df.rdd.map{ convertToReadableString }.saveAsTextFile(filepath)
    

    This will create a folder filepath. Under the file path, you'll find partitions files (e.g part-000*)

    What I usually do if I want to append all the partitions into a big CSV is

    cat filePath/part* > mycsvfile.csv
    

    Some will use coalesce(1,false) to create one partition from the RDD. It's usually a bad practice, since it may overwhelm the driver by pulling all the data you are collecting to it.

    Note that df.rdd will return an RDD[Row].

  2. With Spark <2, you can use databricks spark-csv library:

    • Spark 1.4+:

      df.write.format("com.databricks.spark.csv").save(filepath)
      
    • Spark 1.3:

      df.save(filepath,"com.databricks.spark.csv")
      
  3. With Spark 2.x the spark-csv package is not needed as it's included in Spark.

    df.write.format("csv").save(filepath)
    
  4. You can convert to local Pandas data frame and use to_csv method (PySpark only).

Note: Solutions 1, 2 and 3 will result in CSV format files (part-*) generated by the underlying Hadoop API that Spark calls when you invoke save. You will have one part- file per partition.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...