Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
872 views
in Technique[技术] by (71.8m points)

scala - remove NULL columns in Spark SQL

How to remove columns containing only null values from a table? Suppose I have a table -

SnapshotDate    CreationDate    Country Region  CloseDate   Probability BookingAmount   RevenueAmount   SnapshotDate1   CreationDate1   CloseDate1
        null            null       null   null       null   25  882000  0            null            null         null
        null            null       null   null       null   25  882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null
        null            null       null   null       null   0   882000  0            null            null         null

So I would just like to have Probability, BookingAmount and RevenueAmount columns and ignore the rest.

Is there a way to dynamically select the columns?

I am using spark 1.6.1

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I solved this with a global groupBy. This works for numeric and non-numeric columns:

case class Entry(id: Long, name: String, value: java.lang.Float)

val results = Seq(
  Entry(10, null, null),
  Entry(10, null, null),
  Entry(20, null, null)
)

val df: DataFrame = spark.createDataFrame(results)

// mark all columns with null only
val row = df
  .select(df.columns.map(c => when(col(c).isNull, 0).otherwise(1).as(c)): _*)
  .groupBy().max(df.columns.map(c => c): _*)
  .first

// and filter the columns out
val colKeep = row.getValuesMap[Int](row.schema.fieldNames)
  .map{c => if (c._2 == 1) Some(c._1) else None }
  .flatten.toArray
df.select(row.schema.fieldNames.intersect(colKeep)
  .map(c => col(c.drop(4).dropRight(1))): _*).show(false)

+---+
|id |
+---+
|10 |
|10 |
|20 |
+---+

Edit: I removed the shuffling of columns. The new approach keeps the given order of the columns.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...