I have a dataframe
df of columns
("id", "current_date", "days")
and I am trying to add the the "days
" to "current_date
" and create a new dataframe
with new column
called "new_date
" using spark scala function date_add()
val newDF = df.withColumn("new_Date", date_add(df("current_date"), df("days").cast("Int")))
But looks like the function date_add
only accepts Int
values and not columns
. How can get the desired output in such case? Are there any alternative functions i can use to get the desired output?
spark version: 1.6.0
scala version: 2.10.6
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…