Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
798 views
in Technique[技术] by (71.8m points)

python - Sum operation on PySpark DataFrame giving TypeError when type is fine

I have such DataFrame in PySpark (this is the result of a take(3), the dataframe is very big):

sc = SparkContext()
df = [Row(owner=u'u1', a_d=0.1), Row(owner=u'u2', a_d=0.0), Row(owner=u'u1', a_d=0.3)]

the same owner will have more rows. What I need to do is summing the values of the field a_d per owner, after grouping, as

b = df.groupBy('owner').agg(sum('a_d').alias('a_d_sum'))

but this throws error

TypeError: unsupported operand type(s) for +: 'int' and 'str'

However, the schema contains double values, not strings (this comes from a printSchema()):

root
|-- owner: string (nullable = true)
|-- a_d: double (nullable = true)

So what is happening here?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You are not using the correct sum function but the built-in function sum (by default).

So the reason why the build-in function won't work is that's it takes an iterable as an argument where as here the name of the column passed is a string and the built-in function can't be applied on a string. Ref. Python Official Documentation.

You'll need to import the proper function from pyspark.sql.functions :

from pyspark.sql import Row
from pyspark.sql.functions import sum as _sum

df = sqlContext.createDataFrame(
    [Row(owner=u'u1', a_d=0.1), Row(owner=u'u2', a_d=0.0), Row(owner=u'u1', a_d=0.3)]
)

df2 = df.groupBy('owner').agg(_sum('a_d').alias('a_d_sum'))
df2.show()

# +-----+-------+
# |owner|a_d_sum|
# +-----+-------+
# |   u1|    0.4|
# |   u2|    0.0|
# +-----+-------+

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...