Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
275 views
in Technique[技术] by (71.8m points)

python pandas: apply a function with arguments to a series. Update

I would like to apply a function with argument to a pandas series: I have found two different solution of SO:

python pandas: apply a function with arguments to a series

and

Passing multiple arguments to apply (Python)

both of them rely on the use of functool.partial and they works absolutely fine. By the way the new version of Pandas support multiple argument: in any case I do not understand how does it works. Example:

a=pd.DataFrame({'x':[1,2],'y':[10,20]})
a['x'].apply(lambda x,y: x+y, args=(100))

It exits with a:

TypeError: <lambda>() argument after * must be a sequence, not int
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The TypeError is saying that you passed the wrong type to the lambda function x + y. It's expecting the args to be a sequence, but it got an int. You may have thought that (100) was a tuple (a sequence), but in python it's the comma that makes a tuple:

In [10]: type((100))
Out[10]: int

In [11]: type((100,))
Out[11]: tuple

So change your last line to

In [12]: a['x'].apply(lambda x, y: x + y, args=(100,))
Out[12]: 
0    101
1    102
Name: x, dtype: int64

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...