Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
487 views
in Technique[技术] by (71.8m points)

python - Dismantle dataframe into new dataframes of subsets/groups resp. create new dataframes of data subsets/groups from other dataframe

I have a pandas dataframe that looks like the following and holds groups of data via a column id:

import numpy as np
import pandas as pd


df = pd.DataFrame(np.random.randn(10, 4), columns=list('ABCD'))
df['id'] = ['W', 'W', 'W', 'Z', 'Z', 'Y', 'Y', 'Y', 'Z', 'Z']

print(df)

          A         B         C         D id
0  0.347501 -1.152416  1.441144 -0.144545  w
1  0.775828 -1.176764  0.203049 -0.305332  w
2  1.036246 -0.467927  0.088138 -0.438207  w
3 -0.737092 -0.231706  0.268403  0.464026  x
4 -1.857346 -1.420284 -0.515517 -0.231774  x
5 -0.970731  0.217890  0.193814 -0.078838  y
6 -0.318314 -0.244348  0.162103  1.204386  y
7  0.340199  1.074977  1.201068 -0.431473  y
8  0.202050  0.790434  0.643458 -0.068620  z
9 -0.882865  0.687325 -0.008771 -0.066912  z

Now I want to create new dataframes (named df_w, df_x, df_y, df_z) which only hold their data from the original dataframe and are optimally combined within some iterable e.g. a list:

df_w

          A         B         C         D id
0  0.347501 -1.152416  1.441144 -0.144545  w
1  0.775828 -1.176764  0.203049 -0.305332  w
2  1.036246 -0.467927  0.088138 -0.438207  w

df_x

          A         B         C         D id
0 -0.737092 -0.231706  0.268403  0.464026  x
1 -1.857346 -1.420284 -0.515517 -0.231774  x

df_y

          A         B         C         D id
0 -0.970731  0.217890  0.193814 -0.078838  y
1 -0.318314 -0.244348  0.162103  1.204386  y
2  0.340199  1.074977  1.201068 -0.431473  y

df_z

          A         B         C         D id
0  0.202050  0.790434  0.643458 -0.068620  z
1 -0.882865  0.687325 -0.008771 -0.066912  z

Is there any smart (vectorized pandas) way to achieve this using groupby, apply and/or applymap and a function?

I was thinking about iterating over the dataframe but it doesn't seem to be very elegant..

Thanks in advance for any hints!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

we can create a dict of DFs:

In [166]: dfs = {k:v for k,v in df.groupby('id')}

In [168]: dfs.keys()
Out[168]: dict_keys(['W', 'Y', 'Z'])

In [169]: dfs['W']
Out[169]:
          A         B         C         D id
0 -0.373021 -0.555218  0.022980 -0.512323  W
1 -1.599466  0.637292  0.045059 -0.334030  W
2  0.100659  0.557068  0.142226 -0.186214  W

In [170]: dfs['Y']
Out[170]:
          A         B         C         D id
5  0.540107 -0.739077  0.992408  2.010203  Y
6 -0.201376 -0.913222 -0.173284  1.837442  Y
7 -1.367659  0.915360  0.072720 -0.886071  Y

In [171]: dfs['Z']
Out[171]:
          A         B         C         D id
3 -0.329087  0.842431  0.839319 -0.597823  Z
4 -0.594375 -0.950486  1.125584  0.116599  Z
8  0.366667 -0.978279 -1.449893  0.192451  Z
9 -0.007439 -0.084612  0.010192 -0.417602  Z

UPDATE: with reset index:

In [177]: {k:v.reset_index(drop=True) for k,v in df.groupby('id')}
Out[177]:
{'W':           A         B         C         D id
 0 -0.373021 -0.555218  0.022980 -0.512323  W
 1 -1.599466  0.637292  0.045059 -0.334030  W
 2  0.100659  0.557068  0.142226 -0.186214  W,
 'Y':           A         B         C         D id
 0  0.540107 -0.739077  0.992408  2.010203  Y
 1 -0.201376 -0.913222 -0.173284  1.837442  Y
 2 -1.367659  0.915360  0.072720 -0.886071  Y,
 'Z':           A         B         C         D id
 0 -0.329087  0.842431  0.839319 -0.597823  Z
 1 -0.594375 -0.950486  1.125584  0.116599  Z
 2  0.366667 -0.978279 -1.449893  0.192451  Z
 3 -0.007439 -0.084612  0.010192 -0.417602  Z}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...