From a PySpark SQL dataframe like
name age city
abc 20 A
def 30 B
How to get the last row.(Like by df.limit(1) I can get first row of dataframe into new dataframe).
And how can I access the dataframe rows by index.like row no. 12 or 200 .
In pandas I can do
df.tail(1) # for last row
df.ix[rowno or index] # by index
df.loc[] or by df.iloc[]
I am just curious how to access pyspark dataframe in such ways or alternative ways.
Thanks
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…