Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
100 views
in Technique[技术] by (71.8m points)

python - steps_per_epoch and validation_steps for infinite Dataset in Keras Model

I have a huge dataset of csv files having a volume of around 200GB. I don't know the total number of records in the dataset. I'm using make_csv_dataset to create a PreFetchDataset generator.

I'm facing problem when Tensorflow complains to specify steps_per_epoch and validation_steps for infinite dataset....

  1. How can I specify the steps_per_epoch and validation_steps?

  2. Can I pass these parameters as the percentage of total dataset size?

  3. Can I somehow avoid these parameters as I want my whole dataset to be iterated for each epoch?

I think this SO thread answer the case when we know to total number of data records in advance.

Here is a screenshot from documentation. But I'm not getting it properly. screenshot

What does the last line mean?

question from:https://stackoverflow.com/questions/65949227/steps-per-epoch-and-validation-steps-for-infinite-dataset-in-keras-model

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I see no other option than iterating through your entire dataset.

ds = tf.data.experimental.make_csv_dataset('myfile.csv', batch_size=16, num_epochs=1)

for ix, _ in enumerate(ds, 1):
    pass

print('The total number of steps is', ix)

Don't forget the num_epochs argument.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...