Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
251 views
in Technique[技术] by (71.8m points)

python - Memory Usage During running a Deep learning CNN Model in Colab

I am conducting a research which requires me to know the memory used during run time by the model when i run a deep learning model(CNN) in google colab. Is there any code i can use to know the same .Basically I want to know how much memory has been used in total model run .(after all epoch has been complete). I am coding in python

Regards Avik

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

As explained in this post and my own observations, Tensorflow always tries to allocate the entire memory, no matter how small or big is your model. Unlike for example MXNet that only allocates enough memory to run the model.

Diving a little deeper, I learned that this is indeed the default behaviour in Tensorflow: use all available RAM to speed things up. Fair enough :)

You might think more memory allocation means faster training, but that's not the case most of the times. You can restrict your TF memory usage, as shown in the following code:

import tensorflow as tf
from keras.backend.tensorflow_backend import set_session

config = tf.ConfigProto()
config.gpu_options.per_process_gpu_memory_fraction = 0.9
config.gpu_options.visible_device_list = "0"

set_session(tf.Session(config=config))

Here is the Tensorflow documentation if you need more details of how to set restrictions on TF memory usage.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...