Here you are specifying the model input. You want to leave the Batch size, to None
, that means that you can run the model with a variable number of inputs (one or more). Batching is important to efficiently use your computing resources.
x = tf.placeholder("float", shape=[None, 784])
y_ = tf.placeholder("float", shape=[None, 10])
The next important line is:
batch = mnist.train.next_batch(50)
Here you are sending 50 elements as input but you can also change that to just one
batch = mnist.train.next_batch(1)
Without modifying the graph. If you specify the Batch size (some number instead of None in the first snippet), then you would have to change each time and that is not ideal, specially in production.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…