I'm trying to get the TensorFlow example running with my own data, but somehow the classifier always picks the same class for every test example. The input data is always shuffled prior. I have about 4000 images as a training set and 500 images as a testing set.
The result I get looks like:
Result: [[ 1. 0.]] Actually: [ 1. 0.]
Result: [[ 1. 0.]] Actually: [ 0. 1.]
Result: [[ 1. 0.]] Actually: [ 1. 0.]
Result: [[ 1. 0.]] Actually: [ 1. 0.]
Result: [[ 1. 0.]] Actually: [ 0. 1.]
Result: [[ 1. 0.]] Actually: [ 0. 1.]
...
The right side remains for all 500 images [1. 0.]
. The classification is binary so I just have two labels.
Here is my source code:
import tensorflow as tf
import input_data as id
test_images, test_labels = id.read_images_from_csv(
"/home/johnny/Desktop/tensorflow-examples/46-model.csv")
train_images = test_images[:4000]
train_labels = test_labels[:4000]
test_images = test_images[4000:]
test_labels = test_labels[4000:]
print len(train_images)
print len(test_images)
pixels = 200 * 200
labels = 2
sess = tf.InteractiveSession()
# Create the model
x = tf.placeholder(tf.float32, [None, pixels])
W = tf.Variable(tf.zeros([pixels, labels]))
b = tf.Variable(tf.zeros([labels]))
y_prime = tf.matmul(x, W) + b
y = tf.nn.softmax(y_prime)
# Define loss and optimizer
y_ = tf.placeholder(tf.float32, [None, labels])
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(y_prime, y_)
train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
# Train
tf.initialize_all_variables().run()
for i in range(10):
res = train_step.run({x: train_images, y_: train_labels})
# Test trained model
correct_prediction = tf.equal(tf.argmax(y, 1), tf.argmax(y_, 1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
print(accuracy.eval({x: test_images, y_: test_labels}))
for i in range(0, len(test_images)):
res = sess.run(y, {x: [test_images[i]]})
print("Result: " + str(res) + " Actually: " + str(test_labels[i]))
Am I missing a point?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…