2016-05-04 12 views
0

マイ入力データは36フロートテンソルフローで簡単なロジスティック回帰を行うには?

[-0.712982 1.14461327 -0.46141151 -0.39443004 -0.44848472 -0.65676075 
    0.56058383 -0.61031222 0.43211082 -0.74852234 1.28183317 0.79719085 
-0.28156522 0.16901374 -0.73715878 0.69877005 -0.40633941 0.01085454 
-0.33675554 -0.37056464 -0.43088505 0.3327457 -0.15905562 0.72995877 
    0.56962079 0.10286932 0.25698286 0.89823145 -0.12923111 0.3219386 
    0.10118762 1.29127014 -0.22283298 0.75640506 0.79971719 0.60000002] 

の一の工程 numpyの配列の長さのための私のコードの一部:形状(36、この場合

X = tf.placeholder(tf.float32, (36)) 
Y = tf.placeholder(tf.float32) 

# Create Model 

# Set model weights 
    W = tf.Variable(tf.zeros([36], name="weight")) 
    b = tf.Variable(tf.zeros([1]), name="bias") 


# Construct model 
    activation = tf.add(tf.matmul(X, W), b) 

は(とValueErrorを動作しないtf.matmul )はランク2でなければなりません。 ライセンス認証を単一のフロート番号として取得するために必要な変更は何ですか?

答えて

1

ちょうど使用:

activation = tf.add(tf.mul(X, W), b) 

https://github.com/nlintz/TensorFlow-Tutorials/blob/master/1_linear_regression.pyから、単純な線形回帰例(およびその他)を参照してください:

import tensorflow as tf 
import numpy as np 

trX = np.linspace(-1, 1, 101) 
trY = 2 * trX + np.random.randn(*trX.shape) * 0.33 # create a y value which is approximately linear but with some random noise 

X = tf.placeholder("float") # create symbolic variables 
Y = tf.placeholder("float") 

w = tf.Variable(0.0, name="weights") # create a shared variable (like theano.shared) for the weight matrix 
y_model = tf.mul(X, w) 
cost = tf.square(Y - y_model) # use square error for cost function 
train_op = tf.train.GradientDescentOptimizer(0.01).minimize(cost) # construct an optimizer to minimize cost and fit line to my data 

# Launch the graph in a session 
with tf.Session() as sess: 
    # you need to initialize variables (in this case just variable W) 
    tf.initialize_all_variables().run() 

    for i in range(100): 
     for (x, y) in zip(trX, trY): 
      sess.run(train_op, feed_dict={X: x, Y: y}) 

    print(sess.run(w)) # It should be something around 2 
関連する問題