如果我理解正確的話,我可以通過添加RelU
後the hidden layer
,然後重複hidden layer
+ RelU
從3層NN要深入學習與RELU
我變換3-Layered NN
成DL NN
難以想象維度將如何運作。我現在已經從small library我放在一起,所以我可以在概念
M = 784 # 28 x 28 pixels
N = 512 # hidden neurons
P = 10 # number of possible classes
w1 = np.random.normal(0.0, pow(10, -0.5), (M, N))
w2 = np.random.normal(0.0, pow(10, -0.5), (N, P))
b1 = np.random.normal(0.0, pow(10, -0.5), (N))
b2 = np.random.normal(0.0, pow(10, -0.5), (P))
x = Input(w1, b1)
h = Hidden(x, w2, b2)
g = Softmax(h)
cost = CrossEntropy(g) # numpy.mean(CrossEntropy) over BATCH SIZE
train_data()
下沉以下但我想去
x = Input(w1, b1)
h = Hidden(x, w2, b2)
r = ReLU(h)
h2 = Hidden(r, ??, ??) # 1
r2 = ReLU(h2) # 2
.. <repeat 1 and 2>
g = Softmax(h)
cost = CrossEntropy(g) # numpy.mean(CrossEntropy) over BATCH SIZE
train_data()
Related article I a writing about this