a b loss (<-- 보기편하라고 쓴거임)
0.2708615 0.9408614 0.00010203663
0.27086106 0.940861 0.000100577614
0.2708606 0.94086057 9.9053155e-05
0.2708602 0.94086015 9.76157e-05
거의 다 줄였는데 갑자기 오차가 늘어나는건 왜이런건가요?
import tensorflow as tf
키 = 162.7
몸무게 = 45
a = tf.Variable(0.33)
b = tf.Variable(1.0)
opt = tf.keras.optimizers.Adam(learning_rate=0.0001)
for i in range(2008):
with tf.GradientTape() as tape:
예측몸무게 = 키 * a + b
loss = (예측몸무게 - 몸무게)**2
loss = tf.reduce_mean(loss)
gradient = tape.gradient(loss, [a, b])
opt.apply_gradients([(gradient[0], a), (gradient[1], b)])
print(a.numpy(), b.numpy(), loss.numpy())