1 year ago

#71757

test-img

thatOldITGuy

Keras Adam minimize function: no gradients provided

I need to optimize a function with Adam Optimizer (no Neural Network involved). I made a dummy example to understand how it works, using the minimize function but seems like I'm not getting it. It's a simple function that returns the dot product between two arrays (as tf variables). Code bellow:

    np.random.seed(1)
    phi = tf.Variable(initial_value=np.random.rand(32))
    theta = tf.Variable(initial_value=np.random.rand(32))

    loss = lambda : tf.Variable(np.dot(phi, theta))

    optimizer = Adam(learning_rate=0.1)
    niter = 5
    for _ in range(niter):
        optimizer.minimize(loss, [phi,theta] )
        print(phi[:5].numpy(),theta[:5].numpy())

I'm getting the following error in return:

ValueError: No gradients provided for any variable: (['Variable:0', 'Variable:0'],).

Can anyone tell me what I'm doing wrong?

python

tensorflow

keras

tf.keras

minimization

0 Answers

Your Answer

Accepted video resources