Python – tensorflow.GradientTape.batch_jacobian()
TensorFlow是谷歌设计的开源Python库,用于开发机器学习模型和深度学习神经网络。
batch_jacobian()用于计算和堆积每个例子的jacobian。
语法: batch_jacobian( target, source, unconnected_gradients, parallel_iterations, experimental_use_pfor )
参数:
- target:它是一个最小等级为2的张量。
- source: 它是一个具有最小等级2的张量。
- unconnected_gradients(可选):它的值可以是零或无。默认值是无。
- parallel_iterations(可选): 它用于控制并行迭代和内存使用。
- experimental_use_pfor(可选): 它是一个布尔值,默认值为True。当设置为True时,它使用pfor来计算Jacobian,否则就使用tf.while_loop。
返回:它返回一个张量。
示例 1:
# Importing the library
import tensorflow as tf
x = tf.constant([[4, 2],[1, 3]], dtype=tf.dtypes.float32)
# Using GradientTape
with tf.GradientTape() as gfg:
gfg.watch(x)
y = x * x * x
# Computing jacobian
res = gfg.batch_jacobian(y, x)
# Printing result
print("res: ",res)
输出:
res: tf.Tensor(
[[[48. 0.]
[ 0. 12.]]
[[ 3. 0.]
[ 0. 27.]]], shape=(2, 2, 2), dtype=float32)
示例 2:
# Importing the library
import tensorflow as tf
x = tf.constant([[4, 2],[1, 3]], dtype=tf.dtypes.float32)
# Using GradientTape
with tf.GradientTape() as gfg:
gfg.watch(x)
# Using nested GradientTape for calculating higher order jacobian
with tf.GradientTape() as gg:
gg.watch(x)
y = x * x * x
# Computing first order jacobian
first_order = gg.batch_jacobian(y, x)
# Computing Second order jacobian
second_order = gfg.batch_jacobian(first_order, x)
# Printing result
print("first_order: ",first_order)
print("second_order: ",second_order)
输出:
first_order: tf.Tensor(
[[[48. 0.]
[ 0. 12.]]
[[ 3. 0.]
[ 0. 27.]]], shape=(2, 2, 2), dtype=float32)
second_order: tf.Tensor(
[[[[24. 0.]
[ 0. 0.]]
[[ 0. 0.]
[ 0. 12.]]]
[[[ 6. 0.]
[ 0. 0.]]
[[ 0. 0.]
[ 0. 18.]]]], shape=(2, 2, 2, 2), dtype=float32)