定义权重正则化损失和如何规定要计算梯度的变量

2019/05/08 15:14
阅读数 52

#权重正则化损失 在使用tf.get_variable()和tf.variable_scope()的时候,你会发现,它们俩中有regularizer形参.如果传入这个参数的话,那么variable_scope内的weights的正则化损失,或者weights的正则化损失就会被添加到GraphKeys.REGULARIZATION_LOSSES中. 示例

import tensorflow as tf
from tensorflow.contrib import layers

regularizer = layers.l1_regularizer(0.1)
with tf.variable_scope('var', initializer=tf.random_normal_initializer(), regularizer=regularizer):
    weight = tf.get_variable('weight', shape=[8], initializer=tf.ones_initializer())
with tf.variable_scope('var2', initializer=tf.random_normal_initializer(), regularizer=regularizer):
    weight2 = tf.get_variable('weight', shape=[8], initializer=tf.ones_initializer())

regularization_loss = tf.reduce_sum(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES))

optimize_loss = tf.train.AdamOptimizer().minimize(loss + sum(regularization_loss), var_list=ae_vars)

#如何规定要计算梯度的变量

t_vars = tf.trainable_variables()   //获取全部变量
ae_vars = [var for var in t_vars if 'autoencoder' in var.name]  // 取出要计算机梯度的变量
optimize_loss = tf.train.AdamOptimizer().minimize(loss + sum(regularization_loss), var_list=ae_vars)
展开阅读全文
打赏
0
0 收藏
分享
加载中
更多评论
打赏
0 评论
0 收藏
0
分享
返回顶部
顶部