刘小米_思聪

## 1. 文献中关于Unpool的两种方法

• 方法1: we perform unappealing by simply replacing each entry map by an s*s block with the entry value in the top corner and zeros elsewhere. This increases the width and the height of the feature map s times. We used s=2 in our networks. 如果max-pool是将一个2*2的方格里最大值拿出，那么反向池化可以将该值赋给2*2的左上角元素，其它置为0.

https://arxiv.org/pdf/1506.02351.pdf

## 2. 代码实现

2.1 如何实现 Access tensor by index   修改／读取 Tensor的某个元素

2.  最后，放上实现上述方法1 反池化的 代码

``````
def unpool2(pool, ksize, stride, padding = 'VALID'):
"""
simple unpool method

:param pool : the tensor to run unpool operation
:param ksize : integer
:param stride : integer
:return : the tensor after the unpool operation

"""
pool = tf.transpose(pool, perm=[0,3,1,2])
pool_shape = pool.shape.as_list()
if padding == 'VALID':
size = (pool_shape[2] - 1) * stride + ksize
else:
size = pool_shape[2] * stride
unpool_shape = [pool_shape[0], pool_shape[1], size, size]
unpool = tf.Variable(np.zeros(unpool_shape), dtype=tf.float32)
for batch in range(pool_shape[0]):
for channel in range(pool_shape[1]):
for w in range(pool_shape[2]):
for h in range(pool_shape[3]):
diff_matrix = tf.sparse_tensor_to_dense(tf.SparseTensor(indices=[[batch,channel,w*stride,h*stride]],values=tf.expand_dims(pool[batch][channel][w][h],axis=0),dense_shape = [pool_shape[0],pool_shape[1],size,size]))
unpool = unpool + diff_matrix``````

``````# PI is the 4-dimension Tensor from above layer
unpool1 = tf.image.resize_images(PI, size = [resize_width, resize_width], method=tf.image.ResizeMethod.NEAREST_NEIGHBOR)

``````

``````def max_unpool_2x2(x, output_shape):
out = tf.concat_v2([x, tf.zeros_like(x)], 3)
out = tf.concat_v2([out, tf.zeros_like(out)], 2)
out_size = output_shape
return tf.reshape(out, out_size)

＃ max unpool layer 改變輸出的 shape 為兩倍
＃ output_shape_d_pool2 = tf.pack([tf.shape(x)[0], 28, 28, 1])
＃ h_d_pool2 = max_unpool_2x2(h_d_conv2, output_shape_d_pool2)``````
``````def max_unpool_2x2(x, shape):
inference = tf.image.resize_nearest_neighbor(x, tf.pack([shape[1]*2, shape[2]*2]))
return inference``````

https://ithelp.ithome.com.tw/articles/10188326

### 刘小米_思聪

TensorFlow应用实战-5- TensorFlow基础知识

2018/04/10
0
0

BlackBlog__
2018/05/14
0
0
tensorflow 一些概念

2017/07/30
99
0

AI科技大本营
03/30
0
0
Tensorflow源码解析6 -- TensorFlow本地运行时

1 概述 TensorFlow后端分为四层，运行时层、计算层、通信层、设备层。运行时作为第一层，实现了session管理、graph管理等很多重要的逻辑，是十分关键的一层。根据任务分布的不同，运行时又分...

02/19
0
0

shzwork

6
0
object 类中有哪些方法？

getClass(): 获取运行时类的对象 equals()：判断其他对象是否与此对象相等 hashcode()：返回该对象的哈希码值 toString()：返回该对象的字符串表示 clone()： 创建并返此对象的一个副本 wait...

happywe

6
0
Docker容器实战(七) - 容器中进程视野下的文件系统

JavaEdge

8
0

7
0

MtrS

5
0