1.【博客】How to Build a Recurrent Neural Network in TensorFlow
In this tutorial I’ll explain how to build a simple working Recurrent Neural Network in TensorFlow. This is the first in a series of seven parts where various aspects and techniques of building Recurrent Neural Networks in TensorFlow are covered. A short introduction to TensorFlow is available here. For now, let’s get started with the RNN!
2.【博客】Overview of Artificial Intelligence and Role of Natural Language Processing in Big Data
Natural Language Processing (NLP) is “ability of machines to understand and interpret human language the way it is written or spoken”.
The objective of NLP is to make computer/machines as intelligent as human beings in understanding language.
3.【代码】Deep Image Analogy
eep Image Analogy is a technique to find semantically-meaningful dense correspondences between two input images. It adapts the notion of image analogy with features extracted from a Deep Convolutional Neural Network.
Deep Image Analogy is initially described in a SIGGRAPH 2017 paper
4.【博客】Convolutional Methods for Text
- RNNS work great for text but convolutions can do it faster
- Any part of a sentence can influence the semantics of a word. For that reason we want our network to see the entire input at once
- Getting that big a receptive can make gradients vanish and our networks fail
- We can solve the vanishing gradient problem with DenseNets or Dilated Convolutions
- Sometimes we need to generate text. We can use “deconvolutions” to generate arbitrarily long outputs.
5.【博客】Using Long Short-Term Memory Networks and TensorFlow for Image Captioning
From this blog post, you will learn how to enable a machine to describe what is shown in an image and generate a caption for it, using long short-term memory networks and TensorFlow. You will also find out how to make use of TensorBoard for visualizing graphs, better understand what’s under the hood, and debug the performance of a model if necessary.