- 【博客】Matching Networks for One Shot Learning
This is a paper on one-shot learning, where we'd like to learn a class based on very few (or indeed, 1) training examples. E.g. it suffices to show a child a single giraffe, not a few hundred thousands before it can recognize more giraffes.
This paper falls into a category of "duh of course" kind of paper, something very interesting, powerful, but somehow obvious only in retrospect. I like it.
2.【博客】Building Machine Learning Estimator in TensorFlow
The purpose of this post is to help you better understand the underlying principles of estimators in TensorFlow Learn and point out some tips and hints if you ever want to build your own estimator that’s suitable for your particular application. This post will be helpful when you ever wonder how everything works internally and gets overwelmed by the large codebase.
3.【资源】Deep Learning Resources
Deep learning resources that I marked here for reading and self-study.
4.【博客】Unfolding RNNs —— RNN : Concepts and Architectures
RNN is one of those toys that eluded me for a long time. I just couldn’t figure out how to make it work. Ever since I read Andrej Karpathy’s blog post on the Unreasonable Effectiveness of RNNs, I have been fascinated by what RNNs are capable of, and at the same time confused by how they actually worked. I couldn’t follow his code for text generation (Language Modeling). Then, I came across Denny Britz’s blog, from which I understood how exactly they worked and how to build them. This blog post is addressed to my past self that was confused about the internals of RNN. Through this post, I hope to help people interested in RNNs, develop a basic understanding of what they are, how they work, different variants of RNN and applications.
5.【代码】Neural Variational Document Model
Tensorflow implementation of Neural Variational Inference for Text Processing.
This implementation contains:
- Neural Variational Document Model
- Variational inference framework for generative model of text
- Combines a stochastic document representation with a bag-of-words generative model
- Neural Answer Selection Model (in progress)
- Variational inference framework for conditional generative model of text
- Combines a LSTM embeddings with an attention mechanism to extract the semantics between question and answer