人工智能资料库:第27辑(20170208)
人工智能资料库:第27辑(20170208)
AllenOR灵感 发表于5个月前
人工智能资料库:第27辑(20170208)
  • 发表于 5个月前
  • 阅读 1
  • 收藏 0
  • 点赞 0
  • 评论 0

新睿云服务器60天免费使用,快来体验!>>>   


  1. 【博客】DeepMind’s PathNet: A Modular Deep Learning Architecture for AGI

简介:


PathNetis a new Modular Deep Learning (DL) architecture, brought to you by who else but DeepMind, that highlights the latest trend in DL research to meldModular Deep Learning,Meta-Learningand Reinforcement Learning into a solution that leads to more capable DL systems. A January 20th, 2017 submitted Arxiv paper “PathNet: Evolution Channels Gradient Descent in Super Neural Networks” (Fernando et. al) has in its abstract the following interesting description of the work:

原文链接:https://medium.com/intuitionmachine/pathnet-a-modular-deep-learning-architecture-for-agi-5302fcf53273#.9uwk331d5


2.【博客】On the intuition behind deep learning & GANs—towards a fundamental understanding

简介:

A generative adversarial network (GAN) is composed of two separate networks - the generator and the discriminator. It poses the unsupervised learning problem as a game between the two. In this post we will see why GANs have so much potential, and frame GANs as a boxing match between two opponents.

原文链接:https://hackernoon.com/introduction-to-gans-a-boxing-match-b-w-neural-nets-b4e5319cc935#.lyaaodaih


3.【代码】nmtpy

简介:

nmtpyis a suite of Python tools, primarily based on the starter code provided indl4mt-tutorialfor training neural machine translation networks using Theano.

The basic motivation behind forkingdl4mt-tutorialwas to create a framework where it would be easy to implement a new model by just copying and modifying an existing model class (or even inheriting from it and overriding some of its methods).

原文链接:https://github.com/lium-lst/nmtpy


4.【博客】Demystifying Word2Vec

简介:

Research into word embeddings is one of the most interesting in the deep learning world at the moment, even though they were introduced as early as 2003 by Bengio, et al. Most prominently among these new techniques has been a group of related algorithm commonly referred to as Word2Vec which came out of google research.[^2]

In this post we are going to investigate the significance of Word2Vec for NLP research going forward and how it relates and compares to prior art in the field. In particular we are going to examine some desired properties of word embeddings and the shortcomings of other popular approaches centered around the concept of a Bag of Words (henceforth referred to simply as Bow) such as Latent Semantic Analysis. This shall motivate a detailed exposition of how and why Word2Vec works and whether the word embeddings derived from this method can remedy some of the shortcomings of BoW based approaches. Word2Vec and the concept of word embeddings originate in the domain of NLP, however as we shall see the idea of words in the context of a sentence or a surrounding word window can be generalized to any problem domain dealing with sequences or sets of related data points.

原文链接:http://www.deeplearningweekly.com/blog/demystifying-word2vec


5.【博客】Highlights and tutorials for concepts discussed in "Richard Socher on the future of deep learning"

简介:


Bruner, Jon. “The O’Reilly Bots Podcast” Audio blog post. Richard Socher on the Future of Deep Learning. O’Reilly, December 1, 2016.

Raw interview: I highly encourage listening to the podcast because the questions were so well crafted.

TLDR; Richard Socher of Salesforce (formerly Stanford and MetaMind) offers insight into the current and future states of deep learning used for NLP. We need one model that can do lot’s of different tasks and need to be wary of bias in our models. Future of conversational bots is multimodal and Salesforce research is awesome.

Disclaimer: This is my interpretation of the interview. I have included the pertinent questions I found interesting.

原文链接:https://theneuralperspective.com/2016/12/20/highlights-and-tutorials-for-concepts-discussed-in-richard-socher-on-the-future-of-deep-learning/


  • 打赏
  • 点赞
  • 收藏
  • 分享
共有 人打赏支持
粉丝 6
博文 2139
码字总数 82983
×
AllenOR灵感
如果觉得我的文章对您有用,请随意打赏。您的支持将鼓励我继续创作!
* 金额(元)
¥1 ¥5 ¥10 ¥20 其他金额
打赏人
留言
* 支付类型
微信扫码支付
打赏金额:
已支付成功
打赏金额: