人工智能资料库:第18辑(20170127)
人工智能资料库:第18辑(20170127)
AllenOR灵感 发表于5个月前
人工智能资料库:第18辑(20170127)
  • 发表于 5个月前
  • 阅读 1
  • 收藏 0
  • 点赞 0
  • 评论 0

新睿云服务器60天免费使用,快来体验!>>>   


  1. 【代码】Sugar Tensor - A slim tensorflow wrapper that provides syntactic sugar for tensor variables

简介:

Sugar Tensor aims to help deep learning researchers/practitioners. It adds some syntactic sugar functions to tensorflow to avoid tedious repetitive tasks. Sugar Tensor was developed under the following principles:

  1. Don't mess up tensorflow. We provide no wrapping classes. Instead, we use a tensor itself so that developers can program freely as before with tensorflow.
  2. Don't mess up the python style. We believe python source codes should look pretty and simple. Practical deep learning codes are very different from those of complex GUI programs. Do we really need inheritance and/or encapsulation in our deep learning code? Instead, we seek for simplicity and readability. For that, we use pure python functions only and avoid class style conventions.

原文链接:https://github.com/buriburisuri/sugartensor


2.【代码】illustration2vec

简介:


illustration2vec (i2v) is a simple library for estimating a set of tags and extracting semantic feature vectors from given illustrations. For details, please seeour project pageorour main paper.

Demo

原文链接:https://github.com/rezoo/illustration2vec


3.【资料】30 Top Videos, Tutorials & Courses on Machine Learning & Artificial Intelligence from 2016

简介:


2016 has been the year of “Machine Learning and Deep Learning”. We have seen the likes of Google, Facebook, Amazon and many more come out in open and acknowledge the impact machine learning and deep learning had on their business.

Last week, I publishedtop videos on deep learning from 2016. I was blown away by the response. I could understand the response to some degree – I found these videos extremely helpful. So, I decided to do a similar article on top videos on machine learning from 2016.

原文链接:https://www.analyticsvidhya.com/blog/2016/12/30-top-videos-tutorials-courses-on-machine-learning-artificial-intelligence-from-2016/?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+AnalyticsVidhya+%28Analytics+Vidhya%29


4.【博客】20+ hottest research papers on Computer Vision, Machine Learning

简介:

Computer Vision used to be cleanly separated into two schools: geometry and recognition. Geometric methods like structure from motion and optical flow usually focus on measuring objective real-world quantities like 3D "real-world" distances directly from images and recognition techniques like support vector machines and probabilistic graphical models traditionally focus on perceiving high-level semantic information (i.e., is this a dog or a table) directly from images.

原文链接:http://www.kdnuggets.com/2016/01/iccv-2015-21-hottest-papers.html


5.【课程】Neural Networks and Deep Learning

简介:

Neural networks have enjoyed several waves of popularity over the past half century. Each time they become popular, they promise to provide a general purpose artificial intelligence--a computer that can learn to do any task that you could program it to do. The first wave of popularity, in the late 1950s, was crushed by theoreticians who proved serious limitations to the techniques of the time. These limitations were overcome by advances that allowed neural networks to discover distributed representations, leading to another wave of enthusiasm in the late 1980s. The second wave died out as more elegant, mathematically principled algorithms were developed (e.g., support-vector machines, Bayesian models). Around 2010, neural nets had a third resurgence. What happened over the past 20 years? Basically, computers got much faster and data sets got much larger, and the algorithms from the 1980s--with a few critical tweaks and improvements--appear to once again be state of the art, consistently winning competitions in computer vision, speech recognition, and natural language processing. Below is a comic strip circa 1990, when neural nets reached public awareness. You might expect to see the same comic today, touting neural nets as the hot new thing, except that now the field has been rechristened deep learning to emphasize the architecture of neural nets that leads to discovery of task-relevant representations.

原文链接:https://www.cs.colorado.edu/~mozer/Teaching/syllabi/DeepLearning2015/


  • 打赏
  • 点赞
  • 收藏
  • 分享
共有 人打赏支持
粉丝 6
博文 2139
码字总数 82983
×
AllenOR灵感
如果觉得我的文章对您有用,请随意打赏。您的支持将鼓励我继续创作!
* 金额(元)
¥1 ¥5 ¥10 ¥20 其他金额
打赏人
留言
* 支付类型
微信扫码支付
打赏金额:
已支付成功
打赏金额: