AllenOR灵感 发表于8个月前

• 发表于 8个月前
• 阅读 0
• 收藏 0
• 评论 0

1. 【代码】Machine Learning From Scratch

Python implementations of some of the fundamental Machine Learning models and algorithms from scratch.

While some of the matrix operations that are implemented by hand (such as calculation of covariance matrix) are available in numpy I have decided to add these as well to make sure that I understand how the linear algebra is applied. The reason the project uses scikit-learn is to evaluate the implementations on sklearn.datasets.

The purpose of this project is purely self-educational.

Feel free to reach out if you can think of ways to expand this project.

2.【论文】RNN models for image generation

Today we’re looking at the remaining papers from the unsupervised learning and generative networks section of the ‘top 100 awesome deep learning papers‘ collection. These are:

DRAW: A recurrent neural network for image generation, Gregor et al., 2015
Pixel recurrent neural networks, van den Oord et al., 2016
Auto-encoding variational Bayes, Kingma & Welling, 2014

3.【博客】An Interactive Tutorial on Numerical Optimization

Numerical Optimization is one of the central techniques in Machine Learning. For many problems it is hard to figure out the best solution directly, but it is relatively easy to set up a loss function that measures how good a solution is - and then minimize the parameters of that function to find the solution.

I ended up writing a bunch of numerical optimization routines back when I was first trying to learn javascript. Since I had all this code lying around anyway, I thought that it might be fun to provide some interactive visualizations of how these algorithms work.

The cool thing about this post is that the code is all running in the browser, meaning you can interactively set hyper-parameters for each algorithm, change the initial location, and change what function is being called to get a better sense of how these algorithms work.

All the code for this post is up on github if you want to check it out, it has both the minimization functions as well as all of the visualizations.

4.【代码】Approximate Nearest Neighbor Search for Sparse Data in Python!

Approximate Nearest Neighbor Search for Sparse Data in Python! This library is well suited to finding nearest neighbors in sparse, high dimensional spaces (like text documents).

Out of the box, PySparNN supports Cosine Distance (i.e. 1 - cosine_similarity).

PySparNN benefits:

• Designed to be efficient on sparse data (memory & cpu).
• Implemented leveraging existing python libraries (scipy & numpy).
• Easily extended with other metrics: Manhattan, Euclidian, Jaccard, etc.
• Max distance thresholds can be set at query time (not index time). I.e. return the k closest items no more than max_distance from the query point.
• Supports incremental insertion of elements.

5.【博客 & 视频】A gentle introduction to PyTorch and TensorFlow with a Reddit link

This is the first post for this week. I will use it for the introduction of some Python libraries that are being widely adopted by the deep learning communities. I will also disclose today that The Information Age will change its weekly schedules of posts from 5 p/week to three p/week. The reason is that I plan to begin another project of a blog soon and I will be busy in the meantime. This new project will be somewhat closely related with most of the content I have been posting here, so the posts here will certainly gain even more with this diversified schedule. I intend to post Mondays, Wednesdays and Fridays, but if for some reason this order changes I will notice in advance, or I will feel free to mention the change in the relevant post.

Today I will share a video about the introduction of a course that lectures on the PyTorch and TensorFlow Python/C++ libraries, now taking deeper root at the deep learning and artificial intelligence communities. Further, below the video, I share a link to a fascinating post I found in the Reddit social media website, featuring a Q&A about the comparisons about the advantages and shortcomings of those two libraries, which I thought to be a highly appropriate readership for all involved or interested in these subjects. Some of the highlights are in bold quotes, as usual in this blog.

×