1.【博客】Types of Optimization Algorithms used in Neural Networks and Ways to Optimize Gradient Descent
Have you ever wondered which optimization algorithm to use for your Neural network Model to produce slightly better and faster results by updating the Model parameters such as Weights and Bias values . Should we use Gradient Descent or Stochastic gradient Descent or Adam ?
I too didn’t know about the major differences between these different types of Optimization Strategies and which one is better over another before writing this article.
2.【博客】Exploring Elliptic Curve Pairings
One of the key cryptographic primitives behind various constructions, including deterministic threshold signatures, zk-SNARKs and other simpler forms of zero-knowledge proofs is the elliptic curve pairing. Elliptic curve pairings (or “bilinear maps”) are a recent addition to a 30-year-long history of using elliptic curves for cryptographic applications including encryption and digital signatures; pairings introduce a form of “encrypted multiplication”, greatly expanding what elliptic curve-based protocols can do. The purpose of this article will be to go into elliptic curve pairings in detail, and explain a general outline of how they work.
You’re not expected to understand everything here the first time you read it, or even the tenth time; this stuff is genuinely hard. But hopefully this article will give you at least a bit of an idea as to what is going on under the hood.
3.【视频】Phase-Functioned Neural Networks for Character Control
This year at SIGGRAPH I am presenting Phase-Functioned Neural Networks for Character Control. This paper uses a new kind of neural network called a "Phase-Functioned Neural Network" to create a character controller suitable for games. Our controller requires very little memory, is fast to compute at runtime, and generates high quality motion in many complex situations. We also present a technique for fitting terrains from virtual environments to separately captured motion data. This is used to train our system so it can natually traverse rough terrains at runtime.
4.【代码】NLP concepts with spaCy
“Natural Language Processing” is a field at the intersection of computer science, linguistics and artificial intelligence which aims to make the underlying structure of language available to computer programs for analysis and manipulation. It’s a vast and vibrant field with a long history! New research and techniques are being developed constantly.
The aim of this notebook is to introduce a few simple concepts and techniques from NLP—just the stuff that’ll help you do creative things quickly, and maybe open the door for you to understand more sophisticated NLP concepts that you might encounter elsewhere.
We'll be using a library called spaCy, which is a good compromise between being very powerful and state-of-the-art and easy for newcomers to understand.
(Traditionally, most NLP work in Python was done with a library called NLTK. NLTK is a fantastic library, but it’s also a writhing behemoth: large and slippery and difficult to understand. Also, much of the code in NLTK is decades out of date with contemporary practices in NLP.)
This tutorial is written in Python 2.7, but the concepts should translate easily to later versions.
5.【代码】A TensorFlow Implementation of the Transformer: Attention Is All You Need
I tried to implement the idea in Attention Is All You Need. They authors claimed that their model, the Transformer, outperformed the state-of-the-art one in machine translation with only attention, no CNNs, no RNNs. How cool it is! At the end of the paper, they promise they will make their code available soon, but apparently it is not so yet. I have two goals with this project. One is I wanted to have a full understanding of the paper. Often it's hard for me to have a good grasp before writing some code for it. Another is to share my code with people who are interested in this model before the official code is unveiled.