1.【代码】Visual Question Answering in Pytorch
This repo was made by Remi Cadene (LIP6) and Hedi Ben-Younes (LIP6-Heuritech), two PhD Students working on VQA at UPMC-LIP6 and their professors Matthieu Cord (LIP6) and Nicolas Thome (LIP6-CNAM). We developped this code in the frame of a research paper called MUTAN: Multimodal Tucker Fusion for VQA which is (as far as we know) the current state-of-the-art on the VQA-1 dataset.
The goal of this repo is two folds:
- to make it easier to reproduce our results,
- to provide an efficient and modular code base to the community for further research on other VQA datasets.
If you have any questions about our code or model, don't hesitate to contact us or to submit any issues. Pull request are welcome!
2.【博客】Why Does Deep Learning Not Have a Local Minimum?
Yes, there is a ‘theoretical justification’, and has taken a couple decades to flush it out.
I will first point out, however, it has been observed in practice. This was pointed out by LeCun in his early work on LeNet, and is actually discussed in the ‘orange book’, “Pattern Classification” by David G. Stork, Peter E. Hart, and Richard O. Duda.
3.【博客】Graph-based machine learning: Part I
During the seven-week Insight Data Engineering Fellows Program recent grads and experienced software engineers learn the latest open source technologies by building a data platform to handle large, real-time datasets.
4.【博客】Deep Learning the Stock Market
In the past few months I’ve been fascinated with “Deep Learning”, especially its applications to language and text. I’ve spent the bulk of my career in financial technologies, mostly in algorithmic trading and alternative data services. You can see where this is going.
I wrote this to get my ideas straight in my head. While I’ve become a “Deep Learning” enthusiast, I don’t have too many opportunities to brain dump an idea in most of its messy glory. I think that a decent indication of a clear thought is the ability to articulate it to people not from the field. I hope that I’ve succeeded in doing that and that my articulation is also a pleasurable read.
5.【论文】Deep Learning in Trading
Current state of the art
LSTM is theholygrail of sequencepredictions.
A major part of thefinancial modellingis sequenceprediction- whether that?s volatility modelsor volume modelsor thetoughest oneof all - returnprediction models. This is theunderlyingtask insuchproblems - Givenasequenceof values, can wepredict thenext number inthe sequence?
LSTM modelsnaturally fit this criteria,becauseof its recursivenature.Additionally,thehiddenstateandthe memory cell tremendouslyhelpretaintheuseful featuresof thesequence.
Featureengineeringis thethingof thepast intheeraof neural networks.
Neural networksarereallygoodat comingup withfeaturesontheir own.A number of peopleinfinance work
day-in-day-out incomingup withfeatures.Neural netsarepoisedtotakeover this segment of the market.
Neural networksprovideaneasy way tocombine market dataandother datasources.
Sinceneural netswork inthelatent space,it?s super easy tocombinedyour market datainput withother datasources
you might have.That canbeanythingfrom sentiment analysis, summaryof SEC filings tovisual or audioinputs.
Additionally,neural networksmakeit easy todo multivariate modelling wheretherearealot of relationships
It?s important tounderstand when neural networksdonot work - theydon?t work if youdon?t haveenoughdata.
Small datasetsarebottleneckswhenit comes toconvergence.Largedatasets come withcomputationproblems.