Code

Introduction to TensorFlow and Neural Networks

Here I include Python notebooks to get started with Tensorflow, Neural Neworks (NNs), Convolutional NNs, Word Embeddings and Recurrent Neural Networks. This is a personal wrap-up of all the material provided by Google's Deep Learning course on Udacity, so all credit goes to them. Python 3.5 required!

Notebook 1: How to train a logistic-regressor and a 2-layer NN with L2-norm regularization using TensorFlow. [Files]

Notebook 2: Convolutional NNs and Dropout Regularization [Files]

Notebook 3: Word Embeddings and the wor2vec model [Files]

Notebook 4: Recurrent NNs and sequential character prediction [Files]

Notebook 5: Recurrent NNs and sequential character prediction from MCC features with Connectionist Temporal Classification [Files]

Notebook 6: Bi-directional LSTM RNN and sequential character prediction from MCC features with Connectionist Temporal Classification [Files]

Data Sets

Mini-Course on Inference and Learning in discrete Bayesian Networks

Python library to perform Belief Propagation inference over discrete BN code, defined by the user by means of tabular Conditional Probability Distributions (CPDs). Four python notebooks are included: two to describe how to use the library and run BP inference, and two to show how to learn the BNs CPDs with hidden observations using EM. Slides are also provided [code]. Python 3.5 required! (Last modified 15th November 2016)

Expected graph evolution during peeling decoding of LDPC codes and SC-LDPC codes constructed from protographs

This Matlab-MEX software can be used to analyze finite-length ensembles constructed from protographs over the BEC. The user can specify an arbitrary base matrix for the LDPC code and a channel parameter. As a result, the script gives the expected evolution of the fraction of degree-one check nodes in the graph during peeling decoding [code].