Rio Vs Josh Frenchdefense
Table of Contents
- 1 C10 French Defense: Rubinstein Variation, Blackburne Defense
- 2 Comments
- 3 Starting Position
- 3.1 Nxf6+
- 3.2 Bg5 (game)
- 3.2.1 6. Bg5, h6
- 3.2.2 6. Bg5, Be7
- 3.2.3 If Black takes back with Bishop:
- 3.2.4 If Black takes back with Knight (Game):
- 3.2.4.1 9....h6?? loses to h4 by white offering to sacrifice the bishop. If black takes.
- 3.2.4.2 9....h6?? loses to h4 by white offering to sacrifice the bishop. If black DOES NOT take:
- 3.2.4.3 9....c5
- 3.2.4.4 9....Bd7
- 3.2.4.5 9....b6 (weakens the light square diagonal)
- 3.2.4.6 9. ...,c5
- 3.2.4.7 9. ...,b6 (good move, Black wants to do Bb7)
- 3.2.5 9. O-O (Game)
Kubernetes notes
A simple collection of notes from my explorations of Kubeflow cluster.
Using Roberta classification head for fine-tuning a pre-trained model
An example to show how we can use Huggingface Roberta Model for fine-tuning a classification task starting from a pre-trained model. The task involves binary classification of smiles representation of molecules.
Siamese Network using Pytorch with simulated scatter plot data.
Read MoreSiamese network in keras to detect pairs of scatter plots that are similar
Read MoreSimple Seq2Seq machine translation using GRU based encoder-decoder architecture
Adapted from https://d2l.ai/chapter_recurrent-modern/seq2seq.html
Pytorch - simple GRU experiment
Create a simple GRU layer using pytorch. Feed a tensor of shape batch_size
xnum_steps
xinput_size
and
observe the GRU output. Next feed the same input tensor one time-step at a time ensuring that the
previous timestep hidden state becomes initial state for the current timestep. The outputs should be same.
How is cross entropy computed in pytorch ?
CROSS ENTROPY LOSS (logits, class) = negative log of softmax of logits[class]
Comaparing using TimeDistributed and not with dense when return_sequences=True in Keras. The results are identical.
```python from future import print_function
Buidling multilayer GPU from single GRU-cells with Pytorch.
First use nn.GRU with 3 layers for processing sequences. Then use nn.GRUCell for doing the same.
Simple character level LSTM using Pytorch.
Implements simple character level name classification using Pytorch. Training is done using about 20K names across 18 languages. The names are clubbed into three categories : English, Russian, Other for simplicity. Using SGD as optimizer produces poor results, Adam performs better, Nadam even better.
Simple character level LSTM using Keras.
Implements simple character level name classification using Keras LSTM and Dense layers. Training is done using about 20K names across 18 languages. The names are clubbed into three categories : English, Russian, Other for simplicity. Using SGD as optimizer produces poor results, Adam performs better, Nadam even better.
Simple Generative Adversarial Network
Simple Generative Adversarial Network to generate datapoints from a simple one-dimensional function (adapted from https://machinelearningmastery.com/how-to-develop-a-generative-adversarial-network-for-a-1-dimensional-function-from-scratch-in-keras/).
Encode a protein to a image using Hilbert curves.
```python from future import absolute_import from future import division from future import print_function import os import numpy as np import itertools
Visualize a given substructure in a given molecule using rdkit and python.
Given the smiles of a molecule and the smiles of a possible substructure, find the atoms of the substructure in the molecule. Visualize the molecule with the substructure atoms highlighted in green.
Encoding a set of Graphs using Neural Message Passing
You will need gpu and cuda with pytorch. Data used in the code
Implementation of End-2-End Memory Network for Language Modeling
Tensorflow implementation of End-To-End Memory Networks for the language modeling task. I tried to name the variable as closely as possible to that in the paper following the equations to help understand the paper. Don’t forget to change the “input_file” to your input file. Some of the ideas are borrowed from earlier implementation. The python notebook of the code can be found here.
Simple data feeding to Deepchem framework in tensorflow style
Deepchem provides a wonderful framework and library for developing deep learning and machine learning predictive models for small molecules. However, its understandably complex pythonic architecure and equally inexplicable lack of documentation (except the raw python function descriptions and a handful of tutorials) make it very hard to get benath the surface and engineer it to fit your own needs, particularly so if you are not a physics, chemisty and deep learning and programming major. Here I will chronicle my efforts to just enable training with deepchem where we will be feeding the data using feed_dict (people who uses tensorflow will understand this term) to the tensorflow graph, not using the standard fit_generator or fit functions of deepchem (which kind of makes deepchem a blackbox difficult to understand).
Graph Convolutions to predict Solubility for Molecules
This writeup shows how to use graph convolutions for a regression like problem using the DeepChem library.
Graph Convolutions using Protein structures
This post shows how to implement a simple graph convolutional deep learning method to predict interfaces between protein residues, i.e. given a pair of interacting proteins, can we classify a pair of amino acid residues as interacting or not. This is based on the paper published in NIPS 2017 (Protein Interface Prediction using Graph Convolutional Networks).
Read csv file with variable number of fields using Tensorflow
Read MoreSimple Named entity Recognition (NER) with tensorflow
Given a piece of text, NER seeks to identify named entities in text and classify them into various categories such as names of persons, organizations, locations, expressions of times, quantities, percentages, etc. Here we just want to build a model to predict \(N_c =\) 5 classes for every word in a sentence: PER (person), ORG (organization), LOC (location), MISC (miscellaneous) and O(null class, not a NER).
Tensorflow Recursive Neural Network (ReNN) simple example
A simple example demonstrating the use of TensorArray to create a recursive neural network, essentially a tree structured neural network.
Tensorflow TensorArray Simple Example
A small example on how to use Tensorflow TensorArray.
Parameter learning and updates in simple word2vec
A lot of materials on word2vec models such as Skipgram and CBOW are available that explain the models really well. This post is just a drop in that ocean trying to clarify some of the details that I found useful in understanding the internals and explaining the models in line with the (almost the same) terminology used in the NLP lectures CS224n.Two other resources that I find very useful are word2vec Parameter Learning Explained and word2vec Explained: deriving Mikolov et al.’s negative-sampling word-embedding method.
Tensorflow Cross Entropy Functions
Multinomial (or multiclass) logistic regression (aka softmax regression) with tensorflow
Example of solving a parameterized model with Tensorflow - define the logistic regression with multiple classes to predict.
Denoising Images using Ising model
This post develops on the Ising model concepts from my previous blog, see Ising model. Consider the problem of reconstructing a black-and-white image (i.e. each pixel is either 1 or -1) from the corrupted observations. We assume that there is an underlying (hidden) noise-free image from which the observed image (Figure 1) is generated by adding noise: randomly flip the pixel values with a small probability (say 20%). Given the observed noisy image (modeled by random variable \(\textbf{Y}\)), we want to recover the original noise-free image (modeled by random variable \(\textbf{X}\).
Ising Model
Please refer to my earlier posts on MRF and GRF for getting used to the notations. If we consider clique potentials for sizes of upto 2, the energy function is the sum of clique potentials of all cliques of size 1 and size 2,
Gibbs Random Field (GRF)
Lets see what is a GRF and how it is connected to the MRF. If you are new to my blogs, please visit my blog on MRF to get familiar with the notations. A GRF can be thought of as a graphical representation of a family of random variables \( \textbf{X} =\{ X_1,X_2,…,X_n \} \) on set \( S=\{1,2,…,n\} \) of sites.The relationship between the random variables is defined using a neighborhood system. A GRF obey’s Gibb’s distribution given by,
Markov Random Fields (MRF)
A short intro to MRFs. Let \( \textbf{X} =\{ X_1,X_2,…,X_n \} \) be a family of random variables defined on a set \( S=\{1,2,…,n\} \) of sites. As an example, \( S \) can represent the pixel positions of an \( m \times m \) image in a 2-D lattice \( \{(i,j) | 1 \leq i,j \leq m\} \) where the double indexing can be recoded to univariate indexing by \( (i,j) \rightarrow (i-1)m+j \) so that \( S=\{ 1,2,…,m^2 \} \).
Basic machine learning with python sklearn for classification
A demonstration of how to use python package sklearn for a basic machine learning task : classification.
You're up and running!
Next you can update your site name, avatar and other options using the _config.yml file in the root of your repository (shown below).