# Incremental Learning - a continuous learning approach

A few days back I read an article about how Bitcoin’s price can be sufficiently explained by Metcalfe’s law. As you read through the post, resist the temptation to click the links to learn more about Bitcoin, Metcalfe’s law etc. (Maybe you should read about Bitcoin if you have not heard of it). I have provided all the information you need for your first read on this post.

Bitcoin is a cryptocurrency and worldwide payment system. It is the first decentralized digital currency, as the system works without a central bank or single administrator. The network is peer-to-peer and transactions take place between users directly, without an intermediary. These transactions are verified by network nodes through the use of cryptography and recorded in a public distributed ledger called a blockchain. Bitcoin was invented by an unknown person or group of people under the name Satoshi Nakamoto and released as open-source software in 2009.

Read More

# DSSM (Deep Semantic Similarity Model) - Building in TensorFlow

DSSM is a Deep Neural Network (DNN) used to model semantic similarity between a pair of strings. In simple terms semantic similarity of two sentences is the similarity based on their meaning (i.e. semantics), and DSSM helps us capture that.

Read More

# Matrix Multiplication - A different perspective

Matrix multiplication is a common binary operation we come across in engineering and mathematics. We see it a lot in machine learning algorithms. Unlike multiplication of scalars we have a prerequisite for matrices (i.e. number of columns in first matrix = number of rows in second matrix). The output of a valid matrix multiplication has output rows=number of rows in first matrix and output columns=number of columns in second matrix. I visualize matrix multiplication in a `XY-grid` for validating the feasibility of multiplication and to determine the shape of the output matrix. We will explore in this method in this article.

Read More

# BUN functions - Applying the Pareto Principle for generating random numbers in numpy

The easy way to create an array of numbers is to get a bunch of zeros or ones using convenient functions.

``````np.zeros(shape=(n_rows,n_cols))
np.ones(shape=(n_rows,n_cols))
``````

While this works for some cases, in many others we want the elements of the array to be diverse rather than repeating. At this point hardly anyone thinks about creating a magic square! They do satisfy the diversity criteria, but numpy natively does not have methods to create magic squares. So we will skip them! The popular choice is random numbers. Numpy has a whole bunch of methods to create random numbers. https://docs.scipy.org/doc/numpy-1.13.0/reference/routines.random.html lists all the methods `numpy.random` has and they are sufficient for almost all our needs. Numpy has a good collection of simple methods for generating random numbers - `rand`, `random`, `ranf`, `randn` etc. The problem with these is the non intuitive names and the overlap of features provided by these functions.

Read More

# Beyond the Rule of 72

The Rule of 72 is a useful shortcut to determine the approximate time required to double one's investment. By this rule, It takes roughly `72/r` years to double your principal/investment, where `r` is the annual rate of compounding interest. So, with `12% p.a.` interest rate (compounded annually), you can double your investment in 6 (`72/12`) years. The rule can also be used to calculate the rate of interest `r` required to double a sum of money in `y` years (`r = 72/y`).

This is a useful rule. But, a general rule of thumb that estimates the time required to convert an amount `P` into `X*P` would be better. In this article we will learn to do that.

Read More