# Essence of Inductive/Supervised Learning

November 8, 2015 Leave a comment

**Inductive/Supervised Learning: **Given a training example, [x, f(x)], for some unknown function ‘f’, a good approximation of ‘f’ is called Supervised Learning.

Appropriate applications of Supervised Learning

- Situations where humans can perform the task but cannot describe how they do it.
- x: bitmap picture of hand written characters; f(x): ASCII code of char

- Situation where desired function is changing very frequently.
- Predictiong stock prices.

- Situation where each user needs customized function.
- Amazon user recommendations.

Let us take this simple example

# x1 x2 x3 x4 | y 1 0 0 1 0 | 0 2 0 1 0 0 | 0 3 0 0 1 1 | 1 4 1 0 0 1 | 1 5 0 1 1 0 | 0 6 1 1 0 0 | 0 7 0 1 0 1 | 0 . . so on

We have to predict ‘y’ for a given new combination of (x1,x2,x3,x4). Since there are 4 variables, there will be 2^4 states(16) and hence number of functions would become 2^16.

Since the number of functions is **double exponential** and in real time there will be 1000s of variables, brute force way of finding the function is not possible, therefore we need generalization.

Two views of Machine Learning to generalize

- Removal of remaining uncertainty (suppose we know the unknown function is ‘m of n’, then we would use training examples to infer the function)
- It requires guessing a good, small hypothesis class. We can start with very small class and enlarge it until it contains hypothesis that fits the data.

## Recent Comments