Machine Learning Algorithms: Brief Introduction
Supervised Learning
You have examples of inputs and outputs. You have to learn a model that turns inputs into outputs.
Knowledge Based Learning
Supervised Learning
You have examples of inputs and outputs. You have to learn a model that turns inputs into outputs.
- Decision Tree Learning
- If you want to learn a propositional logic-based theory.
- Learning from features
- Propositional sentences on features.
- If 18 < age < 35, sex = male, location = Chittagong, Then the person is likely to vote for “Nagorik Shakti”.
- Bayesian Learning
- Flat input (vectors, etc.); not features.
- Language Applications heavily use Bayesian Learning.
- Counting occurrences (Frequency - Probability).
- Apply Bayes rule; rigorous definition of terms - not required; rather define terms to suit your purpose and computational power.
- Neural Network Learning
- Learn weights on inputs.
- Learning parameters of equations.
- Genetic Programming
- You join programs together, mutate programs to see if your program can generate required outputs from inputs.
- How do you represent programs?
- You represent programs as trees (remember Context free grammar - CFG?) Lisps are natural, but any language can be used.
You have some knowledge. As you gain new knowledge, how do you incorporate it to your existing knowledge base?
More on Machine Learning
- Explanation Based Learning
- You solve a problem and extract general principles from your solution for future reuse to other problems.
- Suppose, you have logically inferred that A => B => C => D. Then you can conclude (and learn) that A => D. From that point on, whenever you see A, you can replace it with D. This is Explanation Based Learning in action.
- One kind of memoization (storing results to avoid recomputation), but much more general.
- Relevance Based Learning
- You know a general rule. You learn something. And then using the general rule that you knew before and the newly learned knowledge, you infer (learn) a new generalization.
- You knew A & B => C. You learn that A and B are true. You infer (learn) that C is true.
- Inductive Logic Programming
- From examples of logical sentences, you infer (and hence learn) general rules.
- Example: From examples of family relations (say, Jack is Father of Anthony, Sam is Grandparent of Jill, etc. you learn that
- for all Z and X, Grandparent(Z, X) => Parent(Z, Y) & Parent(Y,X) for some Y.
- Can induce new terms / relation. For example, ILP can learn "Parent" from examples of Father and Mother relations.
- Inductive Logic Programming has been used to make Scientific Discoveries!
- Learning new relations and predicates are as important a step in scientific discovery as forming new rules. Introduction of the concept “acceleration” helped Newton form his famous law F (force) = ma (mass * acceleration). Without the concept of acceleration (which Galileo had invented earlier), it would have been very difficult for Newton to come up with a law that describes relationship between Force and rate of change in position of an object.
- In the above example, we saw that our ILP program invented the concept “Parent” and used it to learn (and compactly represent) the rule for “Grandparent”.
This is my 200th Post on tahsinversion2.blogspot.com!
No comments:
Post a Comment