Particle Swarm Optimization, Genetic Algorithm, Ant Colony Optimization, Artificial Immune System, Decision Support System, Artificial Neural Network

Thursday, May 15, 2014

GREAT IDEAs

Some researcher said:
  • "Evolutionary methods are most often used on hard optimization problems rather than pattern recognition"
  • Some decades ago, people thought that genetic and evolutionary algorithms were swiss-army-knives, fueled by spectacular early results. Statements like the building block hypothesis were made in an effort to prove that they were in general good strategies. 
  • Most prominently the No Free Lunch Theorem
    • It became evident that genetic/evolutionary algorithms are often decent heuristics but never optimal in any sense.
    • Lets research more on No Free Lunch Theorem....(will continue)
Building block hypothesis (BBH)
Genetic algorithms are simple to implement, but their behavior is difficult to understand. In particular it is difficult to understand why these algorithms frequently succeed at generating solutions of high fitness when applied to practical problems. The building block hypothesis (BBH) consists of:
  1. A description of a heuristic that performs adaptation by identifying and recombining "building blocks", i.e. low order, low defining-length schemata with above average fitness.
  2. A hypothesis that a genetic algorithm performs adaptation by implicitly and efficiently implementing this heuristic.
Goldberg describes the heuristic as follows:
"Short, low order, and highly fit schemata are sampled, recombined [crossed over], and resampled to form strings of potentially higher fitness. In a way, by working with these particular schemata [the building blocks], we have reduced the complexity of our problem; instead of building high-performance strings by trying every conceivable combination, we construct better and better strings from the best partial solutions of past samplings.
"Because highly fit schemata of low defining length and low order play such an important role in the action of genetic algorithms, we have already given them a special name: building blocks. Just as a child creates magnificent fortresses through the arrangement of simple blocks of wood, so does a genetic algorithm seek near optimal performance through the juxtaposition of short, low-order, high-performance schemata, or building blocks."

Tuesday, April 29, 2014

PSO


Particle Swarm Optimization





• Particle Swarm Optimization (PSO) was introduced by Kennedy and Eberhart in the mid-1990s.

•It is a population-based stochastic approach which has been grouped under swarm intelligence (Kennedy, 2006; Engelbrecht, 2007; Parsopoulos & Vrahatis, 2007) and evolutionary computation (Trelea, 2003).

•PSO can be used to solve continuous and discrete problems.

•PSO was derived from a concept of a flock of birds which fly everywhere to find food.

•Each bird is illustrated as a particle.

•Each particle moves stochastically in search space for a feasible solution.

•Each of the particles has its own velocity and position.

Sunday, April 27, 2014

Nature Inspire Video


Introduction to Swarm Intelligence

Swarm intelligence systems are typically made up of a population of simple agents interacting locally with one another and with their environment

The agents follow very simple rules, and although there is no centralized control structure dictating how individual agents should behave, local interactions between such agents lead to the emergence of complex global behavior.

Natural examples of SI include ant colonies, bird flocking, animal herding, bacterial growth, and fish schooling.