kwc.org Photos Spare Cycles MythBusters

Notes: Stochastic Local Searches

Tutorial session at AAAI

slides

Stochastic Local Searches * performance depends strongly on parameters/tuning * tuning can be time-intensive * limited understanding of how performance depends on parameters (lots of experience required) * many algorithms leave important implementation choices to user

Stochastic Local Methods

  • Simulated Annealing
  • Tabu Search
  • Iterated Local Search
  • Evolutionary algorithms
  • Ant Colony Optimization

Simulated Annealing

  • Parameterized local search extension
  • Features

    • Solution generation:
    • Acceptance criterion: better solutions always accepted, worse accepted with probability:

      ~exp((g(s)-g(s')) / T)

    • Annealing: slowly decrease T

  • Parameters
    • Cooling schedule
    • Initial temperature T0
    • Number of iterations at each iteration
    • Stopping criterion
  • Use a 'cooling schedule' to converge on solution
  • easy to implement
  • good performance but long run time
  • parallelizing:
    • coarse-grained: independent parallel runs
    • fine-grained: parallel evaluation of moves

Tabu Search

  • Parameterized local search extension
  • Short term and long term memory implementations
  • Performs well even with just short term memory
  • May require time-intensive fine-tuning
  • Simple implementations using short term memory (store attributes of tl most recently visited solutions)
  • Tries to avoid cycles in search using memory
  • Don't necessarily move to solutions that are better than the current one
  • Parameters
    • tl: tabu list length / tabu tenure: choice of this parameter critical for performance
    • Stopping criteria: all neighboring solutions are tabu | max iterations exceeded | number of iterations w/o improvement
    • Aspiration criteria: can override tabu status to help avoid marking previously unseen solutions as tabu
  • Features:
    • Single trajectory
    • Memory: short/long term
    • Attributes: store specific attributes of solution rather than complete solution
    • Solutions which contain tabu attributes are forbidden
    • At each step move to best neighboring solution even if not better than the current one.
  • Long term features:
    • Long term memory: often based on some measure of frequency (e.g. frequency of local search moves)
    • Intensification strategies: focus on search areas of search space by restarting around elite solutions or locking certain attributes.
    • Diversification strategies: move towards previously unexplored regions
  • exponential run-time distributions (QAP)
  • cooperative approaches [Crainic et al.]

Iterated Local Search

  • Hybrid strategy
  • build chain of local minima
  • reduce search space to the space of local minima
  • Some of best TSP solutions based on ILS
  • Few parameters
  • Easy to implement
  • Highly effective
  • Related to Variable Neighborhood Search
  • Features:
    • Limited memory
    • Multiple neighborhoods
  • Parameters
    • Initial solution
    • Solution modification: too strong (random restart), too weak (stuck in local minima). e.g. non-sequential 4-opt move (double-bridge move)
    • Acceptance criterion: strength of bias towards best found solutions. e.g. apply solution modification to best solutions since start of algorithm.
    • Local search technique: effectiveness vs. speed. 2-opt, 3-opt, Lin-Kernighan, Helsguan LK

Evolutionary algorithms

  • Hybrid strategy
  • Only discussed genetic algorithms
  • Features
    • Population
    • Limited memory: e.g. passing of information from parents to children
    • Limited multiple neighborhoods
  • Parameters/representation
    • Solution representation: binary vs. problem-specific
    • Fitness function
    • Mutation operator: background vs. driving search
    • Crossover operator: parent selection scheme, problem-specific vs. general purpose crossover, passing of information from parents to children
    • Stopping criteria: fixed number of generations, population convergence.
  • fine-grained and coarse-grained parallelization implementations

Ant Colony Optimization

  • Hybrid strategy
  • Features
    • Population: multiple ants
    • Limited memory: pheromone trails left on edges
    • Solution construction: define solution components, ants construction solutions by iteratively adding components.
    • Pheromone evaporation
    • Probability of selecting solution component probabilistically based on utility(solution) * pheromone(transition)
  • coarse-grained parallelization more successful than fine-grained
  • Good performance demonstrated with protein-folding

Performance measurement

Two measures * search space ruggedness * linear correlation between solution fitness and distance to global optima (fitness-distance correlation)

Asymptotic behavior

  • complete: for each problem instance P there is a time bound tmax(P) for the time required to find a solution
  • probabilistic approximate completeness (PAC Property): for each soluble problem instance a solution is found with probability -> 1 as run-time -> inf.
  • essential incompleteness: for some soluble problem instances, the probability for finding a solution is strictly smaller < 1 for run-time -> inf.

Cannot use t test: exponential, not normal distributions

Estimating optimum solutions qualities:

  • use quasi-solutions (based on running stochastic alg. and making bold assumption), or use known solutions from running complete algorithms.

Applications

  • SAT
  • MAX-SAT
  • TSP
  • Scheduling
  • Time-tabling
  • Protein-folding

Post a comment


tags.

related entries.

what is this?

This page contains a single entry from kwc blog posted on July 26, 2004 11:29 AM.

The previous post was Sketches: Indy Press.

The next post is Notes: Automating the Design of Visualizations.

Current entries can be found on the main page.