0% found this document useful (0 votes)
11 views

Topic:-CA-1: Advanced Search and Optimization Techniques

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Topic:-CA-1: Advanced Search and Optimization Techniques

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

TOPIC:- ADVANCED SEARCH AND

OPTIMIZATION TECHNIQUES
ARTIFICIAL INTELLIGENCE (OE-EE 701A)
CA-1
NAME- SAURABH JAISWAL
ROLL NO.- 35500721002
REG. NO.- 213550101610039
DEPT.- ELECTRICAL ENGINEERING
SEMESTER- 7TH SEM.
SUBMITTED TO- DR. SUMANTA ROY (ASSOCIATE PROFESSOR DEPT. OF COMPUTER SCIENCE)

GHANI KHAN CHOUDHURY INSTITUTE OF ENGINEERING &


TECHNOLOGY
(A CFTI AND ESTD. BY MINISTRY OF EDUCATION, GOVT OF INDIA, NARAYANPUR,MALDA, PIN-732141)
INTRODUCTION TO ADVANCED SEARCH
AND OPTIMIZATION TECHNIQUES

Advanced Search and Optimization Techniques


encompass a range of methods and algorithms used to
efficiently find optimal solutions in complex problem
spaces. These techniques are widely applied in fields like
artificial intelligence, operations research, data science,
and engineering. They aim to enhance the performance
and accuracy of search and optimization processes,
particularly in scenarios involving large, complex, or high-
dimensional spaces.
MEMORY BOUNDED HEURISTIC SEARCH:
LOCAL SEARCH ALGORITHMS
• Memory Bounded Heuristic Search aims to optimize search processes while using limited memory.
It balances between depth-first search (which uses less memory) and breadth-first search (which is
more memory-intensive but thorough). Examples include:

• Iterative Deepening A* (IDA*):** Combines the depth-first search's space efficiency and the A*
algorithm's optimality. It performs a series of depth-limited searches, increasing the depth limit with
each iteration.

• Simplified Memory-Bounded A* (SMA*):** Expands the least-cost node but keeps track of only the
best alternatives in memory, discarding nodes when memory is full and retaining essential paths.
OPTIMIZATION PROBLEMS
1. Hill Climbing:
An iterative algorithm that starts with an arbitrary solution and makes small changes to the solution,
choosing the change that improves the objective function the most.
•Variants:
•Simple Hill Climbing: Selects the first neighbor better than the current state.
•Steepest-Ascent Hill Climbing: Evaluates all neighbors and selects the best one.
•Limitations: Can get stuck in local optima, plateaus, or ridges.

2. Simulated Annealing:
A probabilistic technique that explores the solution space by allowing occasional steps that worsen the
objective function, to avoid local optima.
•Process:
•Starts at a high "temperature" which gradually decreases.
•At each step, a random move is accepted if it improves the solution or, with a probability
that decreases with temperature, if it worsens it.
•Applications: Useful in scenarios where the search space has many local optima.
3. Local Beam Search:
• An algorithm that keeps track of multiple candidate solutions (the "beam") at once.
• Process:
• Starts with a set of random states.
• Expands all states in the beam, keeping only the best resulting states.
• Advantages: More exploration of the search space compared to single-path algorithms.
GENETIC ALGORITHMS IN AI: CONCEPTS
AND APPLICATIONS
• Genetic Algorithms (GAs) are inspired by the process of natural selection and genetics. They are
used to solve optimization and search problems.
• Basic Concepts:
• Population: A set of candidate solutions.
• Chromosomes: Representations of candidate solutions.
• Genes: Parts of a chromosome representing solution components.
• Fitness Function: Evaluates how close a given solution is to the optimum.
• Selection: Process of choosing parent solutions based on fitness.
• Crossover: Combining parts of two parents to create offspring.
• Mutation: Randomly altering genes to maintain diversity.
• Applications: Used in optimization problems such as scheduling, design,
machine learning hyperparameter tuning, and evolving neural network
architectures.
CONSTRAINT SATISFACTION PROBLEMS AND
LOCAL SEARCH SOLUTIONS
• Constraint Satisfaction Problems (CSPs) involve finding a solution that satisfies a set of constraints.
• Definition:
• Variables: Elements to be assigned values.
• Domains: Possible values for each variable.
• Constraints: Restrictions on variable assignments.

• Local Search Solutions for CSPs:


• Min-Conflicts Heuristic: Focuses on minimizing the number of constraint violations.
• Process:
• Starts with an arbitrary assignment.
• Selects a variable involved in conflicts and changes its value to minimize conflicts.
• Advantages: Effective for large CSPs, such as scheduling and map coloring.
CONCLUSION
Advanced search and optimization techniques, including memory-bounded
heuristic search, various local search algorithms, genetic algorithms, and
solutions for constraint satisfaction problems, are essential for tackling
complex computational problems. These techniques are widely used in
artificial intelligence, operations research, and many other fields to find
optimal or near-optimal solutions efficiently.
REFERENCE

Image
• https://www.slideshare.net/slideshow/i-hill-climbing-algorithm
-ii-steepest-hill-climbing-algorithm/247883377
• https://www.geeksforgeeks.org/search-algorithms-in-ai/
• https://wikidocs.net/189100
THANK YOU

You might also like