0% found this document useful (0 votes)
13 views

Class 10 Ai Notes

The document provides an overview of Artificial Intelligence (AI), highlighting its three domains: Data Science, Computer Vision, and Natural Language Processing (NLP). It discusses the importance of data collection, data mining, data privacy, and various types of intelligence, as well as the AI project cycle, deep learning, and AI bias. Additionally, it covers the evaluation of AI models, the role of neural networks, and the significance of data science and computer vision in extracting insights from data.

Uploaded by

kixrain6
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Class 10 Ai Notes

The document provides an overview of Artificial Intelligence (AI), highlighting its three domains: Data Science, Computer Vision, and Natural Language Processing (NLP). It discusses the importance of data collection, data mining, data privacy, and various types of intelligence, as well as the AI project cycle, deep learning, and AI bias. Additionally, it covers the evaluation of AI models, the role of neural networks, and the significance of data science and computer vision in extracting insights from data.

Uploaded by

kixrain6
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

INTRODUCTION TO AI

Q. What are the 3 domains of AI ?


The 3 domains of AI are :
Data science, Computer vision, NLP (explain them).

Q. Why do we need to collect data?


Data to a machine is similar to food for human being to function. The world
of Artificial Intelligence revolves around Data. Every company whether small
or big is collecting data from as many sources as possible. Data is called the
New Gold today. It is through data collection that a business or management
has the quality information they need to make informed decisions from
further analysis, study, and research. Data collection allows themto stay on
top of trends, provide answers to problems, and analyze new insights to great
effect.

Q. What is data mining? Explain with example.


Data mining is the process of analyzing large data sets and extracting the
useful information from it. Data mining is used by companies to turn raw
data into useful information. Data mining is also known as Knowledge Discovery in
Data (KDD).
Example:
Price Comparison websites- They collect data about a product from different
sites and thenanalyze trends out of it and show up the most appropriate results.
Q. What do you understand by Data Privacy?
Proper and ethical handling of own data or user data is called data privacy. It is all about the rights of
individuals with respect to their personal information. Data privacy or information privacy is a branch of
data security concerned with the proper handling of data – consent, notice, and regulatory obligations.
More specifically, practical data privacy concerns often revolve around: Whether or how data is shared
with third parties.

Q. List the different types of intelligence.


The different types of intelligence are as follows:
1. Mathematical Logical Reasoning
2. Linguistic Intelligence
3. Spatial Visual Intelligence
4. Kinaesthetic Intelligence
5. Musical Intelligence
6. Intrapersonal Intelligence
7. Existential Intelligence
8. Naturalist Intelligence
9. Interpersonal intelligence
Note: Human beings processes all the 9 types of intelligence.

Q. What is Deep Learning ?


Deep Learning is the most advanced form of Artificial Intelligence. In Deep
Learning, the machine is trained with huge amounts of data which helps it
in training itself around the data. Such machines are intelligent enough to
develop algorithms for themselves.

Q. What is AI bias ?
AI bias is the underlying prejudice in data that’s used to create AI
algorithms, which can ultimately result in discrimination and other social
consequences.
AI Bias can creep into algorithms in several ways. AI systems learn to make
decisions based on training data, which can include biased human decisions
or reflect historical or social inequities, even if sensitive variables such as
gender, race, or sexual orientation are removed.

AI PROJECT CYCLE

What is an AI Project Cycle?


Project Cycle is a step-by-step process to solve problems using proven
scientific methods and drawing inferences about them.
Let us take some daily examples as projects, requiring
steps to solve theproblem. Creating a birthday card.
1. Checking the factors like budget, etc which will help us decide
the next steps andunderstanding the Project.
2. Acquiring data from different sources like online, with friends
etc for Designs andideas.
3. Making a list of the gathered data.
4. Creating or modelling a card on the basis of the data collected.
5. Showing it to Parents or cousins to let them check it or evaluate it.

Steps or Components of AI Project Cycle AI project cycle?


a)Problem Scoping / Problem Identification – Understanding the Problem.
b) Data Acquisition / Data Collection– Collecting accurate and reliable data.
c) Data Exploration – Arranging the data uniformly.
d) Data Modeling – Creating Models from the data.
e) Data Evaluation – Evaluating the project.
Problem Scoping : Problem Scoping refers to understanding a problem,
finding out various factors which affect the problem, define the goal or aim of
the project. It Has 4 W’s, Who, What Where and Why?
Data Acquisition
Data Acquisition is the process of collecting accurate and reliable data to work
with. Data Can be in the format of the text, video, images, audio, and so on. It can be collected from
various sources like interest, journals, surveys, newspapers, Application program interfaces (API),etc

Data Exploration
Data Exploration is the process of arranging the gathered data uniformly for a
better understanding. i.e. data can be arranged in the form of a table, plotting
a chart, or making a database.
Tools for Data Exploration : Google Charts , Tableau, Fusion Charts,
High Charts etc

Modelling
Modelling is the process in which different models based on the visualized
data can be created and even checked for the advantages and
disadvantages of the model. To Make a machine learning model there are 2
ways/Approaches Learning-Based Approach and a Rule-Based Approach.

What is Rule Based Approach ?


In this approach, the rules are defined by the developer. The machine follows the rules or an instruction
mentioned by the developer and performs its task accordingly. So, it’s a static model. i.e. the machine
once trained, does not take into consideration any changes made in the
original training dataset.
In the Rule-based Approach we will deal with 2 divisions of the dataset:
1. Training Data – A subset required to train the model
2. Testing Data – A subset required while testing the trained the model

What is learning Based Approach ?


It’s a type of AI modelling where the machine learns by itself. Under the Learning Based approach, the AI
model gets trained on the data fed to it and then is able to design a model which is adaptive to the change
in data. It May be Supervised Learning, Unsupervised Learning , Reinforcement Learning.
Supervised learning is where a computer algorithm is trained on input data that
has been labeled for a particular output. For example a shape with three
sides is labeled as a triangle, Classification and Regression models are also type
of supervised Learning

An unsupervised learning model works on unlabeled dataset. This means that the data which is fed to
the machine is random and there is a possibility that the person who is training the model does not have
any information regarding it. The unsupervised learning models are used to identify relationships,
patterns and trends out of the data which is fed into it. It helps
the user in understanding what the data is about and what are the major features identified by the
machine in it.
Reinforcement Learning
It a type of machine learning technique that enables an agent (model) to learn in an interactive
environment by trial and error using feedback from its own actions and experiences. Though both
supervised and reinforcement learning use mapping between input and output, unlike supervised
learning where feedback provided to the agent (model) is correct set of actions for
performing a task, reinforcement learning uses rewards and punishment as signals for positive
and negative behavior. Reinforcement learning is all about making decisions sequentially.

What is Evaluation ?
Evaluation is a process of understanding the reliability of any AI model, based on outputs by feeding the
test dataset into the model and comparing it with actual answers. i.e.o once a model has been made and
trained, it needs to go through proper testing so that one can calculate the efficiency and performance of
the model.

What is Neural Network ?


A Neural Network is essentially a system of organizing machine learning algorithms
to perform certain tasks. It is a fast and efficient way to solve problems for which the
dataset is very large, such as in images.

Neural Networks are collection of Neurons containing specific algorithm which are
networked together to solve a particular set of problems irrespective of data size
The key advantage of Neural Networks, are that they are able to extract data
features automatically without needing the input of the programmer.

FEATURES OF NEURAL NETWORKS:


• Neural Network systems are modeled on the human brain and nervous system.
• They are able to automatically extract features without input from the
programmer.
• Every neural network node is essentially a machine learning algorithm.
• It is useful when solving problems for which the data set is very large.

DATA SCIENCE

What is Data science?


Data science is a field that combines statistics, computer science, and domain expertise to
extract insights and knowledge from data. It involves using various techniques such as
machine learning, data visualization, and statistical modeling to analyze and interpret
complex data sets.

Write the advantages and disadvantages of data science?


Advantages:
- Better decision-making: Data science helps businesses and organizations make better-informed
decisions.
- Improved efficiency: Data science can help companies and organizations streamline
their operations by identifying inefficiencies and areas for improvement.
- Enhanced customer experience: Data science can help businesses and organizations.

Disadvantages:
- Data privacy concerns: There is a risk of data privacy concerns when data is collected
and analyzed.
- Bias in data: Data can be biased due to many factors, such as the selection of the data
or the way it is collected.
- Misinterpretation of data: Data science involves complex statistical analysis, which
can sometimes lead to misinterpretation of the data.

What is the goal of data visualization?


Answer: To communicate insights and patterns in data through graphical representations. Data
visualization is the process of creating graphical representations of data.

Describe the steps involved in the data science workflow.


Answer: The data science workflow typically involves:
- Problem definition and hypothesis formation
- Data collection and cleaning
- Data exploration and visualization
- Modeling and evaluation
- Deployment and maintenance

Explain the concept of bias and variance in machine learning and how they affect
model performance.
Answer: Bias refers to systematic error, while variance refers to model sensitivity to
data. High bias leads to under fitting, while high variance leads to overfitting. The
goal is to balance bias and variance to achieve optimal model performance.

COMPUTER VISION

Explain the term "Computer Vision" and its primary goal.


Computer Vision is a field of Artificial Intelligence that enables computers to
interpret and make decisions based on visual data, with the primary goal of
automating tasks that the human visual system can do

Explain The Basics of Images?


The word “pixel” means a picture element.
Pixels
• Pixels are the fundamental element of a photograph.
• They are the smallest unit of information that make up a picture.
• They are typically arranged in a 2-dimensional grid.
• In general term, the more pixels you have, the more closely the image resembles the
original.
Resolution
• The number of pixels covered in an image is sometimes called the resolution
• Term for area covered by the pixels in conventionally known as resolution.
• For eg :1080 x 720 pixels is a resolution giving numbers of pixels in width and height
of that picture.
• A megapixel is a million pixels.

Pixel value
• Pixel value represent the brightness of the pixel.
• The range of a pixel value in 0-255(2^8-1)
• where 0 is taken as Black or no colour and 255 is taken as white.

Explain the Image Features ?


• A feature is a description of an image.
• Features are the specific structures in the image such as points, edges or objects.
• Other examples of features are related to tasks of CV motion in image sequences, or
to shapes defined in terms of curves or boundaries between different image regions.
Open CV or Open Source Computer Vision Library is that tool that helps a computer
to extract these features from the images. It is capable of processing images and videos
to identify objects, faces, or even handwriting.

Describe the concept of feature extraction and its importance in different


Computer Vision tasks.
Feature extraction involves identifying and isolating significant information
from an image, such as edges, textures, and shapes. This process is crucial for
various Computer Vision tasks:
Object Recognition: Features help in identifying objects within an image by
matching extracted features with known patterns.
Image Classification: Features are used to classify images into categories based
on their content.
Tracking: Extracted features allow for tracking objects across frames in video
analysis, crucial for surveillance and motion detection.
Augmented Reality: Features are used to overlay virtual objects accurately
onto real-world scenes

Compare and contrast object detection, image classification, and image


segmentation in Computer Vision.

Object Detection: This task involves identifying and locating objects within an
image, providing bounding boxes around detected objects. It focuses on
detecting multiple objects and their positions.
Image Classification: This task assigns a single label to an entire image based
on its content. It does not provide the locations of objects, only categorizes the
image as a whole.
Image Segmentation: This task divides an image into segments, each
representing a different object or region. It provides pixel-level classification,
offering detailed information about the structure and boundaries within the
image.

NATURAL LANGUAGE PROCESSING

Natural Languge Processing


Natural Language Processing (NLP) refers to AI method of communicating with an
intelligent systems using a natural language such as English.
Application of Natural Language Processing :
- Automatic Text Summarization
- Sentiment and Emotion Analysis
- Text Classification
- Virtual Assistants

Components of NLP :
There are two components of NLP as given -
Natural Language Understanding (NLU)
Understanding involves the following tasks -
• Mapping the given input in natural language into useful representations.
• Analyzing different aspects of the language.
Natural Language Generation (NLG)
It is the process of producing meaningful phrases and sentences in the form of natural
language from some internal representation.
It involves -
• Text planning - It includes retrieving the relevant content from knowledge
base.
• Sentence planning - It includes choosing required words, forming meaningful
phrases, setting tone of the sentence.
• Text Realization - It is mapping sentence plan into sentence structure.
The NLU is harder than NLG.

Steps in NLP
There are general five steps –
• Lexical Analysis - It involves identifying and analyzing the structure of words.
Lexicon of a language means the collection of words and phrases in a language.
Lexical analysis is dividing the whole chunk of txt into paragraphs, sentences,
and words.
• Syntactic Analysis (Parsing) - It involves analysis of words in the sentence
for grammar and arranging words in a manner that shows the relationship
among the words. The sentence such as “The school goes to boy” is rejected
by English syntactic analyzer.
•Semantic Analysis - It draws the exact meaning or the dictionary meaning
from the text. The text is checked for meaningfulness. It is done by mapping
syntactic structures and objects in the task domain.
•Discourse Integration - The meaning of any sentence depends upon the
meaning of the sentence just before it. In addition, it also brings about the
meaning of immediately succeeding sentence.
• Pragmatic Analysis - During this, what was said is re-interpreted on what it
actually meant. It involves deriving those aspects of language which require
real world knowledge.

EVALUATION

What is evaluation?
Evaluation is the process of understanding the reliability of any AI model, based on
outputs by feeding test dataset into the model and comparing with actual answers.
There can be different Evaluation techniques, depending of the type and purpose of
the model.
Evaluation is the final stage in AI Project Cycle. Once a model has been made and
trained, it needs to go through proper testing so that one can calculate the efficiency
and performance of the model. Hence, the model is tested with the help of Testing
Data.
Evaluation is basically done by two things:
1. Prediction: The output given by the machine after training and testing the data
is known as Prediction. (Output of the machine)
2. Reality: Reality is the real situation and real scenario where prediction has been
made by the machine. (Reality or truth).

What is Confusion matrix ?


The result of comparison between the prediction and reality can be recorded in
what we call the confusion matrix. The confusion matrix allows us to understand
the prediction results. Note that it is not an evaluation metric but a record which
can help in evaluation.

You might also like