0% found this document useful (0 votes)
25 views4 pages

PB3 X Ai MS

AI

Uploaded by

moumita09.das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views4 pages

PB3 X Ai MS

AI

Uploaded by

moumita09.das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

GRIFFINS INTERNATIONAL SCHOOL KHARAGPUR

ARTIFICIAL INTELLIGENCE
(SUBJECT CODE - 417)
PRE-BOARD III EXAMINATION
Class X (Session 2023-2024)
Marking Scheme
SECTION A: OBJECTIVE TYPE QUESTIONS
Answer any 4 out of the given 6 questions on Employability Skills (1 x 4 = 4
Q. 1
marks)
1.1 (b) Self-motivation 1
1.2 (a) Consideration 1
1.3 (b) Graphical User Interface 1
1.4 The statement a myth 1
1.5 Soil nutrients are also getting depleted and lots of chemicals are spoiling the
1
soil due to use of chemical fertilizers.
1.6 (d) Linguistic barrier 1
Q. 2 Answer any 5 out of the given 6 questions (1 x 5 = 5 marks)
2.1 (c) Musical Intelligence 1
2.2 (a) Both A and R are correct and R is the correct explanation of A. 1
2.3 (a) Training Data 1
2.4 (d) Neural Network 1
2.5 (a) Data Science 1
2.6 (d) All of the above 1
Q. 3 Answer any 5 out of the given 6 questions (1 x 5 = 5 marks)
3.1 (d) YouTube 1
3.2 (b) 4Ws Problem 1
3.3 (a) Both Statement 1 and Statement 2 are correct. 1
3.4 3 1
3.5 (c) NLP 1
3.6 (b) F1 Score 1
Q. 4 Answer any 5 out of the given 6 questions (1 x 5 = 5 marks)
4.1 (b) Both statements 1 and 2 are correct 1
4.2 (b) Accuracy 1
4.3 (b) Fraud and risk detection 1
4.4 (b) Object detection 1
4.5 (d) Text and Speech 1
4.6 (d) True Positive, False Positive 1
Q.5 Answer any 5 out of the given 6 questions (1 x 5 = 5 marks)
5.1 (c) Al Bias 1
5.2 (a) defined as the fraction of positive cases that are correctly identified. 1

Page 1 of 4
5.3 (c) Pixel 1
5.4 Text classification 1
5.5 Natural Language Processing is the sub-field of AI that is focused on enabling
1
computers to understand and process human languages.
5.6 In an Al model, when the prediction is true and it matches with the reality, it is
1
true positive.

SECTION B: SUBJECTIVE TYPE QUESTIONS


Answer any 3 out of the given 5 questions on Employability Skills (2 x 3 = 6 marks)
Answer each question in 20 – 30 words
Q.6 1. Use simple language
2. Be respectful of others opinions
3. Do not form assumptions on culture, religion or geography
2
4. Try to communicate in person as much as possible
5. Use visuals
6. Take help of a translator to overcome differences in language.
Q.7 Doing meditation and deep breathing exercises help in proper blood
2
circulation and relaxes the body.
Q.8 1. Run full system virus scan
2
2. Check for expiry of anti-virus software and renew
Q.9 1. Making decisions
2. Managing the business
3. Divide income 2
4. Taking risk
5. Create a new method, idea or product
Q.10 To reduce inequalities we can:
1. be helpful to one another.
2. be friendly with everyone.
2
3. include everyone while working or playing.
4. help others by including everyone whether they are small or big, girl or boy,
belong to any class or caste.
Answer any 4 out of the given 6 questions in 20 – 30 words each (2 x 4 = 8 marks)
Amazon's Al recruiting tool The system taught itself that male candidates
were preferable. It penalized resumes that included the word "women". This
Q.11 2
led to the failure of the tool. This is an example of Al bias. (Any relevant correct
answer)
We humans are able to visualise upto 3-Dimensions only but according to a
lot of theories and algorithms, there are various entities which exist beyond
3-Dimensions. For example, in Natural language Processing, the words are
Q.12 considered to be N-Dimensional entities. Which means that we cannot 2
visualise them as they exist beyond our visualisation ability. Hence, to make
sense out of it, we need to reduce their dimensions. Here, dimensionality
reduction algorithm is used.
While accessing data from any of the data sources, following points should be
Q.13 kept in mind: 2
1. Data which is available for public usage only should be taken up.
Page 2 of 4
2. Personal datasets should only be used with the consent of the owner.
3. One should never breach someone's privacy to collect data.
4. Data should only be taken from reliable sources as the data collected from
random sources can be wrong or unusable.
5. Reliable sources of data ensure the authenticity of data which helps in
proper training of the Al model.
6. Data should be relevant to the problem
Q.14 Resolution of an image refers to the number of pixels in an image, across the
width and height. For example, a monitor resolution of 1280 x 1024. This
2
means there are 1280 pixels from one side to the other and 1024 from top to
bottom.
Q.15 After the stopwords removal, we convert the whole text into a similar case,
preferably lower case. This ensures that the case-sensitivity of the machine
does not consider same words as different just because of different cases. 2
Here in this example, the all the 6 forms of hello would be converted to lower
case and hence would be treated as the same word by the machine.
Q.16 Accuracy is defined as the percentage of correct predictions out of all the
observations. A prediction can be said to be correct if it matches the reality.
Here, we have two conditions in which the Prediction matches with the
Reality: True Positive and True Negative. Hence, the formula for Accuracy 2
becomes:
Accuracy = (Correct Predictions / Total Cases) x 100%
= ((TP + TN)/ (TP + TN+FP+ FN)) ×100%
Answer any 3 out of the given 5 questions in 50– 80 words each (4 x 3 = 12 marks)
Q.17 No, not all the devices which are termed as "smart" are Al-enabled. For
example:
• A TV does not become Al-enabled if it is a smart one, it gets the power of AI
when it is able to think and process on its own.
• A fully automatic washing machine can work on its own, but it requires
human intervention to select the parameters of washing and to do the
necessary preparation for it to function correctly before each wash, which 4
makes it an example of automation, not AI.

Just as humans learn how to walk and then improve this skill with the help of
their experiences, an Al machine too gets trained first on the training data and
then optimizes itself according to its own experiences which makes Al different
from any other technological device/machine.
Q.18 To develop an Al project, the AI Project Cycle provides us with an appropriate
framework which can lead us towards the goal. The Al Project Cycle mainly
has 5 stages:
1. Problem Scoping
2. Data Acquisition 4
3. Data Exploration
4. Modelling
5. Evaluation
Data Acquisition: As the term clearly mentions, this stage is about acquiring

Page 3 of 4
data for the project. Data can be a piece of information or facts and statistics
collected together for reference or analysis. Data can be collected through
surveys, interviews, webscrapping, sensors, cameras, observation, and API
programs. Whenever we want an Al project to be able to predict an output, we
need to train it first using data.
Data Exploration: While acquiring data, we must have noticed that the data is
a complex entity it is full of numbers and if anyone wants to make some sense
out of it, they have to work some patterns out of it. We need to explore data,
so that we can:
 Quickly get a sense of the trends, relationships and patterns contained
within the data.
 Define strategy for which model to use at a later stage.
 Communicate the same to others effectively. To visualise data, we can use
various types of visual representations.
Q.19 Stakeholders Our People Who
Have a problem of Air pollution has damaging effects on What
human health
While When harmful gases like SO2, NO2 and CO2 Where
4
are emitted directly into air
An ideal solution Be develop an air quality index monitor so Why
would that one can know the local air quality and
take action to protect their health
Q.20 Bag of words algorithm: Bag of Words is a Natural Language Processing model
which helps in extracting features out of the text which can be helpful in
machine learning algorithms. In bag of words, we get the occurrences of each
word and construct the vocabulary for the corpus. Here is the step-by-step
approach to implement bag of words algorithm:
1. Text Normalisation: Collect data and pre-process it.
2. Create Dictionary: Make a list of all the unique words occurring in the corpus.
4
(Vocabulary)
3. Create document vectors: For each document in the corpus, find out how
many times the word from the unique list of words has occurred.
4. Create document vectors for all the documents. Let us go through all the
steps with an example:

Any relevant example (With all the four steps)


Q.21 (i) True Negative = 20
(ii) Precision = 0.91, Recall = 0.71 and F1 Score = 0.8 4

Page 4 of 4

You might also like