0% found this document useful (0 votes)
31 views

Unit-4 AI Notes

Uploaded by

nsagarishant
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

Unit-4 AI Notes

Uploaded by

nsagarishant
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Unit-4

Natural Language Processing (NLP) is about teaching computers to understand


and work with human language. Here’s a simple breakdown of some important
parts of NLP:

1. Tokenization: Splitting sentences into words or parts to make them easier for
computers to understand.

2. Syntax and Semantics: Figuring out the structure of sentences (syntax) and
the meaning of words and phrases (semantics) to understand what’s being said.

3. Named Entity Recognition (NER): Recognizing important names, dates, and


places in text, like spotting "India" as a country.

4. Sentiment Analysis: Detecting the emotional tone of text, like if it’s positive,
negative, or neutral.

5. Language Modeling: Predicting which words might come next in a sentence,


which helps make responses more natural.

6. Machine Translation: Translating text from one language to another, like


converting English to Spanish.

7. Speech Recognition: Turning spoken language into text so that computers


can process what we say.
NLP approaches:

1. Rule-Based Systems: Uses fixed rules for language tasks. Good for simple
tasks but can’t handle complex language variations.

2. Statistical NLP: Analyzes language using probability from large datasets.


Works well for language patterns but needs a lot of data.

3. Machine Learning-Based NLP: Trains models using labeled data to


recognize patterns. Useful for tasks like spam detection but needs quality training
data.

4. Deep Learning: Uses neural networks to handle complex patterns, like


language translation, but requires lots of data and computing power.

5. Transformers: Advanced models (like BERT, GPT) that understand context


better and can handle long texts. These are used in applications like text
generation.

6. Hybrid Models: Combines various methods to improve accuracy, useful for


specialized tasks, though complex to set up.

7. Transfer Learning: Uses pre-trained models on general tasks, then fine-tunes


them for specific tasks, saving time and resources
Parsing techniques in artificial intelligence help computers understand the
structure and meaning of sentences by breaking them down into components like
words, phrases, and grammar. Here are some common parsing techniques
explained simply:

1. Top-Down Parsing

How it works: Starts with the overall structure (sentence) and breaks it down into
smaller parts, matching each word or phrase with grammar rules.

Example: If the rule says a sentence is "subject + verb + object," the parser tries
to identify each of these parts in the sentence.

Limitation: Can sometimes fail if the sentence structure doesn’t fit neatly with the
rules.

2. Bottom-Up Parsing

How it works: Begins with individual words and gradually combines them into
larger structures until a full sentence is formed.

Example: First, it identifies words, then combines them into phrases, and finally
joins them into a sentence.

Advantage: Good for handling complex sentence structures.

3. Dependency Parsing

How it works: Focuses on the relationships between words (like “subject” and
“object”) rather than just their order.

Example: In “The cat chased the mouse,” it identifies that “cat” is the subject,
“chased” is the verb, and “mouse” is the object.

Advantage: Useful for understanding meaning, especially in languages with


flexible word order.
4. Constituency Parsing

How it works: Breaks sentences into nested parts, like a tree structure, to show
how smaller parts form larger phrases.

Example: “The quick brown fox” might be parsed as a noun phrase within a
larger sentence structure.

Advantage: Helps identify the grammar structure, which is useful for sentence
generation and translation.

5. Chart Parsing

How it works: Uses a table (or "chart") to store possible interpretations of each
part of a sentence, helping avoid re-analysis of the same parts.

Example: As it parses a sentence, it notes down all possible structures to find the
best fit.

Advantage: Efficient for complex sentences, as it reduces repeated work.


Context-Free Grammar (CFG)

Definition: A set of rules that describe how to create valid sentences in a


language.

Components:

1. Terminals: Actual words.


2. Non-terminals: Symbols that represent groups of words.
3. Production Rules: Instructions for constructing sentences.

Use in AI: Helps computers analyze and understand the structure of sentences.

Transformational Grammar

Definition: A framework that explains how sentences can change form while
maintaining meaning.

Components:

1. Base Structures: Simple sentences.


2. Transformations: Rules that modify sentences (e.g., active to passive
voice).

Use in AI: Aids in understanding different ways to express the same idea,
enhancing natural language processing.

Summary

Together, CFG and transformational grammar enable AI systems to comprehend


and generate human language effectively.

You might also like