0% found this document useful (0 votes)
49 views

Module 2

Management related document. Lecture notes.

Uploaded by

gokul kumar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views

Module 2

Management related document. Lecture notes.

Uploaded by

gokul kumar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 101

Module 2:Key Technologies and

Infrastructure
• Application programming interface-API
• Artificial Intelligence
• Big Data
• Natural Language Processing
• Blockchain
• Cloud Computing (Cloud)
• Edge Computing
Application programming interface-API
• An API (Application Programming Interface) is an interface between
multiple applications.
• It means that it allows two applications to communicate.
• An API follows the principle of question and answer, exactly as humans do.
Meaning-API
• An Application programming interface is a software interface that helps in
connecting between the computer or between computer programs.

• In simple words, it is a software interface that offers service to the other


pieces of software.

• Example –
Best examples of web services APIs are- SOAP (Simple Object
Access Protocol), REST(Representational State Transfer).
How do APIs Work?
• Think of a client-server architecture where the client sends the request
via a medium to the server and receives the response through the
same medium.

• An API acts as a communication medium between two programs or


systems for functioning.

• The client is the user/customer (who sends the request), the medium is
the API, and the server is the backend (where the request is accepted
and a response is provided).
How do APIs Work?
When?

• APIs originated in the 1940s, by British computer scientists Maurice


Wilkes and David Wheeler at the time they worked on a modular
software library EDSAC (Electronic Delay Storage Automatic
Calculator) an early computer…...
Features-API
• An application programming interface is a software that allows two
applications to talk to each other.

• Application programming interface helps in enabling applications to


exchange data and functionality easily.

• The application programming interface is also called a middle man


between two systems.

• Application programming interface helps in data monetization.

• Application programming interface helps in improving collaboration.


Different types of APIs
Applications of APIs in the real world
• Weather snippets – In weather snippets, APIs are generally used to
access a large set of datasets to access the information of weather
forecast which is very helpful information in day-to-day life.

• Login – In this functionality, APIs are widely used to log in via Google,
Linked In, Git Hub, Twitter and allow users to access the log-in portal
by using the API interface.

• Entertainment – In this field, APIs are used to access and provide a


huge set of databases to access movies, web series, comedy, etc.
Applications of APIs in the real world

• E-commerce website – In this, APIs provide the functionality like if


you have purchase something, and now you want to pay so, API
provides interface like you can pay using different bank debit cards,
UPI(Unified Payments Interface), credit card, wallet, etc..

• Gaming – In gaming, it provides an interface like you can access the


information of the game, and you can connect to different users and
play with different-different users at the same time.
Artificial Intelligence - Overview
What is Artificial Intelligence?
• Artificial Intelligence is a collection of many different technologies working together to
enable machines to sense, comprehend, act, and learn with human-like levels of
intelligence.
• Maybe that’s why it seems as though everyone’s definition of artificial intelligence is
different: AI isn’t just one thing.
• Technologies like Machine Learning and Natural Language Processing are all part of the
AI landscape.
• Each one is evolving along its own path and, when applied in combination with data,
analytics and automation, can help businesses achieve their goals, be it improving
customer service or optimizing the supply chain.
What is Artificial Intelligence?
• According to the father of Artificial Intelligence, John McCarthy, it is “The
science and engineering of making intelligent machines, especially
intelligent computer programs”.
• Artificial Intelligence is a way of making a computer, a computer-
controlled robot, or a software think intelligently, in the similar manner
the intelligent humans think.
• AI is accomplished by studying how human brain thinks, and how humans
learn, decide, and work while trying to solve a problem, and then using the
outcomes of this study as a basis of developing intelligent software and
systems.
What is
Intelligence?

The ability of a system to calculate, reason, perceive relationships and analogies,

learn from experience, store and retrieve information from memory, solve problems,

comprehend complex ideas, use natural language fluently, classify, generalize, and

adapt new situations.


Types of Intelligence
History of AI
Goals of AI

• To Create Expert Systems − The systems which exhibit intelligent

behavior, learn, demonstrate, explain, and advice its users.

• To Implement Human Intelligence in Machines − Creating

systems that understand, think, learn, and behave like humans.


What Contributes to AI?
• Artificial intelligence is a science and technology based on disciplines
such as Computer Science, Biology, Psychology, Linguistics,
Mathematics, and Engineering.

• A major thrust of AI is in the development of computer functions


associated with human intelligence, such as reasoning, learning, and
problem solving.

• Out of the following areas, one or multiple areas can contribute to


build an intelligent system.
Key Components of AI
• AI applications generally involve the use of

• Data

• Algorithms, and

• Human interaction.

• Ensuring each of these components is appropriately structured and

validated is important for the development and implementation of AI

applications.
Data:
• AI applications are generally designed to analyze data by identifying patterns and to make
determinations or predictions based on those patterns.

• Examples include facial recognition in surveillance systems,

recommendation engines in e-commerce platforms, and

predictive maintenance in manufacturing.

• Data in AI applications is used for training machine learning models, making predictions, and
improving decision-making.
Facial recognition in surveillance systems
Recommendation engines in e-commerce platforms
predictive maintenance in manufacturing
Algorithms:
• An algorithm is a set of well-defined, step-by-step instructions for a machine to
solve a specific problem and generate an output using a set of input data.

• AI algorithms, particularly those used for ML, involve complex mathematical


code designed to enable the machines to continuously learn from new input
data and develop new or adjusted output based on the learnings.

• An AI algorithm is “not programmed to perform a task, but is programmed to


learn to perform the task.”

• The availability of open-source AI algorithms, including those from some of the


largest technology companies, has helped fueled AI innovation and made the
technology more accessible to the financial industry.
Machine Learning Algorithms
Machine Learning Algorithms
Dimensionality reduction methods are key to several real-life
applications, including text categorization, image retrieval, face
recognition,, neuroscience, gene expression analysis, email
categorization, etc.
Regression examples, including: Financial forecasting (like house price
estimates, or stock prices) Sales and promotions forecasting, etc.,
Decision tree algorithms
Human interaction:
• Human involvement is imperative throughout the lifecycle of any AI application, from preparing the
data and the algorithms to testing the output, retraining the model, and verifying results.

• As data is collected and prepared, human reviews are essential to curate the data as appropriate for
the application.

• As algorithms sift through data and generate output (e.g., classifications, outliers, and predictions),
the next critical component is human review of the output for relevancy, accuracy, and usefulness.

• Business and technology stakeholders typically work together to analyze AI-based output and give
appropriate feedback to the AI systems for refinement of the model.

• Absence of such human review and feedback may lead to irrelevant, incorrect, or inappropriate
results from the AI systems, potentially creating inefficiencies, foregone opportunities, or new risks
if actions are taken based on faulty results.
The benefits of AI
❖End-to-end efficiency: AI improves analytics and resource utilization across your
organization, resulting in significant cost reductions. It can also automate complex
processes and minimize downtime by predicting maintenance needs.

❖Improved accuracy and decision-making: AI augments human intelligence with rich


analytics and pattern prediction capabilities to improve the quality, effectiveness, and
creativity of employee decisions.

❖Intelligent offerings: Because machines think differently from humans, they can
uncover gaps and opportunities in the market more quickly, helping you introduce new
products, services, channels and business models with a level of speed and quality that
wasn’t possible before.
The benefits of AI
❖Empowered employees: AI can tackle everyday activities while employees spend time
on more fulfilling high-value tasks. By fundamentally changing the way work is done
and reinforcing the role of people to drive growth, AI is projected to boost labor
productivity. Using AI can also unlock the incredible potential of talent with disabilities,
while helping all workers thrive.

❖Superior customer service: Continuous machine learning provides a steady flow of


360-degree customer insights for hyper personalization. From 24/7 chatbots to faster
help desk routing, businesses can use AI to curate information in real time and provide
high-touch experiences that drive growth, retention and overall satisfaction.
Disadvantages of AI
The following are some disadvantages of AI.
• Expensive.
• Requires deep technical expertise.
• Limited supply of qualified workers to build AI tools.
• Reflects the biases of its training data, at scale.
• Lack of ability to generalize from one task to another.
• Eliminates human jobs, increasing unemployment rates
Big Data
What is Data?

• The quantities, characters, or symbols on which operations are performed

by a computer, which may be stored and transmitted in the form of

electrical signals and recorded on magnetic, optical, or mechanical

recording media.
What is Big Data??
• Data which are very large in size is called Big Data.

• Normally we work on data of size MB (Word Doc ,Excel) or maximum GB (Movies,


Codes) but data in Peta bytes i.e. 10^15 byte size is called Big Data.

• It is stated that almost 90% of today's data has been generated in the past 3 years.

• Big Data is a collection of data that is huge in volume, yet growing exponentially
with time.

• It is a data with so large size and complexity that none of traditional data
management tools can store it or process it efficiently.

• Big data is also a data but with huge size.


Types Of Big Data

1.Structured
2.Unstructured
3.Semi-structured
1. Structured
• Any data that can be stored, accessed and processed in the
form of fixed format is termed as a ‘structured’ data.

• It is highly organized and follows a pre-defined schema or


format.

• Each data element has a specific data type and is associated


with predefined fields and tables.

• Structured data is characterized by its consistency and


uniformity, which makes it easier to query, analyze and
process using traditional database management systems.
1. Structured-Examples
• Online booking. Different hotel booking and ticket reservation services
leverage the advantages of the pre-defined data model as all booking data
such as dates, prices, destinations, etc. fit into a standard data structure with
rows and columns.
• ATMs. Any ATM is a great example of how relational databases and
structured data work. All the actions a user can do follow a pre-defined
model.
• Inventory control systems. There are lots of variants of inventory control
systems companies use, but they all rely on a highly organized environment
of relational databases.
• Banking and accounting. Different companies and banks must process and
record huge amounts of financial transactions. Consequently, they make use
of traditional database management systems to keep structured data in
place.
2. Unstructured
• Any data with unknown form or the structure is classified as unstructured data.

• It does not have a predefined structure and may or may not establish clear relationships
between different data entities.

• Identifying patterns, sentiments, relationships, and relevant information within


unstructured data typically requires advanced AI tools such as Natural Language Processing
(NLP), Natural Language Understanding (NLU), and computer vision.

• Example: XML data.


2.Unstructured Examples
• Sound recognition. Call centers use speech recognition to identify
customers and collect information about their queries and emotions.
• Image recognition. Online retailers take advantage of image
recognition so that customers can shop from their phones by posting a
photo of the desired item.
• Text analytics. Manufacturers make use of advanced text analytics to
examine warranty claims from customers and dealers and elicit
specific items of important information for further clustering and
processing.
• Chatbots. Using Natural Language Processing (NLP) for text
analysis, chatbots help different companies boost customer satisfaction
from their services. Depending on the question input, customers are
routed to the corresponding representatives that would provide
comprehensive answers.
3. Semi-structured
• Contains elements of both structured and unstructured data.

• Semi-structured data refers to data that is not captured or


formatted in conventional ways

• Examples Emails: Emails are a classic example of semi-structured data.


They have defined fields like the sender, recipient, subject, and date, but
the body of the email is unstructured text. XML, JSON, and CSV files: These
file types are commonly used to store and transmit data on the web.

• Example: Text, PDF, Media logs, Word, etc.


Characteristics Of Big Data
The three Vs of big data
Big Data Tools
• Azure Data Lake: A Microsoft cloud service known for simplifying the
complexities of ingesting and storing massive amounts of data.

• Beam: An open-source unified programming model and set


of APIs for batch and stream processing across different big data
frameworks.

• Cassandra: An open-source, highly scalable, distributed NoSQL


database designed for handling massive amounts of data across
multiple commodity servers.
• Databricks: A unified analytics platform that combines data engineering
and data science capabilities for processing and analyzing massive data
sets.
Big Data Tools
• Elasticsearch: A search and analytics engine that enables fast
and scalable searching, indexing, and analysis for extremely
large data sets.
• Google Cloud: A collection of big data tools and services
offered by Google Cloud, such as Google BigQuery and Google
Cloud Dataflow.
• Hadoop: A widely used open-source framework for processing
and storing extremely large datasets in a distributed
environment.
• Hive: An open-source data warehousing and SQL-like querying
tool that runs on top of Hadoop to facilitate querying and
analyzing large data sets.
Big Data Tools
• Kafka: An open-source distributed streaming platform that allows for real-time
data processing and messaging.

• KNIME Big Data Extensions: Integrates the power of Apache Hadoop and Apache
Spark with KNIME Analytics Platform and KNIME Server.

• MongoDB: A document-oriented NoSQL database that provides high performance


and scalability for big data applications.

• Pig: An open-source high-level data flow scripting language and execution


framework for processing and analyzing large datasets.

• Redshift: Amazon’s fully-managed, petabyte-scale data warehouse service.

• Spark: An open-source data processing engine that provides fast and flexible
analytics and data processing capabilities for extremely large data sets.
Big Data Tools
• Splunk: A platform for searching, analyzing, and visualizing
machine-generated data, such as logs and events.

• Tableau: A powerful data visualization tool that helps users


explore and present insights from large data sets.

• Talend: An open-source data integration and ETL (Extract,


Transform, Load) tool that facilitates the integration and
processing of extremely large data sets.
Natural Language Processing (NLP)
Natural Language Processing (NLP)
• A fascinating and rapidly evolving field that intersects Computer Science, Artificial
Intelligence, and linguistics.

• NLP focuses on the interaction between computers and human language, enabling machines to
understand, interpret, and generate human language in a way that is both meaningful and
useful.

• With the increasing volume of text data generated every day, from social media posts to
research articles, NLP has become an essential tool for extracting valuable insights and
automating various tasks.
Natural Language Processing (NLP)
• NLP can be divided into two overlapping subfields:

• Natural Language Understanding (NLU), which focuses on semantic analysis or


determining the intended meaning of text, and

• Natural Language Generation (NLG), which focuses on text generation by a machine. NLP
is separate from — but often used in conjunction with — speech recognition, which seeks
to parse spoken language into words, turning sound into text and vice versa.
Working of Natural Language Processing (NLP)
Working of Natural Language Processing (NLP)

1. Text Input and Data Collection

• Data Collection: Gathering text data from various sources such as


websites, books, social media, or proprietary databases.

• Data Storage: Storing the collected text data in a structured format,


such as a database or a collection of documents.
Working of Natural Language Processing (NLP)
2. Text Preprocessing
• Preprocessing is crucial to clean and prepare the raw text data for analysis.
Common preprocessing steps include:
• Tokenization: Splitting text into smaller units like words or sentences.
• Lowercasing: Converting all text to lowercase to ensure uniformity.
• Stopword Removal: Removing common words that do not contribute
significant meaning, such as “and,” “the,” “is.”
• Punctuation Removal: Removing punctuation marks.
• Stemming and Lemmatization: Reducing words to their base or root
forms. Stemming cuts off suffixes, while lemmatization considers the
context and converts words to their meaningful base form.
• Text Normalization: Standardizing text format, including correcting
spelling errors, expanding contractions, and handling special characters.
Working of Natural Language Processing (NLP)
3. Text Representation
• Bag of Words (BoW): Representing text as a collection of words,
ignoring grammar and word order but keeping track of word
frequency.
• Term Frequency-Inverse Document Frequency (TF-IDF): A
statistic that reflects the importance of a word in a document relative
to a collection of documents.
• Word Embeddings: Using dense vector representations of words
where semantically similar words are closer together in the vector
space (e.g., Word2Vec, GloVe).
Working of Natural Language Processing (NLP)
4. Feature Extraction
• Extracting meaningful features from the text data that can be used for
various NLP tasks.
• N-grams: Capturing sequences of N words to preserve some context and
word order. Yypes of N-grams:
1. Unigrams: Single words (n=1)
2. Bigrams: Pairs of words (n=2)
3. Trigrams: Sequences of three words (n=3)
4. Four-grams: Sequences of four words (n=4)
5. Five-grams: Sequences of five words (n=5)
• Syntactic Features: Using parts of speech tags, syntactic dependencies, and
parse trees.
• Semantic Features: Leveraging word embeddings and other
representations to capture word meaning and context.
Working of Natural Language Processing (NLP)
5. Model Selection and Training
• Selecting and training a Machine Learning or Deep Learning model to
perform specific NLP tasks.
• Supervised Learning: Using labeled data to train models like Support
Vector Machines (SVM), Random Forests, or deep learning models
like Convolutional Neural Networks (CNNs) and Recurrent Neural
Networks (RNNs).
• Unsupervised Learning: Applying techniques like clustering or topic
modeling (e.g., Latent Dirichlet Allocation) on unlabeled data.
• Pre-trained Models: Utilizing pre-trained language models such as
BERT, GPT, or transformer-based models that have been trained on
large corpora.
Working of Natural Language Processing (NLP)
6. Model Deployment and Inference
• Deploying the trained model and using it to make predictions or
extract insights from new text data.
• Text Classification: Categorizing text into predefined classes (e.g.,
spam detection, sentiment analysis).
• Named Entity Recognition (NER): Identifying and classifying
entities in the text.
• Machine Translation: Translating text from one language to another.
• Question Answering: Providing answers to questions based on the
context provided by text data.
Working of Natural Language Processing (NLP)
7. Evaluation and Optimization
• Evaluating the performance of the NLP algorithm using metrics such as accuracy,
precision, recall, F1-score, and others.
F1-score is a measure of a model's accuracy in NLP tasks,
It's the harmonic mean of precision and recall, providing a balanced measure of
both.
The F1-score ranges from 0 (worst) to 1 (best). A higher F1-score indicates better
performance.
Interpretation:- F1-score = 1: Perfect precision and recall-
F1-score = 0: Worst possible performance-
F1-score > 0.5: Generally considered a good performance-
F1-score < 0.5: Room for improvement
• Hyperparameter Tuning: Adjusting model parameters to improve performance.
• Error Analysis: Analyzing errors to understand model weaknesses and improve
robustness.
Working of Natural Language Processing (NLP)
8. Iteration and Improvement
• Continuously improving the algorithm by incorporating new
data, refining preprocessing techniques, experimenting with
different models, and optimizing features.
Technologies related to Natural Language Processing
1.Machine learning: NLP relies heavily on ML techniques such as
supervised and unsupervised learning, deep learning, and
reinforcement learning to train models to understand and generate
human language.

2.Natural Language Toolkits (NLTK) and other libraries: NLTK is a


popular open-source library in Python that provides tools for NLP
tasks such as tokenization, stemming, and part-of-speech tagging.
Other popular libraries include spaCy, OpenNLP, and CoreNLP.

3.Parsers: Parsers are used to analyze the syntactic structure of


sentences, such as dependency parsing and constituency parsing.
Technologies related to Natural Language Processing
4.Text-to-Speech (TTS) and Speech-to-Text (STT) systems: TTS
systems convert written text into spoken words, while STT systems
convert spoken words into written text.
5.Named Entity Recognition (NER) systems: NER systems identify
and extract named entities such as people, places, and organizations
from the text.
6.Sentiment Analysis: A technique to understand the emotions or
opinions expressed in a piece of text, by using various techniques like
Lexicon-Based, Machine Learning-Based, and Deep Learning-based
methods.
Technologies related to Natural Language Processing
7.Machine Translation: NLP is used for language translation from one
language to another through a computer.

8.Chatbots: NLP is used for chatbots that communicate with other


chatbots or humans through auditory or textual methods.

9.AI Software: NLP is used in question-answering software for


knowledge representation, analytical reasoning as well as information
retrieval.
Applications of Natural Language Processing (NLP)
• Spam Filters:
• One of the most irritating things about email is spam.
• Gmail uses Natural Language Processing (NLP) to discern which
emails are legitimate and which are spam.
• These spam filters look at the text in all the emails you receive and try
to figure out what it means to see if it’s spam or not.
• Algorithmic Trading:
• Used for predicting stock market conditions.
• Using NLP, this technology examines news headlines about companies
and stocks and attempts to comprehend their meaning in order to
determine if you should buy, sell, or hold certain stocks.
Applications of Natural Language Processing (NLP)
• Questions Answering:
• NLP can be seen in action by using Google Search or Siri Services.
• A major use of NLP is to make search engines understand the meaning
of what we are asking and generate natural language in return to give
us the answers.
• Summarizing Information:
• On the internet, there is a lot of information, and a lot of it comes in
the form of long documents or articles.
• NLP is used to decipher the meaning of the data and then provides
shorter summaries of the data so that humans can comprehend it more
quickly.
NLP Future Enhancements
• Companies like Google are experimenting with Deep Neural Networks
(DNNs) to push the limits of NLP and make it possible for human-to-
machine interactions to feel just like human-to-human interactions.
• Basic words can be further subdivided into proper semantics and used
in NLP algorithms.
• The NLP algorithms can be used in various languages that are
currently unavailable such as regional languages or languages spoken
in rural areas etc.
• Translation of a sentence in one language to the same sentence in
another Language at a broader scope.
CLOUD COMPUTING
Cloud Computing
• What is Cloud?
• The term Cloud refers to a Network or Internet.

• In other words, Cloud is something, which is present at remote


location.

• Cloud can provide services over public and private networks, i.e., LAN,
MAN WAN or VPN.

• Applications such as e-mail, web conferencing, Customer Relationship


Management (CRM) execute on cloud.
What is Cloud Computing?
• Cloud computing is a model of delivering computing services over the internet,
where resources such as servers, storage, databases, software, and applications
are provided as a service to users.
Types of cloud computing
❖Public Cloud: The public cloud allows systems and services to be easily accessible
to the general public. Public cloud may be less secure because of its openness.

Ex: Amazon Web services, Microsoft Azure, Google Cloud platform, etc.,

Public clouds are scalable, on-demand, and pay-as-you-go, making them attractive for
businesses of all sizes.

❖Private Cloud: The private cloud allows systems and services to be accessible within
an organization. It is more secured because of its private nature.

Ex: Microsoft System Center, OpenStack, IBM cloud private, etc.,


• Private clouds often used by large enterprises, government agencies, and organizations
with sensitive data.
Types of cloud computing
❖Community Cloud: The community cloud allows systems and services to be accessible by a
group of organizations.

Ex: 1.Microsoft Azure for Research: A community cloud for researchers to access cloud
resources and collaborate.

2.AWS GovCloud: - A community cloud for US government agencies to share resources and
collaborate.
Types of cloud computing
❖Hybrid Cloud: A mixture of public and private cloud, in which the critical activities are
performed using private cloud while the non-critical activities are performed using public
cloud.

Ex: IBM Cloud and IBM Cloud Private: IBM Cloud Private is a hybrid cloud platform that
brings IBM Cloud services to on-premises environments.

Oracle Cloud and Oracle Cloud at Customer: Oracle Cloud at Customer is a hybrid cloud
service that brings Oracle Cloud infrastructure and services to on-premises environments.
Types of cloud services:
• The Infrastructure-as-a-Service (IaaS)

• Provides virtualized computing resources, such as:

- Servers (e.g., Amazon EC2, Google Compute Engine)

- Storage (e.g., Amazon S3, Google Cloud Storage)

- Networking (e.g., virtual firewalls, load balancers)

- Users manage and configure resources, while the provider manages


infrastructure.
Types of cloud services:
Platform as a Service (PaaS):

- Provides a complete platform for developing, running, and managing applications,


including:

- Tools (e.g., compilers, debuggers)

- Libraries (e.g., frameworks, databases)

- Infrastructure (e.g., servers, storage)

- Users focus on application development, while the provider manages the platform.

- Examples: Heroku, Google App Engine, Microsoft Azure.


Types of cloud services:
• Software-as-a-Service (SaaS): SaaS model allows to use software applications
as a service to end-users.

- Users access applications through a web browser or API.

- Examples: Microsoft Office 365, Salesforce, Dropbox.

Anything-as-a-Service (XaaS) is yet another service model, which includes


Network-as-a-Service, Business-as-a-Service, Identity-as-a-Service, Database-as-
a-Service or Strategy-as-a-Service.
Other types of cloud services:
Function as a Service (FaaS):- Provides a platform for running event-driven
code, without managing servers or infrastructure.

- Examples: AWS Lambda, Google Cloud Functions, Azure Functions

2. Serverless Computing: - A cloud computing model where the provider


manages server resources, and users only pay for consumed resources.

- Examples: AWS Lambda, Google Cloud Run, Azure Function


Edge Computing
Edge Computing
• Edge computing allows devices in remote locations to process data at the
"edge" of the network, either by the device or a local server.

• When data needs to be processed in the central datacenter, only the most
important data is transmitted, thereby minimizing latency.

• Businesses use edge computing to improve the response times of their remote
devices and to get richer, more timely insights from device data.

• Edge devices include smart cameras, thermometers, robots, drones, vibration


sensors, and other IoT devices.
An example of Edge Computing:
• A security camera in a remote warehouse uses AI to identify
suspicious activity and only sends that specific data to the main
datacenter for immediate processing.
• So, rather than the camera burdening the network 24 hours per day by
constantly transmitting all of its footage, it only sends relevant video
clips.
• This frees up the company's network bandwidth and compute
processing resources for other uses.
Benefits of Edge Computing
Moving some data functions like storage, processing, and analysis away from the cloud
and to the edge and closer to where data is generated can offer several key benefits:

• More efficient operations

• Faster response times

• Heightened security

• Improved workplace safety

• Reduced IT costs, etc.,


Benefits of Edge Computing
1. More efficient operations:

Edge computing helps enterprises optimize their day-to-day operations by rapidly


processing large volumes of data at or near the local sites where that data is
collected.

This is more efficient than sending all of the collected data to a centralized cloud
or a primary datacenter several time zones away, which would cause excessive
network delays and performance issues.
Benefits of Edge Computing
2.Faster response times:

Edge computing enables devices at or near a network's edge to instantly


alert key personnel and equipment to mechanical failures, security threats,
and other critical incidents so that swift action can be taken.
Benefits of Edge Computing
3.Heightened security:

For enterprises, the security risk of adding thousands of internet-connected


sensors and devices to their network is a real concern.

Edge computing helps to mitigate this risk by allowing enterprises to process data
locally and store it offline.

This decreases the data transmitted over the network and helps enterprises be less
vulnerable to security threats.
Benefits of Edge Computing
Improved workplace safety.

In work environments where faulty equipment or changes to working conditions


can cause injuries or worse, IoT sensors and edge computing can help keep people
safe.

Example: On offshore oil rigs, oil pipelines, and other remote industrial use cases,
predictive maintenance and real-time data analyzed at or close to the equipment
site can help increase the safety of workers and minimize environmental impacts.
Benefits of Edge Computing
5.Reduced IT costs:

With edge computing, businesses can optimize their IT expenses by processing


data locally rather than in the cloud.

Besides minimizing companies' cloud processing and storage costs, edge


computing decreases transmission costs by weeding out unnecessary data at or
near the location where it's collected.
Cloud Computing Vs Edge Computing
Cloud Computing Vs Edge Computing
Let's use the same examples to highlight the differences between cloud
computing and edge computing in terms of Data processing Location,
Latency, Connectivity and Real time decision making
1.Smart Traffic Management:
Cloud Computing:- Data from cameras and sensors is sent to the cloud
for processing and analysis.
- Traffic light control decisions are made in the cloud and sent back to
the traffic lights.
- Higher latency due to data transmission to the cloud.
Edge Computing:- Data from cameras and sensors is processed and
analyzed in real-time at the edge (traffic lights or local servers).
- Traffic light control decisions are made locally, reducing latency and
improving response time.
Cloud Computing Vs Edge Computing
2.Industrial Automation:
Cloud Computing:- Sensor data is sent to the cloud for processing and
analysis.
- Predictive maintenance insights are generated in the cloud and sent
back to the factory.
- Higher latency and dependence on cloud connectivity.

Edge Computing:- Sensor data is processed and analyzed in real-time


at the edge (local servers or machines).
- Predictive maintenance insights are generated locally, enabling faster
response times and reduced downtime.
Cloud Computing Vs Edge Computing
3.Retail Customer Experience:
Cloud Computing:- Customer behavior data is sent to the cloud for
processing and analysis.
- Personalized recommendations are generated in the cloud and sent
back to the store.
- Higher latency and dependence on cloud connectivity.
Edge Computing:- Customer behavior data is processed and analyzed
in real-time at the edge (local servers or in-store analytics platforms).
- Personalized recommendations are generated locally, enabling faster
response times and improved customer engagement.
Cloud Computing Vs Edge Computing
At last to conclude,
Cloud computing is ideal for applications requiring centralized
processing, scalability, and data analytics,
while Edge computing is suited for applications requiring real-time
processing, low latency, and autonomy.

You might also like