W1 Ann
W1 Ann
A basic example of using neural networks is predicting house prices based on features like
house size.
A linear regression model might use a straight line, but neural networks allow more
flexible, non-linear fits.
ReLU (Rectified Linear Unit) is commonly used in neural networks. It sets negative outputs
to zero, creating a non-linear function:
[ f (x) = max(0, w ⋅ x + b) ]
Stacking Neurons: Larger neural networks are created by connecting multiple neurons (like
stacking LEGO bricks).
With more features (e.g., number of bedrooms, zip code, wealth), each feature is
represented by a "node" or neuron in the network.
Hidden Layers:
Standard Neural Networks: Used for general problems (e.g., price prediction).
Convolutional Neural Networks (CNNs): Specialized for image data, extracting spatial
features.
Recurrent Neural Networks (RNNs): Designed for sequential data, such as audio or text,
where time or sequence matters.
Although neural network principles have been around for decades, recent advances in data
availability, computing power, and algorithmic improvements have spurred their current
success and practical applications.
The success of deep learning today is driven by three main factors: data, computation, and
algorithmic innovation. Although the core ideas of deep learning have existed for decades, they
only recently became practical for real-world applications because of advancements in these
areas.
1. Data: Traditionally, algorithms like Support Vector Machines (SVM) and Logistic Regression
perform well with smaller datasets, but their performance plateaus when given large
amounts of data. In contrast, deep neural networks thrive with large datasets. Over the
past 20 years, the digitization of society has led to an explosion of digital data—through
online activity, IoT devices, smartphones, and sensors. This has enabled neural networks to
surpass traditional algorithms by taking full advantage of this abundance of data.