0% found this document useful (0 votes)
25 views

Introduction To Deep Learning-Session3: Ravi Shukla

This document provides an overview of several deep learning topics: - It discusses the Keras Sequential and Functional APIs for building deep learning models in Keras. The Sequential API builds models layer-by-layer while the Functional API connects layers directly. - LSTM models for stock price prediction and text generation are briefly mentioned. - Word2Vec, an algorithm that produces word embeddings, is introduced. It represents words as dense vectors that capture semantic meaning based on word context. - BERT, a pre-trained language model using bidirectional transformers, is noted as being introduced. Transfer learning using large pre-trained models like BERT has proven powerful for natural language processing tasks.

Uploaded by

Shivendra Saurav
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

Introduction To Deep Learning-Session3: Ravi Shukla

This document provides an overview of several deep learning topics: - It discusses the Keras Sequential and Functional APIs for building deep learning models in Keras. The Sequential API builds models layer-by-layer while the Functional API connects layers directly. - LSTM models for stock price prediction and text generation are briefly mentioned. - Word2Vec, an algorithm that produces word embeddings, is introduced. It represents words as dense vectors that capture semantic meaning based on word context. - BERT, a pre-trained language model using bidirectional transformers, is noted as being introduced. Transfer learning using large pre-trained models like BERT has proven powerful for natural language processing tasks.

Uploaded by

Shivendra Saurav
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 21

Introduction to Deep

Learning-Session3
Ravi Shukla
Agenda
• Assignment
• Questions from Previous Topics
• Keras Sequential and Functional API
• LSTM Stock Price Prediction
• LSTM Text Generation
• Word2Vec
• BERT – Introduction
Keras Sequential and
Functional API
Sequential API
• The Sequential model API is a way of creating deep learning models where an instance of
the Sequential class is created and model layers are created and added to it.
 Layers can be defined and passed to the Sequential as an array
from keras.models import Sequential
from keras.layers import Dense
model = Sequential([Dense(2, input_dim=1), Dense(1)])

Layers can also be added piecewise


from keras.models import Sequential
from keras.layers import Dense
model = Sequential()
model.add(Dense(2, input_dim=1))
model.add(Dense(1))

Hard to define models that may have multiple different input sources, produce multiple output destinations or models that re-use layers
Functional API
• Models are defined by creating instances of layers and connecting them directly to each
other in pairs, then defining a Model that specifies the layers to act as the input and
output to the model.

The input layer takes a shape argument that is a tuple that indicates the dimensionality of
the input data.
from keras.layers import Input
visible = Input(shape=(2,))

 the shape must explicitly leave room for the shape of the mini-batch size used when splitting the data when
training the network
Functional API
The layers in the model are connected pair wise.
This is done by specifying where the input comes from when defining each new layer.
A bracket notation is used, such that after the layer is created, the layer from which the input
to the current layer comes from is specified

from keras.layers import Input


from keras.layers import Dense
visible = Input(shape=(2,))
hidden = Dense(2)(visible)

Keras provides a Model class that you can use to create a model from your created layers. It requires that you only specify the input and
output layers.
from keras.models import Model
from keras.layers import Input
from keras.layers import Dense
visible = Input(shape=(2,))
hidden = Dense(2)(visible)
model = Model(inputs=visible, outputs=hidden)
Example of Complex Model
#from keras.utils import plot_model
from keras.models import Model
from keras.layers import Input
from keras.layers import Dense
from keras.layers import Flatten
from keras.layers.convolutional import Conv2D
from keras.layers.pooling import MaxPooling2D
from keras.layers.merge import concatenate
# input layer
visible = Input(shape=(64,64,1))
# first feature extractor
conv1 = Conv2D(32, kernel_size=4, activation='relu')(visible)
pool1 = MaxPooling2D(pool_size=(2, 2))(conv1)
flat1 = Flatten()(pool1)
# second feature extractor
conv2 = Conv2D(16, kernel_size=8, activation='relu')(visible)
pool2 = MaxPooling2D(pool_size=(2, 2))(conv2)
flat2 = Flatten()(pool2)
# merge feature extractors
merge = concatenate([flat1, flat2])
# interpretation layer
hidden1 = Dense(10, activation='relu')(merge)
# prediction output
output = Dense(1, activation='sigmoid')(hidden1)
model = Model(inputs=visible, outputs=output)
# summarize layers
print(model.summary())
# plot graph
plot_model(model, to_file='shared_input_layer.png')
LSTM in action
Predicting Stock Prices
Generating Text
Word Vectors
Problems with Traditional NLP techniques

• Words are treated as Atomic Symbols - Hotel , Motel , Food


• In vector space representation this is a 1 surrounded by lot of zeroes
[00000000100000000000000000000000000]
• Dimensionality -20 K words for speech!
• We call it one hot representation and in this
Hotel [00000000100000000000000000000000000] is not similar to
Motel [00000000000000000000001000000000000]
Represent word by its neighbors

• You Shall Know a word by the company it Keeps – John R Firth


(Linguist)
Word Embedding or Word Vector
Word Vectors and Relationship with Deep Learning

Word Vectors are used as a “secret sauce” for NLP tasks using Deep
Learning 

Sentiment Analysis Bi lingual Word Embedding


Applying Word Vectors on Some Data
Applying Word Vectors on some data

 
Applying Word2Vec on some data
Applying Word Vectors on some text

Five closest words to “cracked”


(crack, broken, water, spill, shattered)
BERT
BERT (Power of Transfer Learning)

You might also like