0% found this document useful (0 votes)
82 views28 pages

Biology Vs ANN

This document discusses the biological motivation for artificial neural networks. It describes how biological neural networks in the brain, composed of interconnected neurons, provide the ability to learn and process information rapidly in parallel. The document outlines key aspects of biological neural systems like neurophysiology, neurobiology, neuron structure and function, synaptic transmission, and how neurons communicate via action potentials. It notes the goals of neural computation are to understand how the brain works, explore a new style of computation inspired by biology, and solve practical problems using novel learning algorithms.

Uploaded by

enpass
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views28 pages

Biology Vs ANN

This document discusses the biological motivation for artificial neural networks. It describes how biological neural networks in the brain, composed of interconnected neurons, provide the ability to learn and process information rapidly in parallel. The document outlines key aspects of biological neural systems like neurophysiology, neurobiology, neuron structure and function, synaptic transmission, and how neurons communicate via action potentials. It notes the goals of neural computation are to understand how the brain works, explore a new style of computation inspired by biology, and solve practical problems using novel learning algorithms.

Uploaded by

enpass
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Machine Learning Srihari

Neural Networks: 

Biological Motivation
Sargur Srihari

1
Machine Learning Srihari

Two Groups Of Researchers in Neural Networks

1. Study and model biological learning


•  Network of neurons in the brain provide people
with ability to assimilate information.
•  Will simulations of such networks reveal the
underlying mechanism of learning?
2. Obtain highly effective learning machines
•  Biological realism imposes unnecessary
constraints
•  Primarily multilayer perceptron

2
Machine Learning Srihari

Neural Computation
Biological Motivation for Artificial Neural Networks

3
Machine Learning Srihari

Biological Motivation

•  Study of neural computation inspired by the


observation:
•  Biological learning systems are built of very complex webs of
interconnected neurons
•  Each unit takes real-valued inputs (possibly from other units)
•  Produces a single real valued output (which becomes the
input to many other units)

4
Machine Learning Srihari

Switching Time

•  Human Brain
•  Densely interconnected network of 1011 (100 billion) neurons
•  Each connected to 104 (10,000) others
•  Fastest neuron switching time is 10-3 seconds
•  Activity excited or inhibited through connections to other
neurons
•  Slow compared to computer switching speed: 10-10secs

5
Machine Learning Srihari

Human Information Processing Speed

•  Humans can make certain decisions (visually


recognize your mother) in 10-1 secs
•  Implies that in 10-1 sec interval cannot possibly have
more than a few hundred steps, given switch speed
•  Therefore
•  information processing abilities of biological systems follow
from highly parallel processing operations distributed over
many neurons

6
Machine Learning Srihari

Neurophysiology

Receptors:
Rods and Cones of eyes,
Pain, touch, hot and cold receptors of skin,
Stretch receptors of muscles
Effectors:
Muscles and glands, speech generators 7
Machine Learning Srihari

Neurobiology

•  Basic morphology of neurons including axons,


dendrites, cell bodies or somata and synapses

•  Chemical transmitters at synapses and how


connection of nerve impulses is affected by the
actions of various ions in and around the cells

8
Machine Learning Srihari

Neuron is a cell

Neuron nucleus contained in the soma or body of the cell


More than 10 billion neurons in human brain
9
Machine Learning Srihari

Network of Neurons

Photomicrograph of
Section of cerebral
cortex

10
Machine Learning Srihari

Dendrites

•  Dendrites: form a fine


filamentary bush
each fiber thinner than an
axon

11
Machine Learning Srihari

Axon

•  Long thin cylinder carrying


impulses from soma to other
cells
•  Splits into endbulbs
•  almost touching dendrites
•  Place of near contact is a
synapse

12
Machine Learning Srihari

Inter-Neuronal Transmission

•  Impulses reaching a synapse


•  set up graded electrical signals
in the dendrites of neuron on
which synapse impinges
•  Inter-neuronal transmission
•  is sometimes electrical
•  usually by diffusion of chemical
transmitters

13
Machine Learning Srihari

Synapses
•  When a spike travels along an axon and arrives at a synapse it causes
vesicles of transmitter chemical to be released
•  There are several kinds of transmitter
•  The transmitter molecules diffuse across the synaptic cleft and bind to
receptor molecules in the membrane of the post-synaptic neuron thus
changing their shape.
•  This opens up holes that allow specific ions in or out.
•  The effectiveness of the synapse can be changed
•  vary the number of vesicles of transmitter
•  vary the number of receptor molecules.
•  Synapses are slow, but they have advantages over RAM
•  Very small
•  They adapt using locally available signals (but how?)

14
Machine Learning Srihari

Chemical synapse operation

•  Transmitting neuron, or
presynaptic cell
•  liberates transmitter substance that
diffuses across synaptic junction
•  Electrical signal converted to
chemical signal
•  Changes postsynaptic cell
membrane potential
•  Chemical signal converted back
to electrical signal

15
Machine Learning Srihari

Nerve Impulse Waveform


As appears on oscilloscope by placing microelectrode near an axon
Action Potential (All or none electric potential
that travel a meter or more and trigger
electrochemical coupling)
Membrane
Potential+20
[Voltage
(mv)
Inside 0
vs.
Outside
Of
Membrane
] -70

Time (msec)
1 2 3 4 16
Machine Learning Srihari

Neuron Firing

•  A neuron will only fire an


electrical impulse along
its axon only if sufficient
impulses reach endbulbs
impinging on its dendrites
in a short period of
time,called period of
latent summation

17
Machine Learning Srihari

Excitatory and Inhibitory Impulses

•  Impulses may help or hinder


firing of impulse
•  Excitation should exceed
inhibition by critical amount
called threshold of the neuron
•  A neuron fires only if the total
weight of the synapses that
receive impulses in the period
of latent summation exceeds
the threshold

18
Machine Learning Srihari

The goals of neural computation


•  To understand how the brain actually works
•  Its big and very complicated and made of yukky stuff that dies when you
poke it around
•  To understand a new style of computation
•  Inspired by neurons and their adaptive connections
•  Very different style from sequential computation
•  should be good for things that brains are good at (e.g. vision)
•  Should be bad for things that brains are bad at (e.g. 23 x 71)
•  To solve practical problems by using novel learning algorithms
•  Learning algorithms can be very useful even if they have nothing to do
with how the brain works

19
Machine Learning Srihari

Idealization of a Neuron

20
Machine Learning Srihari

ANN


•  ANNs are built of


•  densely interconnected set of simple units
•  each unit
•  takes several real-valued inputs
•  Produces single-valued output

21
Machine Learning Srihari

Common ANN

22
Machine Learning Srihari

ANNs

•  One motivation is to capture highly parallel


computations on distributed processes
•  Most ANN software run on sequential machines
emulating distributed processes

23
Machine Learning Srihari

Use of ANNs

•  General practical method


•  Robust approach
•  Used to learn functions that are
•  real-valued,
•  discrete-valued
•  vector-valued

24
Machine Learning Srihari

Limitations of Neural Networks

•  Need substantial number of training samples


•  Slow learning (convergence times)
•  Inadequate parameter selection techniques that lead
to poor minima

25
Machine Learning Srihari

Three Mechanisms of Convolutional Neural Networks

1.  Local Receptive Fields


2.  Subsampling
3.  Weight Sharing

26
Machine Learning Srihari

Convolution and Sub-sampling


•  Instead of treating input to a fully
connected network Input
10 x 10
units 5x5
image
•  Two layers of Neural networks are units

used
1.  Layer of convolutional units
2x2
5x5
•  which consider overlapping pixels
units

regions
2.  Layer of subsampling units
Each pixel patch This plane has
•  Several feature maps and sub- is 5 x 5 10 x 10=100
sampling neural network units
(called a feature map).
•  Gradual reduction of spatial resolution Weights are same for
compensated by increasing no. of features different planes.
So only 25 weights
•  Final layer has softmax output are needed.
Due to weight sharing
•  Whole network trained using this is equivalent
to convolution.
backpropagation Different features have
different feature maps

27
Machine Learning Srihari

ConvNet Inspired by Visual Neuroscience


•  Classic notions of simple cells and complex cells
•  Architecture similar to LGN-V1-V2-V4-IT hierarchy in visual
cortex ventral pathway
•  LGN: lateral geniculate nucleus receives input from retina
•  30 different areas of visual cortex: V1 and V2 are principal
•  Infero-Temporal cortex performs object recognition

28

You might also like