Control System Analysis
Dr. Mbazingwa E. Mkiramweni
(
[email protected], 0759069772)
Dar es salaam Institute of Technology (DIT)
1
Classical Control Modern Control
• Introduction • State Space Modelling
• Mathematical models of Systems • Eigenvalue Analysis
• System Analysis • Observability and Controllability
• Time Domain Analysis • So
• Frequency Domain Analysis • State Space to Transfer Function
• Root Locus • Transfer Function to State Space
• System Design • Direct Decomposition of Transfer Function
• Compensation Techniques • Cascade Decomposition of Transfer
• PID Control Function
• Parallel Decomposition of Transfer
Function
• State Space Design Techniques
2
Recommended Text books
Recommended Text books
Chapter 1: Introduction to Control Systems
A system Controlling the operation of another system.
A system that can regulate itself and another system.
A control System is a device, or set of devices to
manage, command, direct or regulate the behaviour
of other device(s) or system(s).
Definitions
System – An interconnection of elements and devices for a
desired purpose.
Control System – An interconnection of components forming a
system configuration that will provide a desired response.
An automatic control system is a combination of
components that act together in such a way that the
overall system behaves automatically in prespecified
desired manner.
Process – The device, plant, or system under control. The
input and output relationship represents the cause-and-effect
relationship of the process.
Input Process Output
6
Definitions
Controlled Variable– It is the quantity or condition that is
measured and Controlled. Normally controlled variable is the
output of the control system.
Manipulated Variable– It is the quantity of the condition that is
varied by the controller so as to affect the value of controlled
variable.
Control – Control means measuring the value of controlled
variable of the system and applying the manipulated variable to
the system to correct or limit the deviation of the measured
value from a desired value.
7
Definitions
Manipulated Variable
Input
or Output
Set point Controller Process Or
or Controlled Variable
reference
Disturbances– A disturbance is a signal that tends to adversely
affect the value of the system. It is an unwanted input of the
system.
• If a disturbance is generated within the system, it is
called internal disturbance. While an external disturbance
is generated outside the system.
8
The interaction is defined in terms of variables.
i. System input
ii. System output
iii. Environmental disturbances
Types of Control System
Natural Control System
Universe
Human Body
10
Types of Control System
Manmade Control System
Aeroplanes
Chemical Process
11
Types of Control System
Manual Control Systems
Room Temperature regulation Via Electric Fan
Water Level Control
Automatic Control System
Home Water Heating Systems (Geysers)
Room Temperature regulation Via A.C
Human Body Temperature Control
12
Types of Control System
Manual Vs Automatic Control
Control is a process of causing a system variable such as te
mperature or position to conform to some desired value or tr
ajectory, called reference value or trajectory.
For example, driving a car implies controlling the vehicle to f
ollow the desired path to arrive safely at a planned destinati
on.
i. If you are driving the car yourself, you are performing manual contr
ol of the car.
If you use design a machine, or use a computer to do it, then you have
built an automatic control system.
Types of Control System
Open-Loop Control Systems
Open-Loop Control Systems utilize a controller or control
actuator to obtain the desired response.
• Output has no effect on the control action.
• In other words output is neither measured nor fed back.
Input Output
Controller Process
14
Examples:- Washing Machine, Toaster, Electric
Fan, microwave oven, e.t.c
15
Types of Control System
Open-Loop Control Systems
• Since in open loop control systems reference input is not
compared with measured output, for each reference input
there is fixed operating condition. Therefore, the accuracy
of the system depends on calibration.
• The performance of open loop system is severely affected
by the presence of disturbances, or variation in operating/
environmental conditions.
16
Types of Control System
Closed-Loop Control Systems
Closed-Loop Control Systems utilizes feedback to compare the
actual output to the desired output response.
Input Output
Comparator Controller Process
Measurement
17
Examples:- Refrigerator, Electric Iron, Air conditioner
18
Types of Control System
Multivariable Control System
Outputs
Temp
Humidity Comparator
Controller Process
Pressure
Measurements
19
20
Types of Control System
Feedback Control System
• A system that maintains a prescribed relationship between the
output and some reference input by comparing them and using the
difference (i.e. error) as a means of control is called a feedback
control system.
Input + error Output
Controller Process
-
Feedback
• Feedback can be positive or negative.
21
Types of Control System
Servo System
• A Servo System (or servomechanism) is a feedback control
system in which the output is some mechanical position, velocity
or acceleration.
Antenna Positioning System
22
Types of Control System
Linear Vs Nonlinear Control System
• A Control System in which output varies linearly with the input is
called a linear control system.
u(t) Process y(t)
y(t ) 2u(t ) 1 y(t ) y=3*u(t)+5
3u(t ) 5
y=-2*u(t)+1 35
5
30
0
25
-5
20
y(t)
y(t)
-10
15
-15
10
-20 5
0 2 4 6 8 10 0 2 4 6 8 10
u(t) u(t)
23
Types of Control System
Linear Vs Nonlinear Control System
• When the input and output has nonlinear relationship the system
is said to be nonlinear.
Adhesion Characteristics of Road
0.4
Adhesion Coefficient
0.3
0.2
0.1
0
0 0.02 0.04 0.06 0.08
Creep
24
Types of Control System
Time invariant vs Time variant
• When the characteristics of the system do not depend upon time
itself then the system is said to time invariant control system.
y(t ) 2u(t ) 1
• Time varying control system is a system in which one or more
parameters vary with time.
y(t ) 2u(t ) 3t
25
Types of Control System
Continuous Data Vs Discrete Data System
• In continuous data control system all system variables are function
of a continuous time t.
x(t)
• A discrete time control system involves one or more variables that
are known only at discrete time intervals.
X[n]
n
26
Types of Control System
Deterministic vs Stochastic Control System
• A control System is deterministic if the response to input is
predictable and repeatable.
x(t) y(t)
t t
• If not, the control system is a stochastic control system
z(t)
27
Classification of Control Systems
Control Systems
LT
I C
on
rot
lS
Natural Man-made
ys
te
ms
(Li
ne
Manual Automatic
ar
im t
ei
nv
Open-loop Closed-loop
ari
an
tc
on
linear
tro
Non-linear
Non-linear
l
linear
sy
st e
ms
Time variant Time invariant
)
Time variant Time invariant
28
Examples of Control Systems
Water-level float regulator
29
Examples of Control Systems
30
Purpose of Control Systems
i.Power Amplification (Gain)
Positioning of a large radar antenna by low-power rotation
of a knob
ii.Remote Control
Robotic arm used to pick up radioactive materials
iii.Convenience of Input Form
Changing room temperature by thermostat position
iv.Compensation for Disturbances
Controlling antenna position in the presence of large wind d
isturbance torque
Historical Developments
i.Ancient Greece (1 to 300 BC)
Water float regulation, water clock, automatic oil lamp
ii.Cornellis Drebbel (17th century)
Temperature control
iii.James Watt (18th century)
Flyball governor
iv.Late 19th to mid 20th century
Modern control theory
Watt’s Flyball Governor
Human System
The Vetruvian Man
Human System
i.Pancreas
Regulates blood glucose level
ii.Adrenaline
Automatically generated to increase the heart rate and oxygen in times of f
light
iii.Eye
Follow moving object
iv.Hand
Pick up an object and place it at a predetermined location
v.Temperature
Regulated temperature of 36°C to 37°C
History
18th Century James Watt’s centrifugal governor for the speed control of a steam
engine.
1920s Minorsky worked on automatic controllers for steering ships.
1930s Nyquist developed a method for analyzing the stability of controlled systems
1940s Frequency response methods made it possible to design linear closed-loop
control systems
1950s Root-locus method due to Evans was fully developed
1960s State space methods, optimal control, adaptive control and
1980s Learning controls are begun to investigated and developed.
Present and on-going research fields. Recent application of modern control theory
includes such non-engineering systems such as biological, biomedical, economic and
socio-economic systems
Control System Components
i.System, plant or process
To be controlled
ii.Actuators
Converts the control signal to a power signal
iii.Sensors
Provides measurement of the system output
iv.Reference input
Represents the desired output
General Control System
Disturbance
Controlled Manipulated
Set-point Error Signal Variable
or
Reference Actual
input + + Output
+
+ Controller Actuator + Process
-
Sensor
Feedback Signal
Examples of Modern Control Systems
(a) Automobile steering
control system.
(b) The driver uses the
difference between the
actual and the desired
direction of travel
to generate a controlled
adjustment of the
steering wheel.
(c) Typical direction-of-
travel response.
Examples of Modern Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
The Future of Control Systems
The Future of Control Systems
Questions??
46