0% found this document useful (0 votes)
35 views

EMI Unit 1 Notes Latest

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

EMI Unit 1 Notes Latest

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 90

Introduction

• The measurement of any quantity plays very important role not only in science but also in all
branches of engineering, medicine and in almost all the human day to day activities.
• The technology of measurement is the base of advancement of science. The role of science
and engineering is to discover the new phenomena, new relationships, the laws of nature
and to apply these discoveries to human as well as other scientific needs.
• The science and engineering is also responsible for the design of new equipment's. The
operation, control and the maintenance of such equipment's and the processes is also one of
the important functions of the science and engineering branches.
• All these activities are based on the proper measurement and recording of physical,
chemical, mechanical, optical and many other types of parameters.
• Measuring instrument and measurement procedure which minimizes the error. The
measuring instrument should not affect the quantity to be measured.
• The measurement of any electronic or electrical quantity or variable is termed as an
electronic measurement.
Instrumentation : Instrumentation is the use of
measuring instruments to monitor and control a
process. It is the art and science of measurement
and control of process variables within a
production, laboratory, or manufacturing area.

Instruments: It is a device for determining the value


or magnitude of a quantity or a variable

2
MEASURING INSTRUMENTS
“The device used for comparing the unknown quantity
with the unit of measurement or standard quantity
is called a Measuring Instrument.”
OR
“An instrument may be defined as a machine or system which is
designed to maintain functional relationship between prescribed
properties of physical variables & could include means of
communication to human observer.”
CLASSIFICATION OF INSTRUMENTS
Electrical instruments may be divided into two categories,
that are;
1. Absolute instruments,
2. Secondary instruments.
- Absolute instruments gives the quantity to be
measured in term of instrument constant & its
deflection.
- In Secondary instruments the deflection gives the
magnitude of electrical quantity to be measured
directly. These instruments are required to be
calibrated by comparing with another standard
instrument before putting into use.
CLASSIFICATION OF INSTRUMENTS
Advantages of Instrumentation Systems

• Remote measurement.
• Accurate measurement.
• Measurement in adverse conditions: nuclear reactors, space
applications, etc.
• Convenience: recording of data, printout, etc.
• Reduction in size.
Few Definitions
Measurement: It is the act, or the result of quantitative comparison between a
predetermined std. and or an unknown magnitude. Since two quantities are compared and
the result are expressed in numerical value.

Measurand: The physical quantity or the characteristic conditions which is the object of
measurement in an instrumentation system is termed as measurand or measurement
variable or process variable.
e.g. Fundamental Quantity: length, mass, time et.
Derived Quantity : Speed, Velocity, Pressure etc.

Measurand (Qty. to be measured)

Std. Unknown Quantity Result (Read out)


Process of Comparison
Significance of
Measurement
“When you can measure, what you are speaking and express it
in numbers, you know something about and can express it in
numbers, you know something about it, when you cannot
express in it numbers in knowledge is of meagre and
unsatisfactory kind” – Lord Kelvin

The measurement confirms the validity of a hypothesis and


also add to it the understanding. This eventually leads to new
discoveries that require new and sophisticated measuring
techniques.

Through measurement a product can be designed or a process


be operated with max. efficiency , minimum cost and with
desired degree of reliability and maintainability
8
Contd..
Measured Value: Any value or any reading calculated from measurement system
or measuring instrument.

True value: Any value calculated from rated value known as True value of Actual
Value.
e.g. Motor Actual Speed
True Value Measured Value
Measuring Instrument

Error : Any deviation of measured


value from true value
Measured Value-True Value

9
Method of Measurement

Direct Method Indirect Method


The unknown quantity (measurand) In this method the comparison
is directly compared against a standard. Is done with a standard through
The result is expressed as a numerical number the use of a calibration s/m. These
and a unit. Direct methods are common methods are used those cases
for the measurement of physical quantities where the desire parameter to
like length, mass and time be measured. E.g. Acceleration,
power
General-purpose measuring system

• Measurand: The quantity to be measured is called measurand.


• Transducer: A device that converts a physical quantity into an
electrical quantity or vice-versa.
• Signal conditioner: Amplification, Filtering, Modulation,
Demodulation, A/D conversion, etc.
• Display/Record: The quantity is recorded using X–Y or strip-chart
recorders or displayed on monitors, etc
Characteristics of Instruments
The performance of an instrument is described by means
of a quantitative qualities termed as characteristics. These
are broken down into:
1. Static Characteristics: These characteristics pertain to
a system where the quantities to be measures are
constant or vary slowly with time
2. Dynamic Characteristics: Performance criteria based
on dynamic relations (involving rapidly varying
quantities)
The various static characteristics are:
i) Accuracy
ii) Precision
iii) Sensitivity
iv) Resolution
Accuracy
• Accuracy is the closeness with which an instrument reading approaches
the true value of the variable being measured.
• Accuracy, in other words, indicates the maximum error, which will not be
exceeded, as assured by the manufacturer of the instruments.
• If the accuracy of a 100 V voltmeter is +―1%, the maximum error for any
reading will not exceed +―1V
• Th e term accuracy can be explained as ‘conforming to truth’.
16/03/22
error

16/03/22
Precision
• The meaning may be given as ‘sharply or clearly defined’.
• It is the measure of order or degree to which a particular parameter is measured.
This term is also distinctly different from the term ‘reproducibility’.
• A voltage reading expressed as 75.2347 V is a precise value. Therefore, precision
indicates the degree or level or number of decimal places to which a particular
quantity can be measured. But precision does not guarantee accuracy.
• The difference between the two terms must be understood clearly.
• For example, π = 3.14 is a correct or true value. It can be mentioned as an accurate value. But π
= 3.1428574 is a precise as well as an accurate value. However, if the π value is given as
3.2428574, it is still a precise value because the value is expressed to more number of decimal
places, but it is not an accurate value. Thus, precision does not guarantee accuracy.
• Example – Consider the 100V, 101V, 102V, 103V and 105V are the different
readings of the voltages taken by the voltmeter. The readings are nearly close to
each other. They are not exactly same because of the error. But as the reading are
close to each other then we say that the readings are precise
Resolution
• Resolution is the smallest change in the measured value to which the
instrument can respond. It is the smallest change the instrument can
measure.
• For example, a 100 V voltmeter may not be able to measure 100 mV. Only
when the minimum input is 0.5 V, the needle may deflect or the reading
changes from 0. Any input or change in input less than 0.5 V may have no
effect on the instrument. Therefore, the resolution for that particular
instrument is 0.5 V
Sensitivity
• Sensitivity of an instrument indicates the capacity of the instrument to
respond truly to the change in the output, corresponding to the change in the
input.
• For a voltmeter, sensitivity is referred to as Δqο/Δqi , the ratio of the change
in the output to the change in the input. If the input voltage changes by few
millivolts, the output should also change by the same amount in the ideal
case.
• The ratio of change in the output signal to the change in the input is called
sensitivity.
• For a voltmeter, it is the ratio ΔVo /ΔVi . If Vi changes by 0.1 V, the output
reading should also change by 0.1 V. For a given meter, it may change by 0.08
V or less. This change in Vo for a change in Vi is expressed as sensitivity.
Errors
• It is defined as the algebraic difference between the measured value and
true value of the quantity. It is also called as static error.
• Static error = measured value – true value .
Types of Errors
• The static errors are classified as:
1) Gross errors
2) Systematic errors
3) Random errors
Gross errors
• These are basically human errors caused by the operator or person using the
instrument. The instrument may be good and may not give any error but still
the measurement may go wrong due to the operator.
• These errors can not be treated mathematically.
• One of the basic gross errors that occur frequently is the improper use of an
instrument.
• The error can be minimized by taking proper care in reading and recording the
measurement parameter.
Systematic/Fixed errors
• The systematic errors are mainly resulting due to the shortcomings of the
instrument and the characteristics of the material used in the instrument, such
as defective or worn parts, ageing effects, environmental effects, etc.
• A constant uniform deviation of the operation of an instrument is known as a
systematic error.
• There are three types of systematic errors as
1) Instrumental errors
2) Environmental errors
3) Observational errors
Instrumental errors
• Instrumental errors are inherent in measuring instruments, because of their
mechanical structure.
• These errors are mainly due to following three reasons
1) Inherent Shortcomings in the instrument
2) Misuse of instrument
3) Loading effect of instruments
• Instrumental errors may be eliminated by
1) Selecting a proper measuring device or instrument for the particular application.
2) Recognize the effect of such errors and apply the proper correction factors.
3) Calibrate the instrument carefully against a standard.
Environmental errors
• These errors are due to the conditions external to the measuring instrument.
• The various factors resulting these environmental errors are temperature
changes, pressure changes, thermal emf, ageing of equipment and frequency
sensitivity of an instrument.
• Environmental errors are eliminated by
1) Using instrument in controlled conditions
2) Automatic compensation
3) Using magnetic or electrostatic shields
Observational errors
• These are the errors introduced by the observer.
• These are many sources of observational errors such as parallax error while
reading a meter, wrong scale selection, etc.
• For Ex an observer may always introduce an error by consistently holding his
head too far to the left while reading a needle and scale reading.
• Observational errors are eliminated by
1. instrument with mirrors,
2. instrument with knife edge pointers,
3. Instrument having modern digital display.
Random/Residual Errors
• Some errors still result, though the systematic and instrumental errors are
reduced or at least accounted for.
• The causes of such errors are unknown and hence, the errors are called random
errors. These errors cannot be determined in the ordinary process of taking the
measurements.
• Random Errors are eliminated by frequency of measurement is to be increased,
i.e., the same parameter is to be measured often. Error in such measurements
can be estimated by statistical analysis.
Standards of Measurement
Standards of Measurement
Classification of standards
• For convenience and local use by industries, laboratories, and
research organizations, the standards of measurement are classified
as:
1. International standards.
2. Primary/Basic standards.
3. Secondary standards.
4. Working standards.
International standards
• The International standards are defined by International agreement
and they represent certain units of measurement to the closest
possible accuracy that production and measurement technology
allow.
 These International Standards of Measurement are not
available to ordinary users for measurements and calibrations.
International ohms:
It is defined as the resistance offered by a column of mercury
having a mass of 14.4521 gms, uniform cross-sectional area
and length of 106.300 cm, to the flow of constant current at the
melting point of ice.
Primary/Basic standards
• The Primary standards are maintained by National Standard
Laboratories in different parts of the world.
• These standards represent the fundamental units and some of the
derived mechanical and electrical units.
• The principle function of primary standards is the calibration and
verification of secondary standards.

• The primary standards are not available for use outside the
National Laboratory.
Secondary standards
• Secondary standards are basic reference standards used by measurement
and calibration laboratories in industries.
• These standards are maintained by the particular involved industry and are
checked locally against reference standards.
• Each industry has its own secondary standard.
• Each laboratory periodically sends its secondary standard to the National
standards laboratory for calibration and comparison against the primary
standard.

• After comparison and calibration, the National Standards Laboratory


returns the Secondary standards to the particular industrial laboratory
with a certification of measuring accuracy in terms of a primary standard.
Difference between primary and secondary standards

• To put it simply, a primary standard is a factual universal


measurement while a secondary standard is a device directly
calibrated previously by the primary standard.

16/03/22
Working standards
• These are the principal tools of a measurement laboratory.
• These are used to check and calibrate general laboratory instruments
for accuracy and performance.
• For example, manufacturers of electronic components such as
capacitors, resistors, etc. use a standard called a Working Standards
of Measurement for checking the component values being
manufactured, e.g. a standard resistor for checking of resistance value
manufactured
IEEE Standards
IEEE Standards
Introduction to Virtual Instrumentation
Advantages of Virtual Instruments versus Traditional Instruments

1)Flexibility:
Add additional functions into a virtual instrument.

2)Storage:
Computers have hard disks that can store dozens of gigabytes which is an
absolute.

3)Display:
Have better color depth and pixel resolution than traditional instruments.

4)Costs & Size:


Smaller size with lower cost

You might also like