Introduction To Instrumentation and Measurements
Introduction To Instrumentation and Measurements
PART 1
Able to define and explain the following types of errors that occur in
measurements: gross, systematic, random.
1. Introduction
3. Standard of Measurement
The device used for comparing the unknown quantity with the unit of
measurement or a standard quantity is called A Measuring Instrument.
1. Introduction (Cont...)
Used of formula
eg:- resistance of a conductor may be determined by measuring voltage across
the conductor, V and current flowing through the conductor, I and calculating by
Ohm’s law :
V
I R
1. Introduction (Cont...)
Deflection methods
Comparison methods
Nature of units
SI system consist of more than 28 units :- 7 base units, 2 addition units and
more than 19 derived units.
For convenience, some derived units have been given new names such as the
derived unit of force in the SI system is called Newton (N) instead of
dimensionally correct name kg-m/s2.
2. Dimensions
Example:
Find the dimension of derived units below:
3. Standards of Measurements
Standards are used to determine the values of other physical quantities by the
comparison method.
Categories of Standards
3. Secondary Standards
These standards are the basic reference standards used by
measurement and calibration laboratories in the industry to which
they belong. Each laboratory periodically sends its secondary
standards to the primary standards in national standard laboratories.
After calibrate the secondary standards are returned to the industrial
laboratory with certification of measuring accuracy in term of a
primary standard.
4. Working Standards
The principle tools of a measurements laboratory. They are used to
1. check and calibrate the instruments use in the laboratory or
2. make comparison measurements in industrial application.
For example, manufacturers of electronic components such as
capacitors, resistors and many more use a standard called a working
standard for checking the component values being manufactured.
3. Standards of Measurements
Categories of Standards
e.g.
International
Summary Bureau of Weights
and Measures
(BIPM)
e.g. SIRIM
e.g. Industry
e.g. Users
4. Standard of Electrical Measurements
Current Standard
Voltage Standard
Resistance Standard
Inductance Standard
Capacitance Standard
4. Standard of Electrical Measurements
Current Standard
Since, to measure current accurately is much easier than to measure charge, the
unit of current is now become the fundamental electrical unit in the SI system.
4. Standard of Electrical Measurements
The Ampere is defined as constant current when flowing in each of two infinitely
long parallel conductors 1 meter apart, exerts a force of 2 x 10-7 Newton per
meter of length on each conductor.
While, the coulomb is defined as the charge which passes a given point in a
conductor each second, when a current of 1 ampere flows.
•Voltage Standard
The volt (V) is defined as the potential difference between two points on a
conductor carrying a constant current of 1 ampere when the power dissipated
between these points is 1 watt. Volt (V) is the unit of electromotive force (emf)
and potential difference.
(1 Watt = 1 Joule/s)
Volt = Watt / Ampere
Watt = J/S = Power
Joule = N/m = Work / Energy
Newton = kg.m/s2
4. Standard of Electrical Measurements
•Voltage Standard
1 joule of work is consider done when 6.24 x 1018 electrons (1 coulomb) are move
through a potential difference of 1 V.
One electron carries a charge of 1/(6.24 x 1018 ) coulomb. When only 1 electron
moved through 1V, the energy involved known as electron volt (eV).
eV is used in the case of very small energy level associated with electrons in
orbit around the nucleus of an atom.
4. Standard of Electrical Measurements
•Resistance Standard
The ohm (Ω) is the unit of resistance. Ohm is defined as the resistance which
permits a current flow of 1 ampere when a potential difference of 1 volt is
applied to the resistance.
The henry (H) is the unit of inductance. The inductance of a circuit is 1 henry,
when an emf of 1 volt is induced by the current changing at the rate of 1 A/s.
The weber (Wb) is the unit of magnetic flux. Weber defined as the magnetic
flux which, linking a single-turn coil, produces an emf of 1 V when the flux is
reduced to zero at a constant rate in 1s.
4. Standard of Electrical Measurements
•Inductance Standard
While Tesla (T) is the unit of magnetic flux density. Tesla is the flux density in
a magnetic field when 1 weber of flux occurs in a plane of 1 square meter. Tesla
~ 1 Wb/m2.
Therefore,
1 H = 1 Wb / 1 A
1 Wb = 1V x 1 s
Where :
N is no. of wire turn
r is solenoid radius in meter
l is length of solenoid in meter
4. Standard of Electrical Measurements