Metrology - Chapter 1 Notes - Metrology
Metrology - Chapter 1 Notes - Metrology
Definition: Metrology is defined by the International Bureau of Weights and Measures (BIPM) as:
"The science of measurement, embracing both experimental and theoretical determinations at any level
of uncertainty in any field of science and technology”
Establishing the units of measurements, reproducing these units in the form of standards and
ensuring the uniformity of measurements.
Analyzing the accuracy of methods of measurement, researching into the causes of measuring
errors, and eliminating these.
1) Mass (Kg) 2) Time (s)3)temp. 4) Electric current (I) 5) Mole 6) Candela 7) Distance (m)
Objectives of Metrology
1. Through evaluation of newly developed products to ensure that components designed are within
the process capabilities.
2. To determine process capabilities and ensure that they are better than component tolerances.
3. To determine measuring instrument capabilities and ensure that they are adequate for the
respective measurements.
4. To minimize the cost of inspection by effective and efficient use of available facilities.
5. To reduce the cost of rejections and rework through application of SQC techniques.
Types of Metrology
Legal Metrology
Industrial Metrology
Deterministic Metrology
Scientific or Fundamental Metrology: Scientific or fundamental metrology concerns the establishment
of quantity systems, unit systems, unit of measurements, the development of new measurement methods,
realization of measurement standards, and the transfer of traceability from these standards to users in
society. The BIPM (International Bureau of Weights and Measurements) maintains a database of the
metrological calibration and measurement capabilities of various institutes around the world. These
institutes, whose activities are peer-reviewed, provide the top-level reference points for metrological
traceability.
Legal Metrology: Legal Metrology is that part of metrology which treats units of measurement, methods of
measurement and the measuring instruments, in relation to the statutory, technical and legal requirements.
It assures security and appropriate accuracy of measurements. Legal metrology is directed by a national
organisation, viz. National Service of Legal Metrology whose object is to resolve problems of legal
metrology in a particular country. Its functions are to ensure the conservation of national standards and to
guarantee their accuracy by comparison
with international standards; and also to impart proper accuracy to the secondary standards of the country
by comparison with international standards. The contemporary organisation of metrology includes a
number of international organisations viz. (a) The International Organisation of Weights and Measures:
and (b) National Service of Legal Metrology whose ultimate object is to maintain uniformity of
measurements throughout the world. The activities of the service of Legal Metrology are: control (testing,
verification, and standardisation) of measuring instruments; testing of prototypes/models of measuring
instruments; examination of a measuring instrument to verify its conformity to the statutory requirements,
etc.
Industrial Metrology: Industrial or Applied metrology concerns the application of measurement science to
manufacturing and other processing industries and their use in society, ensuring the suitability of
measuring instruments, their calibration and quality control of measurements. Although the emphasis in
this area of metrology is on the measurements themselves, traceability of the calibration of the
measurement devices is necessary to ensure confidence in the measurements.
Deterministic Metrology: This is a new philosophy in which part measurement is replaced by process
measurement. In the deterministic metrology, full advantage is taken of the deterministic nature of
production machines (machines under automatic control are totally deterministic in performance) and all of
the manufacturing sub-systems are optimised to maintain deterministic performance within acceptable
quality levels. In this science, the system processes are monitored by temperature, pressure, flow, force,
vibration, acoustic “finger printing” sensors, and these sensors being fast and non-intrusive.
The sequence of operation necessary for execution of measurement is called process of measurement.
• Measurand: It is the physical quantity to be measured like length , angle, diameter or thickness
1. Direct Method: In this method the value of a quantity is obtained directly by comparing the unknown
with the standard. It involves no mathematical calculations to arrive at the results.
For example: Measurement of length by a graduated scale.
2. Indirect Method: In this method several parameters (to which the quantity to be measured is linked
with) are measured directly and the value is determined by mathematical relationship.
For example: Measurement of density by measuring mass and geometrical dimensions.
3. Comparison Method: This method involves comparison with either a known value of the same quantity
or another quantity which is a function of the quantity to be measured.
4. Coincidence method: In this method the very small difference between the given quantity and the
reference is determined by the observation of the coincidence of scale marks.
For example: Measurement on vernier caliper and micrometer.
5. Complementary method: This is the method of measurement by comparison in which the value of the
quantity to be measured is combined with a known value of the same quantity so adjusted that the sum
of these two values is equal to predetermined comparison value.
For example: Determination of the volume of a solid by liquid displacement.
6. Deflection method: In this method the value of the quantity is directly indicated by the deflection of a
pointer on a calibrated scale.
7. Substitution method: In this method the quantity to be measured is measured by direct comparison on
an indicating device by replacing the measuring quantity with some other known quantity which
produces same effect on the indicating device.
For example: Determination of mass by Borda method.
Measuring Instruments
Measuring Instruments are the devices that transform measured quantity into an information. They can
either indicate directly the value of the measured quantity (Ex: vernier caliper) or only indicate its equality
to a known measure of the same quantity (Ex: equal arm balance). They may also indicate the small
difference between the quantity being measured and known standard (Ex: comparators). Measuring
instruments may contain internal parts to reproduce the unit (Ex: precision threads).
Classification of Measuring Instruments
1. Length measuring instruments (Ex: engineer’s scale, micrometer, vernier height gauge etc.)
2. Angle measuring instruments (Ex: bevel protector, indexing head, sine bar etc.)
Calibration:
Calibration is nothing but “Comparison of instrument’s performance with the standards of known
accuracy”
Advantages of calibration:
Accuracy and Precision: Accuracy is the closeness to which the measured value of the quality agrees with
the true value where as Precision is the variation that occurs when measuring the same quantity with the
same instrument. Both these terms are associated with the measuring process. Accuracy is quantified using
measurement error, E = measured value – true value.
Figure below illustrates the distinction between accuracy and precision by measuring a component several
times and plotting the readings by three instruments.
From Figure it will be obvious that precision is concerned with a process or a set of measurements, and not
a single measurement. In any set of measurements, the individual measurements are scattered about the
mean, and the precision tells us to how well the various measurements performed by same instrument on
the same component agree with each other. Error is the difference between the mean of set of readings on
same component and the true value. Less is the error, more accurate is the instrument.
1. Calibration standards – Coefficient of thermal expansion, Calibration interval, Stability with time,
Elastic properties…
2. Work piece – Cleanliness, Surface finish, Support arrangement, Waviness, Datum references….
3. Inherent characteristics of Instrument – backlash, Friction, Calibration error, Contact geometry for
both work piece and standard
4. Person – Skills, Training, Ability to select measuring instruments and standards, Attitude towards
personal accuracy achievements….
5. Measuring environment – Vibration, Temperature, Humidity, Adequate illumination, Clean
surrounding….
Repeatability: It is the ability of the measuring instrument to repeat the same results for the measurements
of same quantity, when the measurement are carried out by: the same observer, with the same instrument,
under the same conditions, without any change in location, without change in the method of measurement
and the measurements are carried out in short intervals of time.
Reproducibility: Reproducibility is the closeness of the agreement between the results of measurements of
the same quantity, when individual measurements are carried out: by different operator, by different
methods, using different instruments under different conditions, locations, times etc.
Sources of Errors:
In any measurement, there is always a degree of uncertainty resulting from measurement error, i.e. all
measurements are inaccurate to some extent. Measurement error is the difference between the indicated
and actual values of the measurand. The error could be expressed either as an absolute error or on a relative
scale, most commonly as a percentage of full scale. Each component of the measuring system has sources
of errors that can contribute to measurement error. Instrument or indication errors may be caused by
defects in manufacture of adjustment of an instrument, imperfections in design, etc. The error of
measurement is the combined effect of component errors due to various causes. There may be errors due to
method of location, environmental errors, errors due to the properties of object of measurement, viz. form
deviation, surface roughness, rigidity and change in size due to ageing etc. During measurement several
types of errors may arise such as static errors, instrument loading errors or dynamic errors, and these errors
can be broadly classified into two categories viz. controllable errors and random errors.
These are controllable in both their magnitude and sense. These can be determined and reduced, if attempts
are made to analyse them. These errors either have a constant value or a value changing according to a
definite law. These can be due to;
1. Calibration Errors: The actual length of standards such as slip gauges and engraved scales will vary
from nominal value by small amount. Sometimes the instrument inertia, hysteresis effect does not let
the instrument translate with complete fidelity. Often signal transmission errors such as drop in
voltage along the wires between the transducer and the electric meter occur. For high order accuracy
these variations have positive significance and to minimise such variations calibration curves must be
used.
2. Ambient Conditions: Variations in the ambient conditions from internationally agreed standard value
of 20°C, barometric pressure 760 mm of mercury, and 10 mm of mercury vapour pressure, can give
rise to errors in the measured size of the component. Temperature is by far the most significant of
these ambient conditions and due correction is needed to obtain error free results.
3. Stylus Pressure: Error induced due to stylus pressure is also appreciable. Whenever any component is
measured under a definite stylus pressure both the deformation of the workpiece surface and
deflection of the workpiece shape will occur.
4. Avoidable Errors: These errors include the errors due to parallax and the effect of misalignment of the
workpiece centre. Instrument location errors such as placing a thermometer in sunlight when
attempting to measure air temperature also belong to this category.
2. Random Errors:
These occur randomly and the specific cases of such errors cannot be determined, but likely sources of this
type of errors are small variations in the position of setting standard and workpiece, slight displacement of
lever joints in the measuring joints in measuring instrument, transient fluctuation in the friction in the
measuring instrument, and operator errors in reading scale and pointer type displays or in reading engraved
scale positions.
Standards of Measurement:
Introduction:
Metrology concerns itself with the science of measurements and nearly all measurements in workshop
involve measurement of dimensions. Production engineer is especially concerned with the measurement of
the length and angle. Length is of fundamental importance as even angles can be measured by combination
of linear measurements. Thus it is very essential that some standards be prescribed for the length. The
importance of study of length has been realised long back and man has always sought a fixed and
unvarying natural standard of length (fundamental unit). It is well known fact that without standards of
fundamental units (length, mass, time), it would have not been possible for civilisation to exist.
Modern manufacturing technology is based on precise reliable dimensional measurements. Ultimately, all
of these measurements are comparisons with standards developed and maintained by bureaus of standards
throughout the world.
Standards:
The two standard systems for linear measurement used throughout the world are English and Metric (yard
and metre). Realising the importance and advantages of metric system, most of the countries are adopting
metre as the fundamental unit of linear measurement.
Search for a suitable unit of length has always been there and attempts made to keep that unit of length
constant irrespective of the environmental conditions. Actually, previously the standards were made of
materials which could change their size with temperature and other conditions. Thus a great care and
attention was needed to maintain same conditions so that the fundamental unit remains same. Finally the
natural and invariable unit of length was sought when it was found that the wavelength of monochromatic
light never changed with other conditions. Due to it, the previously defined yard and metre could be very
easily expressed in terms of the wavelength of light. Sub-division of standards:
The Imperial standard yard and metre defined previously are just like master standards and can’t be used
for ordinary purposes. Thus depending upon the importance of standard, standards are sub-divided into
four grades.
1. Primary Standards: In order that standard unit of length, i.e. yard or metre does not change its value
and it is strictly followed and precisely defined there should be one and only one material standard
preserved under most careful conditions. This has no direct application to a measuring problem
encountered in engineering. These are used after 10 to 20 years solely for comparison with secondary
standards.
2. Secondary Standards: are close copies of primary standards as regards both design material and length.
These are made, as far as possible, exactly similar to primary standards. Any error existing in these bars
is recorded by comparison with primary standards after long intervals. These are kept at a number of
places under great supervision and are used for comparison with tertiary standards whenever desired.
This also acts as safeguard against the loss or destruction of primary standard.
3. Tertiary Standards: The primary and secondary standards exist as the ultimate controls for reference at
the rare intervals. Tertiary standards are reference standards employed by N.P.L., and are the first
standards to be used for reference in laboratories and workshops. These are also made as true copy of
secondary standards and are kept as reference for comparison with working standards.
4. Working Standards: These are also line standards and similar in design to (1), (2) and (3) but being less
in cost, are made of lower grades of materials. These are of general application in metrology
laboratories.
Sometimes standards are also classified as (a) reference standards (used for reference purposes) (b)
calibration standards (used for calibration of inspection and working standards), (c) inspection standards
(used by inspectors), and (d) working standards (used by operators).
LINE STANDARDS
When the length being measured is expressed as the distance between two lines, then it is called “Line
Standard”.
Examples: Measuring scales, Imperial standard yard, International prototype meter, etc.
END STANDARDS
When the length being measured is expressed as the distance between two parallel faces, then it is called
‘End standard’. End standards can be made to a very high degree of accuracy. Ex: Slip gauges, Gap
gauges, Ends of micrometer anvils, etc.
1. End standards are highly accurate and are well suited for measurements of close tolerances as small as
0.0005 mm.
2. They are time consuming in use and prove only one dimension at a time.
5. Groups of blocks may be “wrung” together to build up any length. But faulty
6. The accuracy of both end & line standards are affected by temperature change.
Slip gauges are rectangular blocks of steel having cross section of 30 mm face length &
Slip gauges are blocks of steel that have been hardened and stabilized by heat treatment.
They are ground and lapped to size to very high standards of accuracy and surface finish.
A gauge block (also known Johansson gauge, slip gauge, or Jo block) is a precision length measuring
standard consisting of a ground and lapped metal or ceramic block. Slip gauges were invented in 1896 by
Swedish machinist Carl Edward Johansson.
Slip gauges are wrung together to give a stack of the required dimension. In order to achieve the
maximum accuracy the following precautions must be taken.
• Wipe the measuring faces clean using soft clean chamois leather.
• Wring the individual blocks together by first pressing at right angles, sliding & then twisting.
1.01-1.49 0.01 49
0.5-9.5 0.5 19
10-90 10 9
1.005 - 1
Total 87
M-45 slip gauges:-
1.01-1.49 0.01 9
1.1-1.9 0.1 9
1-9 1 9
10-90 10 9
Total 45
Line standard:
According to it, yard or metre is defined as the distance between scribed lines on a bar of metal under
certain conditions of temperature and support. These are legal standards and Act of Parliament authorises
their use.
Imperial Standard Yard: The Imperial Standard Yard is a bronze bar of one inch square cross-section and
38 inches long. A round recess, one inch away from two ends is cut at both ends up to central plane of the
bar. A gold plug (1/10)’’ diameter having three lines engraved transversely, and two lines longitudinally is
inserted into these holes so that the lines are in neutral plane.
Yard is then defined as the distance between the two central transverse lines of the plug at 62°F. Secondary
standards were also made as copy of the above International yard for occasional comparisons. These
standards are kept in safe custody. The purpose of keeping the gold plug lines at neutral axis is that due to
bending of beam, neutral axis remains unaffected. Secondly the plug being in a well is protected from
accidental damages.
International prototype Metre:
This is the distance between the centre portions of two lines engraved on the polished surface of a bar of
pure platinum-iridium alloy (90% platinum and 10% iridium). It is inoxidisable and can have good polish
required for ruling good quality of lines. The bar is kept at 0°C and under normal atmospheric pressure. It
is supported by two rollers of at least 1 cm diameter symmetrically situated in the same horizontal plane at
a distance of 751 mm, so as to give minimum deflection (these are known as points of Bessel).
End Standards:
For all practical measurements in workshop, we employ end standards e.g. slip gauges, gap gauges, end of
micrometer anvils etc. Thus the importance of end standards (which are actually used in general
measurement applications) arose. Length bars and slip gauges were then made which were equal in length
to the legal line standard. The only difficulty realised with end standard was that of forming two accurately
parallel surfaces at the end of a bar and to heat treat the ends so that they remained stable.
For measuring the angle, no absolute standard is required. The measurement is done in degrees,
minutes and seconds. The measurement of angular and circular divisions is an important part of
inspection. It is concerned with the measurement of individual angles, angular changes and
deflections on components, gauges and tools. For precision measurement of angles more skill is
required. Like linear measurement, angular measurements have their own importance. The basic
difference between the linear and angular measurement is that no absolute standard is required for
angular measurement. There are several methods of measuring angles and tapers. The various
instruments used are angle gauges, clinometers, bevel protractor, sine bar, sine centers, taper plug
and ring gauges
It is used for measurement of an angle of a given job or for setting an angle. They are hardened and
precision ground tools for accurate angle setting. It can be used in conjunction with slip gauge set
and dial gauge for measurement of angles and tapers from horizontal surface. As shown in Figure,
two accurately lapped rollers are located at the extreme position. The center to center distance
between the rollers or plugs is available for fixed distance i.e. l = 100, 200, 250, 300 mm. The
diameter of the plugs or roller must be of the same size and the center distance between them is
accurate. The important condition for the sine bar is that the surface of sine bar must be parallel to
the center lines of the plug.
Principle of Working:
As shown in Figure the taper angle θ of the job WX YZ is to be measured by the sine bar. The job is
placed over the surface plate. The sine bar is placed over the job with plug or roller of one end of the
bar touching the surface plate. One end of the sine bar is rested on the surface plate and the other
end is rested on the slip gauges the angle of the job is then first measured by some non-precision
instrument, such as bevel protector. That angle gives the idea of the approximate slip gauges
required, at the other end of sine bar. And finally the exact number of slip gauges is added equal to
height h, such that, the top most slip gauges touches the lower end of the roller. The height of the
slip gauges required is then measured. Then the taper angle can be measured by making sine bar as
a hypotenuse of right angle triangle and slide gauge as the opposite side of the triangle as shown in
Figure
h = Height in mm
In addition to the above components, a measurement system may also have a data storage element to store
measured data for future use. As the above six components are the most common ones used in many
measurement systems, they are discussed in detail below:
A comparator works on relative measurements, i.e. to say, it gives only dimensional differences in relation
to a basic dimension. So a comparator compares the unknown dimensions of a part with some standard or
master setting which represents the basic size and dimensional variations from the master setting are
amplified and measured. The advantages of comparators are that not much skill is required on the part of
operator in its use. Further the calibration of instrument over full range is of no importance as comparison
is done with a standard end length. Zero error of instrument also does not lead to any problem. Since range
of indication is very small, being the deviation from set value, a high magnification resulting into great
accuracy is possible. The comparators are generally used for linear measurements, and various
comparators available differ principally in the method used for amplifying and recording the variations
measured. According to the principles used for obtaining suitable degrees of magnification of the
indicating device relative to the change in the dimension being measured, the various comparators may be
classified as follows:
1. Mechanical comparators
2. Mechanical-optical comparators
3. Electrical and Electronic comparators
4. Pneumatic comparators
5. Fluid displacement comparators
6. Projection comparators
7. Multi-check comparators
8. Automatic gauging machines.
Characteristics of Comparators:
Before we discuss the various types of comparators, let us first look into various fundamental requirements
which every comparator must fulfil. These are as follows:
1. The instrument must be of robust design and construction so as to withstand the effect of ordinary
usage without impairing its measuring accuracy.
2. The indicating device must be such that readings are obtained in least possible time and for this,
magnification system used should be such that the readings are dead beat. The system should be free
from backlash, and wear effects and the inertia should be minimum possible.
3. Provision must be made for maximum compensation for temperature effects.
4. The scale must be linear and must have straight line characteristic.
5. Indicator should be constant in it return to zero.
6. Instrument, though very sensitive, must withstand a reasonable ill usage without permanent harm.
7. Instrument must have the maximum versatility, i.e., its design must be such that it can be used for a
wide range of operations.
8. Measuring pressure should be low and constant.
Use of Comparators:
The various ways in which the comparators can be used are as follows:
Mechanical Comparators:
In these comparators, magnification is obtained by mechanical linkages and other mechanical devices.
1. Rack and Pinion: In it the measuring spindle integral with a rack, engages a pinion which amplifies
the movement of plunger through a gear train.
2. Cam and gear train: In this case the measuring spindle acts on a cam which transmits the motion to
the amplifying gear train.
3. Lever with toothed sector: In this case a lever with a toothed sector at its end engages a pinion in the
hub of a crown gear sector which further meshes with a final pinion to produce indication.
4. Compound Levers: Here levers forming a couple with compound action are connected through
segments and pinion to produce final pointer movement.
5. Twisted Taut Strip: The movement of measuring spindle tilts the knee causing straining which further
causes the twisted taut band to rotate proportionally. The motion of strip is displayed by the attached
pointer.
6. Lever combined with band wound around drum: In this case, the movement of the measuring spindle
tilts the hinged block, causing swing of the fork which induces rotation of the drum.
7. Reeds combined with optical display: In this case parallelogram reeds are used which transfer
measuring spindle movement to a deflecting reed whose extension carries a target utilised in optical
path.
The Sigma Comparator:
The various movements in the ‘sigma’ comparator are shown in diagrammatic form below.
A phosphor-bronze strip is attached to the two extremities of the Y arm and is passed around a radius r
attached to the pointer spindle. If the length of pointer be R, then is the second stage of magnification.
In order to adjust the magnification, distance a must be changed by slackening and tightening the two
screws attaching the knife edge to the plunger.
1. As the knife edge moves away from the moving member of the hinge and is followed by it, therefore, if
too robust movement of the plunger is made due to shock load that will not be transmitted through the
movement.
2. By mounting a non-ferrous disc on the pointer spindle and making it move in field of a permanent
magnet, dead beat readings can be obtained.
3. The error due to parallax is avoided by having a reflective strip on the scale.
4. The constant measuring pressure over the range of the instrument is obtained by the use of a magnet
plunger on the frame and a keeper bar on the top of the plunger.
This comparator was made by C.F. Johansson Ltd. and therefore named so. It is shown diagrammatically
in Fig. 5.7. This instrument uses the simplest and most ingenious method for obtaining the mechanical
magnification designed by H. Abramson which is called Abramson Movement. It works on the principle of
a button spinning on a loop of string. A twisted thin metal strip carries at the centre of its length a very
light pointer made of thin glass. The two halves of the strip from the centre are twisted in opposite
directions so that any pull on the strip will cause the centre to rotate. One end of the strip is fixed to the
adjustable cantilever strip and the other end is anchored to the spring elbow, one arm of which is carried on
the measuring plunger. As the measuring plunger moves either upwards or downwards, the elbow acts as a
bell crank lever and causes twisted strip to change its length thus making it further twist or untwist. Thus
the pointer at the centre of the twisted strip rotates by an amount proportional to the change in length of
strip and hence proportional to the plunger movement.
The bell crank lever is formed of flexible strips with a diagonal which is
relatively stiff. The length of cantilever can be varied to adjust the
magnification of the instrument. Since the centre line of the strip is
straight even when twisted, therefore, it is directly stretched by the
tension applied to the strip. Thus in order to prevent excessive stress on
the central portion, the strip is perforated along the centre line by
perforations as shown in figure.
Where is the twist of mid-point of strip with respect to the end, is the length of twisted strip measured
along its neutral axis and w is the width of twisted strip and n is the number of turns.
It is thus obvious that in order to increase the amplification of the instrument a very thin rectangular strip
must be used. Further amplification can be adjusted by the cantilever strip- which also provides anchorage.
It increases or decreases effective length of strip. Final setting of the instrument amplification is made by a
simple adjustment of the free length of cantilever strip. A slit C washer as shown in figure is used for the
lower mounting of plunger. Thus this instrument has no mechanical points or slides at which wear can take
place. The magnification is of the order of x5000.
They not require any external sources like electricity, compressed air etc.
They have more moving linkages due to which friction is more and hence the accuracy is less.
Mechanisms used in these comparators have more inertia which may cause them to be sensitive to
vibrations.
Errors, if any, such as backlash play, wear etc. will also be magnified.
The range of the instrument is limited as the pointer moves over a fixed scale.
Optical Comparators:
The disadvantages in Mechanical Comparators are over come in optical comparators and are capable of
giving higher degree of precision