0% found this document useful (0 votes)
177 views

Unit-1 Mechanical Measurement

Measurement involves comparing a property of an object to a standard unit. It is important for manufacturing to ensure components meet specifications. Some key aspects of measurement are: Hysteresis can cause errors due to factors like friction; linearity is desirable as it allows for easy calibration; threshold and resolution define the smallest detectable input change; drift over time can cause errors unrelated to input changes; and loading effects where measuring disturbs the system must be minimized. Proper calibration is needed to maintain instrument accuracy.

Uploaded by

Nikunj Yagnik
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
177 views

Unit-1 Mechanical Measurement

Measurement involves comparing a property of an object to a standard unit. It is important for manufacturing to ensure components meet specifications. Some key aspects of measurement are: Hysteresis can cause errors due to factors like friction; linearity is desirable as it allows for easy calibration; threshold and resolution define the smallest detectable input change; drift over time can cause errors unrelated to input changes; and loading effects where measuring disturbs the system must be minimized. Proper calibration is needed to maintain instrument accuracy.

Uploaded by

Nikunj Yagnik
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 41

Unit-1

Mechanical Measurement
Introduction of Measurement
• Measurement is defined as the process
of numerical evaluation of a dimension
or the process of comparison with
standard measuring instruments.
• The basic aim of measurement in industries is to
check whether a component has been manufactured
to the requirement of a specification or not.
• Example-consider the measurement of length of bar.
We make use of a scale/steel ruler(i.e. a standard)
Need of Measurement
Basic
Definitions
Hysteresis

• It is the maximum differences in two output


(indicated values) at the same input(measurand)
value within the specified range when input is
continuously increased from zero and when input is
continuously decreased from maximum value.
• Hysteresis arises because
 mechanical friction
 magnetic effects
 elastic deformation
 thermal effects.
Linearity

• A measuring system is said to be linear if the


output is linearly proportional to the input.
• A linear system can be easily calibrated while
calibration of a non linear system is tedious,
cumbersome & time consuming.
• Most of the systems require a linear behavior
as it is desirable i.e. output is linearly
proportional to input.
Linearity
The working range of most of the instruments
provides a linear relationship with output and
input .

Non linear

O/P Linear

I/P
Threshold

• If the instrument input is increased very


gradually from zero, there will be some
minimum value of input below which no
output change can be detected. This minimum
value defines the threshold of the instrument.
• Caused by backlash or internal noise
Resolution (Discrimination)

• Resolution is defined as the smallest


increment of input signal that a measuring
system is capable of displaying.
• Difference:
1. Resolution is defines the smallest measurable input
change while threshold defines the smallest
measurable input.
2. Threshold is measured when the input is varied from
zero while the resolution is measured when the input
is varied from any arbitrary non- zero value.
Drift

• Drift is undesirable gradual departure of the


instrument output over a period of time that is
unrelated change in input, operating condition or
load.
• Drift may occur in obstruction flow because of wear and erosion of the
orifice plate, nozzle or venturimeter.
• Following factors related to drift in an instrument:
1. Wear and tear in mating parts
2. Mechanical vibration
3. Contamination in primary sensing elements
4. Temperature changes, magnetic fields.
Zero stability

• A ability of the instrument to restore to zero


reading after the measurand has returned to
zero and other variation have been removed.
Loading effect

• The presence of a measuring instrument in a medium


to be measured will always lead to extraction of some
energy from the medium, thus making perfect
measurements theoretically impossible.
• This effect is known as ‘loading effect’ which must be
kept as small as possible for better measurements.
• For ex, in electrical measuring systems, the detector stage receives
energy from the signal source, while the intermediate modifying devices
and output indicators receive energy from auxiliary source.
• The loading effects are due to impedances of various elements connected
in a system
System response

•Response of a system may be defined as the ability


of the system to transmit & present all the relevant
information contained in the input signal & to
exclude all others.
•If the output is faithful to input, i.e. the output signals have the same phase
relationships as that of the input signal, the system is said to have good System
response.
•If there is a lag or delay in the output signal which may be due to natural inertia of the
system, it is known as ‘measurement lag’
•“Rise time” is defined as the time taken for system to change from 5% to 95% of its final
value.
•It is a measure of the speed of response of a measuring system and a short rise time is
desirable.
Process of Measurement
It’s a physical quantity like length, angle, diameter, thickness etc.

Physical quantity to which quantitative comparisons are made

Measurand

Reference

It is the men of comparing measurand with reference Comparator


Measurement Methods
1. Direct method
2. Indirect method
3. Comparative method
4. Transposition method
5. Coincidence method
6. Deflection method
7. Complimentary method
8. Contact type
9. Contactless method
10.Absolute or Fundamental method
Direct Method

This is a simple method of measurement, in which the value of the quantity to


be measured is obtained directly without any calculations.
For example, measurements by using scales, vernier callipers, micrometers,
bevel protector etc.
This method is most widely used in production.
This method is not very accurate because it depends on human insensitiveness
in making judgment.
Indirect method

indirect method the value of quantity to be measured is obtained by


measuring other quantities which are functionally related to the required
value.
E.g. Angle measurement by sine bar, measurement of screw pitch diameter by
three wire method etc.
Comparative method

In this method the value of the quantity to be measured is compared with


known value of the same quantity.
Transposition method

• It is a method of measurement by direct comparison


in which the value of the quantity measured is first
balanced by an initial known value A of the same
quantity, and then the value of the quantity
measured is put in place of this known value and is
balanced again by another known value B.
• For example, determination of a mass by means of a
balance and known weights, using the Gauss double
weighing.
Coincidence method

It is a differential method of measurement in which a very small difference


between the value of the quantity to be measured and the reference is
determined by the observation of the coincidence of certain lines or signals.
Deflection method
In this method the value of the quantity to be
measured is directly indicated by a deflection of a
pointer on a calibrated scale.
Complementary method
• In this method the value of the quantity to be
measured is combined with a known value of
the same quantity.
• The combination is so adjusted that the sum
of these two values is equal to predetermined
comparison value.
• For example, determination of the volume of a
solid by liquid displacement.
Contact type method
• In contact type method measuring tip or tip of
sensor touches to the component called
contact type.
Contactless method
• Where as non contact type method does not
make a physical contact to components. Like
STROBOSCOPE, TOOL MAKERS MICROSCOPE.
Absolute or Fundamental method
• It is based on the measurement of the base
quantities used to define the quantity.
• For example, measuring a quantity directly in
accordance with the definition of that
quantity, or measuring a quantity indirectly by
direct measurement of the quantities linked
with the definition of the quantity to be
measured.
GENERALIZED MEASUREMENT SYSTEM
Characteristics of Measuring Devices
Static Characteristics:
Dynamic Characteristics:
Either Constant or Slowly Varying with time to
Rapidly Varying with time
define set of criteria
Examples: Examples:
- Accuracy - Speed of response
- Precision - Measuring lag
- Calibration - Fidelity
- Sensitivity - Dynamic Error
- Reproducibility
- Repeatability
- Readability
- Dead Zone
- Static Error
- Linearity
- Hysteresis
- Resolution
- Threshold
- Drift
Accuracy
The purpose of measurement is to determine the true dimensions
of a part. But no measurement can be made absolutely accurate
> The amount of error depends upon the following factors:
• The accuracy and design of the measuring instrument
• The skill of the operator
• Method adopted for measurement
• Temperature variations 
• Elastic deformation of the part or instrument etc.
Thus the term accuracy denotes the closeness of the measured
value with the true value.
The difference between the measured value and the true value is
the error of measurement. lesser the error, more is the accuracy.
Precision and Accuracy
Precision is the repeatability of the measuring
process.

If the instrument is not precise it will give different


(widely varying) results for the same dimension
when measured again and again.

The less the scattering more precise is the instrument


Accuracy is the degree to which the measured
value of the quality characteristic agrees with
the true value.

The difference between the true value and the


measured value is known as error of measurement.
Distinction between Precision and Accuracy
Calibration
It is very much essential to calibrate the
instrument so as to maintain its accuracy.

Calibration is usually carried out by making


adjustment such that when the instrument is having
zero measured input then it should read out zero and
when the instrument is measuring some dimension it
should read it to its closest accurate value.
Sensitivity
It is defined as the ratio of output response to
a specific change in input.
Reproducibility
It is the closeness of output readings for the same
input when there are changes in the method of
measurement, observer, measuring instrument,
location, condition of use and time of measurement.
Repeatability
It is the ability of the measuring instrument to repeat
the same results for the measurements for the same
quantity when the measurement are carried
out by the same observer, with the same
instrument, under the same conditions,
without any change in location without
change in the method of measurement.
Readability
Readability refers to the case with which the
readings of a measuring Instrument can be
read. It is the susceptibility of a measuring
device to have its indications converted into
meaningful number.

“ To make micrometers more readable they are


provided with Vernier scale. It can also be
improved by using magnifying devices.”
Dead Zone
• It is the largest change of input quantity for
which instrument does not indicates output.
• Due to friction and hysteresis effect in
instrument.
Output

Input
Dead zone
Range:
The region between the limits within which an instrument is designed to operate for
measuring, indicating or recording a physical quantity is called the range of the
instrument. The range is expressed by stating the lower and upper values.

Span :
Span represents the algebraic differences between the upper and lower range values
of the instrument.
For example,
Range -10 0c to 80 0c ; Span 90 0c
Range 5 bar to 100 bar ; Span 95 bar
Range 0Volts to 75 Volts ; Span 75 Volts
ERRORS IN MEASUREMENTS
Error in measurement = Measured value - True value
Classification of Errors:
1. Gross errors or mistakes
– Blunders
– Computational errors
– Chaotic errors
2. Systematic Error
– Instrument errors
– Environmental Error
– Observation errors
– Operational errors
– System interaction errors
3. Random Error
Systematic Error
These errors includes calibration errors, error
due to variation in the atmospheric condition
Variation in contact pressure etc.

As it is controllable things it is also called


controllable error
Random error
These errors are caused due to variation in
position of setting standard and work- piece
errors.

These errors are non-consistent and hence the


name random errors.

You might also like