0% found this document useful (0 votes)
13 views

Document (6)

Uploaded by

monamech0105
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Document (6)

Uploaded by

monamech0105
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 56

Topic: Metrology and Quality Control

Answer:
UNIT 1
Metrology is the science of
measurement. It deals with the
principles and techniques of obtaining,
analyzing, and interpreting
measurements. In the context of quality
control, metrology plays a crucial role in
ensuring that products and processes
meet the required standards and
specifications.
Factors Affecting Measurement:
Several factors can influence the
accuracy and precision of
measurements:
* Environmental Factors: Temperature,
humidity, pressure, and vibration can
affect the behavior of measuring
instruments and the measured object.
* Instrument Factors: Calibration errors,
resolution limitations, and wear and tear
of measuring instruments can introduce
inaccuracies.
* Human Factors: Observer bias,
fatigue, and lack of training can lead to
errors in reading and interpreting
measurements.
* Method Factors: The chosen
measurement method itself can have
inherent limitations and uncertainties.
Types of Measurement:
* Direct Measurement: The quantity is
measured directly using a measuring
instrument. Examples include using a
ruler to measure length or a
thermometer to measure temperature.
* Indirect Measurement: The quantity is
calculated from other measured
quantities. For example, measuring the
volume of a liquid by measuring the
height and diameter of the container.
* Continuous Measurement: The
quantity is measured continuously over
time. Examples include monitoring
temperature or pressure in a process.
* Discrete Measurement: The quantity
is measured at specific intervals.
Examples include taking periodic
readings of a machine’s performance.
Errors in Measurement:
Errors are unavoidable in any
measurement process. They can be
broadly classified into two categories:
* Systematic Errors: These errors are
consistent and predictable. They can be
caused by factors such as instrument
calibration errors or environmental
conditions.
* Random Errors: These errors are
unpredictable and vary randomly. They
can be caused by factors such as human
error or instrument limitations.
Quality Control and Metrology:
Metrology is an integral part of quality
control systems. It provides the
foundation for:
* Setting Standards: Accurate and
precise measurements are essential for
establishing and maintaining quality
standards.
* Product Inspection and Testing:
Measurements are used to verify that
products meet the specified
requirements.
* Process Control: Measurements are
used to monitor and control
manufacturing processes to ensure
consistency and quality.
* Problem Solving: Identifying and
analyzing measurement errors can help
in troubleshooting and improving
manufacturing processes.
Conclusion:
Metrology is a fundamental aspect of
quality control. By understanding the
factors affecting measurement,
choosing appropriate measurement
methods, and minimizing errors,
organizations can ensure the accuracy
and reliability of their products and
processes..
Measurement Uncertainty
Measurement uncertainty is an inherent
characteristic of any measurement
process. It quantifies the range of values
within which the true value of a
measurement is likely to lie. In essence,
it reflects the degree of confidence we
can have in a measurement result.
Key Aspects of Measurement
Uncertainty:
* Sources of Uncertainty:
* Random errors: Fluctuations in
readings that cannot be predicted or
explained systematically.
* Systematic errors: Consistent and
reproducible deviations from the true
value due to factors like instrument
calibration errors or environmental
conditions.
* Expressing Uncertainty:
* Standard uncertainty: A measure of
the dispersion of the possible values of
the measurand.
* Expanded uncertainty: A wider
interval that covers a larger proportion
of possible values, often expressed as a
multiple of the standard uncertainty.
* Significance of Uncertainty:
* Quality control: Understanding
uncertainty helps in setting appropriate
tolerances and specifications for
products.
* Scientific research: Enables accurate
comparison and interpretation of
experimental results.
* Legal and regulatory compliance:
Ensures that measurements meet the
requirements of various standards and
regulations.
Related Problems
* Uncertainty analysis: The process of
identifying and quantifying the sources
of uncertainty in a measurement.
* Uncertainty propagation: Determining
how uncertainties in individual
measurements affect the uncertainty in
the final result, especially in complex
calculations.
* Minimizing uncertainty: Employing
techniques like calibration, using more
precise instruments, and controlling
environmental conditions to reduce
uncertainty.
Types of Standards
Standards provide a common reference
point for consistent and comparable
measurements. They play a crucial role
in ensuring the accuracy and reliability
of measurement systems.
* Primary Standards:
* Realized through fundamental
physical constants and definitions.
* Serve as the ultimate reference for
all other standards.
* Examples: The kilogram (defined
using Planck’s constant), the meter
(defined using the speed of light).
* Secondary Standards:
* Derived from primary standards
through calibration.
* Used for routine calibration of
working standards.
* Examples: National standards
maintained by metrology institutes.
* Working Standards:
* Used directly in industrial and
laboratory settings for routine
measurements.
* Calibrated against secondary
standards at regular intervals.
* Examples: Gauges, thermometers,
balances.
Calibration of Measuring
Instruments
Calibration is the process of comparing
a measuring instrument against a
known standard to determine its
accuracy and correct any deviations. It
is essential for maintaining the reliability
and traceability of measurement results.
Key Steps in Calibration:
* Selection of appropriate standards:
Choosing standards that are traceable
to primary standards.
* Comparison of instrument readings
with standard values.
* Determination of calibration errors.
* Adjustment or correction of the
instrument if necessary.
* Issuance of a calibration certificate.
UNIT 2
Measurement of Linear Dimensions
Linear dimensions refer to
measurements of length, width, height,
or any other distance along a straight
line. Accurate measurement of linear
dimensions is crucial in various
industries, including manufacturing,
engineering, and construction.
Instruments Used for Measuring
Linear Dimensions:
* Vernier Calipers:
* Versatile instrument for measuring
internal, external, and depth
dimensions.
* Provides higher accuracy than a
simple ruler.
* Uses a vernier scale to read fractions
of the main scale divisions.
* Micrometers:
* Used for precise measurements of
small distances.
* Employ a screw mechanism with a
thimble scale for accurate readings.
* Types include outside micrometers,
inside micrometers, and depth
micrometers.
* Height Gauges:
* Used to measure the height or
vertical distance of an object.
* Consist of a base, a vertical column,
and a measuring head.
* Commonly used in inspection and
assembly processes.
* Comparators:
* Used to compare the dimensions of a
part to a master or standard.
* Highly sensitive and capable of
detecting small variations.
* Types include mechanical, optical,
and electronic comparators.
* Profile Projectors:
* Project an enlarged image of the
object onto a screen.
* Used to inspect complex shapes and
contours.
* Enable precise measurement of
angles, radii, and other features.
Angular Dimensions
Angular dimensions refer to
measurements of angles, such as the
inclination of a surface or the angle
between two lines. Instruments used for
measuring angular dimensions include:
* Bevel Protractor:
* Used to measure and lay out angles.
* Consists of a base and a movable
arm with a protractor scale.
* Sine Bar:
* Used to measure and set angles with
high precision.
* Consists of a hardened and ground
steel bar with two cylindrical ends.
* Autocollimator:
* An optical instrument used to
measure small angles with high
accuracy.
* Measures the angular deviation of a
reflecting surface.
* Clinometer:
* Used to measure the angle of
inclination or elevation of a surface.
* Commonly used in surveying and
construction.
Screw Thread Measurement
Screw threads are essential components
in many mechanical assemblies.
Accurate measurement of screw thread
parameters like pitch, diameter, and
angle is crucial for proper functioning.
Instruments used for screw thread
measurement include:
* Thread Micrometers:
* Specially designed micrometers for
measuring the major and minor
diameters of screw threads.
* Thread Gauges:
* Sets of gauges (go and no-go) used
to check if a thread meets specified
tolerances.
* Thread Measuring Wires:
* Used to measure the effective
diameter of a screw thread.
Conclusion
Accurate measurement of linear and
angular dimensions is fundamental in
various industries. By employing
appropriate instruments and techniques,
organizations can ensure the quality and
precision of their products and
processes.

Measurement of Gears
Gears are essential components in
various mechanical systems,
transmitting power and motion between
rotating shafts. Accurate measurement
of gear parameters is crucial for
ensuring proper meshing, smooth
operation, and overall performance.
Key Parameters Measured in Gears:
* Tooth Profile: The shape of the tooth
surface, which determines the contact
and load-carrying capacity. It is typically
measured using a profile projector or a
gear tooth vernier caliper.
* Tooth Thickness: The distance
between the two flanks of a tooth at a
specific point. It is measured using a
gear tooth caliper or a specialized
measuring machine.
* Runout: The radial variation of the
tooth profile from the ideal position. It is
measured using a dial indicator or a
runout measuring machine.
* Pitch: The distance between
corresponding points on adjacent teeth.
It is measured using a gear tooth caliper
or a pitch measuring machine.
* Helix Angle: The angle at which the
teeth are cut on helical gears. It is
measured using an angle gauge or a
specialized measuring machine.
* Backlash: The clearance between
meshing teeth when the gears are in
neutral position. It is measured using a
backlash gauge or a specialized
measuring machine.
* Rolling Gear Test: A dynamic test to
evaluate the meshing performance of
gears under load. It involves rolling the
gears together and measuring
parameters like noise, vibration, and
temperature.
Instruments Used for Gear
Measurement:
* Gear Tooth Calipers: Used for
measuring tooth thickness, pitch, and
other linear dimensions.
* Profile Projectors: Project an enlarged
image of the gear tooth profile onto a
screen for inspection and measurement.
* Runout Measuring Machines: Used to
measure the radial variation of the tooth
profile.
* Pitch Measuring Machines: Used to
measure the pitch of gears with high
accuracy.
* Angle Gauges: Used to measure the
helix angle of helical gears.
* Backlash Gauges: Used to measure
the clearance between meshing teeth.
Importance of Gear Measurement:
* Ensuring Quality: Accurate
measurement helps in identifying and
correcting defects in gear
manufacturing.
* Predicting Performance: Measured
parameters can be used to predict the
performance and durability of gears.
* Troubleshooting: Gear measurements
can help in identifying the root cause of
problems like noise, vibration, and wear.
* Meeting Specifications: Accurate
measurements are essential for
ensuring that gears meet the required
standards and specifications.
Conclusion
Accurate measurement of gear
parameters is crucial for ensuring the
quality, performance, and reliability of
gear drives. By employing appropriate
instruments and techniques,
manufacturers can ensure that gears
meet the required standards and
specifications.
If you would like to delve into a specific
aspect or application of gear
measurement, please feel free to ask
further questions.
UNIT 3

Tolerance Analysis
Tolerance analysis is a critical aspect of
engineering design and manufacturing.
It deals with determining the acceptable
variations in the dimensions and other
characteristics of a part or assembly to
ensure proper function and
interchangeability.
Tolerance Detail
* Types of Tolerances:
* Bilateral Tolerance: Specifies a
variation above and below a nominal
dimension. Example: 20 ± 0.1 mm.
* Unilateral Tolerance: Specifies a
variation in one direction only. Example:
20 +0.1/-0 mm.
* Terminology:
* Nominal Dimension: The ideal or
theoretical dimension of a part.
* Tolerance: The permissible variation
in a dimension.
* Upper Limit: The maximum
permissible dimension.
* Lower Limit: The minimum
permissible dimension.
* Basic Dimension: A theoretical
dimension used as a reference for other
dimensions.
* Limits and Fits:
* Limits: The maximum and minimum
permissible dimensions of a part.
* Fits: The relationship between the
mating parts, determined by the
difference in their dimensions.
Types of Fits
* Clearance Fit: The shaft is smaller
than the hole, resulting in a clearance
between them.
* Interference Fit: The shaft is larger
than the hole, resulting in an
interference between them.
* Transition Fit: The fit can be either
clearance or interference depending on
the actual dimensions of the parts.
Fits Based Problems
Tolerance analysis is often used to solve
problems related to fits, such as:
* Determining the appropriate fit for a
specific application.
* Calculating the maximum and
minimum clearances or interferences.
* Ensuring interchangeability of parts.
* Minimizing the cost of manufacturing
and assembly.
Conclusion
Tolerance analysis is a critical tool for
ensuring the quality and functionality of
engineering components. By carefully
considering tolerances and fits,
engineers can optimize designs,
minimize manufacturing costs, and
improve the overall performance of
products.

1. Limit Gauges
* Details: Limit gauges are specialized
measuring tools designed to quickly
check if a part’s dimensions fall within
the specified tolerance limits. They are
typically used for mass production and
inspection.
* Go Gauge: Checks if the part is
larger than the minimum permissible
size.
* No-Go Gauge: Checks if the part is
smaller than the maximum permissible
size.
* Types:
* Plug Gauges: Used to check the
diameter of holes.
* Ring Gauges: Used to check the
diameter of shafts.
* Snap Gauges: Used to check the
thickness of objects.
* Thread Gauges: Used to check the
dimensions of screw threads.
2. Problems
* Tolerance Stack-up: When multiple
dimensions are combined in an
assembly, the individual tolerances can
accumulate, potentially leading to an
unacceptable overall variation. This is
known as tolerance stack-up.
* Process Capability: This refers to the
ability of a manufacturing process to
consistently produce parts within the
specified tolerance limits. Process
capability analysis helps to identify and
address potential issues that may lead
to excessive variation.
3. Tolerance Charting
* Tolerance Charting: A graphical
method used to visualize and analyze
the effects of tolerance stack-up. It
helps to identify critical dimensions that
contribute most to the overall variation
and to determine the necessary
adjustments to achieve the desired
assembly performance.
Conclusion
Tolerance analysis, including the use of
limit gauges, understanding tolerance
stack-up, and performing process
capability analysis, is crucial for
ensuring the quality and functionality of
engineering components. By carefully
considering tolerances and employing
appropriate techniques, manufacturers
can optimize designs, minimize
manufacturing costs, and improve the
overall performance of products.
UNIT 4
Metrology of Surfaces
Surface metrology deals with the
measurement and characterization of
surface features such as roughness,
waviness, and form errors. It plays a
crucial role in understanding the
functional performance of components,
especially in precision engineering and
manufacturing.
Fundamentals of GD&T
GD&T (Geometric Dimensioning and
Tolerancing) is a standardized system
for defining and communicating
engineering tolerances. It provides a
precise and unambiguous way to specify
the allowable variations in the shape,
orientation, and location of features on a
part.
Conventional vs. Geometric Tolerance
* Conventional Tolerance: Specifies only
the permissible variation in the size of a
feature. It doesn’t provide information
about the shape or orientation of the
feature.
* Geometric Tolerance: Specifies the
permissible variation in the shape,
orientation, or location of a feature,
independent of its size. It provides more
precise control over the functional
performance of the part.
Datums
Datums are theoretical reference
features used to define the location and
orientation of other features on a part.
They provide a stable and consistent
basis for measurement and inspection.
Surface Deviation
Surface deviations refer to the
variations in the shape, orientation, and
roughness of a surface compared to its
ideal or nominal form. Common types of
surface deviations include:
* Straightness Deviation: The deviation
of a line or surface from a true straight
line.
* Flatness Deviation: The deviation of a
surface from a true plane.
* Roundness Deviation: The deviation of
a circular feature from a true circle.
Conclusion
Surface metrology and GD&T are
essential tools for ensuring the quality
and functionality of engineering
components. By understanding and
applying these concepts, manufacturers
can achieve higher precision, improve
product performance, and reduce
manufacturing costs.

Surface Finish Measurement


Surface finish refers to the texture or
topography of a surface, encompassing
features like roughness, waviness, and
lay. It significantly influences the
performance and functionality of
components in various applications.
Surface Parameters
Several parameters are used to quantify
surface finish:
* Roughness: Measures the small-scale
irregularities on the surface, such as
peaks and valleys. It is typically
quantified by parameters like:
* Ra (Average Roughness): The
average deviation of the surface profile
from the mean line.
* Rq (Root Mean Square Roughness):
The root mean square of the surface
profile deviations.
* Rz (Average Roughness Depth): The
average of the absolute values of the
peak-to-valley heights.
* Waviness: Refers to the larger-scale
undulations on the surface. It is typically
quantified by parameters like:
* Waviness Height: The peak-to-valley
distance of the waviness profile.
* Waviness Wavelength: The average
distance between two successive peaks
or valleys in the waviness profile.
* Lay: Describes the direction and
orientation of the surface texture
features. It is typically characterized by
the direction of the predominant pattern
of surface irregularities.
Measurement Techniques
* Stylus-Based Techniques:
* Profilometer: A mechanical
instrument that uses a stylus to trace
the surface profile. It measures the
vertical displacement of the stylus as it
moves across the surface.
* Surface Roughness Tester: A
computerized instrument that uses a
stylus and a transducer to measure
surface roughness parameters.
* Optical Techniques:
* Optical Profiler: Uses optical
interferometry or confocal microscopy to
measure surface topography with high
resolution.
* Laser Scanning Confocal Microscopy:
Uses a focused laser beam to scan the
surface and generate a 3D image.
3D Surface Metrology
3D surface metrology involves
measuring and analyzing the surface
topography in three dimensions,
providing a more comprehensive
understanding of surface features. It
enables the characterization of complex
surfaces with features like curvatures
and slopes.
Conclusion
Surface finish measurement is crucial
for ensuring the quality and
performance of engineering
components. By employing appropriate
techniques and analyzing surface
parameters, manufacturers can optimize
surface properties, improve component
performance, and enhance product
reliability.

UNIT IV: STATISTICAL QUALITY


CONTROL
* Surface Finish: This refers to the
texture or topography of a surface. It
encompasses features like roughness,
waviness, and lay. Surface finish
significantly influences the performance
and functionality of components in
various applications.
* Terminology: In the context of surface
finish, terminology refers to the specific
terms and definitions used to describe
and quantify surface characteristics.
These terms include:
* Roughness: Small-scale
irregularities on the surface, like peaks
and valleys.
* Waviness: Larger-scale undulations
on the surface.
* Lay: The direction and orientation of
the surface texture features.
* Measurements: Surface finish is
measured using various techniques,
including:
* Stylus-based techniques: Using a
stylus that mechanically traces the
surface profile.
* Optical techniques: Employing
optical methods like interferometry or
confocal microscopy to capture surface
topography.
* Optical Measuring Instruments: These
instruments utilize optical principles to
measure surface finish. Examples
include:
* Optical Profilers: Use interferometry
or confocal microscopy to create 3D
images of the surface.
* Laser Scanning Confocal Microscopy:
Uses a focused laser beam to scan the
surface and generate a 3D image.
* Acceptance Test: This is a procedure
used to determine whether a product,
material, or system conforms to
specified requirements. It involves
inspecting and testing the product to
ensure it meets the acceptance criteria.
* Machines: In this context, machines
refer to the equipment used in
manufacturing and production
processes. Acceptance testing may
involve evaluating the performance and
functionality of these machines.
* Statistical Quality Control: This is a set
of statistical techniques used to monitor
and control the quality of products and
processes. It involves collecting and
analyzing data to identify and address
sources of variation that can affect
quality.
* Control: Refers to the activities and
processes implemented to maintain a
desired level of quality and prevent
defects.
* Control Charts: Graphical tools used
to monitor process variation over time.
They help to identify trends and
patterns that may indicate a process is
out of control.
* Sampling Plans: A set of rules that
determine the number of samples to be
inspected and the acceptance criteria
for a batch of products.

UNIT 5
Advances in Metrology
Metrology, the science of measurement,
is constantly evolving with
advancements in technology and
research. These advancements enable
more precise, accurate, and efficient
measurements, leading to significant
improvements in various fields like
manufacturing, engineering, and
science.
Key Advancements in Metrology:
* Laser Interferometry: Laser
interferometers are highly precise
instruments that use the interference of
laser beams to measure distances with
extremely high accuracy. They are
widely used in dimensional metrology,
calibration laboratories, and scientific
research.
* Coordinate Measuring Machines
(CMMs): CMMs are versatile measuring
systems that use touch probes or non-
contact sensors to measure the
dimensions and geometry of complex
parts. Advancements in CMM technology
include:
* Increased accuracy and precision:
Through the use of advanced sensors,
improved control systems, and
environmental compensation.
* Non-contact measurement
techniques: Such as laser scanning and
optical interferometry, enabling the
measurement of complex surfaces and
fragile parts.
* Integration with CAD/CAM systems:
Allowing for direct comparison of
measured data with design models.
* Machine Vision: Machine vision
systems use cameras and image
processing algorithms to analyze and
interpret visual information. They are
used in metrology for tasks such as:
* Dimensional inspection: Measuring
the size, shape, and orientation of
objects.
* Surface inspection: Detecting
defects and irregularities on surfaces.
* Pattern recognition: Identifying and
classifying objects based on their visual
characteristics.
* X-ray Computed Tomography (CT): CT
scanning uses X-rays to create 3D
images of objects. It is used in
metrology for:
* Internal measurement: Measuring
the dimensions and geometry of internal
features of objects.
* Defect detection: Identifying internal
defects such as cracks and voids.
* Reverse engineering: Creating 3D
models of complex objects.
Impact of Advances in Metrology:
* Improved Product Quality: More
accurate measurements enable
manufacturers to produce parts with
higher precision and consistency.
* Increased Efficiency: Advanced
metrology techniques can automate
inspection processes, reducing labor
costs and improving productivity.
* Enhanced Innovation: Advancements
in metrology enable the development of
new materials, processes, and products
with improved performance.
* Improved Research: Precise
measurements are essential for
scientific research in fields such as
physics, chemistry, and materials
science.
Conclusion
Advancements in metrology are
continuously transforming the way we
measure and understand the world
around us. By leveraging these
technologies, we can achieve higher
levels of precision, accuracy, and
efficiency in various fields, leading to
significant advancements in science,
engineering, and manufacturing.
If you would like to delve into a specific
aspect or application of any of these
advancements, feel free to ask further
questions.
Here are the definitions of the words
from the image related to Six Sigma:
UNIT V: SIX SIGMA
* Six Sigma: A data-driven methodology
aimed at significantly improving the
quality of processes and products by
reducing defects and variability.
* Define: The first phase of the DMAIC
cycle, where the project scope, goals,
and objectives are defined.
* Measure: The second phase of DMAIC,
where data is collected and analyzed to
understand the current process
performance.
* Analyze: The third phase of DMAIC,
where data is analyzed to identify the
root causes of defects and problems.
* Improve: The fourth phase of DMAIC,
where solutions are implemented to
address the identified root causes and
improve the process.
* Control: The fifth and final phase of
DMAIC, where changes are implemented
to maintain the improved process
performance and prevent future
problems.
* Analyze Phase Tools: These are tools
used in the Analyze phase to analyze
data and identify root causes.
* Common Tools:
* Histogram: A bar graph that shows
the frequency distribution of data.
* Box Plot: A graphical representation
of the distribution of data, showing the
median, quartiles, and outliers.
* Control Chart: A graphical tool used
to monitor process variation over time.
* Scatter Chart: A graph that shows
the relationship between two variables.
* Cause and Effect Diagram: A
graphical tool used to identify potential
causes of a problem.
* Pareto Analysis: A technique for
identifying and prioritizing the most
important causes of problems.
* Interrelationship Diagram: A
graphical tool used to identify and
analyze relationships between different
factors.
* Special Tools:
* Regression Analysis: A statistical
method used to model the relationship
between a dependent variable and one
or more independent variables.
* Hypothesis Testing: A statistical
method used to test hypotheses about
population parameters.
* ANOVA (Analysis of Variance): A
statistical method used to compare the
means of two or more groups.
* Multivariate Analysis: A set of
statistical methods used to analyze data
with multiple variables.
If you’d like, you can give me another
word or phrase from the image, and I’ll
provide a definition!

2-mark questions
UNIT I: BASICS OF MEASUREMENT
SYSTEM AND DEVICES
* What is Metrology?
* Answer: Metrology is the science of
measurement. It deals with the
principles and techniques of obtaining,
analyzing, and interpreting
measurements.
* Differentiate between Accuracy and
Precision.
* Answer:
* Accuracy: Refers to how close a
measurement is to the true value.
* Precision: Refers to the
repeatability of a measurement. A
precise measurement gives consistent
results, even if they are not accurate.
* What is Abbe’s principle?
* Answer: Abbe’s principle states that
for maximum accuracy in a measuring
instrument, the line of measurement
should coincide with the line of action of
the measuring force.
* What are the three stages of a
generalized measurement system?
* Answer:
* Sensing: Converting the physical
quantity into a measurable signal.
* Transmission: Transmitting the
signal to a processing unit.
* Processing: Processing the signal to
obtain the desired output.
* What are some common sources of
error in measurement?
* Answer:
* Systematic errors: Consistent and
predictable errors (e.g., instrument
calibration errors).
* Random errors: Unpredictable and
fluctuating errors (e.g., environmental
variations).
* Human errors: Errors due to
observer bias, fatigue, or incorrect
reading.
UNIT II: CALIBRATION OF INSTRUMENTS
AND QUALITY STANDARDS
* What is calibration?
* Answer: Calibration is the process of
comparing a measuring instrument
against a known standard to determine
its accuracy and correct any deviations.
* Why is calibration important?
* Answer: Calibration ensures that
measuring instruments provide accurate
and reliable results, which is crucial for
quality control and product consistency.
* Mention some commonly used
instruments for calibration.
* Answer: Vernier caliper, Micrometer,
Feeler gauges, Dial indicator, Surface
plates, Slip gauges.
* What are ISO 9000 quality standards?
* Answer: ISO 9000 is a family of
international standards that provide a
framework for quality management
systems. They help organizations ensure
that their products and services meet
customer requirements and consistently
improve.
UNIT III: GEOMETRICAL MEASUREMENT
AND MACHINE ELEMENTS
* What is angular measurement?
* Answer: Angular measurement is the
process of determining the size of an
angle.
* What is a sine bar?
* Answer: A sine bar is a precision
measuring instrument used to set and
measure angles with high accuracy.
* What is the purpose of a limit gauge?
* Answer: Limit gauges are used to
quickly check if a part’s dimensions fall
within the specified tolerance limits.
* What is GD&T?
* Answer: GD&T stands for Geometric
Dimensioning and Tolerancing. It is a
standardized system for defining and
communicating engineering tolerances.
UNIT IV: STATISTICAL QUALITY CONTROL
* What are control charts?
* Answer: Control charts are graphical
tools used to monitor the variation in a
process over time. They help to identify
trends and patterns that may indicate a
process is out of control.
* What are sampling plans?
* Answer: Sampling plans are used to
determine the number of samples to be
inspected and the acceptance criteria
for a batch of products.
UNIT V: SIX SIGMA
* What are the phases of Six Sigma?
* Answer: Define, Measure, Analyze,
Improve, Control (DMAIC).
* What is a histogram?
* Answer: A histogram is a graphical
representation of the distribution of
data. It shows the frequency of data
points within certain intervals.
a second set of 2-mark questions
UNIT I: BASICS OF MEASUREMENT
SYSTEM AND DEVICES
* What is Sensitivity of an instrument?
* Answer: Sensitivity is the ratio of the
change in output signal of the
measuring instrument to the
corresponding change in the input
quantity.
* Explain the difference between
Systematic and Random errors.
* Answer:
* Systematic Errors: Consistent and
repeatable errors, always occurring in
the same direction. They can be
identified and corrected. Examples:
Calibration errors, environmental
effects.
* Random Errors: Unpredictable and
fluctuating errors, occurring in both
directions. They cannot be easily
eliminated but can be reduced by taking
multiple readings and averaging.
UNIT II: CALIBRATION OF INSTRUMENTS
AND QUALITY STANDARDS
* What is Traceability in calibration?
* Answer: Traceability refers to the
ability to link a measurement result to
national or international standards
through an unbroken chain of
comparisons.
* What is the purpose of a Calibration
Certificate?
* Answer: A calibration certificate
provides documentary evidence of the
accuracy and performance of a
measuring instrument at a specific time.
It is used to establish confidence in
measurement results.
UNIT III: GEOMETRICAL MEASUREMENT
AND MACHINE ELEMENTS
* What is a CMM?
* Answer: CMM stands for Coordinate
Measuring Machine. It is a versatile
measuring system used to measure the
dimensions and geometry of complex
parts.
* What is the difference between a Plug
Gauge and a Ring Gauge?
* Answer:
* Plug Gauge: Used to check the
inside diameter of a hole.
* Ring Gauge: Used to check the
outside diameter of a shaft.
UNIT IV: STATISTICAL QUALITY CONTROL
* What is a Histogram?
* Answer: A histogram is a graphical
representation of the distribution of
data. It shows the frequency of data
points within certain intervals.
* What is the purpose of a Control
Chart?
* Answer: Control charts are used to
monitor the variation in a process over
time. They help to identify trends and
patterns that may indicate a process is
out of control.
UNIT V: SIX SIGMA
* What is the goal of Six Sigma?
* Answer: The goal of Six Sigma is to
achieve near-perfect quality by reducing
defects and variability in processes. It
aims for a level of 3.4 defects per
million opportunities.
* What is a Pareto Diagram?
* Answer: A Pareto diagram is a bar
chart that ranks problems or causes of
defects in order of their importance. It
helps to focus improvement efforts on
the most critical issues.
third set of 2-mark questions
UNIT I: BASICS OF MEASUREMENT
SYSTEM AND DEVICES
* What is the difference between
sensitivity and resolution of an
instrument?
* Answer: Sensitivity refers to the ratio
of the change in output signal to the
corresponding change in the input
quantity. Resolution refers to the
smallest change in the input quantity
that can be detected by the instrument.
* What are the characteristics of a good
measuring instrument?
* Answer: Accuracy, Precision,
Sensitivity, Repeatability, Reliability,
Range, Resolution, Ergonomics.
UNIT II: CALIBRATION OF INSTRUMENTS
AND QUALITY STANDARDS
* What is the difference between
calibration and verification?
* Answer: Calibration is the process of
comparing a measuring instrument
against a known standard and adjusting
it to correct any deviations. Verification
is the process of checking if a measuring
instrument meets specified
requirements without necessarily
making adjustments.
* What are the benefits of
implementing ISO 9000 standards?
* Answer: Improved customer
satisfaction, increased efficiency,
reduced costs, enhanced
competitiveness, and improved
employee morale.
UNIT III: GEOMETRICAL MEASUREMENT
AND MACHINE ELEMENTS
* What is the difference between a
comparator and a micrometer?
* Answer: A comparator is used to
compare the dimensions of a part to a
master or standard. A micrometer is
used to directly measure linear
dimensions.
* What is the purpose of a gear tooth
vernier caliper?
* Answer: It is used to measure the
dimensions of gear teeth, such as tooth
thickness and pitch.
UNIT IV: STATISTICAL QUALITY CONTROL
* What is a process capability study?
* Answer: A process capability study is
used to determine if a manufacturing
process is capable of producing parts
within the specified tolerance limits.
* What is a sampling plan?
* Answer: A sampling plan is a set of
rules that determines the number of
samples to be inspected and the
acceptance criteria for a batch of
products.
UNIT V: SIX SIGMA
* What is a Fishbone diagram?
* Answer: A Fishbone diagram (also
known as a cause-and-effect diagram) is
a graphical tool used to identify the root
causes of problems.
* What is a Control Chart?
* Answer: Control charts are graphical
tools used to monitor the variation in a
process over time. They help to identify
trends and patterns that may indicate a
process is out of control.

You might also like