MSA Training RevAF
MSA Training RevAF
MSA Training RevAF
Supplier Quality
Introduction to MSA
Lise Robert SQS (Supplier Quality Specialist)
Rev. 01 – Dec 17th 2018
Introduction to Measurement System Analysis (MSA)
Everyday our lives are being impacted by more and more data. We have become a
data driven society.
In business and industry, we are using data in more ways than ever before.
Today manufacturing companies gather massive amounts of information through
measurement and inspection. When this measurement data is being used to make
decisions regarding the process and the business in general it is vital that the data is
accurate. If there are errors in our measurement system we will be making decisions
based on incorrect data. We could be making incorrect decisions or producing non-
conforming parts. A properly planned and executed Measurement System Analysis
(MSA) can help build a strong foundation for any data based decision making
process.
What is Measurement System Analysis (MSA)
Variation
Think of Measurement as a
Process
What is a Measurement System?
Measurement
The assignment of numbers to material things to represent the relationships
among them with respect to particular properties.
C. Eisenhart (1963)
What is a Measurement System?
An effective MSA process can help assure that the data being
collected is accurate and the system of collecting the data is
appropriate to the process.
Good reliable data can prevent wasted time, labor and scrap in a
manufacturing process.
Why Perform Measurement System Analysis (MSA)
Example
10
A major manufacturing company began receiving calls from several of their customers
reporting non-compliant materials received at their facilities sites. The parts were not
properly snapping together to form an even surface or would not lock in place.
The process was audited and found that the parts were being produced out of spec.
The operator was following the inspection plan and using the assigned gages for the
inspection. The problem was that the gage did not have adequate resolution to detect
the non-conforming parts.
An ineffective measurement system can allow bad parts to be accepted and good
parts to be rejected, resulting in dissatisfied customers and excessive scrap. MSA
could have prevented the problem and assured that accurate useful data was being
collected..
How to Perform Measurement System Analysis (MSA)
11
12
13
Data Classifications
Prior to analyzing the data and or the gages, tools or fixtures, we
must determine the type of data being collected. The data could be
attribute data or variable data.
Attribute data is classified into specific values where variable or
continuous data can have an infinite number of values.
How to Perform Measurement System Analysis (MSA)
14
15
16
17
The resulting Gage R & R percentage is used as a basis for accepting the
gage. Guidelines for making the determination are found below:
The measurement system is acceptable if the Gage R & R score falls below 10%
The measurement system may be determined acceptable depending upon the
relative importance of the application or other factors if the Gage R & R falls
between 10% to 20%
Any measurement system with Gage R & R greater than 30% requires action to
improve
Any actions identified to improve the measurement system should be
evaluated for effectiveness
How to Perform Measurement System Analysis (MSA)
18
19
20
21
Variable data – Data that can be measured; data that has a value that can vary from one sample to the next; continuous
variable data can have an infinite number of values
Bias – Difference between the average or mean observed value and the target value
Linearity – A change in bias value within the range of normal process operation
Resolution – Smallest unit of measure of a selected tool gage or instrument; the sensitivity of the measurement system to
process variation for a particular characteristic being measured
Key terms and definitions
23
Accuracy – The closeness of the data to the target or exact value or to an accepted
reference value
Repeatability – A measure of the effectiveness of the tool being used; the variation of
measurements obtained by a single operator using the same tool to measure the same
characteristic
Reproducibility – A measure of the operator variation; the variation in a set of data collected
by different operators using the same tool to measure the same part characteristic
Key terms and definitions
24
Accuracy – The closeness of the data to the target or exact value or to an accepted
reference value
Repeatability – A measure of the effectiveness of the tool being used; the variation of
measurements obtained by a single operator using the same tool to measure the same
characteristic
Reproducibility – A measure of the operator variation; the variation in a set of data collected
by different operators using the same tool to measure the same part characteristic
25
Measurement Systems
Analysis
Measurement Systems Analysis
26
Measurement as a Process
Mechanical Aspects (vs Destructive)
Piece part
Continuous (fabric)
Features of a Measurement System
Methods of Analysis
Gauge R&R Studies
Special Gauging Situations
Go/No-Go
Destructive Tests
Place Timeline Here
28
The Target & Goal
Continuous Improvement 29
Production
Pre-Launch
Prototype
LSL USL
Key Words
30
Discrimination
Ability to tell things apart
Bias [per AIAG] (Accuracy)
Repeatability [per AIAG] (Precision)
Reproducibility
Linearity
Stability
Terminology
31
Error ≠ Mistake
Error ≠ Uncertainty
Percentage Error ≠ Percentage Uncertainty
Accuracy ≠ Precision
Measurement Uncertainty
32
Typical Reports
Physici
Measurement as a Process
34
Basic Concepts
Components of the Measurement System
Requirements of a Measurement System
Factors Affecting a Measurement System
Characteristics of a Measurement System
Features (Qualities) of a Measurement Number
Units (Scale)
Accuracy
Precision (Consistency or Repeatability)
Resolution (Reproducibility)
Measurement Related Systems
35
Material to be Inspected
Piece
Continuous
Characteristic to be Measured
Collecting and Preparing Specimens
Type and Scale of Measurement
Instrument or Test Set
Inspector or Technician
AIAG calls these ‘Appraiser’
Conditions of Use
Where Does It Start?
39
Determine Required
Product Engineer
Resolution
How will the data be
used? Consideration of the Entire
Measurement System for Cross-Functional
the Characteristic
(Variables)
Determine What
Equipment is Already Metrology
Available
Measurement Systems Variables
Fixture 41
Material Inspector Methods Eyesight
Test Method
Air Pressure
Sample Training
Collection
Air Movement
Workmanship
Sample Parallax
Practice Fatigue
Preparation
Samples
Reproducibility
Ergonomic
s Standards
Measurement
Discrimination Vibration
Bias Temperature
Repeatability Lighting
Calibration
Linearity Humidity
Instrument Environment
External
Requirements Voice of the Customer
You Must Convert to Technical Features
Convert To
Technical Features
Failure Modes Analysis
Internal Control Plan
Requirements
Voice of the Customer
43
Design FMEA
Process FMEA
Identify Key Features
Identify Control Needs
Detection
Detection
Part Name Recommended
Occured
Occured
Severity
Severity
Operation Potential Failure Potential Effects Of Actions And Actions Responsible
RPN
RPN
Number Process Function Mode Failure Potential Cause Of Failure Current Controls Status Taken Activity
SIR Take TPPE Wrong Material Fragmented Container Insufficient Supplier Control Material Certification 1 9 2 18
Container Material Held In Unpredictable Deployment Improper Handling Required With Each
1 Storage Area Misidentified Material Shipment
Release Verification
Out Of Spec Fragmented Container Supplier Process Control Periodic Audit Of 3 10 3 90
Material Unpredictable Deployment Supplier Material
Contaminated Fragmented Container Open Boxes Visual Inspection 1 9 7 63
Material Unpredictable Deployment
Material Fragmented Container Engineering Change Release Verification 1 10 7 70
Composition Unpredictable Deployment Supplier Change Green "OK" Tag
Change Customer Notification
2 Move To Unreleased Fragmentation Untrained LTO Check For Green "OK" 5 10 1 50
Inspection Points
Inspection Frequency
Instrument
Measurement Scale
Sample Preparation
Inspection/Test Method
Inspector (who?)
Method of Analysis
GM Process Flow Chart
Process Flow Diagram Approved By:
Inspect
Move
Store
Step Operation Description Item # Key Product Characteristic Item # Key Control Characteristic
1 Move "OK" Vinyl Material 1.0 Material Specs 1.0 Material Certification Tag
From Storage Area and
Load Into Press.
2 Auto Injection Mold Cover 2.0 Tearstrip In Cover 2.1 Tool Setup
In Tool # 2.2 Machine Setup
3 Visually Inspect Cover 6.0 Pressure Control Protrusions 2.1 Tool Setup
Filled Out 2.2 Machine Setup
Standard Control Plan Example
49
Control Plan Number Key Contact / Phone Date (Orig.) Date (Rev.)
Part No./ Latest Change No. Core Team Customer Engineering Approval/Date
Supplier/Plant Supplier Code Other Approval/date (If Req'd) Other Approval/date (If Req'd)
Characteristics Methods
Machine, Product/
Part/ Process Name/ Device, Special Process Evaluation
Process Operation Jig, Tools Char. Spec/ Measurement Frequ- Control Reaction
Number Description for Mfg. No. Product Process Class Tolerance Technique Size ency Method Plan
Sensitivity (Threshold)
Chemical Indicators
Discrimination
Precision (Repeatability)
Accuracy (Bias)
Damage
Differences in use by Inspector (Reproducibility)
Training Issues
Differences Among Instruments and Fixtures
Differences Among Methods of Use
Differences Due to Environment
Types of Measurement Scales
54
Variables
Can be measured on a continuous scale
Defined, standard Units of Measurement
Attributes
No scale
Derived ‘Unit of Measurement’
Can be observed or counted
Either present or not
Needs large sample size because of low information content
How We Get Data
55
Test
Operational Definitions
56
Method 1
Method 2
Out of In Spec
Spec
Measurement System Variability
58
Environmental Factors
Human Factors
System Features
Measurement Studies
Environmental Factors
60
Temperature
Humidity Where is the study performed?
1. Lab?
Vibration 2. Where used?
Lighting 3. Both?
Corrosion
Wear
Contaminants
Oil & Grease
Aerosols
Human Factors
61
Training
Skills
Fatigue
Boredom
Eyesight
Comfort
Complexity of Part
Speed of Inspection (parts per hour)
Misunderstood Instructions
Human Measurement Errors Unaware of
problem
Sources of Errors 62
Discrimination
Ability to tell things apart
Bias [per AIAG] (Accuracy)
Repeatability [per AIAG] (Precision)
Reproducibility
Linearity
Stability
Discrimination
64
Indicates Poor
Precision
Bias and Repeatability
67
Precise Imprecise
Accurate
Bias
Inaccurate
Error in Master
Worn components
Instrument improperly calibrated
Instrument damaged
Instrument improperly used
Instrument read incorrectly
Part set incorrectly (wrong datum)
Bias
70
The auditor may want evidence that the concept of bias is understood.
Remember that bias is basically an offset from ‘zero’. Bias is linked to
Stability in the sense that an instrument may be ‘zeroed’ during
calibration verification. Knowing this we deduce that the bias changes
with instrument use. This is in part the concept of Drift.
Bias
71
Measurement Steps
Sample preparation
Setting up the instrument
Locating on the part
How much of the measurement process should we repeat?
Using Shewhart Charts I
74
Repeatability
Using Shewhart Charts II
75
Evaluating Bias & Repeatability
76
People variance
Times done
Trials
Stability
86
Variation in measurements
of a single characteristic
On the same master
Over an extended period
of time
Evaluate using Shewhart
charts
Evaluate Stability with Run Charts
87
Stability
88
Methods of Analysis
Analysis Tools
91
Describe Relationships
Substitute measurement for desired measurement
Actual measurement to reference value
Inexpensive gaging method versus Expensive gaging method
Appraiser A with appraiser B
Substitute Measurements
94
Line of Correlation
Stripping
Measurements vs. Reference Data
96
Measurements vs. Reference Correlation
97
Disparity
Comparing Two Appraisers
98
Run Charts Examine Stability
99
Multiple Run Charts
Displays 3 points
Length of bar; bar-to-bar; Bar cluster to cluster
Plot High and Low readings as Length of bar
Average Reading
Each appraiser on a separate bar
Each piece in a separate bar cluster
Low Reading
Multi-Vari Type I
102
Piece-to-piece variation
is the biggest source of
variation
Bar length (repeatability)
is small in comparison
Appraiser differences
(bar-to-bar) is small in
comparison
Ideal Pattern
Multi-Vari Chart Example
105
Normalized Data
Multi-Vari Chart, Joined
106
Left over
Repeatability
Remember -
Nonconsecutive
Pieces
Left over
Repeatability
Product Process
Measurement Variation
System Variation
Accumulation of Variances
115
Evaluating R&R
116
Go/No-Go
Destructive Testing
If Gauges were Perfect
120
But Repeatability Means We Never Know The Precise Value
121
So - Actual Part Acceptance Will Look Like This:
122
The Effect of Bias on Part Acceptance
123
Go/No-Go gauges
124
Summary
Measurement Variation
129
Bias (Inaccuracy)
Repeatability (Imprecision)
Discrimination
Linearity
Stability
Measurement Systems
131
Material
Characteristic
Sampling and Preparation
Operational Definition of Measurement
Instrument
Appraiser
Environment and Ergonomics
Measurement Systems Evaluation Tools
132
Histograms
Probability paper
Run Charts
Scatter diagrams
Multi-Vari Charts
Gantt “R&R” analysis
Analysis of Variance (ANOVA)
Shewhart “Control” Charts
Shewhart Charts
133
Rule of Ten
Operating Characteristic Curve
Special Problems
Go/No-Go Gages
Attribute Inspection
Destructive Testing