CA-1012134-Agc-A01-A01-MSA GRR - 2. Process Customer Label & Homologation.
CA-1012134-Agc-A01-A01-MSA GRR - 2. Process Customer Label & Homologation.
CA-1012134-Agc-A01-A01-MSA GRR - 2. Process Customer Label & Homologation.
Location
Standard
Annex ´01
Attributive GRR
Instructions:
The following spreadsheet is used to calculate an attribute MSA, in which 50 samples will be evaluated
using 3 operators and 3 trials per sample.
Please specify the general product information Please, specify the general gauge /
as product name, part number, characteristic equipment information as gauge name,
that the operators are checking and the name identification, the tolerance of the product for
of the person who perform the MSA to the PASS / FAIL decision and the date on which
operators. the MSA was conducted.
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
0 = Not OK parts
1 = OK parts
3. Evaluate MSA result (kappa and operator effectiveness) according acceptance criteria:
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Troubleshooting:
Attribute Agreement Analysis consists of creating cross tabulation tables that display the observed frequency
distribution of potential outcomes (Accept or Reject) using tables of counts.
Kappa statistic is the main metric used to measure how good or bad an attribute measurement system is.
The Kappa statistic is used to summarize the level of agreement between raters after agreement by chance
has been removed. It tests how well raters agree with themselves (repeatability) and with each other
(reproducibility).
Sources of Variability
AIAG MSA reference manual and VDA 5 gives us some potential sources of variation in the measurement
system, we can use it for problem solving and risk analysis.
VDA 5 approach; Section 4.1 Influence causing the uncertainty of measurament result; Page 26
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
AIAG approach; Section B, source of variation; Page 17
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
ATTRIBUTIVE GRR CA 1012134–AGC-A01-A01
Instructions:
La siguiente hoja de cálculo se utiliza para calcular un atributo MSA, en el que se evaluarán 50 muestras
utilizando 3 operadores y 3 ensayos por muestra.
Valor
Resultado de cada operador por ensayo. verdadero
de la unidad
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
0 = Not OK parts
1 = OK parts
3. Evaluar el resultado de MSA (kappa y efectividad del operador) de acuerdo con los criterios de aceptación:
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Troubleshooting:
El análisis de acuerdo de atributos consiste en crear tablas de tabulación cruzada que muestran la frecuencia observada
distribución de resultados potenciales (Aceptar o Rechazar) utilizando tablas de recuentos.
La estadística Kappa es la métrica principal utilizada para medir qué tan bueno o malo es un sistema de medición de atributos.
La estadística Kappa se utiliza para resumir el nivel de acuerdo entre los evaluadores después del acuerdo por casualidad.
se ha eliminado. Prueba qué tan bien los evaluadores están de acuerdo consigo mismos (repetibilidad) y entre sí.
(reproducibilidad).
Sources of Variability
El manual de referencia de AIAG MSA y VDA 5 nos da algunas fuentes potenciales de variación en la medición
sistema, podemos usarlo para la resolución de problemas y el análisis de riesgos.
Enfoque VDA 5; Sección 4.1 Influencia que causa la incertidumbre del resultado de la medición; Página 26
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
AIAG approach; Section B, source of variation; Page 17
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
frecuencia observada
ma de medición de atributos.
acuerdo por casualidad.
dad) y entre sí.
n la medición
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
ATTRIBUTIVE GRR CA 1012134–AGC-A01-A01
Internal CA 0608454
© Continental AG. 2018 Version 1
49
50
Internal CA 0608454
© Continental AG. 2018 Version 1
Numerical Analysis
a) Pairwise Agreement Between Appraisers – Per b) Agreement Between Each Appraiser versus the
Evaluation (Reproducibility) Reference Standard – Per Evaluation (Bias)
A B C A B C
B * B Cross Tabu
A ---- Kappa
B ----
C ----
Internal CA 0608454
© Continental AG. 2018 Version 1
e) Agreement Between Appraisers – Across Trials f) Agreement Across All Appraisers versus the Reference
(Reproducibility) Standard (Bias)
A * B * C Cross Ta
Number Evaluated: 0 Number Evaluated: 0
Number Matched: 0 Number Matched: 0
Number Not Matched: 0 Number Not Matched: 0
Upper 95% Conf. Bound: Upper 95% Conf. Bound:
Proportion Matched: Proportion Matched:
Lower 95% Conf. Bound: Lower 95% Conf. Bound:
Kappa: Kappa:
g) Effectiveness
Acceptance Criteria
Internal CA 0608454
© Continental AG. 2018 Version 1